Transcript
Danny Savard: Hello, everyone. We'll just give it a moment for people to join. Just a few more seconds. All right. I think we're good to go. Well, welcome, everyone, to our webinar. My name is Danny Savard. I'm the senior director of marketing at Kyligence. In this webinar, we want to talk about how to deliver projects in a challenging economy with low code platforms. Really, what we want to show you today is that we can enable business users to perform self-service without relying too heavily on a data team. So that means data teams can be more effective, more efficient, they can respond quicker, they can satisfy their internal audience and customers better. And then business users gain autonomy. They have the ability to create, manage and derive their own metrics. So obviously the goal here is that there's a balance between autonomy and governance, but we're not going to talk about data ops today. Really what we want to show you is that business users have an example of a of a platform where business users can derive their own metrics with a GUI and visualize it very simply inside of Microsoft Excel. I also want to encourage you to stay till the very end. Spoiler alert we have a free try and buy and an incredible offer of $1 an hour for the utilization of the platform we're going to show you. So stick around for that. The format is very simple. We're going to do a quick presentation, a quick demo, and then we're going to answer your questions at the very end. So with that, I'd like to introduce Harry Kurniawan, who is our Senior Solutions architect, to take us through the presentation. Harry Kurniawan: Hi everyone. Welcome to today's webinar. My name is Harry Kurniawan. I'm a senior solution architect with collegians. Today I'd like to share with you how you can be a data hero in this challenging economy. In my presentation. I will start with the promise of decision making using data. And then talk about challenges in delivering data projects that are needed to support decision making. Later I will introduce no code matrix platform to address the challenges that I mentioned earlier. And finally, Q&A session. If you are here today, chances you are dealing with data directly or indirectly. Some of you may involve in data projects or even leading data projects. So this challenge is in delivering data project. My sound familiar to you? Especially long journey or process to unlock insight from data. And then slow Economy may impact your data project as well. I will talk about more about these challenges in my next slide. Good news is you can use a no code metrics platform to overcome these challenges. First long journey to get insight from data. You probably know that the data need to be processed before you can get insight from the data. This is the example of data sources that you may encounter. Such as CRM, customer relationship management, online sales, online sales data. And. But Up and Excel files are very popular data sources and POS point of sale data from point of sale system. Typically, data engineers load these data from data sources into data warehouses. Once the data in data warehouse. By Développer. Our analysts can create visualizations or dashboards so business can business user can start analyzing the data. In some cases business user export the data to excel for wider distribution or further analysis. Slow economy add additional problem or risks? You probably heard from the news about workforce reduction. In in this economic climate. Many organizations also cut their budget to save money. It's no surprise there. And finally, competing data projects. It's nothing new here. Since business function or department. So they need data to make decisions. So business department like finance, sales or operations. Create their own data project and this data project are competing for the same resources. Which is it? Resources. No code matrix platform can help alleviate the pain that I mentioned earlier. By allowing business users with no code, with no coding skill to build their own metrics and analyze data quickly, meaning less reliant on IT resources. As a result, data analysis can be done in hours versus weeks and cheaper than traditional data warehouse approach. Since Matrix platform is a SaaS offering, you can just sign up and ready to go. Now, I'd like to introduce you to Zan. No metrics platform offering from Kyligence. It offered no code capability, so business users with no coding skill can create their own metrics and analyze data quickly. Then save money. By support supporting unlimited users and build based on actual usage. It then also budget friendly. It starts at $1 per hour. As a comparison, Tableau Cloud charts about $70 per user per month. So you can realize significant cost savings with Zen. It save time and promote collaboration. Business user can reuse and share metrics in multiple projects or teams. And this is my favorite live connection. Why is my favorite? Because I don't have to export data into Excel and then email it to everybody or drop it in the Dropbox. So no messy emailing, no messy exporting and better when the data in metric platform updated. I don't have to reexport data to excel, so save me time and avoid confusion like with excel containing the recent data. Even better since Excel. Is available on desktop and mobile devices. You can access your data from desktop iPad or tablet, even smartphone. And now let's talk about the demo scenario. Suppose your organization wants to cut expenses across the board and the CFO asks you to analyze cloud spending data, hoping that you can present some recommendations to reduce spending. Unfortunately, it has project backlog. It will take weeks to load cloud spending data into data warehouse. You know, and before you can analyze the data. So what should you do? While you can be a data hero. In the next demo, I will show you how end user with no coding skill can load cloud spending data without it. Help then create matrix. And then you can analyze data using attribute analysis. So you get ideas, generate ideas from the insight to reduce cloud spending. And finally, I will demonstrate how to connect, how easy to connect, Excel to Zen with a live connection. In this demo, I will show how to load cloud data spending without it. Help then create metrics perform. Root cause analysis so we can get insight. To generate ideas how to. Cut. Cut spending. And I also going to demonstrate how to create dashboard quickly. And I will close it with how to connect excel to Zen with live connection. All can be done easily with with the people with the node coding skill. Let's load the. This the data. Click a data type and then click new to adding. Cac file. So here we have a CSV file containing cloud spending. You're just going to load it. After the CSV file loaded, we can sample the data. This is how the data look like. So there is a cost center column has a this is the sample data, technical support, cloud engineering, and there is a platform where there is AWS, Azure project owner of the project. The actual spending date and we have a cost category. For future machine database, network and cost subcategory. All right. Now let's create a matrix. Click the Matrix tab so we can create a matrix that for that can be used for cloud spending analysis. Right. So we can click new and pick a basic matrix because it's basically we just want to aggregate like to some. So we pick this cloud building standard data set based on the CSV that we just loaded. And then we can just the actual cost and we do some so and we can. Uh, populate the rest of the field. And this is how it looks like when it's done. I. So. I pick, I pick currency with the US dollar and a thousand separator. And I added a bunch of dimension that can be used to slice the cost and no filter and pick the time granularity at day level. And I'll give it a name called Total Cloud spending. So this from this metric we can perform, we can gain insight quickly using. Net feature called a root cause analysis, by the way. When we click the Matrix, it will give you it will give us a preview of, uh, of spending over time. Look, this is a cross 2001 to 2002, part of 2002. So let's let's go to a root cause analysis. The purpose of doing a root cause analysis is at this point, we have no idea. Right. Like how to what to do. We have the data, but we have no idea, like. How are we going to cut expenses? So we need to get an insight. Right? So. Let's analyze the data based on a dimension attribute. So here we're going to add a project owner cost cost category. Um, you know, maybe platform. We want to know like, um, breakdown by platform. Okay. All right. And then we want to take a look. Okay, so this is interesting. Um, maybe we can, uh, we can do reason and reason increase, right? So we see here the increase from and at the end of 2021, December 31st, it looks like the cloud X cloud spending jumped and they like February 28th. Let's pick um, the date for. All right. So we want to know what's going on between this time period and let's analyze it. All right. So it came back and we can look at it by the honor. Looks like John contribute the the most for the cloud spending. Definitely we can reach out, John, and ask why you spend a lot of money. Right. And then we can look at the project looks like project maker contribute about additional 800,000 right It's there is a possibility that this is correlated to John's project or maybe, Gloria, we can we can find that further by asking them and looking at the cost category. And look at this. It seems that virtual machine contribute the most spending. So this is opportunity. Maybe we can talk to it, maybe we can find a more efficient virtual machine. So that's the opportunity. And then we can look at the storage. Maybe we can delete unused data. So this is this is kind of cool feature where we can get insight quickly and we can use all these insight that we get to create recommendations for the management, like which area we can, you know, focus on to cut cloud spending. And now let's create a dashboard. So I'll give it a name cloud spending by and category. And then I can add metrics. I'll just pick a total cloud spending. That's it. Select metrics and just drop it. And it will pick a appropriate chart. Let's expand it. And it's by default. It's it's going to analyze over time the spending. Uh, maybe we can analyze further. Let's break it down by, uh, project. So we can see and we can expand the slider. Um, you know, to cover more time. So as you can see, uh, this is the purple one is Native program. It's at the, at the peak is a 200,000. Uh, the accumulated costs over time for native program is 200 K. And then what else? Uh, program paid is the orange one is almost 900,000. So this is, this is, you know, give you a visualization, quick visualization, visualization to figure out the cloud spending for each project. And we can create a dashboard and share it with the user and then let the user do slice and slicing using a different dimension. And now let's switch to Excel. We can save this one. All right. It's safe. And we can share this dashboard, right? Yeah, we can definitely. Uh. Action. Uh, yeah. Can we get. We can do, uh, like any other dashboard. We can share this dashboard with, uh, business user and all. Now let's take a look at the Excel. This is how you excel look like connected to Zen using a live connection. Let's track a total cloud spending that has been selected. So here we see the total cloud spending from the beginning. It's about 30 million. And let me check this checkbox so it will wait until we press update button before refresh the data. So I like to analyze. By time. Let's. Speaker3: This is the. Harry Kurniawan: Breakdown of total cloud spending by payments from January 2021 to March 2022. And we can try a. Maybe cause category or cost center? Maybe. Let's take it to column. Speaker3: So this is the. Harry Kurniawan: Breakdown of the cloud spending by cost center. We see a big data analytics business strategy. You know, digital marketing, PMO, it's all interesting stuff. And let's switch the breakdown we want to see by project, right? So we can remove this one. And track the project here. And refresh. All right. Now we have the breakdown by project over time. As you can see, this is very easy. All these things can be done in less an hour and cost less than a coffee. It's amazing, isn't it? Thank you, everyone. And feel free to scan the QR code and you can try yourself. Collision zone and have fun. Danny Savard: All right, Harry, thank you very much. Let me get us back to that QR code. And we've got some questions that came up. So while I do that, let me ask you the question. So the demo you just did was a CSV file, or it could have been from an S3 bucket. Right? That's it's just the data for consumption for AWS. The question that came up is what about larger data sets from traditional data warehouse, something bigger or something older or something from a different location? Can you talk a little bit about that? Harry Kurniawan: Sure. Thank you, Danny. So, yes, college and Satan can support a large historical data set as well, like like a data warehouse. So, you know, the data could be converted to CSV or flat file or parquet file that can be loaded to AWS S3. You know, standard data lake storage. So and you know, and can support up to a petabyte data. So it's very large data. So typical data warehouse is probably in a terabyte data. Danny Savard: Sure. Harry Kurniawan: And one petabyte is 1000 terabyte. So. So yeah, definitely you can support a large historical data. Danny Savard: Another question that came in is questioning, because we've mentioned it a few times, questioning our pricing. Pricing is too good to be true. Are you saying that you can have you could have 50 users and still pay $1 an hour? What what are the details here? Speaker3: Okay. Harry Kurniawan: Yes, that's correct. So the pricing is based on usage. It's not based on number of users. So the $1 per hour is good to cover up to 100 gigabyte data. Obviously, if you data are bigger than 100 gigabyte, so you pay more. But but most for casual users, typical usage 100GB would be enough. Danny Savard: Okay, Um, let's see if have existing tools and I want to migrate metrics from SQL Server reporting or Tableau or some other platform where I've already invested in the creation of metrics. How can I import or how can I migrate? Harry Kurniawan: Oh yeah. As long as the reporting tool support like standard format, like XML or, you know, Json, we have tools that are able to migrate, automate the migration of metrics from the tool that you just mentioned, like reporting SQL Server Reporting Services or Tableau or other reporting tool to collisions and. Danny Savard: Okay, great. I think that's it for questions. And that takes us to the bottom of the of the hour. We thank you very much for attending and we look forward to seeing you next time when we do our next in the series of talks. So thanks again. Have a good day. Speaker4: All right. Thank you, everyone.