Transcript
Hi Mike Matchett Small World Big Data and we are here today talking about AI. Of course it's all about AI these days, but how do you really apply it effectively? How do you realistically bring it into your organization, and how do you make it something that's really adding value? You've got a lot of challenges. If you're faced with an AI initiative today, particularly in folding AI into your business processes, and we've got some answers for you. We've got Rabi today from Thesys, which is the Sys, if you will. But just hang on a bit and we're going to dive into how you can leverage AI to do a better job of incorporating AI into your applications. Hold on. Yup, yup. Hey. Hi, Mike. Thanks for having me. Hi. Hey, Rabi. Um, just let's start a little bit about this idea of of generative AI and the problems people are having getting value out of it. Right? It's if they're just looking at ChatGPT or copilot, what are they struggling with? Yeah. I think, uh, the first of all, this whole technology wave is new, and I think people are trying to figure out a lot of stuff. But one of the key areas that we see that people are actually not thinking hard enough about is how will they drive value to their end users, right? And the end users don't care about the technology as much. They care about getting value out of the technology. And if you see today's AI implementations, most of them, if not all of them, are kind of regressing with the user interface part of it. Like earlier, you would have this much richer, richer visual interface for how things are done. And now almost it's like a text in, text out interface. And that's great for some use cases, I guess. But for a lot of the enterprise use cases, the business use cases, it's just not cutting it. Yeah. It's like we like you mentioned regression. It's like we've said, hey, you know, this kind of zoom thing that we're using is really great. But for the next great thing, we're going to go back to teletype and you're going to say, we're going to start. It's like, why? Why did we do that? We have we have we have these ways of doing things. All right. So let's talk about this. What exactly when you when you got when you started doing this, when you started looking at what was happening in AI, particularly gen AI and the way people are using it. What did what did you think would be the a better way? How does how did you start to start to formulate you and your co-founder that there's got to be an improved way of doing this? Yeah. Like I said, I think when we were working on a copilot ourselves, we realized that there are not a lot of tools or are not a lot of frameworks of thinking about enriching the interface for AI. And for example, like it's almost like a console like interface. Now, with all these AI implementations, right? You type text and you get back text. But we have already come a long way from the console of the 80s and the console of the 70s. Right now we are we are used to having this richer visual experiences. We started thinking more deeply of what it could mean, and then we realized it could mean way more than even the traditional SaaS experiences that people have today. The traditional SaaS is either built for a person or down the line for, let's say, ten different groups of people. But AI has the potential to actually understand your use cases, your habits, your users, and then present them with an interface that is so tailor made for them. So that is what we are trying to unlock at Thesys. Basically, this is something people call generative UI. The whole idea of taking the outputs of Llms, which are mostly text only, and enriching them and presenting them in a way more visual and a way more interactive fashion to your end users. I like I like that phrase generative UI because it's, you know, a nice little twist on generative AI. Uh, so we're we're really talking about how do I make the UI that I'm using to access something, maybe like Salesforce or my business app, more dynamic and not just dynamic in a random or haphazard way, but in a smart way. So I'm actually leveraging other AI to figure out how to present me and work with me on the interactions I'm having with the with the with the business AI that I'm working on. Does that sort of a good description? Yeah, I think you have summarized it perfectly. And basically our AI basically interacts with your existing systems and all the systems you have built on top of it. It takes all that raw data, the text that it spits out and figures out the right visual representation for it. And let me give you an example. For example, if you want to find a chart of all your sales metrics over the past one quarter or three quarters, Laks is probably not the best way to represent it, right? I can dump it in a huge table, but what are you going to do with that? It's much better to present it visually. Form of a line chart or a pie chart. And when I already have that chart, now the next thing I want to do is drill down. I don't want to go to the chart interface, figure out what prompt to write correctly to drill it down. I just want to interact with the chart. Tell it that, hey, I want to look at this slice of the pie, drill down further and go from there. So that is our vision of how all generative experiences should look like. Yeah. So you kind of you're kind of sort of duplicating that, that idea of going from natural language query where I just instead of figuring out a structured query that I normally go to my business, I type in something in a natural language as a prompt box to a natural, and I want to say a natural HTML interface, right? That like somehow you're taking this visual language that we've come to know and love how we interact with browser kind of apps, with widgets and buttons and drill down charts and Ajax and all this stuff and bring it to this world to say like, this is how I want to interact dynamically with my data. But for the system to be smart enough to present me that kind of UI. It has to be. It has to have its own intelligence itself. It has to have its own AI going on there. Right. So I know this is kind of a meta discussion. We're kind of having a meta discussion here, but, but, but it's really what you're doing with this thing you're calling gen AI is helping. One is helping somebody converse with with an AI. Um. Tell me, tell me a little bit about some of the the give me another example of this and tell me a little bit about what what what audience reaction is to this change in metaphor from prompt engineering to sort of natural click, drag, drop kind of stuff? Yeah. I think, uh, I mean, it's like it's not a very meta thing either, like the design patterns that we are working with, they've already started integrating this, and people already see a much higher increase retention and engagement with the agent than they did previously. Right? I mean, the example that I gave, like people are already used to interacting with charts. Like we don't have to tell them how to interact with charts. Actually, it's way harder to convince them or even like, teach them how to write an exact prompt to drill down. But everybody knows I select a part of the chart, I click on it, I drill down, I click back, I go back to the higher zoomed out view. So I don't think the challenge is going to be in teaching your end users. Actually, they would love you for it. All right. So what we're what what what what does it take to bring this in if I, if I'm looking at a at a business app or like Salesforce or something and I want to explore generative UI with that, what what is sort of the implementation steps I'd have to follow to get that get that going. Yeah. So on day one you basically just come on Thesys and you lay out what your visual language is, right? Like if you are Salesforce, you want your UI to look and feel a certain way. You don't want it blue and green one day. And suddenly one day your customer wakes up and now everything is red for that matter, even those minor things matter a lot when you are building for a large customer base like Salesforce SaaS. But going from there, you basically lay out some day zero rules for AI to interact with. Like, tell them that if people ask me about this kind of data, show it in a chart. If people ask me about that kind of data, show me in a table, for example. And then you let AI. Then you basically integrate our SDK with it and you let AI do its magic. It starts collecting. It starts collecting all this data behind the scenes from your actual users. It learns over time what works well, what doesn't work well, and it collects all this data. And basically every day when you get up, your app engagement is going to improve. All right. Uh, so, uh, you know, you said something interesting there about dashboard design. How much effort goes into trying to find the right dashboard for people? Well, really, I mean, this really overlaps with personalization and customization and tailoring of information to not just the use case, but the person who's trying to use it and it's going to learn some of their synchrony, their their personalities and behaviors and even quirks over time and tailor what it's doing for them. Uh, which is which is kind of cool to say, like, I'm really presenting you with a dynamic interface. That is, it's not it's not a personality itself. Like people are thinking they're going to end up with a talking head kind of thing. It's a system that's really working for you and learning from you as a as an enabler, which I think is, you know, I think it's a real promise of AI here. Uh, where do you where do you where do you see this going next? Where do you where do you look forward and say, like, you know, in a couple of years, you know, this kind of generative UI is going to be able to do what. I think this is in a couple of years. I think this is going to become the way of building agents. I think this is what we are seeing right now is almost the first stage, first iteration of agents Where just the back end is so hard enough that people are saying, okay, we'll release a text and a text out version. But over time, I don't see this being the status quo. Right. Over time, every agent, every AI interface, every AI product interaction that you have is going to feel more and more natural. It's going to take inspiration from everything out there and enrich it. Like you said, the promise of AI and I think this is a no brainer for me that this is how I was supposed to be from day one. Yeah. And it really it really almost is, is a way if you look forward as a sort of a prognostication, to say we're really taking the idea of someone having to interface with an AI and embedding it in the system where they don't have to even know they're working with an AI. Right. And I think that's maybe, you know, we know we're at level one, where everyone has to become a prompt engineer and be very conscientious of what hallucinations and everything else are, and really work with the AI there and token counts and all the rest of it. To a world where it just becomes more natural. Like when we went to natural language query from from structured queries. Now we're going from this, this prompt engineering to a visual interface of visual dialog, a visual conversation, if you will, with your data, which I think is great. Yeah. And very honestly, the way we think about it is we want to do for agents in AI what Apple did for the normal operating system world. They brought it from a console like system to an actual GUI where people didn't have to learn things, where natural things were intuitive. The promise of UX that they delivered. We want to deliver a similar promise for the world of AI, the world of AI products. I mean, don't get me wrong, I like ox said and bash from my sysadmin days. We're going to go back and write some Perl scripts for everything. But yes, using an iPhone is a much easier experience for the world to do, and they don't see the all the complexity behind it because it's working for us now. So I'm glad there's, uh. So tell me. I mean, don't get me wrong, I love Oxnard just as much, like, I think, uh, we discussed that we are both Imagicle fanboys, but today I do use, uh, cloud and all the copilot's a lot like, I'm a personal fan of cursor. So I think even when you are writing an OC script now, you are using going to use way more AI to write that than you did, like 20 years ago. Probably. Yeah. And and just by the way, since you mentioned all those, what you're doing is not dependent on a specific LM that someone's trying to use. It's really your AI application coming into the side and helping someone have a better conversation with whatever they're doing. Yeah, exactly. And, uh, all the models are going to become way more richer in terms of domain knowledge. So we are going to see way more specialized models to do A versus B. Our model basically takes the output of these models and converts it to a natural language. That in itself is a huge task, and that is the route that we are going down with. But we don't really, Uh, specify that you have to use this model or that model. You can basically bring your own model to do the other stuff. All right. Uh, you're not quite in GA yet, but maybe at this point you could tell us if someone is watching this and says, hey, I want to know more about Thesys. I want to see what you guys are doing. Uh, and maybe follow along. What would you recommend they do? Uh, we recently announced our funding and launched our website. So please go there and sign up for the waitlist. We'll start giving early access to all our customers by the end of Q1. We are going to go GA. And then you will be able to just log on to our platform, use it. But for now, just go sign up on that waitlist and we'll get back to you very shortly. And you're planning on having a free tier I understand. So this isn't something that somebody has to come up with a big budget for to try out. Yeah, exactly. I mean, we definitely have a free tier, but at the same time, we are thinking very hard about enterprise problems like data isolation and making sure all your customer data is safe. So we'll have a free tier and an enterprise tier. And the enterprise tier comes with complete data isolation. All the knowledge, all the data of how your users use will not ever get, uh, mingled with the data of other customers. Right. I appreciate this, Ravi. Thank you for being here today and explaining Thesys to us. I mean, it's new stuff, so it's hard to get our heads around sometime. We have to we have to be we have to talk about it a couple of times and maybe even look at it next time you come around. Uh, maybe we'll do a little demo or something for folks, but thank you for being here today. Thanks a lot. Thanks a lot, Mike, for having me. All right. And check it out. There's a lot happening in the AI world. And it's this kind of as we said, it's maybe not meta meta too much, but this kind of, uh, meta application of AI to helping you work with AI, I think is the future. We're going to see collaborations of of AI agents working on our behalf all throughout 2025 and going forward. So, uh, it's worth paying attention to. Take care folks. Yum yum yum.