The PRQL: How LLMs are Transforming Enterprise Workflows with Mark Huang of Gradient.ai

October 30, 2023

In this bonus episode, Eric and Kostas preview their upcoming conversation with Mark Huang of Gradient.ai.

Notes:

The Data Stack Show is a weekly podcast powered by RudderStack, the CDP for developers. Each week we’ll talk to data engineers, analysts, and data scientists about their experience around building and maintaining data infrastructure, delivering data and data products, and driving better outcomes across their businesses with data.

RudderStack helps businesses make the most out of their customer data while ensuring data privacy and security. To learn more about RudderStack visit rudderstack.com

Transcription:

Eric Dodds 00:05
Welcome to The Data Stack Show prequel. This is a short bonus episode where we preview the upcoming show, you’ll hear about the topics we’re going to cover. If they’re interesting to you, you can catch the full length show when it drops on Wednesday. Kostas, this week’s show is with Mark Huang of Gradient.ai. And I’m really excited about this conversation because I think it’s a great example that type of thing that will further accelerate the usage and adoption of LLM. So gradient essentially takes open source LLM. So let’s say like llama to write, you want to operationalize llama to for some use case, they actually package llama to into a service and essentially give you access to it via an API endpoint. So you get an API endpoint. And now you can literally send and receive data from llama to and they take care of literally all of the infrastructure, which is pretty fascinating. I think. One of the big topics here is what this means for ml ops. But I think there are also implications for data roles that we sort of traditionally see as data engineering, data science workflows, to sort of manage this data lifecycle, and you’re almost jumping over a lot of that, which is fascinating.

Kostas Pardalis 01:30
Yeah, 100%. And unfortunately, I didn’t make it on this recording, but I had the luxury to listen to the recording already. So I have to say that’s, like, a very fascinating conversation that you had with with Mark there. But yeah, like 100%. Like, I don’t think we agree with you, I think there are a couple of different things here. The first one is access to the technology itself, rights, weights, I mean, just putting like a REST API there, like It literally makes like lowers the bar of like accessing such a complicated like technology so much that Britain has, like everyone can go out there and like build, like, anything like just by being like a front end developer, you can go now and like build like an AI driven application, right? Which is amazing, like in terms of the potential innovation that can be created. But there’s I think, like another factor there that many people might not think about. And that’s has to do with infrastructure and primary, like the hardware infrastructure, right? Like it is extremely hard today. Like for someone who wants to go and experiment and build around like these systems, you just cannot get access to the hardware that you need, like to go and do that. Right. So sure. Figuring out like, using like a service that removes these forward say, all the logistics around like doing that. I think it’s like an amazing opportunity. And it is one of the reasons that we see like so much growth right now happening around AI, right. It’s not just like the technology is also like how the industry managed to react really faster. So like delivering products that’s pretty much like everyone out there can go like and stuff like accessing these technologies, which is amazing.

Eric Dodds 03:22
All right, that’s a wrap for the prequel. The full length episode will drop Wednesday morning. Subscribe now so you don’t miss it.