[00:00:00] Adil Saleh: Hey, greetings everybody, this is Adil the Hyperengage podcast, and I'm super excited to, you know, delve deeply into, you know, the, you know, the co-founders and CTOs that are like more taking the heavy lifting and a lot of these people like us that are not so technical taking on this. The thinking about the job has become so easy.
And now 70% of the code is written by ai and, you know, you can simply think and do things at the same time. And you know, a lot has changed with the, in the last, I would say, 18 to 24 months you know, this large language models coming up from China as well. So there's a lot that we have on the table.
To discuss
[00:00:38] Adil Saleh: With one of the market leaders that we have today. Who's who's CTO of agility. Joel. Joel, thank you very much for taking the time.
[00:00:46] Joel Varty: Oh, thanks for having me. It's great
[00:00:47] Adil Saleh: to be here. Likewise. But Joel, I mean, looking at your prior journey, I know that you know, it's not easy, you know, getting, starting out you know, on your on the CTO role with not a lot of background, which you don't have.
It's like in the marketing space as a CEO executive officer, as a marketing csuite it's easier to, you know, not back with a lot of, not a lot of
[00:01:07] Adil Saleh: background, especially in this this age of ai. But when it comes to a technical role, like a CTO, how hard or easy it is to transform and shape yourself you know, compared to the background that you have.
When you talk about the technology and engineering from the day, like days like 2020 and pre that you know, how do you compare it as a CTO?
[00:01:28] Joel Varty: It's a balancing act. And I find myself balancing a lot of different things in my day-to-day role. Part of it is because as a headless CMS, our audience is split between the technical audience of developers and they want to know how they can use the latest and greatest.
How can they use ai? How can they superpower their teams with new tools? Most of those are AI types of tools. But then we have the non-technical side, the marketers, the people creating content. They want better tooling as well. They want a simpler experience. And we want to be able to use our technology to deliver that to them.
So trying to balance those two audiences in what they want is a real challenge. And it keeps it super fun.
[00:02:08] Adil Saleh: Yeah, it is. It is fun. And at the same time, you gotta make sure as a product, as an engineering that, I mean. Today, in this area age, especially with AI and capability of all these large language models it's, the job is still not easy when it comes to scale and a platform such as agility, how do you manage scale?
Because a lot of these things that you're doing for tomorrow, nobody's gonna clap for it today. And it is kind of a position that as a technology leader, you gotta make sure that you do things for in a longer term view. Looking at also the competition that is coming so, so rapidly.
So how do you manage actually the tech-wise for scale?
[00:02:43] Joel Varty: Yeah, and we've been thinking about the future since, you know, I've been with Agility for over 20 years and we're building on some of the similar types of stack items, but a lot of the stack has completely changed over the time, especially as new cloud technologies come around and new just ways of working with data.
We balance that with the compliance stuff that we need to do. And sometimes we have government clients, sometimes we have highly sensitive data, so how do we make sure that is secure, but also scalable in the long term? And we think about. No matter what happens with ai and I'm, you know, a huge proponent of augmenting teams or even replacing teams sometimes with AI for doing certain types of work, but you are always gonna have data in the cloud somewhere.
That's what I think. And we wanna host that data. We wanna host that content for folks and make it super scalable for them to use that. So we make, there's tons of infrastructure things that we're doing all the time, and I would say probably over half of our engineering. Infrastructure, or sorry, engineering, like budget goes into infrastructure and how can we make sure that what we're doing in cloud is going to scale as folks do more stuff with ai.
And and what we're expecting is that our APIs that are servicing customers right now are gonna get more and more traffic from AI tools, from agents, from MCP servers, things like that.
[00:04:01] Adil Saleh: Okay. And of course it is it's always you know, as a product that has has been there in the space significantly earlier than a lot of these new edge technologies that are trying to adopt.
The only benefit and one of the benefits that you guys have is you already have the customer you already know. Your ICP and you already have the tech that you can train using the AI new capabilities to, you know, you know, better serve those customers. Looking at your prior experience, I know that you've been back in the years, like that was long back you know, you've been a software engineer and you know, solution architect.
You know, I know that this has changed a lot in this in this age. How do you. What is that number one thing that you look at your you know, technical team that works for you as a crew to, you know, scale this platform agility to some adjacent market? Because you know, it's more about now doing the multi-product, multi licensing, covering the adjacent use cases to make sure as a product you were I'm not talking about one stop shop or, you know, all size fits that kind of scenario.
But it's more about, you know. Covering those adjacent use cases as well. So how do you you create this formation of really a players you know, that can come work for your vision and all? I know that you, I mean, I do, you're, you are equally as as well as the company not sit still closer.
So how do you make those kind of decisions on people?
[00:05:20] Joel Varty: Agility sits for most of our customers. We kind of sit at the center of what we call like a composable ecosystem. So our technology has to be. We have to be the best technology. So it has to be the most robust system. Most people will swap out other parts of, you know, what technology they work with in a sort of a more composable space.
They'll swap out their search provider, they'll swap out who they're using for analytics, but they very rarely will swap out the CMS. Our customer lifetime is in excess of 15 years. People stay with agility for a long time because the investment is there and because we continually expand on what we're doing as well as.
Part, a lot of our engineering is also just supporting older customers. Customers who, they don't wanna change, they wanna scale up, but they wanna use the same API version that they were on in 2013, which is a its own challenge as we're trying to evolve stuff on the other side. So that's the customer need.
The customer wants our, I want the latest and greatest. I wanna do things with ai. I want to, you know, do more with less. I wanna ship in half the time. So from my engineering team, when I'm interviewing folks and when I'm looking at folks on the product side of things, I wanna make sure that they're up for that challenge.
They, we need to kind of have the same mentality as our customers in one aspect, but be a step ahead in what they might be needing. And so it's like the wants. Versus the needs. You, a lot of the times we're teaching people come to us and they want a certain thing, and our engineers and our support staff and everyone on our team needs to teach folks, okay, this is what you want, this is how we can solve it with the needs that you're actually having.
And that's a bit of a process that we go through with onboarding, but it's a continual process because there's a lot of turnover with our customers in terms of their teams. It's the same customer. They have new staff that comes on and they have to kind of relearn and and they talk to our technical staff.
And my engineers actually do a lot of support. They talk to the other developers that are on staff at some of our partners and our customers, so they have to be able to be customer facing. So our entire co company is customer centric as well as engineering centric.
[00:07:23] Adil Saleh: Interesting. And you know, we use also was thinking again, of course it's harder to meet customers on their level as an engineer on the use case side.
But it's always beneficial because you are the, you are demand behind tech and you understand the tech you know, a lot of these solution engineers are turning out as customer success managers of we've seen. And this is kind of becoming a new norm in a, in the platforms such as agility. So now thinking about there's some questions that I my team put in is more about like, how do you think about agent ai these frameworks. You mentioned a little bit about about about support staff and, you know, change management around those, the training and that part. So how do you see agen workflows built within ag agility or maybe you can use third party. How do you see that fitting into your workflow and how do you see in general the agenda AI.
[00:08:10] Joel Varty: So the most exciting two things that I've seen in the last year for technology. Last November Andro came out with what's called the MCP the Model Context Protocol. So being able to implement that in addition to the agent to agent protocol that Google announced about a month ago. Those two things working in conjunction are super exciting for me personally, mostly because I see the death of websites coming.
Now that doesn't mean all websites are gonna go away. People have also, you know, the death of TV has also been predicted many times. That's not what I'm saying. What I'm saying is people searching on Google or using you know, chat, GPT, those are agents, they're already using those agents, right?
Whether they know it or not. People that are searching on Google are getting, you know, Gemini as an agentic response. The ability for those agents to talk to our customers, agents that they built, or the ones that we built. Is a huge superpower. That means that people can learn about your product or your offering or your service, or learn how to use that without having to go to your website and read it and find it.
They don't have to watch a video. They're gonna get that in whatever format that they want that comes back from that agent. I think building agents. That, that are talked to by other agents that eventually a human is going to use that is the future of the internet. And I think that superpower is going to evolve very quickly in the next few months or year as it becomes more discoverable so that, you know, instead of doing a Google search to find content, my search.
On whatever agentic platform I'm using will first find the correct agent to use and then start interacting with that. And I think that once people start to encounter that more often, then their perspective on the internet will change. But the fact of the matter is people are not surfing the web as much as they used to be.
They're not searching Google. They're using the internet via an agent already, whether that be chat, GPT and OpenAI, whether it to be through Google. Maybe they're using Claude through an philanthropic or whatever tool they're using. You know, maybe there'll be something on Apple devices that's, you know, apple specific at some point, whatever.
There are only gonna be a handful of them though. Writing our own in product agents I don't think is the best thing to do. Lots of products are benefiting from that, but having an agent to agent sort of system I think makes a lot more sense.
[00:10:26] Adil Saleh: That's interesting because you mentioned that human having ability to access specialized agent for their use cases and is far more greater in an ability than a product.
Having their own custom agents like Gong launched, like loads of them a few months back and a lot of platforms that they're going like. Building their own agents, like specializing for the customers. And are you guys also thinking about having maybe internally for your own team or
[00:10:51] Joel Varty: a handful of customers?
So we, so for internally for our own team, we have agents written in copilot that we've done for internal stuff, for internal documents and things like that. And that's not going away. There's certainly a lot to do there with the K we want to be very specific about how we're using internal data.
That's, that's very authe you know, highly authorized. We do have an in product agent that we've worked on and we've done an internal beta as well. There's a ton of engineering that needs to go into making that agent experience really good, and I think it's worthwhile exploring. I think long term though, if I'm looking for the value for our customers, if they're looking at their digital properties, it's a website, it's an app, something like that.
Do they wanna spend a ton of engineering building all of this stuff that needs to go into an agent, or do they build. Like a fully functioned in product agent. Maybe they want to do that, but I think potentially, you know, looking at an option is there a specialized agent? Is there what are the things that people are doing through their site?
You know, maybe it's, I'm gonna buy tickets or I'm gonna purchase some product through the site, making that purchase. Experience available as an agent experience through an agent to agent experience is super valuable because then people don't need to find your whole website and start from there.
They can start from where they're already at and that's where they're going to be. People are not going to surf the web. They're just not, they're not surfing the web like they used to. They're not even searching Google like they used to. They're using their agent and we want to be where they're, you want to be where your customers are at,
[00:12:14] Adil Saleh: yes, absolutely, because, and also the search engine optimization part is also going to affect this. Now people are trying to more rank inside different LMS than Google as a search engine. So now it's, the SEO concept is are going to change as well. And how people are going to do SEO and write content and getting reference points and all this, like marketing wise is changing a lot.
Interesting. And how do you think you know, this is from my technical team as well. We are also a B2B SaaS company. So how do you see the efficiency of these co-pilots tours solving these complex you know, problems engineering wise? Of course it's always human in the loop, but there are some approaches that we like, like we using philanthropic b.
We, we say it's the best so far. And also the second question is how do you see on scale the cost and everything? I know deep see, and all of these are a lot cheaper, but of course they're not proven models. So a lot of these these companies that are that have security concerns for scale thinking, like at these 2, 3, 5 years down are little bit hesitant.
With those you know, those language models. So how do you see, as a technical leader, what do you
[00:13:23] Joel Varty: suggest? So from a cost perspective, there has been a direct correlation to the complexity of a model, how well it performs and its cost. So those costs have gone up and up as more and more LLMs and the providers of them try to solve the context problem.
How do we make, how do we make these models hallucinate less as they try to do more complex things? And has, you know, if you look at Claude and a couple other if you look at, cursor ai, you know, it actually has the idea of a manifest. And a lot of things are coming up with the idea of steps now, and I think that's where you need the human in the loop is to generate that manifest or to come up with the ideas about how to better prompt this LLM.
So it's a different kind of engineering, prompt engineering that people are starting to get better at. But certainly the cost is a big problem because I think that as we have seen it, get these things get better at doing work, their cost has increased now. That is being offset by the fact that I think teams are trying to have fewer engineers on staff.
That will even out at a certain point. There will be a point where we, you know, companies stop laying off engineers and the teams that they have are the teams that they need. And the LLM costs are going to keep increasing as that happens. So at some point someone's gotta figure out that problem. Now interestingly enough, you talk about security.
We actually haven't had, to my knowledge, a huge sort of AI security breach yet that has been publicized. And thank God for that. Knock on wood. When that does happen, there will be a shift in thinking and there will be a lot more sort of consideration of that. I know that my, for my team, the way we've been working with LLMs we do almost all of our stuff in Azure through the, through their AI Foundry.
That has been very good. They, and Azure has a great sort of security footprint that's there, but not all LLMs have that. And so where will the trust be? So I think there will be a cost factor there too. It will be more expensive to use, more trustworthy lms. That's just the way it's going to be.
At the same time, you're gonna see smaller LLMs that we can use on device, and I think that is something that's going to explode out as well as we get you know, the idea to put an LLM on your phone, how much can work on there and what can it do? What are device is capable of? Then, you know, that'll be a different function of cost, I think, because it'll be more of purchase this thing or a subscription to this thing and use it as much as you want, as your device can.
So we're gonna see a, you know, a lot of different things coming through and you know, as anyone who follows the AI world, like it doesn't get slower. It just gets faster. So I think that there's a lot more to look for. But the underlying technologies that we've talked about are the big ones. You know, the idea of doing an MCP server agent to agent protocol, these are the things that are gonna dominate the conversation because that's how stuff works together.
And building software that works with other software in a good way is the next level. At one point it was all APIs and rest APIs and stuff like that. Now it's agents. How can your agent talk to someone else's agent? And and how will that work? Who is going to own the index of all those agents?
That's a good question. Will it be Google? That's another story of like, where do, where does that list come from? And then how do I rank higher on that list? We've seen, you talked about the difference in SEO ranking for SEO and ranking for LLMs in terms of instead of ranking for keywords, now we're gonna rank for more like how to statements and you know, actual prompt type statements, which is interesting 'cause products like us that have always had a good documentation portal.
It's like the LLM optimization we're getting from that is awesome. It's showing up all over the place, which is great 'cause we have lots of how-to articles, so people are gonna write more how-to articles and less keyword based articles. So that's great. How do we make our agents rank higher is a totally different thing that I have no idea on yet that we'll see.
If you're not familiar. So each agent that you publish is gonna have what's called an agent card, which is essentially the SEO for that agent. And so the descriptions that are in there are what's going to help folks rank. And so there will be a wild west of agent ranking, I believe that comes very soon.
So that's my prediction for the next sort of year and a half, is that there will be a lot of jockeying for positions there.
[00:17:20] Adil Saleh: To agent brand. Okay. So these are basically built by teams internally for their own use cases. And of course individuals as well. They can build their own agents for different purposes.
You know, do you think that this more penetrating slightly more faster in the marketing teams, or do you also talk about the engineering teams?
[00:17:39] Joel Varty: It's gonna be engineering teams that create these things. But it needs to be in conjunction with marketing teams because there's a public face of these things of how they're used.
And so while I think engineers are the, you know, the early adopters of a lot of agent stuff marketing teams are using a lot of AI tools as well. They're using the branded AI tools to do their things. So the places that the market marketers use are the places where we're gonna want our agents to show up.
And yes, there's gonna be a lot of new products that are themselves full-blown agents for doing specific tasks. But I think it's the more generic products, like things like just OpenAI and Chat GPT that people will go to that will have agent discoverability built into it so that it can find the specialized agents that do different things.
And the user interface that comes back from those agents is going to get more specific too. Right now you're just seeing texts and images and videos that come back. You're gonna see a more componentized interface in those agent interfaces. That is, I don't know what the protocol for that will be yet, whether it be react components or whatever, or maybe it'll be specific, but I think there will be a protocol that comes out for that so that if I'm doing something like buying tickets or choosing a seat in a theater, that interface can be right baked in and I don't have to leave that experience.
We're seeing that in some internal agent tools. I believe those are gonna come to the external tools as well.
[00:18:57] Adil Saleh: Yeah, I mean, and the one thing that makes me excited is how things are happening so, so fast. Like it's, we are talking about months, like few quarters. So it is wait. If I am, if I'm a CTO or a technical leader, I would be so excited to know, see, like, why, you know, you, you are a CTO.
Why didn't you think about like billing? Own products. Like you can build loads of them, like with your, like your capability and experience and exposure and of course knowledge. You can of course stand out as well.
[00:19:27] Joel Varty: Anybody can build anything. Anybody can build any product now.
And you can use AI even to decide which products to build. Getting product market fit for that product is just as hard as it ever was or harder because there's such a plethora of things. So I don't know. And I don't know that everyone's excited about how fast things change. Enterprise, so the SaaS market has changed in the last sort of five to six years.
So we're a SaaS company, SaaS companies that have been able to embrace enterprise. Enterprise customers are the ones that are doing the best. The ones that are. You know, servicing like $50 a month or $10 a month accounts, they're having trouble because they scale to a certain extent. And then you start running out of new users to find and so you need a new product or something like that.
Whereas those of us in more of these small to medium enterprise or the full on enterprise space are, it's a lot more stability. Those customers are not as excited about change, so it's really interesting. As an engineer, I'm super excited about new stuff. I'm an early adopter of tons of technologies. I love it.
I'm teaching my customers how to be okay with some of that change or picking the thing that is not going to fizzle out. I. That's so as a leader of a, as a leader of an engineering team, but also leader of a SaaS company, like I can't pick the wrong thing. If I, if we pick the wrong thing, then, you know, our customers are gonna not have less trust on us.
We need to build trust with our customers. So I tend to wait a little longer to weigh in on stuff. But I try more things. So I love the fact that things are changing really fast and there's a lot of different options. But you'll see this from other companies as well. It's like they're not gonna weigh in and kind of back a certain player until there's a clear leader because they wanna build trust with their enterprise customers.
And those enterprise customers are insurance companies. They're banks, they're media companies. They don't want to, they wanna look like they're future facing, but they don't wanna take risks. And so there's a really big dichotomy here between risk and reward for new things that are coming out really fast.
And I have to say, as a person who's also interviewing, like I'm hiring folks right now, a lot of those folks are coming from blockchain companies that have gone under there's a few AI companies that have gone under or fizzled out. There's a lot of Web3 folks that are out there that are not working anymore.
So there's a lot of these technologies that have been really hyped up and kind of fizzled. They're still great technology. But it's how you implement that technology in a product that has product market fit. That's the magic. And that's, AI can't re, I can't replace that yet because the market chooses. And so that it needs some time for the market to kind of figure out what is gonna stick.
And we're starting to see it with ai, but we're also starting to see where the technologies are about how companies can work together better. And that's what's exciting to me.
[00:22:02] Adil Saleh: Yeah, that's, that sounds really I would say it's hard to achieve, make sure that you do an enterprise level product and even if your product like two, two decks down, you gotta make sure that you, if you change something, you gotta make sure you have five, 10, 15% of the enterprise segment is going to hard.
It's so hard to, you know, get them adopted to the, to that new feature, new product enhancements or any initiatives. So what kind of initiatives are you guys taking product wise? What makes you excited this year? I know we are one quarter down, but a lot is changing. I know that. Looking at you know, how you're approaching this.
I love the way that you're approaching this, taking on AI is individually might be trying different things, experimenting ly, but you're doing the risk assessment a lot. So keeping that in in mind, how do you see, like any initiative are you taking product wise? You guys having, you guys are more about workflow integration and workflow autonom.
Like how do you see this happening product wise this year?
[00:22:57] Joel Varty: So for us, we did an internal beta starting sort of last Q4 last year on an AI agent and just kind of playing around to see what we could do, you know, Hey, how can AI make our product better? And what we found is that people don't really care whether it's AI or not, they just want their job to be easier.
And I think we've refined what our AI agents internally will do. Biggest part of it for us is that people in CMS, they don't work it. They don't. Tend to create content in the cms. They create content outside of it, and they actually use a lot of AI to create content outside of the cms. That content tends to come in a very unstructured way.
The value of a headless CMS is that there's a lot of structure in the content. It has fields and it's split up into different sections, so you can use it on your website, on your app, wherever it needs to go. We're using AI to take that unstructured content and to create structured based on it. And there's two ways that we're doing it.
One is Hey, take my page that I have here, that's, you know, my, my Google Doc or my Word doc, or whatever it is, my unstructured piece of text and a bunch of images that AI created. Create me a blog post or an article or a press release and structure that in a correct content item way that's fairly straightforward, a little more complex.
Part of that is. Create me a page with components. So you can define your component library in agility. You can define pages on a site map. So I wanna be able to take my unstructured idea that I have, which may or be, may not be totally flushed out, and then have AI generate a couple of two or three.
Examples of how that could be a page on my website that's gonna be compelling to whatever audience I'm trying to do or whatever I'm trying to achieve. Maybe that's a landing page for my ad. Maybe it's a contact us page. Maybe it's a page that's describing my product or service. So taking the unstructured, making it structured is one of the hardest things that our our editors have been doing.
And so that's that's serving the non-technical audience and making their life simpler. So that's one use case that we're looking at. The other use case is making. The, all the things that are available in the agility APIs right now available as an MCP server so that somebody using cursor, using windsurf, using copilot can subscribe to that, win to that and log into that server and be able to do those things as part of their AI chat in their IDE.
And some of that stuff is gonna be low level. Hey, create a content model that's like this and it goes and does it, but it also might be a higher level task like. Import this Excel file. Is it in the right format? Tell me how to do that. So that's, so those are the things that developers have a hard time getting started on.
Or it might even be, Hey, create me the code for a component I defined in agility. Agility doesn't have any code inside of it, it just has the definition of the schema. So how can we use AI to infer, Hey, I have a carousel component. How could that look like? And I'm, my UI is React, or my UI is in spelt kit, you know, generate me the code for that.
The AI tools are getting a lot better at generating code, so we'll definitely want to be working with with some of the top providers to make sure that our MCP server services the them really well. And I think more and more folks are going to be coding with ai. You know, what tool they use.
I don't really care. All the tools are going to support MCP, and so we want to double down on that.
[00:26:01] Adil Saleh: Oh, perfect. And also the API you mentioned is perfectly compatible to all sorts of, you know, workflows you have, SDS pretty much defined all of this because it allowed us to do with the, you know, with the API integration in the first place as
[00:26:15] Joel Varty: well.
If you have a great API on your product and if talking to any, anyone who's building a product out there or working at it. Build a public, API, it's hard, authorize it properly. Work with development teams, get their feedback on it. Use it yourself to build some stuff on it, but build that public.
A PII see a lot of tools out there that say, oh yeah, we have an API. Okay, where is it? Where is the swagger doc? You know, how does it work? Is it GraphQL? Let me play around with it. Oh, it's more of a private thing and we, here's the specific end. No, that's no good. Have a public API let developers work with it.
And then if you are a really good engineering company, then you'd be able to, you need to be able to maintain that API forever, whatever customer is using that. You need to be able to maintain that and you build trust with your customers by make maintaining an API that works for them forever.
[00:27:03] Adil Saleh: Yeah, I mean, I was also looking at the API space API documentation. A lot of these tools like stoplight now being acquired by Smart Beer and a lot of these companies in the space, it's a very huge category, but very limited products. They're they're also like talking about all of this, like all these technical co-founders, like we have friends working at these companies.
They're all about APIs. You know, they make APIs as a center of their engineering. And they keep on updating it and investing heavily into, in terms of you know, all of this engineering efforts and you know, all for these enterprise customers you need to work hand in hand.
Like you need to have a dedicated team, just like you have a account manager for you know, for customer success. You need to have someone dedicated to help them, you know, understand the API do the changes relevant change, make sure it's it compatible at all levels, at all times.
[00:27:49] Joel Varty: As you, your API is the, your API is the doorway to your infrastructure.
And so yes, you have to have a great infrastructure. Talked about all the engineering that we've been doing on that, just from an infrastructure side and scalability side of things. But the API is the gateway. That's how people get to your infrastructure. And so you need to have some layers there.
You need to have, it still needs to be fast, it needs to be you know, robust. It can't go down. You need, and sometimes, like our API, we're serving billions of requests through that thing. Measuring that becomes expensive. So you need to be able to measure your own work. There's lots of tools that will help you do that, but sometimes you may choose a tool to help you build your API that ends up costing more than the rest of your infrastructure as you scale.
So you have to think about these prop. You kind of gotta have a little forethought, but then sometimes you, the plane gets, takes off and you're doing great and you have to change out the landing gear. Before you land, you know, hey, we need a new way to measure everything. Or hey, you know, shoot, we need to choose a new CDN and that's a, you know, these tough pivots.
I've been there in 2020. We chose, we went through three different CDNs before we found the one we landed on that can actually protect our infrastructure the way we wanted to.
[00:28:52] Adil Saleh: Yeah. If you don't mind, can you also mention them because this we are pretty much on the same page. In the last 18 months, we have changed like.
Three tools for integrations like we are doing, like Salesforce integrations. Couple of tools for other like server side and all of those. Now we are going about AWS self-managed sorry, a AWS managed servers. That is all I know. We are not that technical, but my CTO tells me on my ears and we then some when she hits the fan.
If you could just mention some of the. Tech sec. Starting up, building up then engineering infrastructure. If you can mention some of the newish technologies that you have tried that you see that are good fit, please do.
[00:29:29] Joel Varty: Yeah, so part of our infrastructure, we're on Azure. Lots of SQL databases lots of moving.
We've moved a lot of stuff to Cosmos db 'cause it's super, super scalable. But for a lot of our customers, they love the fact that they can have all their data siloed in a SQL database. So that's important and we're maintaining that for folks. So that's at the, sort of the very base. And we use blob storage in there too.
So we actually, when cloud first sort of became a thing, we started off in AWS but since we're more Microsoft on the backend we had a really good partnership with Azure and with Microsoft and Azure, and that's been great. So we've been with them since the beta, however. I don't love their CDN. So Azure, CD and Azure CDN has gone through a bunch of other providers.
So the original Azure CDN was actually not, you know, the actual infrastructure was not Microsoft's infrastructure. They have since added infrastructure, but it was through Verizon's CDN and then got spun off and actually the company that they spun it off to since went bankrupt. So anyone who was on that CDN had to get migrated.
In. We looked for another company and we used a company called StackPath who was pretty good. Lot cheaper and it had some really good deal for us, but they couldn't invalidate content fast enough. So when our customers change content, they want that to appear on the API right away or as fast as possible.
And we weren't able to make sure that was gonna work global globally with StackPath. Now, StackPath has also since gone outta the CDN business, so I'm glad we didn't choose them for a long term. We ended up landing on a company called Fastly. And Fastly is incredible because their architecture is built on a language called VCL, which is a offshoot of something called Varnish, which is like, nobody uses varnish.
It's almost like a combination of like assembly language and JavaScript. It's super, super fast. Like our Edge code runs in microseconds, so we're able to do all kinds of stuff at the edge that works for both our assets and our APIs. Like our json API, so it works super fast and we also have great licensing deals with them.
So there they, their case where it's like for product market fit, they met us with where we're at for technology and then they adapted the licensing to make sense for us. And it's been working really well and we'll probably use more and more services from them. Our WAF technology, we may move out of our origin server and off to the edge.
To run at Fastly. So that's something we're looking at too, was where's our, where does our WAF live and how can that work most efficiently for detecting things like AI bots and stuff like that where we want to know what's going on.
[00:31:48] Adil Saleh: Love that. Love that. I'm learning so much from you, Joel. I mean, you know, this is such a, such a conversation that I wanted to add, and this is the primary goal of doing the podcast and to learn from people that have been there.
Before them myself, and you know, learn from them their experiences because you know, my father says that, you know, you learn from other people's mistake. You don't live long enough to make them yourself. So it's. It's all about experiences and learning from experience. And I, you know, I cannot tell much like how much of this conversation value has value in my eyes.
And you know, people listening as well. A lot of these founders, they're juggling around so much of different technologies, making a lot of mistakes and infrastructure wise especially these data analytics tools that we have and platforms that are pre-product market for two, three years.
They have a good peer group, but again, they're paying consultants a lot of money. They're part of different communities that are not helping them. So many founders problems start a founder problem and for the last, I would say one to one, one to two years, I would say Max. You know, a lot of these investors are like you know, the, they're, they backed off from investing into ai so much.
They're pretty reluctant on you know, making sure how it goes. All of this, just like you are. Technology, you're sitting back trying yourself a different thing, experimenting and seeing how it goes, how it flows at scale. And a lot of categories are evolving, but a lot of them are, you know, just starting out.
This creates doubts and a lot of you know, founders that are firsthand founders like us and getting funding to, you know, ship pass and have, you know, invest in teams and also. A lot of them are bootstrapping and they need people like you to share the knowledge and experiences and there's a huge respect component that I have inside me for people like you.
So now thinking about the culture at you know, at your team, you can simply talk about your team if not the entire team, like what kind of operating principles you guys have. You also, men mentioned earlier that you're hiring for some roles, you mentioned those as well, but what kind of ingredients you're looking at looking for people.
To work for you with you? Like what kind of mindset? Kind of, background or anything related to their personal traits that makes the culture that what it is now.
[00:33:52] Joel Varty: I'm glad you mentioned bootstrapped and I have a ton of respect for anybody that's bootstrapped. So agility is a hundred percent bootstrapped company build based on the revenue you get from customers.
And that leads me to the first principle of hiring anybody and anybody on our team. You have to be customer first, customer centric, thinking about what the customer needs. Now, for us it's like that might be developers, that might be marketers, so it's kind of a different thing. And then you have the CFOs that actually pay for it.
So who is the customer really affects how you work with them. But. Everyone on our team, from engineers to sales, to you know, folks on support to folks on customer success. Like everyone has to be customer centric. And if you're not gonna be happy on our team because we listen to our customers.
And so if you're the kind of person that gets. Stressed out by that, it's not gonna work. You also have to follow all of our core values, and we've really honed down our core values. You know, collaborate, build, trust, keep it fun you know, all those different things. Learn it and teach it.
Everyone on our team has to be a learner and a teacher. And I think that you learn things by teaching, but if you're not into that, then you're not gonna work on our team. Those are things that's really easy to have things that's this is what we believe in. And so if you believe in things, cool, you have a chance to be on our team, and if you don't, you're not gonna be happy on our team.
It gets a little bit more deep when you get to what are you willing to do, you know, what works? What's your willingness to show up? We're a remote company, so you gotta show up virtually every day and bring energy to the team. I look for folks that bring energy. Are you a battery?
Are you someone who like really energizes people or are you one of those people that sucks the energy outta the room? I wanna work with batteries. And so that's, I really look, you know, we ask a lot of questions that are not technical necessarily. They're just, how are you gonna work with our core values?
Do you, you know, what kind of energy are you gonna bring to the team? Are you gonna keep learning? I would rather have someone who learns super fast and can take in information than someone who already comes in. Oh, I'm an expert already and I know everything well. That's not gonna work because as you, as we've just talked about, things are changing fast.
And no matter whether you're a senior, junior, intermediate, whatever, or, you know, maybe you've been around for longer than me, you still have to adapt and change and learn things. And and I love watching people learn and grow. That is super fun. And so that's what I look for in my team.
Someone who's also excited about that.
[00:35:57] Adil Saleh: Amazing. You know, it's all about li you know, having a hunger for something like I mean, you gotta make sure that you maybe get your head around people that, what makes them hungry everyday life. You know, what motivates them what they're compassionate about.
And these are the personal trait that drives. Their workflow. And and actually it shows like in very quick time when they join the energy and the vibe of people. And like you mentioned, a lot of people that come with a lot of background experience and all you know, it's, I better tell them like all that, you know, even to myself, all that you know, might get obsolete very soon.
So you gotta make sure you love. With new technologies, knowledge and experimenting. You know, this AI as an engineer is, you know, AI is everywhere. You know, it, it has become a part of everybody's life. My little sister, she's, she was doing cyber security. I know her for very long time since she, my sister, she's not so into tech, like he hates tech.
To be very honest, like she's always been our old school, so she has my chat gt paid version for the last three years. So just two days ago we changed the password because so many people using using it. So we just changed it. And because O three was you know, something that we wanted to use deep search.
So it has limitations and. Just a few hours later she said you guys changed the password. What's going on? So I said, oh yeah, I changed the password so you can have the new one. So it's just about making sure that you stick in, adopt and acquire knowledge and knowledge from people like you as well to get some direction, because visibility and direction is equally informed because there's so much of noise.
You know, we've tried so much, but we loved plot like computer teams, like engineering teams, tech teams, and I myself, like more into you know, the O three and it's, its capability to our reasoning and deep search. And of course, the member lane that he ha it has. So what do you think, by the way, what is your favorite? Like you must have tried all of them. So
[00:37:47] Joel Varty: Yeah, what I love is that more and more tooling has model selection in there. So when I code, I use GitHub copilot a lot. I haven't started using Cursor that much, but I'm starting to dabble a little bit in Cursor and it's really exciting.
I love the idea of doing a manifest and I would love to have templates for junior developers or new developers or folks that are new to ai. It's here's the template for how to build like a whole project based on, you know, a few different things. Claude's, super exciting there. I love GitHub copilot and we have, you know, all the different models that are available in there.
It's funny, four oh is still pretty good. Four one is, I'm starting to use that one. And it is, so you can only use it through the API right now. GPT-4 one and it is wild. So it's what I have hated is that there are things in prompting that you have to, you know, repeat the prompt near the end again and if you like, things like that.
Opening, I came out with a prompting guide for GPT-4 one that is really specific about how to solve problems and I'm loving it. So just using it through the API, so not through front end tool yet. I don't know when they're gonna make it available. It is amazing for generating code. It's amazing for doing specific tasks.
You can say things in your prompts or in your system, prompt if you're building an agent to say, don't stop until you're finished, which is the biggest problem when the context gets large in an agent and you get into a lot of things and you've gone back and forth a lot of times, and there's a lot of data involved.
Every once in a while it'll try to solve a problem and then you don't know why. It just stops responding. And then you're stuck and then you think, okay and then everything kind of goes off the rails. So eliminating that from agent play is one of the things that I've been working on.
And four, one is worked really well there. Love Claude. I love, actually there's a few things that Gemini's really good at too. So just trying different ones. The memory as you, you mentioned the memory portion of oh three, that is really useful. I don't know how well these tools are going to intermix l LLMs and what model people use and whether more model selection is going to be a thing.
I mean, I can kind of keep about five to five to six models in my head about what they can do, but what, what happens when it gets to 50? What happens when there's 150? What happens when, you know, how many models are we gonna have? It's a little crazy. And almost every model now. Yeah, it's. Augments with the current internet, it'll look up to the internet as well.
So you're not stuck with, you know, when did it last, you know, when was that model built? So again, we're seeing lots of, we're seeing lots of variance of things and capabilities here.
[00:40:05] Adil Saleh: Yeah. It's getting crazy and crazier and with every month, every quarter. Okay, perfect. So you mentioned that is API based, like it's not invite based.
Like you, you get to make sure you apply for it, like for the API access, just like we had for. You have a,
[00:40:20] Joel Varty: I believe, have an open a open API key. Then you can add 4.1. I know for us it's in Azure, so we just added it to our, okay. We can basically add it to our Auto Model foundry and just start using it.
And that's really useful too. So we're building an agent using the, we started off using Symantec kernel in Azure. We're using Agent SDK. We've actually used the AI SDK from Versal as well. So a few different tools we've used all to sort of differing effects, but a lot of those is based on we're hitting our Azure AI foundry.
Backend and the models are in there and all the memory is stored in there as well. So that's a really nice thing. If you're an enterprise, if you're building a product where all your AI data is stored, it should be in your cloud. And so for us, it's all in our Azure cloud. If you're building a tool that uses open ai, it's gonna be an open AI's cloud.
And so you have to be okay with that, and that comes with its own benefits as well. I also think there's a future where if you look at website personalization and things like that. You go to a website and it does a login with open AI and it knows who you are on open ai and then personalize the website based on the memory it has of you there.
That's really valuable. That could be really cool. It could be super creepy as well, but I think there's benefits and trade offs there, there
[00:41:27] Adil Saleh: are definitely benefits. I mean, of course how you can get like context as you mentioned earlier in the conversation that. Context is super important to make sure that these are specialized to your use cases?
Yes. And this has been the hardest thing. A lot of these these companies internally or externally, they are doing to make those use agents specialized to different use cases for different segment of their customers. And I think Agen AI is all about that, as you mentioned earlier too. So making sure it's specialized and personable to, with all the context and everything.
Yeah. Perfect. I mean, Joel, it was it was so amazing having you here and you know, getting to know about you and your knowledge and experiences. And of course a lot of this was opinionated. I love that. And you know, this is what it should be someone. For more than two decades.
You know, playing around with with the product at scale serving lots of customers and happy customers should be opinionated. So I know that this conversation is it's going to be extremely valuable for people, especially these people building skill scalable products.
So thank you very much for your time. It was amazing having you.
[00:42:29] Joel Varty: Thanks for having me on. It's been a pleasure.
[00:42:31] Adil Saleh: Love that.