top of page

EPISODE 12: JOHN KANAROWSKI

  • Lytical Ventures
  • Jun 19
  • 22 min read

In this episode of CyberThoughts, host Lucas Nelson speaks with John Kanarowski, CEO of Implicit (formerly Agolo). Kanarowski shares the journey of Implicit, an AI company that pivoted from summarization technology to entity intelligence and now specializes in product expertise solutions using large language models (LLMs).



Welcome to the Cyber Thoughts podcast, where we explore the world of cybersecurity through the eyes of practitioners and leaders in the field. In each episode, we invite a guest from the world of Infosec to share their insights and expertise on the latest trends and developments in the cybersecurity market.


Whether you're a seasoned Infosec professional or just starting in the field, this podcast is for you; our guests will provide valuable insights and perspectives on the challenges and opportunities facing the Infosec market.


Join us as we delve into the world of Cybersecurity and learn from the experts on the Cyber Thoughts podcast.


PODCAST TRANSCRIPT


Lucas Nelson

Hi, welcome to CyberThoughts, the podcast where we explore the industry of cybersecurity through leaders in the field. It's my pleasure today to introduce John Kanarowski, the CEO of one of our companies, formerly Agolo, now Implicit, to the pod. So John, why don't you give us the basics? I've known this as an AI company for a while, but how did you get involved and give me kind of the entry point?


John Kanarowski

Yeah, absolutely. And thanks for having me on the podcast here, Lucas. A real pleasure. yeah, to give you the history. as you mentioned, Agolo has been around for a while. The company actually has history in AI and LLP specifically.


I got involved about a year and a half ago in Time Flies and it's been a real fun journey. Yeah, as a company, we've definitely navigated several big industry changes and it's been exciting and we continue to navigate those changes as we go forward.


Lucas Nelson

Awesome. So you joined a company that had already been around for a bit. Tell us a little bit about that, like stepping into a new role. How did you look at the opportunity? How did you look at the company to decide to get involved in the first place?


John Kanarowski

Yeah, absolutely. So the initial founders decided that they wanted to pursue other opportunities, which is totally understandable after working on something for 10 years. And so they were looking for new leadership and new energy, and in particular around navigating the changes in the landscape and on the go-to-market side of things. And that's kind of where I could help out. And so it was a transition. So for a time, know, we both have, I worked closely with the founders on that transition. And, you know, to answer your question around the opportunity and how we kind of went about that, it was really looking closely at what particular technology expertise and skillset we had in-house and how that was relevant in an LLM and GenAI world. We have a lot of expertise and we have built up a product that was relevant in the pre-LLM world. And then the question was, how do we stay relevant in this new world?


Lucas Nelson

You know,


As coming into a company, how do you view the tech? you, was, you know, was it a, I'm doing a lot of tech diligence. Like how did you decide to make that choice? Because you had other options. And so, you know, what, what drew you to, to implicit.


John Kanarowski

Yeah, so it's a great question. And I've been in the tech world for the last 25 years, but I'm not a technologist by trade. And so a lot of this is conversations with the team, but then also bringing in consultants as well. So trusted advisors that have a lot of technical expertise so that I can kind of triangulate with them and get there. And then they had very detailed conversations with the technical team to really understand our areas of expertise because the team definitely had a lot of strengths and had developed very valuable products as well and technology as well. And so it was kind of a combination of direct conversations that I had, but then also bringing in people that could provide more technical depth and kind of triangulate my own perspective.


Lucas Nelson

Got it. So you found technology you liked, you found a team that you could work with, but clearly there needed to be some strategy pivot. How did you view that? How did you make that pivot? How did you dig in and decide, okay, I think this is the way to take this going forward?


John Kanarowski

Yeah, I think at a high level, the process is definitely one of developing a hypothesis based on all the conversations that we had testing that hypothesis with the team and then also with the market and then evolving it. You know, we're in a fortunate position where we do have several customers, long standing customers, both commercial as well as government customers and


So also like having conversations with them about our product vision going forward was super important as well and kind of getting feedback from them on the new vision. And then like I saying, like iterating as well, because it's unlikely that you're immediately going to come up with, okay, exactly, here's what the direction should be. You come up with an overall vision.


and then bounce it off the team, bounce it off the customers and get their feedback and then adjust based off of that. And that's worked well for us.


Lucas Nelson

So we talk a lot, as an industry, about the founder journey and what do founders do. But it's different when you step into a role this way. how do you hit the ground running? What's your 30, 60, 90 day plan when you come into what's essentially still a startup, but you kind of have to take it in a new direction, make it your own?


John Kanarowski

Yeah, so great question. the one thing about the 30 days is it actually starts, there's like minus 15 days, right? As soon as you commit to like, I'm gonna go here, the clock starts, right? And you start doing your third-party research. But then one of the key things that I ended up doing, which in retrospect, I'm really glad I did, is I had a conversation with everyone in the company. like, and we're not a huge company, we're about 35 people, but conversations with everyone and very open-ended questions to kind of get their perspective on things. So that was really important in that first 30 days. The next thing was meeting with customers.


So to the extent it was possible, obviously I didn't want to disrupt the natural cadence of meetings, but to the extent it was possible, customer conversations and then leading up to those, listening to all that, every customer conversation, pretty much everyone is recorded these days. So going back and listening to a bunch of those recordings as well, so that kind of get a real good feel for the customers. And then the other key piece was that technology assessment as well that we're talking about.


before. And then as those puzzle pieces kind of come together, start seeing what you have and you start coming up with ideas. And then we kind of phased into like a kind of a typical sort of strategy brainstorming type of process. We did SWOT analyses. We worked on our vision and things like that then that led to an initial hypothesis. Okay, this is what we should focus on. And well, it actually three things that we could focus on. So we had a three-fold hypothesis and then tested those with pretty extensive customer conversation. So a prospect and customer conversation. So going out and talking to at that point then, and this is towards the 90-day point.


Like having, we had like 50 conversations with prospects in those three, kind of testing those three hypothesis that we come up with. And then we kind of on one after that.


Lucas Nelson

Cool. So we've jumped in pretty fast and like, I don't know if I did it. couldn't have jumped in the table. The original technology was, natural language processing and then the world of LLMs comes in. So why don't you kind of explain.


the technology that you started with, your hypothesis, and then what you're doing today to let everyone kind of sink their teeth into the product that you have now.


John Kanarowski

Yeah, absolutely. And so we started out in the nutshell, the first product that we developed was a summarization product. So you kind of have to rewind back to pre-Chat GPT days. And this is around 2015 to 2019 time period. And summarization was a really meaty problem, hadn't really been solved yet. And so the team did a fantastic job.


of solving that and acquiring customers for that. And as part of that, the team was actually, and this obviously predates me, I joined a year and a half ago, but during the 2020 to 22 era, the team was actually an early adopter of the early versions of ChatGPT. So before it was anywhere even close to where it is now even.


know, mainstream. they were early adopters and they started picking up on like, wow, this is like really powerful technology. And by the way, this, this is going to leapfrog what we've been working on. And so that's obviously a, that's a pretty tough realization to make. And so the initial product was summarization. Once they were going through that process and they realized, okay, they're going to get leapfrog here very soon.


Lucas Nelson

Okay.


John Kanarowski

They made a pivot to entity intelligence. And essentially


what this was is taking the core summarization technology, the engine of that, which was based around entities and extracting those from documents and turning that into a product. And then that would be a middle layer that would support GenAI applications. And so when I joined, that was the product that the team was taking to market.


there's, it's a very technical product. I mean, it's kind of hard to describe even as I'm describing it right now. And so that was one of the challenges was finding adopters for this. And so what we've pivoted to now is leveraging that core technology, but also integrating in several LLMs to solve a very specific business problem, which is product expertise. And we can talk more about that.


but that's what we've pivoted to. And so that's what implicit now provides.


Lucas Nelson

Cool. So why don't we use a generalized use case for product expertise, and then we can delve into that and maybe talk about the government. But what does a product expertise LLM tool do? Who uses it and for what?


John Kanarowski

Yeah, no, great question. So we have three or four design partners that we've worked with this on very closely. Think of these companies as like large technology companies that have complex products. So that's a really important one. They're product-centric companies. And a lot of times ecosystems of products.


As a result of that, there's an ecosystem of knowledge around those products, meaning there's various different knowledge bases. There's a bunch of different manuals in different formats. There's a lot of expertise tucked away in your CRM. So think of it as a federated knowledge ecosystem. And so because of that, it's hard for employees. Like if you're a customer facing employee, let's say you're a customer success manager at one of these companies.


It's hard to answer product questions, and particularly if you're a new employee. If you've been there for a while, this is what you learn, and this is your expertise and your differentiation. But if you're a new employee, understanding all of those products is really hard. And so that's what we help with, is building a large language model based product expertise chatbot.


And obviously we can deliver those answers in different ways as well. But so it really helps employees that are customer facing. So think of customer success, technical support, customer service people. But now, of course, the next question is, well, if you have all of that product expertise in one place, can't you also direct make this available to your customers? And that's the journey our customers honor as well. So once they have confidence in the answers, then also


making this available on their website directly to customers so that the customers can answer their own questions. And then that has the impact of being able to deflect those calls, which would have turned into higher cost support interactions, but being able to deflect those by just giving the customer the answer whenever they want it. So.


That in a nutshell is what we're doing. I can certainly go into more details on some use cases, but that's, yeah, that's the focus area.


Lucas Nelson

Very cool. So I guess why don't we dive in because you mentioned a use case in kind of cybersecurity, which as you know, is kind of my bread and butter, the place where I love to play. So I'd love to talk about that use case specifically. And then maybe we can talk about the government versus commercial.


John Kanarowski

Yeah, cool. So yeah, so it turns out, I mean, this is again, one of those things that we didn't know, but it turns out like the problem that we're solving is very prevalent in software companies in general, but then cybersecurity, cybersecurity specifically. And the one use case that one of the prospects came to us with and that we're now working with them on it has turned out as it really resonates well. And what it is, what they have a system where their system automatically generates vulnerability reports. So, and it's an AI based solution. And what we're helping them with is before that vulnerability report goes to customer success, we're running it through our engine. And basically,


analyzing it against a very specific library of content, which is their content. think of this as people use this term, WolfGarden, but think of it as their proprietary contents now not out on the open internet. It's not available for like the chat GPTs of the world. But it is fed into our large language model. And so we've done a lot around like, so when that incident report comes in, we analyze it.


leveraging the data that we have, and then we come up with a hypothesis as to what caused this and then what the potential solution could be. And then that is then provided to the customer success person when they first set eyes on that vulnerability report. so it gives them, makes them, it empowers them, so it gives them more knowledge and it also does so really quickly. So it saves them time as well.


And then they can in turn reach out to that customer either however they want to like by email or Slack or whatever their preferred communication mode is. And then engage in a conversation about that vulnerability report. And for this customer, the key thing was not, hey, we don't, it's not like we want to somehow avoid that call and automate this response. No, that's not what they wanted to do. They wanted that call because it's actually an opportunity for them.


to show value to their customer, but they want to make that call faster and more effective and more efficient. And that's what we're enabling them to do. And so then if you look at like the business impact or the ROI of that, it's from those downstream customer relationships. So improving just the customer satisfaction, which in turn will lead to a higher likelihood of retention, maybe upsell opportunities.

That kind of come up as part of that vulnerability conversation. So it's a really exciting use case. And it's not one that we initially were aware of, but it's one of those that we're like, wow, this is really cool. And there's probably broader applicability for this.


Lucas Nelson

Very cool. So I'm going to use that as way to pivot. You work with both the commercial sector and government. You've got a pretty technical product that has to be modified for each of your customers, So have to work with them.


John Kanarowski

Yeah.


Lucas Nelson

How is it the same working for those and how is it differing working with the government versus working in the commercial sector?


John Kanarowski

Yeah. So it is, it does feel like I'm driving down like different strides of the street on the same day, right? Like it's like, I'm over, you know, I'm over in the UK and then I'm back in the U S because it is a very different, different experience. use cases are similar, but just the way these organizations, the way they gather information, the way they make decisions is very different.


You know, a lot of times people talk about speed as being a big difference. And I don't think that's what it's about really, because what I've noticed from our government customers is that they make decisions pretty quickly as well. What I would say is the major difference is the amount of buy-in and consensus that is required. What I've noticed on the government side is the complexity of the matrix.


there, like the overlay decision-making authorities is really, is really pretty complicated. And so our customers to get things done, they need buy-in from a number, a bunch of other stakeholders within the government. And so that is a big difference versus, you know, working with cybersecurity companies or software companies or tech companies in general, decision-making authority is


is obviously there's Matrix, it's Matrix there as well and you need a buy-in from a broader set of stakeholders, but it's definitely not as complex as on the government side. And so what this means kind of is that like when you are working with the government, you really need champions and sherpas and guides and people that understand you have to trust them, right? Like because there's no way you can really understand.


And in particular, so we're working with some customers that are in the DOD and in the intelligence space. And so you absolutely 100 % have to trust your guides to be successful and help them, right? So, yeah, go ahead.


Lucas Nelson

Yeah.


How do you, so yeah, how do you toggle, you in some ways, you know, you are the sales leader, right? Like CEO, your job is selling. How do you toggle between those different customers? you do on a daily basis? Do you have different teams? Like how do you view that job of, okay, I've got one lane that looks like this. I got one lane looks very different. How do you, how do you manage that?


John Kanarowski

Yeah, absolutely. And you probably gathered, so my background is in sales. I grew up on the sales side of software companies. And yeah, so it's different teams. And one of the things, like as a startup, and particularly in this day and age, you need to be lean. So we've got a very lean team here. But what we do is we supplement our team with advisors. So we work with a bunch of different advisors on the government side.


and but also on the commercial side. that fills in gaps that we have in terms of like industry expertise, in terms of relationships, things like that. And so that's a big part of what we do in terms of like our team supporting these ends up being the same. I mean, we're a small operation in the grand scheme of things, right? And so everyone here wears two hats and supports.


Supports both commercial and federal.


Lucas Nelson

So let's take a different route for a couple of seconds. We mentioned this was originally an NLP technology, natural language processing. The team saw the coming, let's call it tsunami, hurricane, sea change of LLMs.


John Kanarowski

Yeah.


Yep.


Lucas Nelson

You know, how has the company managed to weather that? Because, you know, that could have easily kind of killed something in that, wow, this major change is coming down the pipe and the stuff that we've been working on, while still may be valuable, is not going to be the thing going forward.


John Kanarowski

Yeah, think it's the humility is a big part of it and just being, you know, kind of reading, reading the tealies and, and, and then adjusting based off of that. that, you know, kudos to the previous team and the founders. They were really the ones that, kind of had their pulse on, on the changes and, then made the, made the right decisions. the other thing that I would say is, having really good transparent, relationships with our investors, because you can't do this without support, right? And so making sure that your investors are really informed and bought in and part of the decision-making process, right? Like, so that's just super important. And then the third thing I would say is, you know, staying close to your current customers, right? Because, you know, as much as we are navigating the tsunami,


you know what, if you're sitting on their side of the table, they're navigating the same tsunami, right? So, you know, that's the other thing is just, you know, being very empathetic to your customer's situation too and adapting with them.


Lucas Nelson

So it sounds like your customers may have been kind of the key resource to help you kind of figure out where to go and what moves to make. If that's the case, you kind of


Hone in that a little bit for me. what's that? What was that period like for you to do? I know the team did some of it, but again, you came in and kind of had to figure out how to surf this particular wave.


John Kanarowski

Yeah, no, you're absolutely right. it's, I think the first thing is just navigating the current relationships, right? So like the customers have trust in people, right? I mean, like the, you do business with people. so just accepting that as a base foundation, right? Like, look, these customers are doing business with our team. let's, that's, you know, start there.


Build from that foundation. And then in the first several conversations, really take a listening mode. So super important because you're the least smart person in the room, Like everyone is much more knowledgeable than you are. so listening in those first conversations, and then that gives...


It gives you time to figure things out, first of all, really figure things out, but also build trust with the customers. so then as you kind of roll out changes, you have their buy-in as well. And yeah, so that's what I would say worked for me in these conversations. yeah.


Lucas Nelson

So I'm going to get into the last set of general questions. But before I do that, if you were to say what differentiates


of what you're doing today versus the other LLMs out there. I've got some ideas from your earlier answers, but I'd love to hear a crisp, how do you position yourself in a world where there's a bunch of different players?


John Kanarowski

Yeah, absolutely. So there's a couple of key things here and kind of the biggest one is around accuracy. generating more accurate answers, but also avoiding inaccurate answers. So that's sort of a high level concept. So you can kind of break down like, okay, how do we now deliver more accurate answers? And so there's a couple of components that go into that. One is the data that we bring in.


The second is like how we process that data. And then the third is how we actually generate an answer at the point of the customer query. the first thing around data. we basically, and some people use the term, you know, a small language model, but we basically focus a large language model on a very well-defined library. And so people use the term walled garden.


People or you can another analogy is like if you go into a library We're not like rifling through all sorts of different shelves looking for a book. We have a really well organized system. And we're looking through a very specific shelf for exactly the right paragraph in the right book So we have organized that library better and know exactly where to look so But the key thing is we're starting rather than starting from you know


100,000 books, we're starting from a bookshelf of 300 books. And it varies from customer to customer. I mean, we some customers that have a couple hundred documents that we have in our data set. Some have tens of thousands. But it is, in every case, it's a focused library. So that's the first thing. And then we generate answers from that data set and only from that data set.


And then the second thing is when we, this is around sort of the document and data management side of things. So our team has a lot of expertise in NLP and natural language processing and in particular around entity extraction. So when we go through a particular document, we have a preloaded taxonomy that has multiple dimensions, but the key ones are, you know, product and situations. And so when we go through that document,


Lucas Nelson

Okay.


John Kanarowski

we are going in and doing entity extraction for every single product, accessory, components, things like that, that is referenced in that document. And so we're, and then also all the situations. And so think of situations as like areas or concepts where there are issues and actions. And so we're going through and doing a semantic search to find sections. might be a couple sentences, might be a paragraph, might be more.


Lucas Nelson

Okay.


John Kanarowski

that relate to a particular situation, and then we're tagging that as well. And so our team is absolutely world-class at that step of going through these documents and doing the entity extraction. And that kind of goes back to our roots around the NLP expertise. So that's the second step. And then the third step is at the time of the query, we...sorry, I should have mentioned in that second step, another key part of that is building a knowledge graph of all of those entities that we extract. So the products, the situations, as well as the sections of the document. So we create a knowledge graph. And so then the third piece is at that point when the customer could be an employee, could be an end user, is coming in with a question, then we do a graph rag process. So we're doing a graph query, what part of that knowledge graph is relevant here. But then we're layering on a vector search as well to really hone in on exactly the right part of the data set that we have. And so this enables us to hit that accuracy that we're talking about that is industry leading, but also do this in a really efficient way. because this process is a pretty standardized process that we do, we can save our customers' GPUs as well in this process too. So, I know it's a little bit longer answer, but that's how we differentiate is really around accuracy and then, but those, various different data management steps that kind of lead into that.


Lucas Nelson

Awesome. So let me ask a general question here. You're trying to surf a wave, to use a common metaphor. Where do you think the industry is headed? Where's the wave going in general? How are you going to stay out in front of it?


John Kanarowski

Yeah, so that is a million dollar question, right? The key thing is staying very close to our, at least for us, is staying very close to our customers. And so we have a pretty clear understanding of what the customer problem is that we're solving. We know what type of customers have this problem. And we're solving a part of that problem right now, the most valuable part in our view. But obviously there's additional parts to that as well. And so that's how we're staying focused is by staying focused on that customer problem that we're solving and then biting off additional layers to it. And that enables us


to stay one step ahead of all the changes that are happening. Now, the other thing that I would say too though, is as much as we are a provider of Gen.AI solutions, we are also a consumer of Gen.AI solutions, right? So we have embedded models in our solution. so we're also constantly looking at, what's new, what models have come out.


And for specific tasks within our solution. So lot of those things that we're doing, like around entity extraction, around building the graph, and then around generating the answer, we leverage large language models as well. And so as new models come out, we also are constantly saying, what tool is the best for our particular use case? So that's how we're kind of surfing that wave, if you will.


Just like everyone else, know, making sure that we're staying current and using what's best for our particular needs.


Lucas Nelson

Cool.


What has you most excited the next six months? Like what, what, you know, what do you want to see happen next?


John Kanarowski

So I'm going to divide this answer based on the two different customer bases that we serve, so commercial and federal. On the commercial side, so we launched Implicit in March, right? So it's early days for us. Right now, we're engaging with a bunch of different prospects. So a key thing is converting those prospects into customers, driving value, driving real ROI, super important.


Also, we want to be fortunate in that process and work with good customers, good people. Ultimately, that's what's going to drive our roadmap 18, 24 months from now. So that's a key thing for us. Then on the federal side, and obviously, as you can imagine, this is a crazy time to be selling into the federal government. There is just a lot of volatility, a lot of uncertainty that are in our customers' minds. Their world is spinning. mean, it's hard for us sitting outside of the government to truly appreciate that. so what I hope for there is, and I'm sure we will get to this point, of more stability, more clarity around what


the strategic imperatives are and then being able to, know, once our customers have a clearer view of that, and a lot has changed even in the last four weeks. There's definitely been an increase in clarity. But, you know, over the next six months, a clearer view of what their direction is and how they're going to support our war fighters in different ways and how...


AI can play a big role in that. there is a very large group of people that are very knowledgeable within the government, that are very knowledgeable about LLMs and the opportunity to leverage these within the government for a bunch of different obvious and non-obvious use cases. So there's going to be a lot of innovation there. And I think we are really well positioned to support them in a way that aligns with their strategies and in a way that's super efficient as well. So we don't bring in the cost structure that other companies bring in. And so because of that, we can deliver a lot of value at a price point that is super compelling to our federal customers.


Lucas Nelson

Great. To wrap up, what are you reading? What book would you recommend? What's on your mind?


John Kanarowski

Yeah, absolutely. So what I will say is I'm one of these people that, you know, I used to read a ton. But now, like, I feel like most of the ideas that I consume come in through podcasts and through, and if I read, quote unquote, read, it's via audible.


Lucas Nelson

Okay.


John Kanarowski

The one podcast, and all my friends know this, that I absolutely love is, is The Rest is History. And so if you haven't listened to it, I definitely encourage you. It's phenomenal. And I love history in general. And it's two British guys that, you know, and they put out like an episode like every other day. So there's a ton of stuff out there.


But yeah, so that's one I consume more than anything else. yeah, it was one of those where in fact, they actually did a tour of the US. And so I happened to be in New York the time that they were in New York. And so I went to their show. I actually was one of those geeks that actually saw them live. And that was, yeah, it was cool.


Lucas Nelson

That's wonderful. Okay. So let me, let me give you a chance to plug stuff. Where, can people find implicit? Where can they find you? Yeah.


John Kanarowski

Yeah, implicit.cloud. So implicit.cloud is where we're at. And then I'm just John at implicit.cloud. So that's the easiest way to get a hold of me. And yeah, I'd love to hear from anyone. I love networking and just having conversations, even if the goal is not obvious as to how we're going to collaborate or move forward.


You know, I love helping people out and if there's any way can do that, I certainly would love to do that and then I'll share more about what we do and if there is a way for us for somebody to introduce us to somebody that might care about our technology, that's awesome as well, obviously.


Lucas Nelson

Perfect. Well, thank you so much for joining me today. I really appreciate your time,


John Kanarowski

Yeah, thanks Lucas. My pleasure.


Lucas Nelson

Alright, bye.


John Kanarowski

Bye.


.


bottom of page