Constellation Software hosted a call today about the impact of AI on its business, and decided to write a brief article with the highlights. Note that most of the highlights are really shared through quotes from the call (by any of the participants), with me just providing some quick thoughts to help with the context.
Much like some people already believed, AI has not “eaten” VMS software (at least not yet). Mark Leonard started the call with a powerful story that demonstrate his integrity and also the fact that some people might be running ahead of themselves with their forecasts:
In 2016, Jeff Hinton made a long-term forecast. For those of you who don't know him, Jeff is known as the godfather of AI and is a Nobel Prize winner for his work in the field. And long-term forecasting is very difficult. I talked about this before, and I'm happy to send you some sources/information if you'd like to delve into that further.
Jeff's forecast in 2016 was that radiologists were going to be rapidly replaced by AI, and specifically, he said people should stop training radiologists. In the intervening nine years since he made that forecast, the number of radiologists in the US has increased from 26,000 (these are US board-certified radiologists) to 30,500 or a 17% increase. Now that's outpaced the population growth in that period. So the number of radiologists per capita is up from 7.9 to 8.5. Now, Jeff wasn't wrong about the applicability of AI to radiology. Where he was wrong was that the technology would replace people. Instead, it has augmented people. The quality of care delivered by radiologists has improved. And the number of practicing radiologists has increased.
So I told you this story to make two points. Firstly, you and I will never know a tiny fraction as much about AI as Jeff did. And secondly, despite his deep knowledge of AI, he was unable to predict how it would change the structure of the radiology profession.
So I think we're at a similar point today with the programming profession. It's difficult to say whether programming is facing a renaissance or a recession. Programmers could experience massive demand for their services if their efficiency improves tenfold. You can imagine not having to put up with software that does 80% of what you want. You'll be able to get software that does 100% of what you want. That's customized to your needs. And the cost of programming will drive that increase customization. What a wonderful outcome that would be. Equally, you can imagine the a tenfold increase in programmer productivity driving massive oversupply of programmers. And demand, particularly if demand for their services remained static. And. Similarly , if the 10x efficiency doesn't happen, if it's a 10% efficiency gain, you can imagine that there would be very modest changes to the current status quo. So we don't know which way this is going to go. We're monitoring the situation closely.
In this opening “speech”, Mark Leonard touches on an interesting topic that was discussed later on during the call: customization. An analyst asked the panel whether the potential for large customers to insource more software development to achieve customization is good or bad for Constellation. This was management’s reply:
Put that question into context because I think it's a great question. But it's one that we have been confronting forever. So I see vertical market software as sitting somewhere between horizontal applications that are cheap and cheerful and do 50% of what you want and highly customized systems that do exactly what you want. Now, obviously you have to hit a certain price point to live in the middle, where most vertical market software companies live. You frequently have professional services to provide some customization, but those professional services, whether it be custom programming or otherwise, are expensive and hence only a certain class of client can afford them. So those who graduate from horizontal point solutions laced together with Excel to vertical market software at the low end, are going to have very little in the way of professional services and customization. And custom interfaces and custom reports. And at the high end are going to have highly customized systems where we are willing to do whatever they want as long as they have budget. And our people aren’t cheap. And so they're going to have to pay for that.
Well . That has always been the case. And the very largest clients frequently see their software as strategic. It isn't just the tools to do business. It's a way that they differentiate themselves from their competitors. And if you're dealing with a highly differentiated, large client , they're going to want to have that be proprietary to them, and they're going to try and capture as much of that information technology advantage within their own realm as they can and control it. And so we frequently do lose large clients to an SAP implementation or a proprietary implementation. And that's always the case.
You know, we capture the small companies as they graduate from horizontals. We take them and some of them grow enormously and become very successful, large companies. And then they graduate to no longer using our systems, but to using a much more proprietary system that they have a much stronger hand in driving. Now, AI has the potential to allow us to do way more work on making the client happy and customizing our solutions. But it also allows the client to potentially do that. And so there's a natural tension there.
We obviously would love to capture that. Our clients, if they don't have a list of five years worth of IT projects to get to, would obviously love to capture that as well. And so I think to some extent. Whenever we go see a large clients, IT director. We're in a negotiation. Regarding what we'll do and what they'll do. And, you know, it's not going to be an easy answer. It's it's going to be somewhere in between. And AI makes it potentially way more exciting for us to provide customization . But it also makes it much more likely that the client will do it themselves.
The way I interpret this (might be wrong) is that the risk/reward that AI brings in terms of customization is great for Constellation and probably not great for those ERP vendors who are currently the providers of the customized solutions. If Constellation can leverage AI to offer some sort of customization at a cheaper price than it does today, customers might not “graduate” from their VMS solutions and therefore retention could inch higher. This reminded me of Intuit’s strategy to disrupt the ERP market. Intuit recently launched IES (Intuit Enterprise Suite) to go upmarket and retain those customers that were graduating to more customized solutions that Quickbooks could not provide.
All this said, what everyone wanted to hear was whether AI is going to “eat” software, and management attacked this question through several angles. First, they made the distinction between the usefulness of AI when coding from scratch and its usefulness when maintaining the software it has generated:
Obviously with different gains in different stages, but coming back to the original question, which was should we consider rebuilding software, then extending or maintaining what we already have, considering that these are better at writing code than maintaining code?
Well, regarding this question, it's important to consider that even if we rebuild with AI, we're still going to have to maintain it. So I think in some cases I will enable us to modernize and rebuild our solutions more effectively than we were able to do so before. But at the same time, how we maintain, troubleshoot and bug fix our solutions, we're still going to have to do that. Whether the code has been written by AI or by humans ten years ago.
Mark Leonard basically pointed out that it’s unclear whether productivity improvements gained from writing code with AI outweigh the potential problems you can find down the line in terms of maintenance:
So just to jump in and sort of drive home that point. It's really easy to get excited about 10x improvements in programmer productivity as you generate that new application. But if that new application goes out into the field and generates scads of bugs reported by clients and is fundamentally difficult to change and improve, you may give up on the roundabouts, what you made on the swings you may end up with higher lifetime costs of the code base, and similarly, you have to take into account the efficiency of the code that the AI produces. And I think we're very early days. We know there are some wonderful advances in programmer efficiency on the front end, and we just don't know the answers on the back end yet because we haven't lived with it for long enough.
Surprise surprise, the reality seems to be much more nuanced than just assuming that AI will take over the entire lifecycle of software.
Secondly, they argued that it’s not data per se that’s the most relevant competitive advantage to them as an “incumbent,” but rather the workflows and processes that they’ve built through the years:
What might be more important than data will actually be processes and workflows. Our businesses have incredible knowledge of the end users processes and workflows. Often better than the end users themselves. And I believe this will be the big opportunity for us in looking at those processes. Try to reimagine some of them by embedding AI in in certain areas and go from systems of record systems that capture data and allow you to edit and retrieve data to systems of action, which in some cases the AI agent or the AI solution will do certain steps that will automate more of the human work. So one of the things that we're recommending our businesses is to, again, looking at data is an important first step. But as has been discussed, that data the customers already have access to that data. In some cases, the customers are experimenting with other AI solutions on top of that data. But the processes, the business rules, the workflows, that's something that we built into the systems. And I think we're going to be able to leverage quite nicely.
This is a pretty good point because the customer’s data is not proprietary to Constellation but all of these processes and workflows are, giving the company an edge over newcomers. This is how Mark Leonard sees it:
I agree with the Paul entirely. I believe that vertical market software is the distillation of a conversation between the vendor and the customer that has gone on frequently for a couple of decades, and you distill those work practices down into algorithms and software and data and reports, and it captures so much about the business and being able to examine that in a new way. Because of AI creates new opportunity to modify and change and suggest new approaches. So yeah, I'm hopeful that that unique and proprietary information will be of value.
All of the above together with the belief that AI can primarily add value through software led management to claim that they actually believe the software budgets might increase rather than decrease:
Answering this question, I should say we saw it with low code. Low code was the promise of automating all the processes and there wouldn't be software developers needed anymore. Anyone could contribute to low code systems and well, the reality was completely different. And I think because of the scope, we'll broaden and deepen there. Will evolve. Very interesting business cases. And if you talk about software budget, it's only costs compared to the very interesting business cases which will evolve. So I should say, it could be that software budget will be become much bigger because of the broadening and deepened scope and I think that will be the case because there are a lot of more possibilities to optimize businesses. And of course, all the businesses using our software also will have stronger competition. So they have to respond to their competitors. And of course, I and it will be the means to compete with. With competitors. It will be become more even more important to have a unique selling proposition from our business, from the businesses using our VMS.
So I would say software budgets will certainly not be eaten by AI. It will be leveraged by AI.
Management also discussed how the company leverages LLMs and how their strategy protects them from worsening unit economics. Both things are related, so let’s start first with how Constellation has structured its access to LLMs to avoid being “price-gouged:”
So we've essentially created our own centralized sort of platform that essentially removes the various factions that are currently going on where to a certain extent, you have to largely be within this cloud provider to have access natively to this LLM and so on and so forth. So there's these turf wars being kind of created across the various cloud providers and whatnot.
And so with our strategy has been to really play a very neutral sort of Switzerland type role, where by centralizing things through strategic relationships, either directly with the model providers or with the platform providers and so on and so forth. We've managed to negotiate, I think some, some, some really aggressive deals and remove the element of these sort of factions. They're all willing to kind of play nice with us in the sandbox. So that puts us in a very unique position where sort of technically we have access to 15,000 sort of unique models. And that's because we're essentially sort of coalescing sort of anything that otherwise couldn't be or reside within other platforms. The other piece that sort of I had touched on very briefly, and Paul sort of alluded to as well, is sort of using a on prem based assets where and when possible.
So to the extent that the LLM needs to be or the AI model needs to be hyper specific or, you know, a specific trained one that it resides with a pre-existing best of breed provider, then sure, that may make sense to kind of tap into that one, but for basic, let's say sort of translation service, summarization, service, and a myriad of other hosts of functionality and whatnot, you know, the on prem one is plenty sufficient and capable of doing it's, you know, its own sort of thing.
This flexibility basically means that CSU will benefit from price wars across the different LLMs (which they expect will happen) and will also be able to take advantage of their on-prem infrastructure to lower costs for consumers (when able to). This, together with pricing to value might be enough to maintain margins even as COGS increase:
Yeah, maybe it's worth briefly outlining like what we know right now before we look into what might happen in the future. So as of right now, these model providers are charging anything between $1 to $3 for 1 million tokens. You can think of tokens roughly as words. Now, there are many studies that show that AI platforms users consume, on average per month between 50,000 tokens, which these are the light users and 1 million token users. These are the heavy users.
So based on the data that we have right now, we can infer that if a CSI customer would start to embed AI features into their product, there's going to be an estimated COGS per user of anything in between, like $1 and $8. So I think we can easily cover this and maintain our margins by having premium add ons where our end customers, if they want to to leverage these features, they can buy those premium add ons.
Now, indeed, it's hard to predict how this will will turn out in the future. But the good news is that currently, these large language models, they don't have a very large moat around them. Some of you might have read that when GPT five came out, there were some doubts regarding its performance. They had some issues with their deployment and overnight there was a huge switch from OpenAI models to different providers, with virtually one line of code being changed into the consumer of these items. So this tells me that there will be high competition between these model providers, and this will keep the pressure on on the cost. And I think in the long run, the price per token will go down. Now, again, we can speculate about how these companies will start to build up their moat so that it's harder to switch from one line to another. But so far, again, we don't have clear data or indicators towards that. I believe that as of right now, if CSI companies are adopting AI features, we will be able to maintain our margins. And if kind of one of the big providers are starting to to to hike up their prices. We always have the option to use things like model routing. So using smaller models for different tasks or even on premise inference by leveraging open weight LLMs.
Management also received questions on how AI changes the M&A landscape. The short answer is that they have not yet seen an impact on M&A and that it’s BAU (business as usual):
Yeah I think the underlying assumption is that we're not opportunity constrained, and we are opportunity constrained. And so narrowing the aperture is a bad idea. We end up sitting on a whole pile of cash . And when you're striving to generate very high rates of return on your investments. Sitting on cash is a is not a good plan. So we already are working hard to look at things outside of strictly vertical market software. We've done some horizontal stuff. We've done some hybrid hardware software. We've done some hybrid data software. And so I would say that AI is not reducing what we're looking at, I'd say it's it may influence the pricing on certain things where we see it having current impact . But yeah, it isn't changing much in the M&A world.
The call ended in pure Mark Leonard fashion, just like it started: with a healthy dose of skepticism in terms of forecasts and what AI can achieve. They first shared some stats around AI usage at one of the operating groups (didn’t say which one), and claimed that some stats were worse than what they’d expect them to be considering how much hype AI is getting. Here they are:
27% of the business units in a given operating group are developing AI products for customers. Around half and half is for (1) the customer’s customers, and (2) for the customers themselves, so CSU is also helping their customers add value through AI. They expect this will accelerate over the coming months
AI is being used for 29% of customer requests, which is significantly worse than ML expected
50% of the BUs in this operating group are using AI for sales and marketing
61% of their BUs are using AI tools in R&D
Roughly 3% of BUs had replaced people with AI tools
Constellation doesn’t claim to know the future of AI, but they do know that nobody knows either:
Let me encourage you not to listen without healthy skepticism to what you read. The last few weeks alone, I've heard that a major soft drink company increased its US sales by 7 to 8% because of AI. And I had a look at its stock and it went down.
I heard that from the founder of a major software investor that AI just increases Tam and that's, you know, wonderful. But you got to consider the source. He's not about to say that software is threatened by AI.
I've heard from a bank CEO that AI is revolutionizing their business and is going to lead them to a brave new world. It's really important to dig in and try and understand to be an anthropologist, to observe and test the claims that you hear and try. To understand the current state of the art . There's two ways to do it. One is obviously through sort of sleuthing claims that you hear. Obviously, if you have trusted partners from whom you're getting evidence , that makes life a whole lot easier. The other thing you can do is be a scientist instead of an anthropologist and observing actually run experiments , you know, try AI and ideally try it against the alternative. And see if you get a significant improvements in whatever it is that you're endeavoring to do. So. Predicting the future really, really hard, particularly at times like these. But monitoring what's happening in real time. A whole lot easier.
You just got to approach it with, as Chris said, a healthy skepticism.
All in all, nothing thesis changing shared in the call (which was expected) but a couple of interesting highlights that confirm that…
AI has not eaten Constellation’s business just yet and the company has ways to leverage AI
Nobody knows what the future holds
Have a great day,
Leandro