Olivier Pomel
Analyst · Bernstein Research.
Yes. I mean, look, the way we currently sell a lot of these products is you show like the difference in time spent. And when the alternative is you try and solve a problem yourself and you have an outage and you start a bridge and you have 20 people on the bridge and they look for 3 hours for the root cause, you know, and you wake up people in the middle of night for that. Like, it's very expensive. It takes a lot of time. There's a lot of customer impact because the outages are long. And if the alternative is in 5 minutes, you have the answer, you only get 3 people looking that are the right folks and you have a fix within 10 minutes, shorter impact on the customer, many, many, many less folks internally involved, lower cost. So it's fairly easy to make that case. And so that's how we sell the value there. Longer term, as I was saying earlier, I think the -- right now, the state-of-the-art for incident resolution is post-hoc. You know, you have an incident and you look into it. And you diagnose it and then you resolve it. So yes, maybe you could be the customer impact from 1 hour to 15 minutes. But you still have an issue, you still have impact, you still distract the team, you still have teammates working on that. I think longer term, what's going to happen is the systems will get in front of issues. They will auto diagnose issues. They will help pre-mitigate or pre-remediate potential issues. And for that, the analysis will have to be run in stream, which is a very different thing. You can massage data and give it to an LLM for post-hoc analysis and a lot of the value is going to be in the gathering the data, but you also have quite a bit of value in the smarts that are done in the back end by the LLM for that. And that's something that is done by Anthropics, the OpenAIs of the world today. I think as you look at being in-stream looking at 3, 4, 5 orders of magnitude, more data, looking at the data in real time, and passing judgment in real time on what's normal, what's anomalous and what might be going wrong doing that hundreds, thousands, millions of times per second, I think that's what is going to be our advantage and where it's going to be much harder for others to compete, especially general purpose AI platforms.