ESI Team

The AI Playbook

July 12, 2023
Share this blog post

On the 26th episode of Enterprise Software Innovators, hosts Evan Reiser (Abnormal Security) and Saam Motamedi (Greylock Partners) talk with Naveen Zutshi, CIO of Databricks. Databricks is a leading AI enterprise software company enabling organizations to build, deploy, and maintain data solutions at scale. In this conversation, Naveen shares fascinating use cases from the next generation of large language models, how AI can transform every aspect of enterprise work and the best ways to stay knowledgeable about the cutting edge of AI while building a culture of innovation. 

As one of the premier AI enterprise software companies, Databricks sees firsthand how the new generation of large language models (LLMs) is having a measurable impact on businesses of all industries. Naveen shares a fascinating use case from Rolls-Royce, the legendary engine manufacturer: “With Rolls-Royce, their model is selling engines for a very low price because they charge the customer per fly run, per hour of flights. They want to keep the engines in great working order and do a lot of preventative maintenance to prevent and reduce the amount of downtime for the engines because their entire revenue is tied to this. They use machine learning and AI for all of those purposes.” Whereas earlier iterations of preventative maintenance could not incorporate vast amounts of data, Rolls-Royce can now do extensive predictive modeling and analytics, optimizing their engines long-term and driving their business forward. 

Like many AI practitioners, Naveen is excited at the prospect of AI being a transformative technology across multiple business areas. He frames his optimism by first acknowledging that prior use cases of AI were siloed off to only specific engineering teams who knew how to harness it since there weren’t many user-friendly interfaces for interacting with its capabilities: “Before [this latest AI wave], you were using NLP and AI, but it was used primarily by data scientists and data engineers. It is now democratizing this whole notion of LLMs to the entire company. I can’t tell you how many business leaders, whether they’re in sales or other groups who want to start using the technology.” To that end, Naveen shares several fascinating use cases he sees during this next wave of AI technology. 

His first example centers around user experience and accessing information within an enterprise organization, a place where large companies have difficulty. Naveen believes AI can solve this problem: “Finding information is probably one of the lowest engagement scores for enterprises. I think we can turn complex requests into English-based requests without the need for engineering help and a lot of backend work once the prep is done, so I think that can make not just access to information, which is hindsight information, but also hopefully help with some of the predictive information as well.” By training larger and larger datasets, enterprises can use AI to make information within the organization more readily accessible across teams. 

Customer service is another area where this latest wave of generative AI capabilities will enable a much more dynamic and helpful experience for the average user. As Naveen describes it, Databricks is helping their customers do just this: “We built a knowledge base at kb.databricks.com. You vectorize it, embed it, do the LangChain, and then do the summarization. And you’re improving the accuracy of results, grounding the prompts, and coming out with a much more conversational bot that will help with both self-service for customers and reduce the number of tickets that need to be addressed by our customer support reps.” 

In such a fast-moving space, it can feel daunting for even seasoned technology leaders like Naveen to stay current with the latest trends, especially as the speed of the innovations continues at a breakneck clip. To combat this, Naveen recommends technical leaders dig into the white papers of the latest AI models since more and more are published all the time, and there’s no longer a single source of truth. Additionally, as these latest waves of models continue to reduce in cost, this opens up more chances for learning and experimentation as well: “You used to think one model to rule them all, and now suddenly there is a plethora of both open source models as well as smaller models with fewer parameters. The cost curve has come down so dramatically, so quickly; it’s amazing to see. I think GPT-3 was $5 million; it now can be trained at $300,000 to $500,000.” As these prices likely continue to fall, it will invariably be more inviting to IT leaders and other leaders across an organization to begin their AI journeys. 

Better yet, not just cost is reducing; the complexity of building these models is also decreasing. Week-long hackathons utilizing the latest AI models showcase tangible results, with the prospect of supercharging new initiatives. For IT leaders, it’s all about being open to experimentation. As Naveen puts it, “...let’s say you are doing 100 experiments, and 20 turn out to be amazing and 80 you have to discard. That is 20 more ways to turbocharge your revenue and productivity than you had before. Today, we are seeing a threshold change with AI that we had not seen before.”

From Naveen’s perspective, one of the most positive aspects of the current AI moment is how quickly and broadly the ecosystem moves. While it may seem daunting to some, he believes the opportunity outweighs the risk, and technology leaders should take full advantage of today’s landscape: “[With AI], you’re not marrying for a long term with a company. You are using it for certain use cases, and in many cases, you can have different models for different use cases. You can work with many companies rather than just one company. That would be my advice to CIOs because you’re right, the world is changing very rapidly and, whether ChatGPT’s in the front lead in terms of models, there are so many new companies that are coming up, Anthropic, Cohere, and others, as well as what Google is doing, what Microsoft is doing, what we are doing…or other companies are building their own models.” Given Naveen’s vantage point at Databricks, enterprise CIOs would be wise to heed his optimism and seize the opportunity in front of them.

Listen to Naveen's episode here and read the transcript here.