ElementsGPT Demo Wows Webinar Attendees with Generative AI
Last week, Elements.cloud, an independent software vendor in the Salesforce ecosystem, demonstrated generative AI prototypes in a public webinar sponsored by SalesforceBen. One demo was a GPT-style conversation with the metadata in your org. Another demo recommended org changes and generated user stories based on Elements business process management data. In this article I’ll describe what was presented in the webinar, why I think this raises the stakes in enterprise software, and how some of this magic works. ElementsGPT is currently in an early pilot with select customers. It is due to be available to all Elements customers and consulting partners by Dreamforce 2023.
- ElementsGPT Webinar Summary
- ElementsGPT Demo Excitement
- Customer Reaction: Why click when I can ask?
- Large Language Models and Emergent Properties
- Grounding LLMs with AI Orchestration
- Real Prompt Engineering
- Getting Ready for AI Orchestration
- Salesforce & Generative AI: A Target Rich Environment
- Resources Mentioned
ElementsGPT Webinar Summary
The webinar introduction focused on the future of AI, particularly on automating repetitive and time-consuming tasks, thus enhancing work enjoyment. The discussion revolved around the application of AI in business processes and the emergence of AI tools that can automate tasks with a click of a button.
Ksawery Lisinski, Elements’ presenter and VP of product management, initially addressed numerous managerial and business issues related to the influx of AI news. Ksawery offered a novel analysis of how various people within the Salesforce ecosystem will be influenced by AI over time.
We spent about a third of the webinar time on these issues, plus some surveys. The survey results showed that 66% of participants were excited, 24% were skeptical, and 10% were fearful of Generative AI in their workplace.
ElementsGPT Demo Excitement
The first demo that caught everyone’s attention was Elements version of the GPT chatbot. They first showed a metadata chatbot where you can make inquires like “Show me the metadata used to support commissions.” This capability could be beneficial in numerous administrative and developer scenarios. For myself, this would save visits to the Workbench, awkward searches in VS Code, or a dive into the admin interface. This increases the value of Elements as an org discovery and maintenance tool.
But the user story generation demo and recommended changes was jaw dropping for many attendees. Here the demo started with a graphical representation of a business process. Then, with the click of a button a complete draft user story is generated, including a metadata impact analysis. The generated user story and impact analysis may then be used as a template or starting point for a complete user story and action items for doing work.
If users find the generated story helpful, then user story generation may save business analysts hours of time. Generating English-language user stories can also benefit users who speak English as a second language. I am excited about increasing the quality and production speed of user stories. This will increase the ability to iterate business requirements gathering with constituents faster. I hope by speeding up the technical end of business process management, end-users will become more engaged in system development.
Customer Reaction: Why click when I can ask?
I had a wide-ranging discussion about Elements, AI, and the future of the architect role with Andrew Russo, a user and architect in the Salesforce ecosystem. He is the enterprise architect for BACA Systems in Orion, Michigan. “Elements is amazing for metadata discovery and figuring out impact analysis,” explains Andrew. But, even with Elements in place, Andrew still finds himself clicking all over a Salesforce org to answer architectural questions.
Andrew is excited for metadata chat to become available. “I can do a full security review in about 40 minutes using the Salesforce admin interface. I can imagine getting that job done in just about five minutes with careful prompt engineering,” added Andrew. We went on to explore the idea of just asking a complex enterprise app like Salesforce to explain itself. That seemed to be something to look forward to as enterprise AI evolves.
Our discussion left us to ponder a new truism in the generative AI world: Why click when you can ask?
Large Language Models and Emergent Properties
Let’s dig into the AI technology used in these demos to understand what Elements is doing. The Large Language Model (LLM) APIs from OpenAI, Anthropic, and others, have general purpose knowledge management and planning capabilities that exceed their expectations. Sometimes, an LLM will behave unexpectedly and produce results that indicate it has learned something new.
AI scientists say the LLM may have developed emergent properties when it creates new skills based on general knowledge. Researchers have noticed this occurrence not only in language generation, but with generative AI in chemical and biological synthesis research. This is one of the reasons why ChatGPT-4 is so good at writing Salesforce Apex and other specialized languages with little domain-specific training.
Grounding LLMs with AI Orchestration
Another major factor is the growth of AI-powered applications is the ability to ground an LLM in your organization’s proprietary data. Grounding data can be structured, like XML metadata, or unstructured data like a whitepaper PDF. To be compatible with an LLM, grounding data first needs to be vectorized into an intermediate form. This process of behind-the-scenes preparation of data for an AI application is called AI Orchestration.
Orchestration is important right now because there is a limited context window available for the LLM to keep in its current memory. The context window is important because today they are limited in size, and you can overrun it easily. You currently can’t just load all your org’s data into a context window. Metadata intelligence is required to effectively load vectorized grounding data.
AI orchestration and grounding is how Elements plans to bring org metadata Salesforce Well Architected content into a LLM context. When you’re working on commissions, for example, ElementsGPT will load the relevant parts of your org’s metadata, plus the latest version of Salesforce’s architectural recommendations, into the context of your application implementation activity. This ensures the most relevant and up-to-date information is loaded and is not restricted by an LLM’s training cutoff date.
Real Prompt Engineering
The User Story and recommended changes generation demo showcases an important evolution in how enterprise applications will begin to leverage generative AI. In the case of generating a user story, the user gets a draft story by just pressing a button. There was no chatting or other interaction with the AI required.
This is because ElementsGPT uses formal Prompt Engineering to implement the user story generation find the org changes required. Behind the scenes, ElementsGPT is still engaging in a chat session with an OpenAI API. The difference is that the prompt being used is extraordinarily detailed, fine tuned, and proprietary.
My review of AI literature has revealed that prompt engineering, a scientific discipline that involves extending and testing prompts used with an LLM, can yield more accurate results when using LLMs. I have seen multi-page prompts where the prompt is parameterized with details relevant to the use case. You may have noticed this yourself where longer, more detailed instructions generate better results with ChatGPT. ElementsGPT is doing something similar with its user story generation.
Not What You See on Twitter
Real prompt engineering, like what ElementsGPT is doing, is certainly a world away from what we see as examples of prompt engineering on Twitter. Instead, I am exploring Google Scholar for superior examples of prompt engineering! This free course from DeepLearning.AI features an OpenAI developer advocate who explains prompt engineering from the developer perspective.
As enterprise AI evolve, I expect there to be more embedded prompt engineering under the hood. Companies will rely on carefully formulating prompts to minimize hallucinations, or those cases when a helpful LLM invents fictitious, but credible false information.
However, asking is still faster than clicking! Because of this, I expect the chat interface to persist. But the chat interface will increasingly become the domain of the architect and other experts. This is because a blank chat window is intimidating for most users.
Getting Ready for AI Orchestration
After last week’s webinar, I had the opportunity to meet with Ian Gotts, CEO of Elements.cloud, and AI specialist Ksawery Lisinski. They were kind enough to share the details of how they are building ElementsGPT. I also wanted to find out what they thought people needed to understand about generative AI.
In our wide-ranging discussion, Mr. Gotts wanted to emphasize the following points:
- Good Prompts: The first point Ian emphasized was the importance of generating good prompts. He mentioned that it’s not necessarily about having a long conversation, but about creating effective prompts for the AI.
- Good Inputs: The second point was about the need for good inputs to generate good prompts. In the AI world, good inputs mean a well-documented business and some form of documentation about how your organization works.
- AI Readiness: Ian stressed that people shouldn’t wait until Elements GPT is generally available (GA) to start using it. Instead, they should start thinking about how they can be AI-ready so that when GPT is put on top of this, they can get answers instantly.
- Documentation: Ian pointed out that if your documentation has nuances, abbreviations that don’t make sense, or if you’re using several different objects for different purposes but never wrote that down, AI is not going to help you. Now is the time to start thinking about the important things in your organization, such as key objects, key flows, and key operational processes.
Salesforce & Generative AI: A Target Rich Environment
Elements.cloud won’t be the only Salesforce ecosystem vendor to jump into the generative AI waters with both feet. I covered a Copado announcement about user story generation earlier this month. Later this week I’ll be covering an announcement Salesforce devops vendor Cloud Compliance about their new GPT spinout, GPTfy, where they are the first to provide a shipping OpenAI API integration with Sales and Service Clouds. GPTfy incorporates some of the same prompt engineering features we see in ElementsGPT, but with user data instead of metadata.
I expect 3rd party and AppExchange implementations of OpenAI and other AI APIs in Salesforce to explode in 2023. And don’t get me started on the idea of combining Slack with ChatGPT or Anthropic Claude!
It is easy to imagine generative AI impacting nearly every Salesforce ecosystem vendor. These impacts will range from direct integrations of OpenAI APIs into products to improve usability and productivity, to things like SDRs using ChatGPT-4 for personal productivity. These innovations, combined with Salesforce’s stated goal of integrating generative AI into every nook and cranny of their clouds, indicates a new wave of innovation in enterprise software is now well underway. Users should expect to reap the rewards from this innovation soon.