Skip to content
two different tech workshops, emphasizing the differing approaches to AI development between Salesforce and Google. On one side, it features a user-friendly Salesforce workshop in a warm, inviting environment, and on the other, a more technical, data-driven Google workshop with an intense focus.

Salesforce and Google: Contrasting Approaches to Enterprise AI

In recent weeks, I had the privilege of attending groundbreaking hands-on workshops at the headquarters of two Silicon Valley giants, Salesforce and Google. These experiences offered a firsthand look at the contrasting approaches these tech titans are taking to bring enterprise-grade, large language model (LLM) applications to scale. As I immersed myself in the cutting-edge world of AI development, I couldn’t help but feel a sense of excitement and awe at the history unfolding before my eyes.

The workshops provided a fascinating glimpse into the future of enterprise software, where AI is not just a buzzword but a transformative force reshaping how businesses operate. Salesforce and Google, each with their unique strengths and philosophies, are at the forefront of this revolution, pushing the boundaries of what’s possible with LLMs and retrieval-augmented generation (RAG). As I navigated through the hands-on exercises and engaged with the brilliant minds behind these innovations, I knew I was witnessing a pivotal moment in Silicon Valley and computer history.

Salesforce: Low-Code, Business User-Friendly

At the “Build the Future with AI and Data Workshop” held at Salesforce Tower in downtown San Francisco, the focus was on empowering business users with a low-code, clicks-not-code approach. The workshop, attended by around 100 people, took place in a ballroom-sized auditorium. Each attendee received a free instance of the Generative AI-enabled org, pre-populated with a luxury travel destination application, which expired in 5 days.

Data Cloud: Lots of Clicks

The workshop began with setting up data ingestion and objects for linking AWS S3 buckets to Salesforce’s Data Cloud. The process was intricate, involving a new nomenclature reminiscent of SQL Views within Views, requiring a considerable number of setup steps before accessing Prompt Builder.

I should note that when using Einstein Studio for the first time, you don’t normally need to do Data Cloud setup. We did it in this workshop so we could later include Data Cloud embeddings in a Prompt Builder retrieval.

Prompt Builder: Easy to Use

Prompt Builder was the highlight of the workshop. It allows for template variables and various prompt types, including the intriguing Field Prompt, which enables users to attach a prompt to a field. When editing a record, clicking the wizard button in that field executes the prompt, filling out the field automatically. This feature has the potential to greatly enhance data richness, with numerous use cases across industries.

Integrating Flow and Apex with Prompt Builder demonstrated the platform’s flexibility. We created an Apex Class using Code Builder, which returned a list that could be used by Prompt Builder to formulate a reply. The seamless integration of these components showcased Salesforce’s commitment to providing a cohesive, user-friendly experience.

Einstein Copilot, Salesforce’s AI assistant, exhibited out-of-the-box capabilities when integrated with custom actions. By creating a Flow and integrating it into a custom action, users could invoke Einstein Copilot to assist with various tasks.

A Warmly Received Roadmap

Salesforce managers, including SVP of Product Management John Kucera, provided insights into the Generative AI roadmap during a briefing session. They emphasized upcoming features such as Recommended Actions, which package prompts into buttons, and improved context understanding for Einstein Copilot. The atmosphere in the room was warm, with genuine excitement and a sense of collaboration between Salesforce staff and attendees.

The workshop positioned Salesforce’s AI solution as an alternative to hiring an AI programmer and building AI orchestration using tools like I used in the Google workshop. Salesforce’s approach focuses on a user-friendly interface for setting up data sources and custom actions, enabling users to leverage AI without relying on code. This low-code philosophy aims to democratize AI, making it accessible to a broader range of business users.

For organizations already invested in the Salesforce ecosystem, the platform’s embedded AI capabilities offer a compelling way to build expertise and leverage the power of Data Cloud. Salesforce’s commitment to rapidly rolling out embedded AI enhancements, all building on the familiar Admin user experience, makes it an attractive option for businesses seeking to adopt AI without the steep learning curve associated with coding.

While there was palpable enthusiasm among attendees, the workshop also highlighted the complexity of setting up data sources and the challenges of working with a new nomenclature. As Salesforce continues to refine its AI offerings, striking the right balance between flexibility and ease of use will be crucial to widespread adoption.

Google: Engineering-Centric, Code-Intensive

The “Build LLM-Powered Apps with Google” workshop, held on the Google campus in Mountain View, attracted around 150 attendees, primarily developers and engineers. We met in a large meeting room with circular tables. The event kicked off with a keynote presentation and detailed descriptions of Google’s efforts in creating retrieval-augmented generation (RAG) pipelines. We participated in a hands-on workshop, building a RAG database for an “SFO Assistant” chatbot designed to assist passengers at San Francisco airport.

Running Postgres and pgvector with BigQuery

Using Google Cloud Platform, we created a new VM running Postgres with the pgvector extension. We executed a series of commands to load the SFO database and establish a connection between Gemini and the database. The workshop provided step-by-step guidance, with Google staff helping when needed. Ultimately, we successfully ran a chatbot utilizing the RAG database.

The workshop also showcased the power of BigQuery in generating prompts at scale through SQL statements. By crafting SQL queries that combined prompt engineering with retrieved data, we learned how to create personalized content, such as emails, for a group of customers in a single step. This demonstration highlighted the potential for efficient, large-scale content generation using Google’s tools.

Gemini Assistant

One of the most exciting discoveries for me during the workshop was the Gemini Assistant for BigQuery, a standout IT Companion Chatbot tailored for the GCP ecosystem. Comparable to GitHub Copilot Chat or ChatGPT-Plus, Gemini Assistant demonstrated a deep understanding of GCP and the ability to generate code snippets in various programming languages. What distinguishes Gemini Assistant is its strong grounding in GCP knowledge, enabling it to provide contextually relevant and accurate responses.

During the workshop, I had the opportunity to interact with Gemini Assistant firsthand. I was impressed by its ability to generate Python code and complex BigQuery SQL statements from simple text descriptions. This level of sophistication and contextual awareness has the potential to revolutionize how developers and engineers work within the GCP ecosystem, boosting productivity and simplifying complex tasks. Moreover, Gemini Assistant often provides sources, such as blog posts or GitHub repositories, to support its answers, enhancing confidence in its outputs.

Product Presentations

In addition to the hands-on workshop, we saw product presentations covering Vertex AI, Gemini Chat, Model Garden, and Anthropic’s Claude. These presentations offered insights into the latest advancements in Google’s AI ecosystem and its collaborations with leading AI companies. Arize Phoenix, an open-source project, introduced the “five pillars of LLM Observability,” underscoring the nascent stage of LLM app development and the critical importance of monitoring and understanding the behavior of these complex systems.

The Google workshop highlighted the company’s engineering-centric approach to AI development, emphasizing the use of code and CLI/API interactions to set up and operate systems. This contrasts with Salesforce’s low-code, business user-friendly approach, catering to different user personas and skill sets. However, both companies face the shared challenge of securing RAG embeddings, with Google actively developing new RAG security features in AlloyDB (it’s managed Postgres service) to address this concern.

Lots of Builders and Data Scientists

Throughout the event, I had the opportunity to connect with professionals from various Silicon Valley giants, including HP, Cisco, and Apple. These interactions revealed a wide array of applications under development, ranging from customer chatbots and finance administration tools to enterprise search solutions. The diverse use cases and the palpable enthusiasm among attendees underscored the growing importance and potential of enterprise AI.

Attending the Google workshop provided valuable insights into the company’s cutting-edge AI technologies, its commitment to empowering developers, and its collaborative approach to driving innovation. As Google continues to refine its AI offerings and address key challenges, such as data security and observability, it is well-positioned to play a pivotal role in shaping the future of enterprise AI.

Analysis: Different Approaches, Common Challenges

While Salesforce and Google cater to different audiences – business users and engineers, respectively – both face the critical challenge of securing RAG embeddings. Google is addressing this with new RAG security features in AlloyDB, while Salesforce’s Einstein Trust Layer leverages existing metadata and security models for CRM data. And in a competitive atmosphere, Salesforce has decades of experience implementing metadata and enterprise security models. This gives Salesforce a leg up in the basic architecture required to secure RAG embeddings.

Salesforce’s low-code approach prioritizes ease of use, while Google’s engineering-centric model offers flexibility and scale. Google’s massive data processing capabilities give it an edge for handling large volumes of inferences, while Salesforce’s strength lies in its deep integration with the Salesforce ecosystem.

However, a notable omission in Salesforce’s offerings is the lack of an IT Companion Chatbot. While Salesforce ISVs like Copado, Elements.cloud, and Metazoa are filling this gap, Salesforce’s absence in this area is concerning. An IT Companion Chatbot is crucial for alleviating the cognitive load of IT professionals, providing real-time support and solutions in complex environments like Salesforce and DevOps.

The presence of a Googler at the Salesforce event, advocating for the integration of Gemini into Salesforce’s Model Builder, highlights the recognition of Google’s superior AI capabilities. Although Salesforce management’s response suggested openness to collaboration, it also hinted at the challenges Salesforce faces in keeping pace with AI advancements.

Looking Ahead

The workshops at Salesforce and Google showcased the rapid pace of innovation and the growing importance of AI in shaping the future of enterprise software. As these tech giants continue to evolve their AI offerings, the focus on data security, observability, and ease of use will be paramount.

Salesforce’s low-code, user-friendly approach, and deep understanding of enterprise needs position it well, but the lack of an IT Companion Chatbot remains a significant gap. Google’s technological superiority and vast resources give it a clear advantage, but Salesforce’s ecosystem and customer relationships could help offset its limitations.

As a long-time Bay Area denizen, witnessing this pivotal moment in Silicon Valley and computer history is both exciting and thought-provoking. The race is on to provide the most compelling and trustworthy solutions for enterprise AI adoption, and the choices made by Salesforce, Google, and other players will shape the future of work for years to come.

Post
Filter
Apply Filters