Ada Support

The ultimate guide to generative AI for customer service

Sarah Fox
Content Producer
AI & Automation | 30 min read

Thanks to generative AI, customer service has quickly scaled the corporate ladder and become a top executive priority. In fact, 85% of execs say generative AI will be interacting directly with customers in the next two years. As a result, 63% say that they will have invested in generative AI use cases that serve their agents by the end of 2023. 

If you're in customer service, you're probably wondering what AI is, how it applies to customer service, and if it will take your job. We're here to help you separate the buzz from reality.

If you’re looking to understand exactly how AI can transform customer service and provide the ROI leadership is looking for, you’re in the right place. Before we get into that, let’s start with the basics.

What is generative AI?

Generative AI is a subset of AI that creates (or “generates”) new content, data, or outputs. Generative AI systems learn and analyze patterns from existing data and use those patterns to produce outputs that resemble human-created content — everything from image generation, text, data augmentation, music composition, and more.

With all the hype we’ve seen around generative AI in the past year, it may seem like flashy new technology, but generative AI dates all the way back to the 1950s and 1960s. That being said, it wasn’t until 2014, with the introduction of a specific type of machine learning, generative adversarial networks (GANs), that generative AI applications were actually able to create novel content outputs.

Generative AI vs AI

And other commonly used and misconstrued terms

Disclaimer: In case you were wondering, yes, generative AI did help in the making of this blog post. In our research, we found that the definition of generative AI differs depending on which web page you land on and which dictionary you use. Some consider it a category of AI algorithms. Others use generative AI to describe deep-learning models and fit it under the broader category of machine learning. defines it as AI designed to process prompts from users and respond with outputs modeled on a training data set. We used generative AI, more specifically, ChatGPT to help us compare, contrast, and define technical terms.

AI is often used as a blanket term to describe various advanced computer systems. While AI and generative AI are related, they refer to different aspects within the broader field of artificial intelligence. Some even refer to AI, generative AI, machine learning, and large language models (LLMs) synonymously, so it’s important to learn the differences. First, let’s differentiate AI from generative AI.

AI refers to the broader field of computer sciences pursuing the creation of intelligent machines that can perform tasks that would be typically performed by a human. AI can perform tasks like speech recognition, problem-solving, perception, and language understanding. It can also be designed to do more specific tasks, like diagnosing medical conditions or playing chess. There are a wide range of techniques used in AI, and this includes machine learning, natural language processing, and expert systems, for example. AI can analyze data and intelligently respond to what it sees, but it is limited to this function.

Generative AI, on the other hand, takes it a step further by using data to create content that’s entirely new and in various formats. Using machine learning models, generative AI produces original content based on patterns learned from existing data.

Both the fields of AI and generative AI continue to evolve and advance. Now, let’s go over some other terms you’ve likely seen floating around in relation to generative AI.

Machine learning is a subfield or method used in AI research. It involves the development of algorithms and models to enable computer systems to learn from data and make predictions or decisions. Machine learning systems learn from examples and adjust based on the data they’re exposed to, allowing it to improve its performance over time without being programmed to do so.

Large language models (LLMs) are the newest and least clearly-defined concept. A large language model, like OpenAI’s GPT-3, is a type of machine learning model that's trained on a large amount of text and specializes in text generation and understanding to generate natural-sounding replies. 

Deep learning is a subfield of machine learning that focuses on training neural networks to handle more complex patterns than traditional machine learning. 

Neural networks are inspired by the functioning of the human brain’s interconnected neurons. They learn by being given lots of examples and are used for various tasks like pattern recognition, classification, regression, and more. They consist of interconnected units that process and transform input data into meaningful outputs. They also learn from information over time. 

How does generative AI work?

Generative AI uses machine learning to analyze common patterns and arrangements in large sets of data, and then uses this information to generate new content similar to the existing data it's trained on. The more data or examples generative AI has to learn from, the more sophisticated it becomes.

Here’s a quick overview of the generative AI process:

Data collection and preprocessing: A dataset of relevant examples is collected — text, images, or any other type of data that the model is intended to generate. This data is preprocessed to ensure consistency and accuracy.

Model training: There are various models used in generative AI, chosen based on the nature of the data and the desired type of content generation. The most common models include:

  • Generative Adversarial Networks (GANs): This is a machine learning model where two neural networks are trained simultaneously, to learn and compete with each other, to become more accurate in their predictions. 
  • Variational Autoencoder (VAEs): A VAE is an AI algorithm that encodes and decodes information. To do this, it maps large sets of information into smaller sets or representations. These smaller representations are called the latent space. The original information is hidden in the compressed representation in the latent space, which allows the VAE to decode — to reconstruct an image for example — by observing the latent space.
  • Autoregressive Model: Autoregressive models are used in machine learning to predict future behavior based on past behavior data.

Sampling from the model: Once the model is trained, you can start using it to generate new content. Provide a random input or seed to the model, and it will use its learned patterns to produce new data that follows similar characteristics as the training data.

Fine-Tuning and Exploration: Depending on the application, you might fine-tune the model's parameters to adjust the quality or style of the generated content. You can explore the model's capabilities by trying different inputs, altering parameters, and experimenting with the generated outputs.

Evaluation: Evaluate the generated content based on various criteria such as realism, coherence, relevance, and aesthetics. Iteratively improve the model based on feedback and evaluation results.

Ethical and Bias Considerations: Be aware of potential ethical concerns such as bias, misinformation, and potential misuse of the generated content. Take steps to mitigate these concerns and ensure responsible use of generative AI.

Why is generative AI important?

While many continue to view generative AI with a degree of healthy skepticism, there’s no question that it has the potential to do extraordinary things.

"Investment in AI is predicted to increase by more than 300% over the next year."

- Forrester

Its ability to quickly create new and honestly, rather impressive, content drives further creativity and innovation — pushing the boundaries of human imagination in the generation of art, music, literature, and more. The fact that it can produce vast amounts of content so quickly, with a personalized touch, makes it useful across a variety of industries like marketing, ecommerce, and entertainment. By 2023, AI technologies are projected to increase business productivity by up to 40% .

Generative models can simulate real-world scenarios, and this makes for a highly valuable tool for scientific research, engineering, and risk assessment. It can also suggest novel molecular structures, simulate protein folding and other complex biological processes to aid medical and scientific advancements.

"More than 100 million people in the U.S. will use generative AI by 2024. By 2025, that number is expected to reach 116.9 million."

- Hootsuite

Generative AI is also useful in data augmentation, AI research, and data imputation and denoising.

How does generative AI impact customer service?

People are speculating that AI is about to fundamentally change the way we do business. When it comes to customer service, it already has. By 2024, the global chatbot market is expected to reach $994 million .

As customer service leaders and employees, you’re probably still wondering exactly how generative AI can affect your customer service organization or your position on the team, and we don’t blame you. We’re here to help you make sense of it — and eliminate any notions that AI is going to steal your job.

While it is true that, like previous iterations of AI, generative AI will take over some customer support tasks, it’s also paving the way for new opportunities. Rather than diminishing the role of customer service professionals, generative AI will actually amplify the importance of human input.

"68% of workers say generative AI will help them better serve their customers."

- Salesforce

When generative AI and customer service teams work together in harmony, a magical thing happens: you’re automatically resolving the most customer service inquiries with the least amount of human effort.

It’s also giving customer support teams the opportunity to actually evolve their careers , and go from roles like customer service agent to bot manager, or customer support advocate to conversational AI specialist.

Here’s how you can transition to AI-first and ensure AI matures over time alongside the investment and restructuring of your customer service team:

Phase 1: Deploy

Generative AI can scrape support documentation and use existing content to create and deliver answers to customer questions without manual training. This allows customer service teams to launch conversational AI in hours instead of weeks, shifting employee focus on identifying more automation opportunities, building action flows, optimization, and auditing transcripts.

That means you don’t need a technical person to be an automation or bot builder. Instead, identify your best performing agent and transition them to the role of Bot Specialist. Empower the Bot Specialist with the tools to take their career further — train and deploy the AI chatbot across support channels and establish Automated Customer Experience (ACX) benchmarks and KPIs.

To ensure success in this initial phase, ensure you have your people, technology, and strategy in place. This looks like:

People: Onboard the Bot Specialist and introduce them to the wider customer service organization as the AI chatbot partner who will be training and deploying the chatbot, as well as learning from partner's best practice recommendations and generative AI expertise. Partners include:

  • A customer support leader, who they will report to on AI chatbot implementation and content management
  • The education team as a major stakeholder in knowledge base content management
  • The product/engineering team for integrations and embedding the chatbot across the website

Technology: Connect the generative AI chatbot to an agent platform and a knowledge base tool.

Strategy: Generative AI implementation needs a firm and consistent strategy. Here are some steps for a Bot Specialist to follow:

  • Check knowledge base documentation to ensure it's accurate, up to date, and optimized for generative AI
  • Set up weekly checkins with the education team to surface any content gaps or opportunities found in chatbot transcripts
  • Set quarterly goals based on customer service OKRs
  • Provide monthly reporting against the goals to their manager

Phase 2: Learn

Once this initial set up is done and the AI chatbot is up and running, it's time to start paying close attention to bot analytics and insights. Work on implementing deeper integrations that power complex actions and increase automated resolutions.

At the same time, with generative AI not only writing content but helping improve and create variations of content, an employee in a Bot Specialist role could be promoted to a Bot Manager or work alongside a Bot Manager. The Bot Manager will continue to focus on the AI chatbot and content optimization. They’ll spend their time writing more high-value automation flows and ensuring content is consistent and on-brand.

Bot Managers will also want to partner with the education and engineering team, as well as the product team to report customer problems, trends, and propose solutions for improvement. They will also oversee integrations with the CRM and connect the generative AI chatbot to any necessary software via API endpoints.

The Bot Manager's goal will be to configure chat to be the primary support channel, and in support of this, they will need to share AI insights across the company to get other teams thinking about how they can leverage this data and also become AI-first.

Phase 3: Improve

In phase three, the Director of ACX will be leading the charge, aligning with leaders across the company and championing the customer to influence core business decisions. They will go deeper into the AI chatbot strategy, ensuring that it's leveraging machine learning and that the team is integrating AI guidance and expanding AI capabilities within customer service.

The Director of ACX will oversee the project planning and implementation of new support programs, develop long-term automation roadmap, and work cross-functionally and deliver customer insights to business development teams. And while they’re focusing on high-level KPIs and elevating the role of the customer service organization within the company, AI is automating complex use cases and automatically leveraging customer data to make informed decisions.

The benefits of generative AI for customer service

Generative AI can enhance the customer experience while improving efficiency for the business at unparalleled speed.

"By 2023, AI technologies are projected to increase business productivity by up to 40%."

- Forbes

AI-powered chatbots can quickly and accurately understand heterogeneous data compiled from different sources — like customer service transcripts or feedback surveys — at the speed and scale that’s needed, especially during periods of rapid growth. For the customer this means fast and consistent support around the clock. Here are some additional benefits:

  • Reduce operational costs
  • Offer personalized recommendations and solutions based on customer data
  • Resolve more inquiries at scale
  • Data analysis for insights
  • Multilingual support

Generative AI can instantly generate reports on customer insights, ensuring no inquiry, conversation, complaint, or suggestion gets lost or disregarded. Clear visibility into this data invites C-suite leaders to reconnect with the customer, closing the feedback loop and bringing CX organizations closer to the customer than ever before.

"Chatbots are expected to save businesses up to 2.5 billion hours of work."

- Juniper Research

Examples of generative AI for customer service

With the benefits of generative AI for customer service top of mind, you're probably wondering which companies are actually putting it to work and in what ways. Here are some key industries where companies are already successfully deploying generative AI chatbots in their customer service strategies. 

Travel and hospitality

With generative AI chatbots, airlines like Delta are able to provide detailed responses on travel, booking, and in flight service inquiries at lightning speed, while also enabling customers to take action with check in, tracking their bags, and finding flights. Airports like Heathrow International use generative AI to reply to service queries and automatically summarize cases, saving agents time and effort and boosting productivity. And travel sites like Expedia are integrating generative AI — in this case, ChatGPT — into mobile apps to provide a more conversational trip planning experience, with the generative AI bot providing recommendations and travel assistance.


Generative AI is making it easier for shoppers to find the products they're looking for. For global retail giant, H&M, a generative AI chatbot has reduced response times by 70% over human agents , and shoppers can now access generative AI-powered voice assistant in the mobile app. This improves the customer experience while significantly reducing the load on its customer service team.

Google's shopping service is often where the hunt for that perfect item begins, and now they've introduced a new generative AI-powered "try on" feature that allows shoppers to see how clothes look on a model with a similar body shape, skin tone, and size. Using a technique called diffusion, the company’s new generative AI engine is able to take a single image of a piece of clothing and realistically adapt it to various body shapes, showing how it will hang, drape, and fold. 

Healthcare, financial service, and more

Healthcare companies like SmileDirectClub are using generative AI to listen and summarize customer calls to save agent time and improve the overall customer experience. Financial service Wealthsimple uses a generative AI chatbot to answer FAQs and allow customers to quickly access their financial information, while also giving relevant guidance on money management. And productivity tool, ClickUp, uses its generative AI chatbot give people access to instant product information and a seamless hand off to the right specialist when necessary. 

Other examples of how generative AI can be used for customer service include drafting detailed email responses in record time. This resulted in 18% higher customer happiness scores for Octopus Energy. 

Truth is, we’re just scratching the surface of what generative AI can do for customer service organizations — and the technology is advancing quickly. The more we use it, the smarter and more efficient it will become. Let's take a closer look at what you can and can't do with generative AI for customer servicer today.

What you can and can't do with generative AI for customer service

Smart customer service leaders recognize that generative AI isn’t a magic solution — it requires the proper approach and a degree of scrutiny. 

First, let’s explore what generative AI can do for customer service today .

Speed up content creation

Companies with the right AI-powered platform can use AI as a writing and building assistant. The AI assistant can source existing information from existing content and develop first drafts of chat flows to speed up the time to value. Accelerating the pace of bot building allows CX leaders to launch a chatbot quickly to start making an impact with their customers.

It’s important to remember, while AI is there to help speed up the process, there still needs to be a human in the loop — someone to review that what’s been generated is safe, accurate, and helpful.

Reinvent the customer service organization

With generative AI taking some of the pressure off automation building, there’s more opportunity for CX teams to reorganize their teams and put their customer support staff in more strategic positions.

AI may be the ship, but you still need people to steer. Now, there’s more emphasis on the analytical side — digging into which interactions are working and which ones are underperforming, and then, determining different flows or interaction frameworks that can mitigate these concerns and get customers the answers they’re looking for.

It’s providing more clarity on how customer service organizations should be structured and opening up new career paths, allowing bot builders to evolve into true bot managers and for those managers to leverage other skills and empower other individuals.

Offer built-in conversational design best practices

Advances in generative AI are making automation more accessible to people without previous experience in this space, with conversational design best practices built into the responses and content it’s generating.

Bot managers can essentially type out a few bullet points, and the AI assistant can use those built-in best practices to reformat them for the channel the conversation is happening on — email, web chat, SMS, phone call, or something else.

Now let’s come to terms with what generative AI can’t do for customer service.

Set it and forget it

If you’re onboarding generative AI to your customer service, you need to make sure you have a deep understanding of how this technology works and what kind of guardrails you need to set up around it. You need to put a process in place to validate that the generated output is safe, accurate, and helpful.

Best practices for deploying generative AI in customer service

At this point, you might be itching to find out how to get started. Machine learning experts suggest you approach generative AI implementation with a human perspective and onboard the model like you would a new employee.

There’s an art to how you build, deploy, and grow automation over time. This is most often referred to as your automation strategy, but you could also consider this automation maturity.

There are three critical (and chronological) best practices to follow to deploy generative AI in customer service:

1. Automate your most common FAQs with a generative approach

With generative AI, you don’t have to waste time manually building out the answers to FAQs, so this is the best place to start. AI can crawl your support documentation and generate these answers for you, but you first need to mitigate the risk that your AI will resurface irrelevant, inaccurate, or harmful responses.

It starts with the knowledge base . Start by ensuring all the information in your knowledge base is accurate and up-to-date. Then, train the AI to use it effectively. You should:

  • Connect to existing knowledge sources: Ensure your support documentation is centralized and works as the single source of truth for your agents and customers.
  • Create and maintain a brand persona: Establish a consistent brand persona that aligns with your company’s values and messaging. Apply it everywhere.
  • Speed up content creation: Use the AI assistant to accelerate the pace of building automated flows.

2. Integrate across systems to power personalization and more complex use cases

Chatbots need to do more than just provide information, they need to take action on customer’s behalf. To do this, AI needs to be connected with other systems from across the organization. Develop an API strategy, using the tips below:

  • Ensure your team has access to a technical resource to help pull information like account balance, order status, account type, and so on. This allows the AI to take more action, like processing account upgrades, changing an address for an order delivery, and reviewing the account status.
  • Ensure your systems can give AI the information it needs by (1) mapping out which actions you want your customers to be able to make in your bot, (2) digging into where this data lives today and audit their APIs, with the help of a technical resource, to evaluate which options are available to pass information from one system to another, and (3) prioritize deployment of each API integration, informed by data and potential ROI.

3. Elevate generative AI using analytics and insights to continually optimize automation

Over time, you’ll want to progress from simply measuring Automated Resolution to improving it. Generative AI gives you faster, and more accurate, insight to how you can do this, but it can also create new answers to unforeseen questions dynamically. Here are some best practices to take generative AI to this level:

  • Remove knowledge silos: Key sources of human expertise should be involved in this process to translate institutional knowledge from human expert to AI. Identify the experts, set up routing technology to connect them when needed, and set up the tooling that allows them to offer their expertise to train the AI.
  • Analyze and optimize by transitioning from policy and content review to analysis and insights, make your bot manager the bridge between cross functional teams, and position your bot manager as a trailblazer and subject matter on AI. 

Knowledge base best practices for generative AI

1. Mutually exclusive and collectively exhaustive ontology

Thoughtful planning and preparation of your knowledge base architecture upfront will save you wasted time and effort in the future. This can also be referred to as “ontology.” 

Categories at each level of the knowledge tree should be mutually exclusive and collectively exhaustive. With a mutually exclusive and collectively exhaustive ontology, the AI will be able to find the answer to a large percentage of customer questions, and deliver it with a high degree of confidence.

Let’s break down those terms:

  • Mutually exclusive means that no two categories contain knowledge that overlaps. Having mutually exclusive categories ensures that you have a single source of truth for a piece of information. This removes the risk of inconsistency and if you ever need to update the information, you only ever have to update it in one place.
  • Collectively exhaustive means that all together, the categories cover all the information that your customers need to know or may ask about.

2. Precise and exhaustive titles for clear context setting

If you have multiple sections on the same subject, with the same title, it will become very difficult for AI to crawl the information and give a customer an accurate answer. Titles and section headers should become increasingly precise as a customer goes down a branch of the knowledge tree.

Fun fact, descriptive titles are always a better bet than questions. But if your knowledge base is already organized as a list of questions and answers, ensure the answers are self-contained. For example, “Can I pay by credit card?” → “Yes, you can pay by credit card”, rather than simply “Yes.”

If many knowledge base articles describe similar topics, ensure the distinction is made in the titles and article body — this avoids information being used out of context when generating an answer.

3. Self-contained articles

The question of how much content is too much content, or further, how much detail is too much detail is case dependent. But best practice is to maintain one topic per knowledge article, and ensure that customers can get all the essential information about that one topic from that one article so that they don’t have to jump around between pages.

The generative AI toolkit for customer service leaders

Practical guides to evolve your team, strategy, and tech stack for an AI-first world.

Get the toolkit