Home
Breadcrumb ChevronBreadcrumb Chevron
Blog
Breadcrumb ChevronBreadcrumb Chevron
Getting Started with Generative AI
Artificial Intelligence
Employee Experience

Getting Started with Generative AI

Read Time Icon LightRead Time Icon Dark
9 minutes read time
Publish Date Icon LightPublish Date Icon Dark
Published on Apr 10th, 2024
Author Icon LightAuthor Icon Dark
Written by Janine Marchi

In human history, most innovations have taken a long time to progress – from pioneering inventions to bold scientific and medical advancements. However, with ChatGPT and other generative artificial intelligence programs, it feels like the future of work has once again evolved overnight. 

So, it's no surprise that this year at the 2024 Gartner Digital Workplace Summit, generative AI was one of the largest and hottest themes covered during the event.  Generative AI is fundamentally changing how work is being done and it’s creating new jobs, so it’s no question that the topic on everyone’s mind is how to adapt to AI at work. 

The event covered a range of topics to help organizations, equipping them with the right resources, insights, and tools to get started and most importantly not be left behind. This blog will cover a range of key themes, insights, and considerations for implementing generative AI within your organization.  

1. Generative AI is Here and Companies Are Looking for Ways to Test and Implement

Embracing AI is no longer a choice, but a necessity in today's competitive business landscape. In fact, during the event survey results indicated the ability to leverage AI within your job would soon impact employee retention and attraction. Basically, employees would be willing to leave a job if that meant they had an opportunity to leverage AI in their everyday work. 

There’s no doubt that employees across all verticals and job functions could benefit from spending less time looking for information, more effective work processes and meetings, an easier time writing/editing content, and the list goes on. This is where AI promises real productivity gains for employees.

Many employees likely still have fears about AI potentially taking over their jobs, but the question to ask is not how well can AI do my job, but rather how much better can I do my job using AI? What new capabilities will generative AI bring to my digital workplace?

It’s a better-together story. Humans will need to be and stay in the loop when it comes to AI. Studies have shown that when humans use generative AI technologies their productivity or writing skills improve, but using generative AI alone doesn’t necessarily produce accurate results. Information created (such as content) with generative AI needs to be validated and fact checked. Sure, ChatGPT passed the bar exam, but would you want AI-generated technology to represent you in court?

Organizations should look to experiment with generative AI and consider how they can set up cohorts to run different experiments. Unlike many traditional IT projects, many use cases allow for fast implementation and time-to-value cycles when it comes to implementing AI for day-to-day activities. 

Pilots should not be about just ensuring the technology works, but also about learning how generative AI fits into the future of your company, and that’s where organizations need to ensure they are creating a culture that embraces generative AI.

2. Create a Culture that Embraces Generative AI

Culture shapes how work gets done- everything from the way employees interact to how they make decisions and approach their work, including what tools are used and what rules they follow. 

Include employees in the process 

An important step to embrace a generative AI culture shift is to ensure employees are included in the process. Include and co-create the generative AI journey, together. Ensure that employees are informed and have the opportunity to actively participate in generative AI rollouts. According to Gartner, involving employees increases the chances of success by 14x.  

Encourage experimentation and drive innovation  

Generative AI thrives in environments where experimentation and innovation are encouraged. Organizations should create spaces for employees to explore new ideas, test hypotheses, and iterate on solutions. This could involve setting aside dedicated time for innovative projects, establishing cross-functional teams, or implementing innovation challenges. 

Celebrate successful implementations and failures  

Organizations should recognize and reward employees for their contributions to AI projects whether they result in successful implementations or valuable insights gained from failures. Moreover, organizations should encourage a culture of safety where employees feel comfortable taking risks and experimenting without fear of retribution.  

Shifting organizational culture to embrace generative AI is a complex and ongoing process. However, by following best practices such as securing leadership buy-in and investing in education – organizations can create an environment where generative AI thrives and drives growth and innovation. 

3. Generative AI Use Cases and Where to Get Started

When it comes to generative AI, a common question that came up at the Digital Workplace Summit was where to get started. Workgrid hosted a session on this exact topic, where we shared real-world examples of use cases our customers have implemented using the Workgrid AI Assistant. The best part is they are examples that are quick to implement due to Workgrid’s out-of-the-box prebuilt templates and integrations to various third-party enterprise systems. With generative AI, it's recommended to start with a land-and-expand approach. Begin with smaller use cases to test and drive adoption, then iterate and grow into the workforce. 

Gartner analyst, Darin Stewart covered this best practice and four key areas to get started. 

1. Copilots

Copilots are getting a lot of press and attention, and for good reason. Copilots can provide instant productivity gains for various tasks. Rather than going layers deep into technology to complete a task (e.g. pivot table), vendors have implemented copilots embedded directly into their platforms so you can simply just ask for the copilot to create your pivot table.  

Vendors are introducing new copilots at a fast rate. Many copilots are restricted to a single vendor platform (e.g. Salesforce, Microsoft, GitHub, etc.,). There could be lots of value to using these vendor-specific copilots; however, there are also enterprise copilots such as Workgrid that expand outside a single platform and integrate across multiple systems and repositories to provide employees with multiple use cases under a single copilot.  

2. Content Generation

As generative AI becomes more accessible, more platforms are incorporating copilots to help in content creation. This can significantly enhance both creativity and productivity. For individuals who are not accustomed to writing, the prospect of starting with a blank page can be quite daunting. 

When using generative AI, you can quickly generate an outline or draft within seconds. However, it's important for employees to keep in mind that anything produced with AI should be considered a draft. The quality of the content is unlikely to be sophisticated or publication ready. 

Teaching and mentoring employees on how to effectively utilize generative AI for various use cases, particularly in content generation and research, will be crucial. Prompt engineering is an emerging discipline that is expected to grow significantly. To facilitate learning and maximize the potential of generative AI, Gartner suggests developing a prompt library for employees. This library will serve as a valuable resource for employees to easily grasp and leverage the capabilities of generative AI. 

3. Knowledge Discovery

Enterprise search continues to be a major source of frustration in modern digital workplaces, leading to significant negative impacts on productivity and business decision-making. It's not uncommon for employees to make poor choices simply because they couldn't locate the necessary information, resulting in guesswork. 

Over the past decade, Siri has revolutionized our expectations. We have become accustomed to asking questions and receiving instant answers, rather than having to sift through numerous search results. This is the kind of experience your employees desire and require. With the help of generative AI, you have the capability to provide them with relevant and accurate results through a conversational interface.

Traditional search, commonly utilized by most enterprises, operates on the basis of specific keywords, requiring users to have precise knowledge of what they are searching for. On the other hand, AI-driven search leverages natural language processing (NLP) and semantic search techniques to comprehend the intent, meaning, and relevance behind search queries, going beyond mere keyword matching. 

AI-powered search enables users to search unstructured data, including documents, videos, and emails, regardless of the formatting. This advanced technology ensures that relevant information can be retrieved without the need for extensive manual searching. 

4. Knowledge Capture

The transfer of knowledge is a very important task, yet it’s extremely time-consuming. Nobody has the time to sit down and write out everything they know. Generative AI is a great solution to help get around the challenges of knowledge capture. 

Generative AI offers various methods to capture information and transform it into valuable resources. For instance, in the context of remote work, numerous meeting and collaboration platforms can generate transcripts. By leveraging generative AI, you can effortlessly summarize these transcripts.

Consider the abundance of valuable data that is often confined within support tickets. By utilizing Generative AI, the solution for the issue can be summarized and transformed into an article. This functionality serves as a time-saving measure for knowledge capture and article creation.

4. AI Upskilling is Necessary to Full Harness AI Capabilities

The current workforce's skill set is not meeting the demand generated by AI for most organizations. AI reskilling is not limited to technical roles alone. It encompasses a wide range of disciplines, including data analysis, machine learning, and AI ethics.

Part of the upskilling process will involve providing employees with clear guidelines to understand the role of AI in their specific job and within the company. It is important to include employees in the AI journey, ensuring that they are educated on the plans and know what is expected of them once the implementation takes place. Keep your team updated on the AI roadmap and communicate how you intend to invest in AI in the upcoming months and years. Highlight how these new tools will enhance employees' roles and contribute to the achievement of your company's goals.

Employees at all levels should prioritize developing a foundational knowledge of AI concepts and how they can apply AI to their work. Training programs should aim to enhance technical skills related to AI while also fostering a mindset of curiosity and experimentation.

To effectively address the digital dexterity gap, organizations must communicate clear expectations and provide resources to bridge the disparity between the availability of technology and employees' ability to leverage it.

Training employees will cover a wide range of topics. Examples include, but are not limited to: 

Prompt Engineering

The importance of prompt engineering will be very important. Understanding how to best structure a prompt will enable the production of more accurate and relevant responses. Even minor alterations to a prompt can yield significant variances.

Data and Privacy

Ensure that employees do not enter private information into the models and that they remove personal information when completing knowledge capture tasks. 

Data Validation

Help employees understand that generative AI should be used as a jump start or a tool to reduce time and effort in completing tasks, but any content generated should be validated.

Ethical Considerations 

As organizations embrace AI, it is crucial to prioritize ethical considerations and ensure the responsible use of AI technologies. Prioritize ethical considerations by establishing guidelines and frameworks for ethical AI development including considerations related to bias, fairness, transparency, and accountability. 

Establish AI Policies 

Create clear AI policies that outline acceptable uses of AI in the workplace as well as clear guidance on best practices for using AI at work.

5. Considerations for Vendor Copilots and the Vision for a Unified Enterprise Copilot

As the popularity of AI continues to grow, organizations are faced with the task of developing a strategy for implementing and leveraging copilots in the enterprise. With a wide range of vendors incorporating copilots into their platforms, it is important to consider several factors before enabling these AI capabilities across multiple platforms. While vendor-specific copilots can undoubtedly enhance an employee's day, careful consideration is necessary to ensure successful integration. 
 
The top five factors to consider before purchasing vendor-specific copilots: 

  1. Output quality – many copilots are very early on, so their output might not necessarily be the best quality.  

  2. Limited functionality – functionality may be very limited and only benefit a small subset of employees.  

  3. Restricted to a single platform – functionality is limited to the specific platform it's available in (i.e. Salesforce, Microsoft, GitHub) 

  4. No control over the LLM – You have no control of the LLM, the vendor controls the LLM and how they handle your data.  

  5. Expensive – Many copilots are very expensive and will double your licensing cost, Microsoft for instance starts at $30 per user, per month.  

Finding the Right AI Solution

A key technology of AI solutions recommended at Gartner was the use of Retrieval Augmented Generation (RAG).  

According to AWS Retrieval-Augmented Generation (RAG) is the process of optimizing the output of a large language model, so it references an authoritative knowledge base outside of its training data sources before generating a response.”  

This is important because when RAG is not leveraged, an LLM takes a user’s prompt and generates a response based on the information the model was trained on or already knows. 

Without RAG, an LLM takes a user’s prompt and generates a response based on information the model is trained on or already knows. This causes limitations of what the model can generate in terms of relevance because models are only trained up to a certain date. So, asking a simple question about something in 2024, you may get a response that sounds similar to “I don’t have access to that specific data”.  

With RAG, however, the LLM doesn’t need to be trained in advance to answer the prompt. The prompt is augmented in real-time with relevant and contextual information before being passed to the LLM. The result is that you can greatly improve the reach of the data being searched across without spending large amounts of resources to create, tune, and train models.  

What made this session particularly intriguing was the fact that it presented a comparable methodology to the one utilized by Workgrid in developing its own AI Work Assistant. Workgrid's AI Assistant effectively addresses various challenges associated with implementing a vendor-specific copilot. Here are some of the ways in which our AI Assistant accomplishes this: 

  • Expands beyond a single platform - the AI Assistant integrates across your entire digital workplace ecosystem (third-party business systems and knowledge sources) to provide employees with a single, conversational AI experience to find information and complete tasks.

  • Provides control over your data - Workgrid does not train the LLM with your data. Workgrid ensures your data and sensitive information remains under your control. Some copilots may use your data to train their LLM which is used across multiple customers.

  • Enables organizations to bring their own LLM - Workgrid offers the flexibility for customers to integrate their own LLMs for specific apps.

  • Leverage RAG - and other natural language processing (NLP) technologies to enhance the ability to provide accurate and reliable answers while searching across multiple data sources while also addressing challenges faced by many LLMs, including hallucinations, outdated knowledge, and non-transparent reasoning processes.

Learn more about how the Workgrid AI Assistant can help support your digital employee experience. Request a demo now.

Loading... please wait

Want to see Workgrid in action?