As businesses evolve in the digital age, the need for quick, accurate, and efficient data retrieval and problem-solving has never been more crucial. One tool that’s making waves in this regard is ChatGPT, a conversational agent based on the GPT-4 architecture. One may have heard a lot of precautions regarding the accuracy of the data in ChatGPT, so this and a lack of important business or domain context are the major impediments for more businesses to start making more value of ChatGPT in their organizations.

But how does ChatGPT gain the information it needs to answer questions or perform tasks?
This is where the integration of a knowledge base comes in.

ChatGPT has not enough data

As we explore ChatGPT’s inner workings, it’s important to note its core principle: the use of embeddings.
Embeddings are used to translate text into high-dimensional numerical vectors that encapsulate both semantic and syntactical meaning.
This is pivotal for training the neural network to create contextually appropriate and coherent responses.

Here’s how it works:

  1. Word Embeddings: This refers to representing words in the Language Model. Each word is replaced by a dense vector representation that encapsulates its semantic meaning. The model learns and refines these embeddings during the training phase.
  2. Tokenization and Embedding: In essence, tokenization breaks down words into individual tokens. These tokens are then embedded into a vector representation. Each vector dimension signifies a unique aspect of the word’s meaning.

This article explores the practical aspects of using ChatGPT, including its accuracy, cost, and potential risks.

We’ll also discuss how ChatGPT can be integrated with your company’s SharePoint, Google Drive, Confluence, or any other knowledgebase source and how you can utilize it for custom data.

ChatGPT Training on Company Knowledgebase – Truth or Myth

ChatGPT, a large language model (LLM), is frequently sought after to augment a company’s knowledgebase. However, employing this LLM effectively requires a clear understanding of the concepts of training, prompting, and vector embeddings and how they influence the integration of ChatGPT.

With the aim of an enhanced database, there’s a misconception that uploading an encyclopedia-like amount of data as a part of the training process will reshape the model’s understanding. Yet, this is far from the truth. ChatGPT’s training is directed at a general context awareness, not memorizing specific knowledge sources or databases. This key aspect balances the practical use of ChatGPT with the necessary caution in handling enterprise knowledge.

What is Training in Language Models (LLM)?

Training in Language Models like ChatGPT refers to the initial phase where a model learns by ingesting and processing vast amounts of text data. The primary objective is to enable the model to generate coherent and contextually accurate responses when prompted with queries. During the training phase, the model learns patterns, associations, and structures in the text data, allowing it to make educated guesses when faced with new, unseen text.

  • Example: If you are training a model on customer service interactions, it will analyze the back-and-forth dialogue between agents and customers. The model would attempt to recognize patterns such as common queries, appropriate responses, and sentiment to engage with the user later on effectively.

Why Uploading an Encyclopedia Won’t Train the Model?

There is a common misconception that uploading a single comprehensive document like an encyclopedia or a company’s organizational structure will make the model an “expert” in that domain. However, this is far from the truth.

  • Focused Effort: The initial training and fine-tuning are more about a focused effort involving multiple examples to help the model understand and label unseen instances. A single document is not sufficient for this.
  • Domain-specific Information: When understanding unique, domain-specific data, a one-off upload doesn’t help the model understand the nuances deeply.
  • Myth-busting: Just like a model won’t comprehend a company’s organizational structure based on a single document, uploading an encyclopedia will not make it an expert on every subject in the encyclopedia.

What is Prompting in LLM?

Prompting, on the other hand, refers to the queries or statements used to trigger a model into generating a specific response. These prompts can be finely tuned to get the most relevant and accurate output from the model. Prompting becomes especially crucial in the application phase where you engage the model in real-time or batch tasks.

  • Example: You could use the prompt “How do I resolve a networking issue?” and expect the model to generate a step-by-step guide based on its previous training. However, the output’s accuracy and usefulness largely depend on the effectiveness of the training and prompt engineering.

You might also like

How to make the AI work for your business

Intro In today’s fast-paced world, technology is constantly changing, and artificial intelligence (AI) is leading the way in this revolution and impacting how we run a business. AI isn’t just the future of your business – it’s the here and now. So, if you haven’t integrated artificial intelligence into your business yet, you’re falling behind […]

What is Vector Embeddings in LLM?

Vector embeddings in Language Models (LLM) like ChatGPT refer to the representation of words, phrases, or even entire sentences as vectors in a continuous mathematical space. This enables the model to understand semantic similarity between different pieces of text, and forms the basis of the model’s ability to generate coherent and contextually relevant text. Each word or piece of text is transformed into a series of numbers that capture its meaning and context within the dataset the model was trained on. Words that are contextually or semantically similar will have vector representations that are close to each other in this multi-dimensional space.

what is vector embeddings
  • Example: If the model has been trained to recognize that “How do I fix a bug?” and “What are the steps to resolve a software issue?” are similar queries, then the vector embeddings for these sentences would be closely aligned in the mathematical space, leading the model to produce similar or contextually relevant responses.

Cost-Effective Approach: Training vs. Prompt Engineering

For further details, you can refer to our article “AI Model Training or Prompt Engineering? Cost-effective Approach”, which elaborates on why prompt engineering could be a more efficient way to get the desired outputs. In many cases, you don’t need to go through the resource-heavy process of retraining the model. Instead, careful crafting of prompts can lead to equally satisfying results.

Deployment Options

When it comes to deploying ChatGPT in a business environment, the approach can differ based on specific needs, existing tech stacks, and budgets. Here are some options to consider for a seamless and effective deployment:

1. Custom Development

With custom development, you have the option of building your chatbot interface from scratch, integrating it with Open AI API.

Basically, this option provides great flexibility, allowing you to plug in literally any DB to ChatGPT, starting from specific files, websites, cloud storage (Sharepoint, Google Drive, Dropbox), or parsing web results.

This would require a proprietory development and may take up to a month for the first version to start.

  • Pro: Total Control and Customization
    • Tailored to your organization’s specific needs and nuances.
    • It can be integrated with any existing system.
  • Con: Time-consuming and Resource-Heavy
    • Requires significant time, typically ranging from a few weeks to months.
    • Requires skilled developers.
  • Cost and Timelines
    • Costs can vary widely based on project complexity, but ballpark estimates often start at around $5,000.
    • Timelines might range from 1 to 6 months, depending on feature sets.

Note: For specific numbers, it’s advisable to inquire for a proposal. It’s free and will give you a detailed estimate.

2. Power Virtual Agents for SharePoint

ChatGPT can be integrated directly into SharePoint using Microsoft’s Power Virtual Agents.

Power Virtual Agents lets you create powerful AI-powered chatbots for a range of requests—from providing simple answers to common questions to resolving issues requiring complex conversations.

Open bots panel and then click on new Bot. Enter name, select language, and Environment and then press create. This will spin up a new power virtual agent bot. Note: It takes few minutes for the bot to setup.

Microsoft has added Boosted Conversation to Power Virtual Agent. You can link an external website and the Bot will start generating answers if it couldn’t find any relevant topics. Now, the improved version supports up to 4 public websites and 4 internal Microsoft websites (SharePoint sites and OneDrive).

power agents for sharepoint
  • Pro: Easy Integration with Microsoft Ecosystem
    • No need for custom development.
    • Syncs well with other Microsoft Office products.
  • Con: Limited to SharePoint Capabilities
    • May not support some custom features.
    • Tied into the SharePoint ecosystem, which might not work for everyone.

3. Gemini (Google AI labs) for Google Drive

While the process of integrating ChatGPT with Google Docs requires some technical acumen, it can be incredibly beneficial. By creating an App Script code with a unique OpenAI API key, you can utilize the powers of ChatGPT within Google Docs.

Add-ons like GPT for Sheets and Docs, AI Email Writers, and Reclaim.ai can assist you in fusing AI into Google Workspace. These can dramatically change the way you interface with Google Docs, improving productivity and enhancing text creation and editing functionality. But it does not currently really leverage all the data one has on Google Drive in ChatGPT.

Google hinted the assistant that would read from Google Drive service on it’s latest I/O presentation but no details are provided yet.

Google Gemini on Google Docs
  • Pro: Seamless with Google Services
    • This might be the most straightforward option if your company uses Google Workspace.
  • Con: Less Customization
    • Offers fewer customization options compared to a full-fledged custom development & does not leverage the knowledge base beyond the current document.

4. ChatGPT on Atlassian Confluence Knowledgebase

You can utilize third-party plugins like Copilot: ChatGPT for Confluence to integrate ChatGPT into Confluence Knowledgebase.

This option offers easy plug-n-play solution but is limited to the specific knowledge base systems, like Atlassian Confluence.

Copilot for Atlassian Confluence
  • Pro: Seamless with Confluence
    • Quick setup.
    • Enables ChatGPT to answer queries using your existing Confluence Knowledgebase.
  • Con: Data Limitations
    • If your data is not fully in Confluence, you might not be leveraging the full potential of ChatGPT with Confluence.

In summary, the best deployment option will depend on your existing ecosystem, the resources you have at your disposal, and the specific use-cases you have in mind for ChatGPT. Each approach comes with its own sets of pros and cons, and understanding these can help you make an informed decision.

Leveraging ChatGPT for Company’s Knowledgebase: A Case Study on an AI Consulting Firm with PDLT

Consulting firms always accumulate tremendous amounts of data, but only a small percentage is used. The tax and accounting consulting firm realized the need to enhance its knowledge management system for consultants, enabling faster responses and service delivery to its clients.

Situation

Previously, the consulting firm’s associates struggled to quickly find and understand the complex regulations necessary to tailor solutions for its clients.

For example, explaining new rules or comparing laws from different areas often took longer than clients expected. This delay in responding impacted customer satisfaction and led to fewer repeat clients.

Solution

In response to these challenges, the firm engaged PDLT, an AI consulting company skilled in building MVPs (Minimum Viable Products) within less than three months. The AI consulting firm swiftly went to work and developed a proprietary system that indexed hundreds of terabytes of diverse documents.

The system used vector embeddings to process information effectively, transforming the vast quantity of unstructured text into high-dimensional numerical vectors. This enabled more efficient text retrieval and understanding. Once the vectorization was complete, the owned data was embedded as a company-specific knowledgebase in a web application built on OpenAI, creating a unique “chatgpt knowledgebase”.

PDLT utilized the best practices for the User Experience of the chatbots, enabling consultants to access vital information anytime they need.

Outcome

The results were noticeable and immediate. After the project’s first iteration was delivered within a month, the consulting firm began experiencing its impact. Because the information became immediately accessible, a single junior-level associate began to provide senior-level advice, and the cost of report and time to the first response saw a significant improvement.

In essence, this AI-empowered knowledgebase allowed the consultants to deliver agile and accurate solutions to the clients, furthered by ChatGPT’s remarkable abilities in understanding, processing, and providing relevant results.

Integrating ChatGPT with Your Company’s Knowledgebase

Your business’s knowledge base is a valuable information repository that supports your internal teams and external customers. However, traditional searching and information retrieval methods can feel cumbersome and outdated. By integrating an advanced AI like ChatGPT into your knowledgebase, you can modernize how users interact with your information, making finding what they need easier and more intuitive.

Here is how to proceed:

1. Identification of Core Knowledge Areas:

Identify the key domains of knowledge within your business. These could include product details, troubleshooting guides, business strategies, or any other crucial information needed for daily operations.

Example: If you run an IT service company, your core knowledge areas could include hardware requirements, software troubleshooting, client relationship management, and cybersecurity protocols. ChatGPT can enhance the searching and retrieval process from the volume of guides, manuals, and past queries stored in your knowledge base.

2. Define the Risks and Required Access Levels:

Ensure that sensitive data within your business is properly safeguarded. Not every piece of information should be accessible by every user. Design different access levels based on your organization’s roles, departments, or security clearance.

Example: Your product development process might be confidential, so you’ll restrict access to only product team members. In contrast, generic troubleshooting guides for common technical issues could be widely accessible to all staff and customers.

3. Limit the Knowledgebase Indexing to Respective Access Levels:

With access levels defined, apply these restrictions to your knowledgebase indexation. That way, ChatGPT can serve relevant data based on an individual user’s access level rather than showing results they are not authorized to view.

Example: When a customer service executive asks ChatGPT a question regarding available software updates, they receive a response relevant to their level of access – information that’s safe to share with customers. Simultaneously, an engineer receives more in-depth and technical data about the software update because their access level permits so.

4. Integrate ChatGPT with Existing Information Systems:

Depending on your existing information management platforms, like SharePoint, Confluence, or Google Drive, your method of ChatGPT integration will differ.

Structuring your company’s knowledgebase benefits you through:

  • Enhanced information accessibility: Empower employees and customers to find precise information instantly
  • Increased efficiency: Reduce the time spent manually searching through data, resulting in improved productivity.
  • Modernized user interaction: Improve user experience with a more intuitive and AI-enhanced interaction interface with your knowledgebase.

These steps might seem overwhelming at first, but remember, you’re not alone in this journey. AI consulting firms like Pragmatic DLT (PDLT) can help guide you through the entire process of setting up ChatGPT for your knowledgebase in less than three months. By leveraging this game-changing AI, you’ll be creating a significantly more efficient and effective knowledge management process for your business.

Potential Applications in Companies

Pragmatic DLT (PDLT) focuses on leveraging cutting-edge technologies such as AI to streamline various business processes, enabling companies to rapidly develop a minimum viable product (MVP) in less than three months. Below are some of the key areas where PDLT’s expertise can be highly beneficial.

Customer Service

  • AI Chatbots: PDLT can help in the development of AI-driven customer service chatbots capable of answering FAQs, solving common issues, and even upselling products, thereby freeing up human resources for more complex tasks.
    • Recommendation: Develop a chatbot with pre-identified common queries and deploy it on your website and social media channels.
  • Automated Ticketing System: With AI, it’s easier to categorize, prioritize, and assign customer queries to the relevant departments.
    • Recommendation: Integrate an AI-powered ticketing system with your CRM to better manage and resolve customer queries.
  • Sentiment Analysis: AI can analyze customer interactions and feedback to gauge overall sentiment, helping in proactive service adjustments.
    • Recommendation: Use sentiment analysis on customer reviews and chat logs to identify areas needing improvement.

Employee Onboarding

  • Automated Training Modules: PDLT can help create AI-driven training modules that adapt to the learning pace and style of individual new hires.
    • Recommendation: Implement AI-driven training programs to accelerate the onboarding process and better retain information.
  • Document Verification: AI can automate the cumbersome process of document verification, ensuring a faster and more accurate onboarding.
    • Recommendation: Use AI-based OCR (Optical Character Recognition) technology to automate the verification of documents like IDs, certifications, etc.
  • Resource Allocation: AI algorithms can predict the resources a new hire might need based on the department, job role, and past data.
    • Recommendation: Utilize AI for predictive analytics to prepare in advance the resources required for new hires, such as laptops, software licenses, and workspace setups.

Decision Support

  • Predictive Analytics: PDLT’s expertise in AI can be used for developing predictive models that analyze past and current data to forecast future trends.
    • Recommendation: Use predictive analytics to forecast sales, employee turnover, or customer behavior to proactively make data-driven decisions.
  • Risk Assessment: AI can evaluate potential risks in various business decisions, providing a quantitative measure of risk factors.
    • Recommendation: Integrate risk assessment algorithms into your decision-making processes to evaluate the potential impact of various choices.
  • Market Research: AI can sift through vast amounts of market data to identify opportunities or threats faster than traditional methods.
    • Recommendation: Use AI tools to perform market research, sentiment analysis, and competitive analysis to stay ahead in your industry.

The adoption of AI in these sectors can drastically improve efficiency, customer satisfaction, and ultimately, profitability. PDLT can be your ideal partner in this journey, offering tailor-made solutions that fit your business needs.

Conclusion

ChatGPT’s potential for business optimization is evident, particularly in data retrieval and problem-solving. Through various integration options—custom development, Power Virtual Agents for SharePoint, Google AI Labs for Google Drive, and Atlassian Confluence—the technology offers flexibility in deployment tailored to business needs and existing ecosystems.

Key Takeaways:

  1. Data Accuracy: While concerns exist, coupling ChatGPT with a robust knowledge base can mitigate them.
  2. Practical Use-cases: ChatGPT can enhance customer service, automate troubleshooting, and provide real-time decision support, among other applications.
  3. Training and Prompting: No magic bullet; effective use demands understanding of training, prompting, and vector embeddings.
  4. Cost and Time: Implementation timelines and costs vary; ballpark figures start at around $5,000 and one month.
  5. Risks and Access Levels: Attention to data sensitivity and access levels is non-negotiable.
  6. Consulting Options: Specialist firms can expedite and de-risk implementation.