Durabuilt Windows & Doors Tensor IOT

Business Problem

Durabuilt Windows & Doors is a leading manufacturer in Western Canada that is recognized  for its commitment to quality and innovation. Founded on family entrepreneurship, the company has consistently prioritized growth and improvement, earning a place on Canada’s Best Managed Companies list since 2012. Durabuilt’s dedication to service and leadership in the market sets it apart as an industry pioneer.

  • Customer Support Challenges: Existing support staff struggle to provide timely and personalized responses to diverse customer inquiries.
  • Efficiency Issues: The current system lacks scalability, leading to longer response times and decreased customer satisfaction.
  • Need for Innovation: The company seeks to leverage an amazing technology to enhance customer interactions and operational efficiency.

Explore Now

Solution

GoML supported TensorIoT in developing a Generative AI chatbot proof of concept, providing expertise in Large Language Models (LLMs) and Amazon Bedrock integration to enhance customer support through fast, personalized responses. This collaboration helped TensorIoT validate scalable, efficient GenAI solutions tailored to Durabuilt’s unique requirements.

Architecture

  • AWS S3 Bucket: The warranty text data (e.g., Website Warranty Text.txt) is stored in Amazon S3.
  • Amazon Bedrock Titan Embedding Model: This embedding model is used to generate embeddings (vector representations) of the warranty text data stored in S3.
  • Amazon OpenSearch Vector Database: The vectorized data (embeddings) from the Bedrock Titan model is stored in the OpenSearch vector database. This allows for efficient vector search and retrieval capabilities.
  • Amazon SageMaker Jupyter Notebook: Acts as a central interface for integrating, orchestrating, and testing the model, database, and embedding workflow. This likely provides the backend logic, processing, and computation.
  • Amazon Bedrock Claude RAG Model: A Retrieval-Augmented Generation (RAG) model is used here to generate responses based on information retrieved from OpenSearch.
  • Gradio UI: Provides a user-friendly interface for end-users to interact with the model and get responses. It communicates with SageMaker to get predictions or information.
  • AWS Console: Provides management and control over the resources and services involved in this architecture. It also links with SageMaker and other components for operational monitoring.
  • Data Flow: The data flow goes from S3 to Bedrock for embedding, then to OpenSearch for storage, and finally to SageMaker for interaction with the Claude RAG model to produce responses, which are then delivered to the Gradio UI for end-users.
Outcomes

0%

Increased Response Efficiency: Reduction in response time through rapid, contextually accurate responses.

0%

Enhanced Customer Satisfaction: Faster, personalized support improves customer satisfaction.

0%

Operational Cost Savings: Efficient query classification and streamlined data retrieval reduce support-related operational costs.