Demystifying AWS’s AI toolbox

At the latest AWS London Summit 2024, Generative AI and Machine Learning took center stage. In this post, I aim to highlight the differences of their three key AWS offerings: SageMaker, Bedrock, and Q Business.

AWS SageMaker (Classic Machine Learning)

AWS SageMaker is a comprehensive platform that streamlines the entire machine learning lifecycle on AWS. Users can efficiently prepare data, train models at scale (with built-in algorithms or pre-trained models), optimize performance, deploy into production, and monitor performance in real-time. All of these features are available in one Integrated Development Environment (IDE) with tools like notebooks (JupyterLab) and pipelines (MLOps).

SageMaker provides various types of algorithms such as Supervised and Unsupervised Learning that can be applied across various industries and use cases, for example banks and financial institutions can use SageMaker to develop models for risk assessment, fraud detection, stock price forecasting and algorithmic trading.

AWS Bedrock (Generative AI / LLM)

LLM stands for “Large Language Model,” which refers to a powerful new type of machine learning neural network model trained on vast amounts of text data to understand and generate human-like language.

AWS Bedrock is a fully managed service that offers a choice of high-performing LLM models from leading AI companies like AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and Amazon through a single API, along with a broad set of capabilities the users need to build generative AI applications with security and privacy.

Models can be customized privately with users’ data to create a personalized user experience. Amazon Bedrock makes a separate copy of the base LLM model that is accessible only by the users, and the users’ data is not used to train the original base models.

In addition to the fine-tuning / pre-training technique, Bedrock also offers RAG (Retrieval Augmented Generation), another approach that involves fetching data from company data sources and enriching the prompt with that data to deliver more relevant and accurate responses (no matter how comprehensive the training data is for the fine-tuned general LLM, there’s always potential for missing data points that could contribute to answering new questions).

As an example, Financial institutions have the opportunity to leverage generative AI technology for streamlining the labour-intensive task of customer due diligence. Through the analysis of extensive customer personal data, this technology has the potential to expedite customer onboarding processes, minimize erroneous alerts, and elevate the precision of risk evaluations. Additionally, it plays a crucial role in upholding compliance with rigorous Anti-Money Laundering (AML) and Know Your Customer (KYC) regulations.

AWS Q Business (AI chat assistant within a workplace)

Amazon Q Business is a generative AI–powered assistant with a built-in user interface  that can answer questions (with reference to internal documentation), provide summaries, generate content, and securely complete tasks based on data and information in the users’ enterprise systems.

One of the key feature is the built-in connectors to popular document repositories such as Salesforce, Google Drive, Microsoft 365, SharePoint, Gmail, Slack, Atlassian, ServiceNow, etc…

Amazon Q Business provides administrative controls (guardrails), such as the ability to block entire topics and filter both questions and finalized answers using keywords. For example the assistant can be configured to avoid talking about “financial advices” or talk about a specific “Project X”. What’s more it can be set so that it never provides generic answers (from the general LLM model) but only from the organization sources to avoid “hallucinations” (inaccurate answers).

More importantly, Amazon Q Business end users can perform specific tasks supported by third-party applications from within their web experience chat, such as creating a LiveDataset ticket as result of a conversation.

Key takeaways

In summary, Business Q offers the capability to integrate LLM into extensive corporate datasets, Bedrock provides an API for developing diverse generative AI applications, while SageMaker empowers data scientists and engineers to craft domain-specific ML models.

 

Supported Models

Model tuning

AI Expertise required

User interaction

Pricing

AWS SageMaker

Supervised, Unsupervised

Yes

High

IDE

Underlying compute resources

AWS
Bedrock

LLMs

Yes

Medium

API

Pay as you go

 

AWS
Business Q

LLMs

No

Low

Web Interface

Per user

High level comparison

Before delving into the AI realm, it’s crucial to assess whether machine learning aligns with your specific use case.