Going to AWS Summit London? 🇬🇧🇬🇧

Meet us there →

AWS

What is Amazon Bedrock? AWS Generative AI Tool Overview

What is AWS Amazon Bedrock

Generative AI (GenAI) has emerged as a transformational technology. Amazon Bedrock is one of the latest offerings from AWS that combines the security, scalability, and reliability of cloud services with GenAI. This article aims to dive deep into Bedrock, exploring its core features, benefits, use cases, and how it fits into the broader AWS ecosystem.

What we will cover:

  1. What is Amazon Bedrock?
  2. Key features of Amazon Bedrock
  3. Amazon Bedrock pricing
  4. Generative AI applications use cases
  5. How to get started with Amazon Bedrock
  6. Amazon Bedrock tutorial

What is Amazon Bedrock?

Amazon Bedrock is a generative AI tool for application development released by AWS. It’s a fully managed service that provides access to leading foundational models (FMs) through a single endpoint.

With Bedrock, AWS aims to democratize access to GenAI technologies and simplify the development of GenAI applications. Bedrock allows companies to easily consume various FMs to build powerful GenAI applications for their business use cases without needing elaborate machine learning skills and the hustle of managing the associated infrastructure.

The available FM models have been trained on massive datasets using cutting-edge techniques and can be further optimized to specialize in specific tasks. Building or training these models from scratch would be a cumbersome, time-consuming, and highly costly endeavor for enterprises. With the simplicity of Amazon Bedrock, developers can easily and securely experiment with different models, customize them for specialized tasks, and integrate them with their existing applications or evolve and reimagine their products.

Key features of Amazon Bedrock

Let’s look at some of the most important features of AWS Bedrock.

1. Access to a range of leading foundation models (FMs)

Bedrock offers various FMs from prominent AI companies, such as Anthropic, AI21 Labs, Cohere, Meta, Mistral AI, Stability.ai, and Amazon’s own models. Since different models are best suited for different tasks, Bedrock gives teams flexibility regarding model choice for different scenarios.

2. Simplified and managed experience for GenAI applications

Amazon Bedrock is a fully managed and serverless service, entirely abstracting the need to manage infrastructure components for your foundation models. It provides a single API access regardless of the chosen model, simplifying integrations, operations, and model version upgrades.

3. Model customization and Retrieval Augmented Generation (RAG)

The actual value of the power of FMs comes when companies manage to privately and effectively customize and adapt these models with their own proprietary data. To customize these models, Bedrock offers fine-tuning features and creates a separate private copy of this model. To pair the models with recent and up-to-date information, companies leverage RAG, a technique to enhance a model’s context with proprietary data sources for more accurate and informed responses.

4. Built-in security, privacy, and safety

With Bedrock, the data never leave your AWS environments and are encrypted in transit and at rest. Users can leverage their existing AWS security controls and services, such as KMS for encryption, IAM policies, CloudWatch for monitoring, CloudTrail for governance, and network design based on Amazon VPC.

When a base model is fine-tuned, a private copy of that model is used, and the proprietary data aren’t used to improve the base model. Bedrock is in scope for common compliance standards, including ISO, SOC, CSA, STAR Level 2, is HIPAA eligible, and can be used in compliance with the GDPR.

Bedrock has released the Guardrails functionality that allows companies to enforce policies and safety for their model responses. Guardrails provide a layer of safeguards and policies to promote safe and responsible usage of FMs by blocking unsafe topics, avoiding harmful content, and redacting personally identifiable information.

5. Leverage Agents for executing multi-step tasks

There are use cases where companies would like to automate processes and execute complex multi-step tasks based on the model’s response. With Agents, users can fast-track their prompt creation with customized instructions, orchestrate a sequence of actions, call the necessary APIs to fulfill the desired task, and monitor and trace the agent’s reasoning and orchestration of complex tasks.

Amazon Bedrock pricing

Amazon Bedrock pricing is based on the usage of the service and depends on the pricing model and foundation model.

There are mainly three pricing models for Amazon Bedrock:

  1. On-demand
  2. Provisioned throughput
  3. Model customization

On-demand

On this plan, you pay only for your usage with no long-term commitments. Typically, you pay for each inference operation performed using the models available in Amazon Bedrock. Pricing depends on the number of input and output tokens processed and on the chosen foundation model.

A token consists of a few characters and refers to the basic unit of text for a model’s input and prompt. For image generation models, you are charged for every image generated.

A subset of on-demand pricing is the Batch mode, which allows processes to process a set of prompts, receive responses in a single output file and store them on S3. The Batch mode follows the pricing for On-demand.

Provisioned throughput

For some models, you can purchase provisioned throughput, which guarantees availability and is useful for large and consistent inference workloads. Provisioned throughput requires a 1 or 6-month commitment for baseline usage. 

Model customization

For customizing models, you are charged for the model training based on tokens, and the model storage is charged per month. Inference for customized models requires a provisioned throughput plan. 

For detailed pricing information on each available provider, foundation model, and pricing model, check out the official Amazon Bedrock Pricing.

Generative AI applications use cases

Here are the three biggest use cases for GenAI in general.

Content generation

With Amazon Bedrock, you can leverage any of the FMs to generate new content in text format, such as stories, blogs, social media posts, product descriptions, and images for marketing, campaigns, websites, presentations, artwork, and illustrations. Content generation with GenAI is already extensively used for personalization, marketing, entertainment, and gaming.

Virtual assistants

Chatbots and virtual assistants have emerged as one of the most common use cases for enhancing customer experience and improving support. Virtual assistants based on GenAI technology, equipped with proprietary information and knowledge bases, can efficiently address customer requests, automate tasks, and provide solutions and guidance.

Text summarization

Text summarization is another excellent use case for using GenAI technology. With the help of FMs, we can increase productivity by generating concise summaries of books, stories, research papers, technical documentation, and fast-track information extraction. This use case is particularly handy in legal documents, educational and academic content, and summarizing meetings.

How to get started with Amazon Bedrock

To get access to Amazon Bedrock and start building, you need an AWS account. Then, follow the instructions to Set up Amazon Bedrock and request model access for the models you want to enable for your AWS account.

The easiest way to experiment with Bedrock is by using the Playground feature to try different models before deciding which one to use. You can use the Playground feature for text, image, and chat cases. The playground also provides configurable parameters for each model related to response length, randomness, and diversity of answers. 

Here’s an example of using the `Image Playground` functionality with the `SDXL 1.0 Stability` AI model:

aws bedrock pricing

Check out the Supported Base Models to look at all the different Foundation Models. To get the most out of your models, check out at the Prompt Engineering guidelines. Each of the models has its own configuration and inference parameters. These configurable parameters can be used to influence the responses together with the prompt. 

To evaluate different models, you can use the Model Evaluation feature that allows you to compare and evaluate the outputs of different models and help you decide which one to prefer. Another helpful feature is the managed Knowledge Base, combining FMs with private data sources and information. This functionality leverages a technique called Retrieval Augmented Generation (RAG), which enhances models’ responses with specific and up-to-date information. 

Amazon Bedrock tutorial

Let’s look at a demo to get a better taste of Bedrock’s capabilities. To follow along, check out this GitHub repository: Amazon Bedrock – Introductory Demo

Note: Running this demo will incur costs on your AWS account.

Prerequisites:

  • AWS account (sandbox account recommended)
  • IAM user or role with Administrator access or the required permissions to access Amazon Bedrock and its FMs. Configure this principal’s credentials in your environment’s default AWS profile (AWS_PROFILE). Also, ensure that you have enabled Model access on Amazon Bedrock.
  • Python 3.9+
  • Internet access

To get started, create a Python virtual environment:

python -m venv demo
cd demo
source bin/activate

Then, clone the repository and install the dependencies:

git clone https://github.com/aws-samples/amazon-bedrock-intro-demo.git
cd amazon-bedrock-intro-demo
pip install -r requirements.txt

Finally, launch the streamlit application:

streamlit run main.py

This command will open a new browser tab with the deployed application that looks like this:

what is aws bedrock

There, you will find several applications and use cases calling the Bedrock APIs for different models. Let’s take a look at some of them.

On the left side panel, select `QA – FM Comparison`, where you can choose any of the available models and compare them for QA use cases. In this case, I am comparing the `amazon.titan-tg1-large` and `anthropic.claude-v2`:

aws bedrock tutorial

Feel free to play around and experiment with different models and different questions. Next, let’s check the `Chat – FMs` option. This time, I opted for the `meta.llama2-13b-chat-v1` model and asked it to explain cloud computing like I am five years old.

amazon bedrock models

Moving to the `Code Translation` option allows us to translate code blocks from one programming language to another. I used a Javascript function that checks if a number is prime and opted to translate it to Python as an example:

bedrock amazon

Lastly, let’s explore the `RAG – Document(s)` option, which allows you to upload a PDF document and answer questions or retrieve information based on the document. It even shows you a comparison of the model’s answer with and without the RAG technique. 

As for the document to use, I uploaded the European’s Parliament Artificial Intelligence Act and asked it to summarize the contents of the document. Note that the model is unaware of this document without using RAG, and its answer is entirely irrelevant. On the other hand, using the RAG method and leveraging the document, the answer was quite on point:

amazon bedrock ai

Key points

In this article, we explored Amazon Bedrock to start building generative AI applications on AWS. We deep-dived into its functionalities, provided different use cases, and discussed how users can get started. Lastly, we reviewed an introductory demo application that leverages Bedrock and calls different Foundation Models behind the scenes via an API. 

In the meantime, go ahead and learn how a platform like Spacelift can help you and your organization fully manage cloud resources within minutes.

Spacelift is a CI/CD platform for infrastructure-as-code that supports tools like Terraform, Pulumi, Kubernetes, and more. For example, it enables policy-as-code, which lets you define policies and rules that govern your infrastructure automatically. You can even invite your security and compliance teams to collaborate on and approve certain workflows and policies for parts that require a more manual approach. You can check it for free by creating a trial account or booking a demo with one of our engineers.

Thank you for reading, and I hope you enjoyed this as much as I did!

The Most Flexible CI/CD Automation Tool

Spacelift is an alternative to using homegrown solutions on top of a generic CI. It helps overcome common state management issues and adds several must-have capabilities for infrastructure management.

Start free trial