AWS bolsters generative AI offerings with Amazon Bedrock API

AWS has launched Amazon Bedrock, a new service allowing businesses to access a range of generative AI foundation models from a single API

In the aim of competing with Microsoft-backed ChatGPT developer OpenAI, as well as Bard developer Google, Amazon Bedrock aims to bring faster flexibility by offering multiple models from in-house and third parties, accessible from a single API.

The new product will allow business customers to create internal capabilities such as data search, report generation and image creation for marketing, as well as customer-facing tools including chatbot assistants.

With use cases for generative AI tools continuing to evolve, AWS looks to meet business demands for fast access to bespoke solutions that can meet long-term goals, as well as seamless integration into existing infrastructure.

The service will offer the following choice of generative AI foundation models to businesses:

  • Amazon Titan — capable of classifying, generating and summarising text, as well as extracting information and responding to queries.
  • Claude — developed by Anthropic, this large language model (LLM) is designed for automated Q&As, and trained workflow automation.
  • Jurassic-2 — AI21 Labs‘s offering is multilingual, supporting text in Spanish, French, German, Portuguese, Italian and Dutch.
  • Stable Diffusion — Stability.ai‘s model can generate images, logos and designs as well as artwork.

“With Bedrock’s serverless experience, customers can easily find the right model for what they’re trying to get done, get started quickly, privately customise FMs with their own data, and easily integrate and deploy them into their applications using the AWS tools and capabilities they are familiar with, without having to manage any infrastructure,” said Swami Sivasubramanian, vice-president, database, analytics and machine learning at AWS in a blog post.

“One of the most important capabilities of Bedrock is how easy it is to customise a model. Customers simply point Bedrock at a few labeled examples in Amazon S3, and the service can fine-tune the model for a particular task without having to annotate large volumes of data.

“Bedrock makes the power of FMs accessible to companies of all sizes so that they can accelerate the use of ML across their organisations and build their own generative AI applications because it will be easy for all developers.”

Early trials

Amazon Bedrock is currently in a limited preview phase, with a limited number of business customers trialling the API, including document creation platform Coda.

“As a longtime happy AWS customer, we’re excited about how Amazon Bedrock can bring quality, scalability, and performance to Coda AI,” said co-founder and CEO Shishir Mehrotra.

“Since all our data is already on AWS, we are able to quickly incorporate generative AI using Bedrock, with all the security and privacy we need to protect our data built-in.

“With over tens of thousands of teams running on Coda, reliability and scalability are really important.”

To help developers drive value from generative AI technology, AWS recently introduced a global accelerator scheme for startups.

Related:

How to embrace generative AI in your enterpriseWhat are the use cases for embedding generative AI in your enterprise? How can it help ease burden of repetitive admin? What are its limitations? Find out here.

How ChatGPT will transform marketingMarketing departments that do not embrace AI such as ChatGPT will find themselves at a disadvantage. But ultimately, although people can use generative AI, ChatGPT cannot run a business.

Avatar photo

Aaron Hurst

Aaron Hurst is Information Age's senior reporter, providing news and features around the hottest trends across the tech industry.

Related Topics

API
AWS
Generative AI