AWS for M&E Blog
Increase engagement with localized content using Amazon Bedrock Flows
Content producers and publishers have large collections with thousands, if not hundreds of thousands, of articles that could be localized for new audiences and geographies to deliver increased engagement to novel and emerging markets. Localization is broadly, the process of transforming content (lexical choice, tone shift and translation) for new audiences from one geography to another.
Human-driven localization cannot scale to the volume required, so automated translation is often used. However, automated localization has been challenging in terms of quality, especially in its ability to contextualize for nuances in a specific market. This often leads to content having a lower engagement rate from consumers when compared to the original content. With foundation models supporting multiple languages and dialects, media and entertainment customers are increasingly leveraging generative AI to deliver higher quality localized content.
Amazon Bedrock Flows offers an intuitive, no-code visual builder and a set of APIs to seamlessly link state-of-the-art (SOTA) foundation models (such as Anthropic’s Claude and Amazon Nova), within Amazon Bedrock. It also integrates with other AWS services to automate user-defined generative AI workflows that go beyond submitting a prompt to a large language model.
Amazon Bedrock Prompt Management provides a streamlined interface to create, evaluate, version control, and share prompts. It helps developers and prompt engineers achieve the best responses from foundation models for their use cases.
We’ll demonstrate how you can take advantage of Amazon Bedrock features (such as Flows, Prompt Management and different foundation models) to quickly build and test a workflow. The workflow will take existing content, provide localized copy and deliver an evaluation on changes made for editorial review.
Scenario overview
As an online publisher that is planning to expand their readership to new geographies and channels without having to rely exclusively on net-new local content, you want to create a workflow that:
- Localizes the existing text content for a specific country and language to better align with local markets and advertising strategy.
- Adapts content for new, emerging channels (like short-form social media) using style guides currently loaded into Amazon Bedrock Knowledge Bases to help content editors check their work.
- Provides an evaluation on metrics (such as factual correctness, length, dialect and overall changes in meaning) so content editors can make an informed choice to publish or make further changes.
Prerequisites
Before creating the flow and prompts, make sure you have the following setup:
- An Amazon Web Services (AWS) account and a user with an AWS Identity and Access Management (IAM) role authorized to use Amazon Bedrock. For guidance, refer to Getting started with Amazon Bedrock. Make sure the role includes the permissions for using Flows and Prompt Management, as explained in Prerequisites for Amazon Bedrock Flows and in Prerequisites for prompt management.
- Access to the models of your choice for invocation and evaluation. For guidance, refer to Manage access to Amazon Bedrock foundation models.
- Our demonstration uses Amazon Nova Pro, Cohere Command R, Cohere Embed Multilingual v3 and Anthropic’s Claude 3.7 Sonnet models in the Oregon (us-west-2) Region.
- Amazon Bedrock Knowledge Bases configured with an Amazon Simple Storage Service (Amazon S3) data source.
- For this demonstration our knowledge base already contains style guides on how to create content for social media.
Create the prompts
Before creating the flow, you need to create two prompts that will be used in the prompt flow later.
Our first prompt localizes the content, and we used Anthropic’s latest Claude 3.7 Sonnet foundation model, which supports languages such as English, French, Spanish, Portuguese, Japanese and others.
When creating the prompts, variables such as the text article are expressed as {{article}} or local language as {{language}} to allow the values to be referenced between different flow nodes, allowing for greater flexibility and efficiency.
The second prompt evaluates the original and localized content to provide an evaluation to the content editor. To confirm independence from the localization prompt, this prompt uses Amazon Nova Pro. Amazon Nova Pro is a highly capable multimodal model with the best combination of accuracy, speed and cost for a wide range of tasks—supporting over 200 languages.
Create and configure the flow
Using the Amazon Bedrock console, navigate to Flows under Builder Tools and create a new flow as shown in Figure 4.
Upon creation, the browser redirects to the flow builder, otherwise use Edit in flow builder to switch to the visual interface.
In the Flow Builder, there are different node types you can use to create your flow. You can drag and drop different nodes onto the canvas and create links between them after defining variables directly in each node, or by loading saved prompts.
For this demonstration the following flow was created (Figure 5).
You can review specific settings for each node by selecting it in the Flow Builder.
- To simulate text content being sent from a content management system, the Flow input accepts a JSON object with the following attributes:
- article: text of the article selected for localization
- country: the target country for localization
- language: the target language for localization
- query: a text prompt to retrieve relevant style guides from Amazon Bedrock Knowledge Bases
- Using the Cohere Command R model, the Get_Style_Guides node parses the query text from the input and retrieves relevant results from the knowledge base. The output from this node augments how the localized text is generated.
- The Localization prompt node uses the previously created prompt to localize the input text.
- The Evaluation prompt node uses the previously created prompt to evaluate the input and localized text.
- The two Flow output nodes then stream the text output from each of the two prompt nodes as separate events.
Large language models (LLMs) can generate incorrect information due to hallucinations. Amazon Bedrock Flows integrate with Amazon Bedrock Guardrails to let you configure safeguards that identify, block or filter unwanted responses in your flow.
Test the flow
Using the Test flow pane in the Flow Builder, you can quickly test your complete workflow, with each flow output node streaming output in near real-time as the flow executes. To demonstrate how generative AI can work with Spanish across European and Latin American markets, the following examples use the same input text but changes the target country for localization.
The following (Figure 7) is an example of localized text in Spanish. Note how the localization step provides detailed feedback on what changes were made and provides additional context for the specific geography.
In the final step, the evaluation uses a different foundation model to independently compare the original and localized content, as well as provide scored feedback to the content editor.
Pricing
Amazon Bedrock Flows charges for every 1000 transitions required to execute your workflow. A transition occurs when data flows from one node to another (for example, an input node transitioning to a prompt node). There are no additional charges for using Amazon Bedrock Prompt Management.
Amazon Bedrock model usage charges will vary on the type of model used and the number of input and output tokens. It should be noted that this demonstration also uses Amazon Bedrock Knowledge Bases configured with an Amazon OpenSearch Serverless vector database and Amazon S3 data source.
Pricing will vary depending on the AWS Region used. All resources for our demonstration have been configured in the Oregon (us-west-2) Region. Reference Amazon Bedrock, Amazon S3 and Amazon OpenSearch Service pricing as needed.
Conclusion
We demonstrated how media and entertainment customers can configure an article localization workflow using a low code, serverless architecture. By using Amazon Bedrock Flows and Prompt Management, content owners and editors can leverage the benefits of generative AI to deliver more engaging content for new audiences.
Contact an AWS Representative to know how we can help accelerate your business.
Further reading
- Amazon Bedrock Flows User Guide: Build an end-to-end generative AI workflow with Amazon Bedrock Flows
- Amazon Bedrock Prompt Management User Guide: Construct and store reusable prompts with Prompt management in Amazon Bedrock
- Amazon Bedrock Flows is now generally available with enhanced safety and traceability
- Evaluating prompts at scale with Prompt Management and Prompt Flows for Amazon Bedrock