Posted On: Mar 29, 2024
We are excited to announce that Knowledge Bases for Amazon Bedrock now lets you create custom prompts to have greater control over personalizing the responses generated by the Foundation Model (FM). Additionally, you can configure the number of retrieved passages, which improves accuracy by providing added context to the FM.
Prompts significantly impact how FMs process information and generate responses. With custom prompts, you can tailor the prompt instructions by adding context, user input, or output indicator(s), for the model to generate responses that more closely match your use case needs. For example you can define the output language and format (e.g., “Generate the answer in Spanish”), and save the effort of setting up separate post-processing and orchestration steps. Custom prompts is an optional parameter, without which the default system prompt is used.
Additionally, users can now control the amount of information needed to generate a final response by adjusting the number of retrieved passages. The process of breaking down long input texts into smaller segments, or passages, is called chunking. Previously, when querying a knowledge base, the Retrieve API returned up to five chunks. Knowledge Bases now supports up to a hundred retrieved chunks enabling more relevant and comprehensive information retrieval.
These two capabilities are now available in US East (N. Virginia) and US West (Oregon) AWS Regions. To learn more, refer to Knowledge Bases for Amazon Bedrock documentation. To get started, visit the Amazon Bedrock console or utilize the RetrieveAndGenerate API and Retrieve API.