AWS Machine Learning Blog

Revolutionize trip planning with Amazon Bedrock and Amazon Location Service

Have you ever stumbled upon a breathtaking travel photo and instantly wondered where it was and how to get there? With 1.3 billion international arrivals in 2023, international travel is poised to exceed pre-pandemic levels and break tourism records in the coming years. Each one of these millions of travelers need to plan where they’ll stay, what they’ll see, and how they’ll get from place to place. This is where AWS and generative AI can revolutionize the way we plan and prepare for our next adventure. With the significant developments in the field of generative AI, intelligent applications powered by foundation models (FMs) can help users map out an itinerary through an intuitive natural conversation interface. It’s like having your own personal travel agent whenever you need it.

Amazon Bedrock is the place to start when building applications that will amaze and inspire your users. Amazon Bedrock is a fully managed service that empowers developers with an uncomplicated solution to build and scale generative AI applications by offering a choice of high-performing FMs from leading companies like AI21 Labs, Anthropic, Cohere, Meta, Stability AI, and Amazon through a single API, along with a broad set of capabilities that you need to build generative AI applications with security, privacy, and responsible AI. It enables you to privately customize the FM of your choice with your data using techniques such as fine-tuning, prompt engineering, and retrieval augmented generation (RAG) and build agents that run tasks using your enterprise systems and data sources while adhering to security and privacy requirements.

In this post, we show you how to build a generative AI-powered trip-planning service that revolutionizes the way travelers discover and explore destinations. By using advanced AI technology and Amazon Location Service, the trip planner lets users translate inspiration into personalized travel itineraries. This innovative service goes beyond traditional trip planning methods, offering real-time interaction through a chat-based interface and maintaining scalability, reliability, and data security through AWS native services.

Architecture

The following figure shows the architecture of the solution.

The workflow of the solution uses the following steps.

  1. A user interacts with an AWS Amplify frontend to initiate a trip planning request, either through text or by uploading an image. The user can access and interact with the generated trip itinerary through the frontend application, which includes visualizations on maps powered by Amazon Location Service and Amplify.
  2. If an image is uploaded, it is stored in Amazon Simple Storage Service (Amazon S3), and a custom AWS Lambda function will use a machine learning model deployed on Amazon SageMaker to analyze the image to extract a list of place names and the similarity score of each place name. It will then return the place name with the highest similarity score. The user’s request is sent to AWS API Gateway, which triggers a Lambda function to interact with Amazon Bedrock using Anthropic’s Claude Instant V1 FM to process the user’s request and generate a natural language response of the place location.
  3. If the user interacts using text, it will trigger the Amazon Bedrock FM directly, providing the natural language response of the place location.
  4. Amazon Location Service is integrated to provide precise location (location coordinates) data based on the place name. If the user prompt consists of suggestions such as searching for places of interests (POIs), it will pinpoint these POIs on the map within the chat interface as well.
  5. A Lambda function combines the generative AI response from Amazon Bedrock with the location data from Amazon Location Service to create a personalized and context-aware trip itinerary.
  6. The conversation history of the user is stored in Amazon DynamoDB.

Core benefits of Amazon Bedrock and Amazon Location Service

Amazon Bedrock provides capabilities to build generative AI applications with security, privacy, and responsible AI practices. Being serverless, it allows secure integration and deployment of generative AI capabilities without managing infrastructure.

Amazon Location Service offers cost-effective, high-quality location-based services. It provides geospatial data based on coordinates, enabling accurate mapping, geofencing, and tracking capabilities for various applications. With a single API across multiple providers, it offers seamless integration, flexibility, and efficient application development with built-in health monitoring and AWS service integration.

By integrating Amazon Bedrock with Amazon Location Service, the virtual trip planning application uses the strengths of both services. Amazon Bedrock enables the use of top FMs for specific use cases and customization for generating contextual responses, while Amazon Location Service provides location data and mapping capabilities. This integration offers tailored trip recommendations through engaging responses powered by generative AI and intuitive visualization on maps.

Key features

Other currently available search engines often require multiple customer touch points and actions to gather information; this virtual trip planner streamlines the process into a seamless, intuitive experience. With a few clicks, users can access location coordinates, personalized itineraries, and real-time assistance, eliminating the need for cumbersome navigation across various sites and internet tabs. These features are presented in a web UI that was designed as a one-stop solution for our users. The following figure shows the start of a trip-planning chat.

Within this innovative generative AI solution, there’s a key feature of chat-based natural language interaction that enhances the solution by introducing a user-friendly conversational interface. This capability enables users to engage in dynamic conversations, articulating their preferences, interests, and constraints in a conversational manner. Notably, this functionality eliminates the need for navigating through complex tasks solely to plan a trip, fostering a more personalized and human-like interaction. The following figure shows the first question and response in the solution.

This application uses Anthropic’s Claude Instant V1 on Amazon Bedrock, where it’s designed to respond with context-specific insights, dynamically adapting to the ongoing conversation. Through natural language processing algorithms and machine learning techniques, the large language model (LLM) analyzes the user’s queries in real time, extracting relevant context and intent to deliver tailored responses. Whether the user is seeking recommendations for accommodations, exploring POIs, or inquiring about transportation options, the model will use contextual understanding to provide accurate and personalized assistance to the user. This responsiveness makes sure that each interaction feels intuitive and fluid, mimicking the experience of conversing with a knowledgeable travel expert who anticipates and addresses the user’s needs seamlessly throughout the conversation. The following figure shows the continuation of the interaction and depicts the user’s question, the response, and a map that reflects the information in the response.

This innovative feature harmonizes with the evolving needs of users, providing a comprehensive solution that significantly enhances the overall travel experience. The user-centric approach of the solution reflects a commitment to simplifying the trip planning process, allowing travelers to seamlessly translate inspiration into personalized and enjoyable travel itineraries.

Project conceptual walkthrough: Virtual trip planner

LangChain is a framework for developing applications powered by LLMs that can be used to build applications that are context-aware through connecting an LLM to sources of context (prompt instructions, few shot examples, and other content to ground its response to the users). The application also relies on the LLM to reason with the user, such as answering based on provided context and the actions to take.

LangChain enables you to create your own customized agent. The core idea of agents is to use a language model to choose a sequence of actions to take. These actions can invoke certain functions, from a simple calculation to complex internet search or API call. You can write a prompt template, providing a list of tool names that it can use, and ask the agent to make a decision based on certain inputs. An agent uses the power of an LLM to determine which function to execute, and output the result based on the prompt guide. Here is an example from LangChain.

The following code snippet imports the necessary libraries, including the Amazon Bedrock module from the LangChain LLM. It then initializes the Amazon Bedrock client using the necessary parameters, which involves the selection of the model_id, region_name, and keyword arguments.

from langchain.llms.bedrock import Bedrock
llm=Bedrock(
    model_id="anthropic.claude-instant-v1",
    # model_id="anthropic.claude-v2:1",
    model_kwargs={
        "max_tokens_to_sample": 20000,
        "temperature": 0.
    },
    region_name = REGION,
    verbose=True
)

Amazon Location Service offers cost-effective location-based services (LBS) with high-quality data from trusted providers like Esri, HERE, and GrabMaps. This enables developers to build advanced location-enabled applications that include location data and functionality such as such as maps, POIs, geocoding, routing, geofences, tracking, and health monitoring metrics. This virtual trip planner highlights the integration of Amazon Location Service with Amazon Bedrock to build location-enabled applications.

The tool SearchPlaceIndexForText from Amazon Location Service enables users to geocode free-form text, such as addresses, names, cities, or regions, facilitating the search for places or POIs. By using optional parameters, such as bounding box or country filters, and biasing searches towards specific positions globally, users can refine their search results. Notably, the tool allows users to search for places near a given position using BiasPosition or filter results within a bounding box using FilterBBox. The search results are presented in descending order of relevance, providing users with a list of POIs along with their coordinates for visualization on maps in the user interface.

To use this functionality, the user input needs to be translated into the appropriate Action and parameters required by Amazon Location Service. For instance, if a user enters “Find coffee shops near Central Park, New York City,” the application would parse this input and convert it into the corresponding Action and parameters for the SearchPlaceIndexForText tool. This could involve setting the SearchText to coffee shops, the BiasPosition to the coordinates of Central Park, and potentially applying filters or bounding boxes to narrow down the search area.

After the user input is translated into the required Action and parameters, Amazon Location Service processes the request and provides the relevant location coordinates and names of coffee shops near Central Park. This information is then passed to the generative AI component of the application, which uses it to generate human-friendly responses or visualizations for the user interface.

By seamlessly integrating Amazon Location Service with generative AI, the application delivers a natural and intuitive experience for users, allowing them to search for places using conversational language while using its powerful geocoding capabilities.

USER'S INPUT
--------------------
Here is the user's input (remember to respond with a markdown code snippet of a json blob with a single action, and NOTHING else):
Recommend me places in Marina Bay Sands
AI:  ```json
{
  "action": "FindPlaceRecommendations",
  "action_input": "Marina Bay Sands, Singapore"
}
```
Human: TOOL RESPONSE:
---------------------
[{'Place': {'AddressNumber': '10', 'Categories': ['PointOfInterestType', 'Hotel'], 'Country': 'SGP', 'Geometry': {'Point': [103.8585399, 1.2821027]}, 'Interpolated': False, 'Label': 'Marina Bay Sands, 10 Bayfront Avenue, Singapore, SGP', 'Municipality': 'Singapore', 'Neighborhood': 'Marina', 'Region': 'Singapore', 'Street': 'Bayfront Avenue'}, 'Relevance': 1}, {'Place': {'AddressNumber': '1', 'Categories': ['PointOfInterestType'], 'Country': 'SGP', 'Geometry': {'Point': [103.8610554, 1.2849601]}, 'Interpolated': False, 'Label': 'Marina Bay Sands, 1 Bayfront Avenue, Singapore, SGP', 'Municipality': 'Singapore', 'Neighborhood': 'Marina', 'Region': 'Singapore', 'Street': 'Bayfront Avenue', 'SupplementalCategories': ['EV Charging Station']}, 'Relevance': 1}, {'Place': {'AddressNumber': '1', 'Categories': ['PointOfInterestType'], 'Country': 'SGP', 'Geometry': {'Point': [103.8601178, 1.2825414]}, 'Interpolated': False, 'Label': 'Marina Bay Sands, 1 Bayfront Avenue, 018971, Singapore, SGP', 'Municipality': 'Singapore', 'Neighborhood': 'Marina', 'PostalCode': '018971', 'Region': 'Singapore', 'Street': 'Bayfront Avenue', 'SupplementalCategories': ['Building']}, 'Relevance': 1}, {'Place': {'Categories': ['PointOfInterestType', 'Hotel'], 'Country': 'SGP', 'Geometry': {'Point': [103.85976, 1.28411]}, 'Interpolated': False, 'Label': 'Marina Bay Sands, SGP'}, 'Relevance': 1}, {'Place': {'AddressNumber': '8', 'Categories': ['PointOfInterestType'], 'Country': 'SGP', 'Geometry': {'Point': [103.8592831, 1.2832098]}, 'Interpolated': False, 'Label': 'Marina Bay Sands Casino, 8 Bayfront Avenue, 018956, Singapore, SGP', 'Municipality': 'Singapore', 'Neighborhood': 'Marina', 'PostalCode': '018956', 'Region': 'Singapore', 'Street': 'Bayfront Avenue', 'SupplementalCategories': ['Casino']}, 'Relevance': 0.9997}, {'Place': {'AddressNumber': '1', 'Categories': ['PointOfInterestType', 'Hotel'], 'Country': 'SGP', 'Geometry': {'Point': [103.8601305, 1.2825665]}, 'Interpolated': False, 'Label': 'Marina Bay Sands Hotel, 1 Bayfront Avenue, 018971, Singapore, SGP', 'Municipality': 'Singapore', 'Neighborhood': 'Marina', 'PostalCode': '018971', 'Region': 'Singapore', 'Street': 'Bayfront Avenue'}, 'Relevance': 0.9997}, {'Place': {'Categories': ['PointOfInterestType'], 'Country': 'SGP', 'Geometry': {'Point': [103.8600853, 1.2831384]}, 'Interpolated': False, 'Label': 'MARINA BAY SANDS HOTEL, Singapore, SGP', 'Municipality': 'Singapore', 'Neighborhood': 'Marina', 'Region': 'Singapore', 'SupplementalCategories': ['Bus Stop']}, 'Relevance': 0.9997}, {'Place': {'AddressNumber': '2', 'Categories': ['PointOfInterestType'], 'Country': 'SGP', 'Geometry': {'Point': [103.8585042, 1.2828475]}, 'Interpolated': False, 'Label': 'Marina Bay Sands Skating Rink, 2 Bayfront Avenue, 018970, Singapore, SGP', 'Municipality': 'Singapore', 'Neighborhood': 'Marina', 'PostalCode': '018970', 'Region': 'Singapore', 'Street': 'Bayfront Avenue', 'SupplementalCategories': ['Ice Skating Rink']}, 'Relevance': 0.9995}, {'Place': {'AddressNumber': '1', 'Categories': ['PointOfInterestType', 'Pharmacy'], 'Country': 'SGP', 'Geometry': {'Point': [103.8601178, 1.2825414]}, 'Interpolated': False, 'Label': "Nature's Farm Marina Bay Sands, 1 Bayfront Avenue, 018971, Singapore, SGP", 'Municipality': 'Singapore', 'Neighborhood': 'Marina', 'PostalCode': '018971', 'Region': 'Singapore', 'Street': 'Bayfront Avenue'}, 'Relevance': 0.9704999999999999}, {'Place': {'AddressNumber': '10', 'Categories': ['PointOfInterestType', 'Tourist Attraction'], 'Country': 'SGP', 'Geometry': {'Point': [103.861031, 1.285204]}, 'Interpolated': False, 'Label': 'Marina Bay Sands Skypark, 10 Bayfront Avenue, Singapore, SGP', 'Municipality': 'Singapore', 'Neighborhood': 'Marina', 'Region': 'Singapore', 'Street': 'Bayfront Avenue'}, 'Relevance': 0.9653}]

The following code sample is a function that queries locations using Amazon Location Service and takes parameters such as the index name, text, country, maximum results, categories, and region. The function initializes the Amazon Location Service client, sets search parameters based on the input, performs the search using search_place_index_for_text, and returns the results. In case of a ResourceNotFoundException, it prints an error message and returns an empty list.

def query_locations(
    index_name: str,
    text: str = None,
    country: str = None,
    max_results: int =10,
    categories: List=[],
    region='ap-northeast-1'
):
    # Initialize the Amazon Location Service client
    client = boto3.client('location', region_name=region)
    # Specify the parameters for the search
    parameters = {
        'IndexName': index_name,
        'MaxResults': max_results
    }
    if text is not None:
        parameters['Text'] = text
    if len(categories) > 0:
        parameters['FilterCategories'] = categories
    if country is not None:
        parameters['FilterCountries'] = [country]
    try:
        # Perform the search
        response = client.search_place_index_for_text(**parameters)
        # Extract and return the results
        locations = response['Results']
        return locations
    except client.exceptions.ResourceNotFoundException as e:
        print(f"Error: {e}")
        return []

In a LangChain agent, an LLM is used as a reasoning engine to determine which actions to take and in which order. You can also customize the prompt template (known as prompt engineering) to make the model generate the desired contents. Building an agent requires you to customize the agent methods to form an appropriate prompt. LangChain provides some prebuilt classes, such as ConversationalChatAgent. In the following code snippet, a ConversationalChatAgent class that inherits the Agent class from LangChain is defined. The class definition is similar to the LangChain ConversationalChatAgent class.

from langchain.agents.agent import Agent, AgentOutputParser
class ConversationalChatAgent(Agent):
    """An agent designed to hold a conversation in addition to using tools."""
    output_parser: AgentOutputParser = Field(default_factory=ConvoOutputParser)
    template_tool_response: str = TEMPLATE_TOOL_RESPONSE
    @classmethod
    def _get_default_output_parser(cls, **kwargs: Any) -> AgentOutputParser:
        return ConvoOutputParser()
    @property
    def _agent_type(self) -> str:
        raise NotImplementedError
    @property
    def observation_prefix(self) -> str:
        """Prefix to append the observation with."""
        return "Observation: "
    @property
    def llm_prefix(self) -> str:
        """Prefix to append the llm call with."""
        return "Thought:"
    @classmethod
    def _validate_tools(cls, tools: Sequence[BaseTool]) -> None:
        super()._validate_tools(tools)
        validate_tools_single_input(cls.__name__, tools)
     
    # ... other methods

Note from_llm_and_tools This method creates a prompt template and initiates an LLM model to use a prompt and provided functionalities (tools) to generate the content. This is where you can customize the prompt template.

    @classmethod
    def from_llm_and_tools(
        cls,
        llm: BaseLanguageModel,
        tools: Sequence[BaseTool],
        callback_manager: Optional[BaseCallbackManager] = None,
        output_parser: Optional[AgentOutputParser] = None,
        system_message: str = PREFIX,
        human_message: str = SUFFIX,
        input_variables: Optional[List[str]] = None,
        **kwargs: Any,
    ) -> Agent:
        """Construct an agent from an LLM and tools."""
        cls._validate_tools(tools)
        _output_parser = output_parser or cls._get_default_output_parser()
        prompt = cls.create_prompt(
            tools,
            system_message=system_message,
            human_message=human_message,
            input_variables=input_variables,
            output_parser=_output_parser,
        )
        llm_chain = LLMChain(
            llm=llm,
            prompt=prompt,
            callback_manager=callback_manager,
        )
        tool_names = [tool.name for tool in tools]
        return cls(
            llm_chain=llm_chain,
            allowed_tools=tool_names,
            output_parser=_output_parser,
            **kwargs,
        )

cls.create_prompt will create a BasePromptTemplate object, which is the object that LangChain used to create the actual prompt for the LLM. Instead of the provided BasePromptTemplate object, you can modify the human_message and system_message values in the from_llm_and_tools method, as shown in the preceding example.

As you progress through the code, it’s important to understand how the different components work together to create an agent capable of finding POIs based on user queries.

The following code defines a class named FindPOIsByCountry, which is a subclass of BaseTool. This class is designed to find POIs in a specific country and includes a description of when to use the tool and examples of queries that it can handle. The _run method within this class takes a query and attempts to identify the country mentioned in the query using the pycountry library. It then calls the query_locations function, passing parameters such as the Amazon Location Service index name, text (query), maximum results, categories of interest (for example, amusement park or museum), identified country code, and region.

class FindPOIsByCountry(BaseTool):
    name = "FindPOIsByCountry"
    description = "Only use this tool when you need to find a list of points of interests like mountains or scenic locations in a certain country. Never use this tool until users explicitly ask about finding points of interests like mountains or scenic locations. A 'landmark' is also a point of interest. An example of a sentence that uses landmark is: 'I want to see the Eiffel Tower'. An example of a question that uses landmarks or points of interests is: 'What cool places are there in Japan?'"

    def _run(
        self, query: str, run_manager: Optional[CallbackManagerForToolRun] = None
    ) -> str:
        """Use the tool."""
        print("FindPOIsByCountry query", query)
        country_code = None
        try:
            countries = pycountry.countries.search_fuzzy(query)
            country_code = countries[0].alpha_3
        except Exception as e:
            print(e)
            print("Setting country to None")
        return query_locations(
            index_name=AMAZON_LOCATION_INDEX_NAME,
            text=query,
            max_results=10,
            categories=["Amusement Park", "Aquarium", "Museum", "Shopping Mall", "Tourist Attraction"],
            country=country_code,
            region=REGION
        )

We take this further by implementing a query_nearby_locations feature that uses BiasPosition and FilterBBox. BiasPosition is an optional parameter from Amazon Location Service that indicates a preference for places that are closer to a specified position. FilterBBox is an optional parameter that limits the search results by returning only places that are within the provided bounding box. By setting a 10 km radius around the bias position, it narrows the search to locations within this range. The key difference lies in how locations are filtered based on proximity in query_nearby_locations.

def query_nearby_locations(
        index_name: str,
        bias_position: List[float],
        text: str = None,
        filter_country: str = None,
        max_results: int = 10,
        categories: List = [],
        region='ap-northeast-1'
):
    # Initialize the Amazon Location Service client
    client = boto3.client('location', region_name=region)
    # 10km radius from the bias position
    filterbbox = [bias_position[0] - 3, bias_position[1] - 3, bias_position[0] + 3, bias_position[1] + 3]
    # Specify the parameters for the search
    parameters = {
        'IndexName': index_name,
        'MaxResults': max_results,
        # 'BiasPosition': bias_position,
        'FilterBBox': filterbbox
    }
    print(text, categories)
    if text is not None:
        parameters['Text'] = text
    if len(categories) > 0:
        parameters['FilterCategories'] = categories
    if filter_country is not None:
        parameters['FilterCountries'] = [filter_country]
    try:
        # Perform the search
        response = client.search_place_index_for_text(**parameters)
        print(response)
        # Extract and return the results
        locations = response['Results']
        return locations
    except client.exceptions.ResourceNotFoundException as e:
        print(f"Error: {e}")
        return []

The functions discussed in this section can be integrated into the backend of your project, tightly coupled with the generative AI component. With the implementation details and guidance provided, you can use the power of Amazon Location Service and LangChain’s generative AI capabilities to build a conversational application that allows users to search for nearby points of interest using natural language queries. By integrating the query_nearby_locations function, parsing user input, customizing the LangChain agent’s prompt template, and developing a user-friendly interface, you can create an intuitive experience where users can discover relevant locations within specified proximities or bounding boxes. As you build your application, focus on implementing robust error handling, considering edge cases, and thoroughly testing the application before deploying it to a production environment. With this foundation, you can create innovative location-based applications that seamlessly blend the power of Amazon Location Service and Amazon Bedrock using Anthropic’s Claude V1

Conclusion

Harnessing the power of generative AI enables this web solution to interpret user queries and dynamically generate personalized travel itineraries. This application offers a user-friendly experience, where users can interact with the system through a chat-based interface providing relevant responses based on context. This application serves as a transformative tool that seamlessly guides users to discover more information about locations and explore additional points of interest. To get started on building your own innovative solutions, explore Amazon Bedrock now and start your journey today.


About the Authors

Yao Cong (YC) Yeo is a Solutions Architect at Amazon Web Services, empowering Singapore’s ISVs and SMBs in their cloud transformation journeys, guiding customers to optimize workloads and maximize their AWS cloud potential. YC specialises in the Application Security domain in Cloud Security, ensuring robust and secure cloud implementations. In the Generative AI space, YC delivers thought leadership content to bridge the gap between technical possibilities and business objectives in the evolving digital landscape.

Loke Jun Kai is an AI/ML Specialist Solutions Architect in AWS. He works on Go-To-Market motions and Strategic Opportunities in the ASEAN Region. Jun Kai have provided technical and visionary guidance for customers across industries and segments, from large enterprises to Startups. Outside of work, he enjoys looking at all things related to Venture Capital, or having Tennis sessions.

Abhi Fabhian is a Solutions Architect at Amazon Web Services based in Indonesia, providing expert technical guidance on cloud technologies to clients across various sectors in Indonesia, helping them optimize their cloud experience. Outside of work he enjoys sports, cars, music and playing games.

Tung Cao is a Solutions Architect at Amazon Web Services based in Vietnam, covering Vietnam’s SMB and ISVs on their journey to the cloud, helping them optimize and innovate their business processes. Tung specializes in AI/ML, which helps in providing cutting-edge solutions to enhance customer experiences, streamline operations, and drive data-driven decision-making. enabling businesses to leverage advanced technologies like machine learning and deep learning to gain competitive advantages.

Siraphop (Fufu) Thaisangsa-nga is a Solutions Architect at Amazon Web Services based in Thailand, dedicated to guiding local businesses through their cloud transformation journeys. With a deep understanding of the Thai market, Fufu helps companies leverage AWS services to innovate, scale, and improve their operational efficiency, excelling in tailoring cloud solutions to meet the unique needs of Thai businesses across various industries.