IBM Consulting creates innovative AWS solutions in French Hackathon
In March 2023, IBM Consulting delivered an Innovation Hackathon in France, aimed at designing and building new innovative solutions for real customer use cases using the AWS Cloud.
In this post, we briefly explore six of the solutions considered and demonstrate the AWS architectures created and implemented during the Hackathon.
Solution 1: Optimize digital channels monitoring and management for Marketing
Monitoring Marketing campaign impact can require a lot of effort, such as customers and competitors’ reactions on digital media channels. Digital campaign managers need this data to evaluate customer segment penetration and overall campaign effectiveness. Information can be collected via digital-channel API integrations or on the digital channel user interface (UI): digital-channel API integrations require frequent maintenance, while UI data collection can be labor-intensive.
On the AWS Cloud, IBM designed an augmented digital campaign manager solution, to assist digital campaign managers with digital-channel monitoring and management. This solution monitors social media APIs and, when APIs change, automatically updates the API integration, ensuring accurate information collection (Figure 1).
- Amazon Simple Storage Service (Amazon S3) and AWS Lambda are used to garner new digital estates, such as new social media APIs, and assess data quality.
- Amazon Kinesis Data Streams is used to decouple data ingestion from data query and storage.
- Lambda retrieves the required information from Amazon DynamoDB, like the most relevant brands; natural language processing (NLP) is applied to retrieved data, like URL, bio, about, verification status.
- Amazon S3 and Amazon CloudFront are used to present a dashboard where end-users can check, enrich, and validate collected data.
- When graph API calls detect an error/change, Lambda checks API documentation to update/correct the API call.
- A new Lambda function is generated, with updated API call.
Solution 2: 4th party logistics consulting service for a greener supply chain
Logistics companies have a wealth of trip data, both first- and third-party, and can leverage these data to provide new customer services, such as options for trips booking with optimized carbon footprint, duration, or costs.
IBM designed an AWS solution (Figure 2) enabling the customer to book goods transport by selecting from different route options, combining transport modes, selecting departure-location, arrival, cargo weight and carbon emissions. Proposed options include the greenest, fastest, and cheapest routes. Additionally, the user can provide financial and time constraints.
- User connects to web-app UI, hosted on Amazon S3.
- Amazon API Gateway receives user requests from web app; requests are forwarded to Lambda.
- Lambda calculates the best trip options based on the user’s prerequisites, such as carbon emissions.
- Lambda estimates carbon emissions; estimates are combined with trip options at Step 3.
- Amazon Neptune graph database is used to efficiently store and query trip data.
- Different Lambda instances are used to ingest data from on-premises data sources and send customer bookings through the customer ordering system.
Solution 3: Purchase order as a service
In the context of vendor-managed inventory and vendor-managed replenishment, inventory and logistics companies want to check on warehouse stock levels to identify the best available options for goods transport. Their objective is to optimize the availability of warehouse stock for order fulfillment; therefore, when a purchase order (PO) is received, required goods are identified as available in the correct warehouse, enabling swift delivery with minimal lead time and costs.
IBM designed an AWS PO as a service solution (Figure 3), using warehouse data to forecast future customer’s POs. Based on this forecast, the solution plans and optimizes warehouse goods availability and, hence, logistics required for the PO fulfillment.
- AWS Amplify provides web-mobile UI where users can set constraints (such as warehouse capacity, minimum/maximum capacity) and check: warehouses’ states, POs in progress. Additionally, UI proposes possible optimized POs, which are automatically generated by the solution. If the user accepts one of these solution-generated POs, the user will benefit from optimized delivery time, costs and carbon-footprint.
- Lambda receives Amazon Forecast inferences and reads/writes PO information on Amazon DynamoDB.
- Forecast provides inferences regarding the most probable future POs. Forecast uses POs, warehouse data, and goods delivery data to automatically train a machine learning (ML) model that is used to generate forecast inferences.
- Amazon DynamoDB stores PO and warehouse information.
- Lambda pushes PO, warehouse, and goods delivery data from Amazon DynamoDB into Amazon S3. These data are used in the Forecast ML-model re-train, to ensure high quality forecasting inferences.
Solution 4: Optimize environmental impact associated with engineers’ interventions for customer fiber connections
Telco companies that provide end-users’ internet connections need engineers executing field tasks, like deploying, activating, and repairing subscribers’ lines. In this scenario, it’s important to identify the most efficient engineers’ itinerary.
IBM designed an AWS solution that automatically generates engineers’ itineraries that consider criteria such as mileage, carbon-emission generation, and electric-/thermal-vehicle availability.
The solution (Figure 4) provides:
- Customer management teams with a mobile dashboard showing carbon-emissions estimates for all engineers’ journeys, both in-progress and planned
- Engineers with a mobile application including an optimized itinerary, trip updates based on real time traffic, and unexpected events
- Management team and engineers connect to web/mobile application, respectively. Amazon Cognito provides authentication and authorization, Amazon S3 stores application static content, and API Gateway receives and forwards API requests.
- AWS Step Functions implements different workflows. Application logic is implemented in Lambda, which connects to DynamoDB to get trip data (current route and driver location); Amazon Location Service provides itineraries, and Amazon SageMaker ML model implements itinerary optimization engine.
- Independently from online users, trip data are periodically sent to API Gateway and stored in Amazon S3.
- SageMaker notebook periodically uses Amazon S3 data to re-train the trip optimization ML model with updated data.
Solution 5: Improve the effectiveness of customer SAP level 1 support by reducing response times for common information requests
Companies using SAP usually provide first-level support to their internal SAP users. SAP users engage the support (usually via ticketing system) to ask for help when facing SAP issues or to request additional information. A high number of information requests requires significant effort to retrieve and provide the available information on resources like SAP notes/documentation or similar support requests.
IBM designed an AWS solution (Figure 5), based on support request information, that can automatically provide a short list of most probable solutions with a confidence score.
- Lambda receives ticket information, such as ticket number, business service, and description.
- Lambda processes ticket data and Amazon Translate translates text into country native-language and English.
- SageMaker ML model receives the question and provides the inference.
- If the inference has a high confidence score, Lambda provides it immediately as output.
- If the inference has a low confidence score, Amazon Kendra receives the question, searches automatically through indexed company information and provides the best answer available. Lambda then provides the answer as output.
Solution 6: Improve contact center customer experience providing faster and more accurate customer support
Insured customers often interact with insurer companies using contact centers, requesting information and services regarding their insurance policies.
IBM designed an AWS solution improving end-customer experience and contact center agent efficiency by providing automated customer-agent call/chat summarization. This enables:
- The agent to quickly recall the customer need in following interactions
- Contact center supervisor to quickly understand the objective of each case (intervening if necessary)
- Insured customers to quickly have the information required, without repeating information already provided
Summarization capability is provided by generative AI, leveraging large language models (LLM) on SageMaker.
- Pretrained LLM model from Hugging Face is stored on Amazon S3.
- LLM model is fine-tuned and trained using Amazon SageMaker.
- LLM model is made available as SageMaker API endpoint, ready to provide inferences.
- Insured user contact customer support; the user request goes through voice/chatbot, then reaches Amazon Connect.
- Lambda queries the LLM model. The inference is provided by LLM and it’s sent to an Amazon Connect instance, where inference is enriched with knowledge-based search, using Amazon Connect Wisdom.
- If the user–agent conversation was a voice interaction (like a phone call), then the call recording is transcribed using Amazon Transcribe. Then, Lambda is called for summarization.
In this blog post, we have explored how IBM Consulting delivered an Innovation Hackathon in France. During the Hackathon, IBM worked backward from real customer use cases, designing and building innovative solutions using AWS services.