2024

KINTO Technologies improves development agility by utilizing Amazon Bedrock. By improving quality, we have achieved a 50% reduction in data analysis man-hours

50%

Reduced man-hours for X follower analysis

Improved accuracy of code reviews

Reducing the load of creating pull requests

Reducing the load on tech blog reviews

Enriching tech blog content and improving expressiveness

Overview

As a financial services company of the Toyota Group, KINTO Technologies Co., Ltd. is responsible for the service development of the car subscription "KINTO." To improve development agility, the company, which is working to expand generative AI use cases, has developed in-house primary review tools for program code and minimal generative AI platforms using Amazon Bedrock. These cases have penetrated the company, and the technical level of the entire engineering team has improved.

Business Issues | Creating Company-wide Use Cases Aiming to Be the Group's No. 1 Generative AI Utilization Team

KINTO Technologies (hereafter, KTC) supports mobility services developed by the Toyota Group with technology. Since its establishment in 2021, we have set a policy of in-house production development, and have been actively training engineers with high technical skills. “To become a company that uses the latest technology more than anywhere else, we are incorporating it into product development while constantly catching up with information. In order to develop the human resources, we are trying to create an environment where it is easy for the field to take on challenges, such as fully supporting study sessions, implementing various trainings, participating in external communities, and promoting tech blogs,” says Hitoshi Kageyama, CIO and Executive Vice President.

From this background, we also focused on the generative AI from an early stage. The company launched a cross-departmental project and called on all employees to create use cases.

“Driven by the goal of becoming the number one in generative AI within the Toyota Group, the company is actively encouraging employees to try without fear of failure. Once ideas that were born spontaneously take shape, the company fully supports them, and they are deployed company-wide as success stories.” (Mr. Kageyama)

Two representative use cases that emerged from this initiative are the primary review of program code using Amazon Bedrock and the minimal generative AI utilization platform.

kr_quotemark

"We would like AWS to be the first to provide the latest technical information on generative AI while collaborating with us to create cutting-edge use cases."

Hitoshi Kageyama
Executive Vice President of KINTO Technologies Co., Ltd.

Solution | Adopt Amazon Bedrock due to compatibility with AWS, which has been adopted as a core technology

The primary review of program code utilizes the OSS AI tool which is "PR-Agent." The DBRE group, an organization across KTC that supports solving database-related issues, adopted PR-Agent to automate the primary review of pull requests (PRs) with the goal of reducing the time developers spend on code reviews.

Motoaki Hoshino of DBRE Group DBRE Engineer said, “We adopted PR-Agent after evaluating the fact that it is OSS, that it is compatible with multiple language models, and how easy it is to implement. We evaluated large language models (LLMs) for their integration with the AWS services used at KTC and their ability to be deployed with speed. As a result, we adopted Claude 3.5 Sonnet available on Amazon Bedrock."

When PR-Agent was adopted, prompt tuning was repeated so that the intended output can be obtained in response to the input. By managing prompts on GitHub Wiki, we make it easier for team members to modify prompts.

“With Amazon Bedrock's managed service, we were able to concentrate on developing prompts without the workload of connecting models, etc.” (Mr. Hoshino)

With the introduction of PR-Agent, it has been possible to reduce the load of PR creation and improve review accuracy.

“Creating a PR title and description, which used to take about 10 minutes, is now completed instantly. Since parts that are often overlooked in the review process are pointed out, the accuracy of code reviews has improved” (Mr. Hoshino)

PR-Agent, which specializes in code reviews, is also used by the company for tech blogs. Masaki Hirose of DBRE Group Principal DBRE Engineer said, “By introducing it to tech blogs, the review load has been reduced. In addition to checking typographical errors and grammar, it is different from conventional static analysis review tools that it is possible to review even what content should be included in order to increase the appeal of sentences and what should be done to make an article easy to understand, and we have heard from users that LLM feedback was helpful.”

Currently, based on the technology developed with PR-Agent, the company is working on a full-scratch schema review using generative AI. In the future, they plan to apply this to multiple projects to provide an even better developer experience.

The other use case, simplified data analysis, was initiated because there were no services suitable for small-scale cases. Generic Artificial Intelligence (AI) web applications lack necessary features, while specialized Artificial Intelligence (AI) applications with search-augmented generation (RAG) are highly functional but expensive. Therefore, we decided to build an analysis platform that is serverless and simple and does not require RAG. Takuya Uehira, from the Data Analysis Group, which supports data analysis across the entire company, said, "Initially, we thought that generative AI could be used to create SQL." In addition, there was a need to summarize questionnaire results etc. from the field, so we started building a minimal generative AI utilization platform. “Minimal” here means the minimum necessary architecture without waste. Even if it is minimal, it has performance that can handle batch input of about 140,000 characters,” he says.

At the time of development, Amazon Bedrock and AWS Lambda were adopted to create a serverless configuration to reduce maintenance costs and maintenance man-hours. The prompt generation process is an add-in, making it easy for users who are not familiar with programming to use it.

"The deciding factor for adopting AWS was that all the necessary features for building AI applications are serverless. Prototype development was completed in about 2 weeks. The feature is that users can arbitrarily change the AI mode, and since the number of modes increases simply by creating a prompt and placing it on Amazon Simple Storage Service (Amazon S3), new functions can always be used.” (Mr. Uehira)

The minimal generative AI utilization platform is currently being used for tasks such as searching through more than 100 data analysis dashboards to find those that align with the user's objectives, and analyzing the profiles of users who follow X accounts of various services and products.

“In the X follower analysis project, we were able to reduce the man-hours by around 50%. Since reference data and system prompts can be provided in advance, some users have mentioned that it has become easier to get better results with prompts.” (Mr. Uehira)

In the future, it is planned to expand applications and apply it to automatic generation of SQL and automatic data extraction in Japanese. Hideki Tomonaga, Principal Data Scientist and Assistant Manager, said, “We plan is to expand the user base to all employees. Therefore, it is necessary to raise awareness, and we will continue to disseminate information through tech blogs, etc.”

Architecture

First review of program code

Minimalistic Generative AI Utilization Platform

click to enlarge

Implementation effects | Obtain various benefits by applying generative AI to the business

KTC is seeing an increase in generative AI use cases applied to business. Kageyama, who feels a response to this situation, has expressed his idea of developing a system to back up the company in the future.

“New technologies such as generative AI are appearing one after another, and it is becoming difficult to catch up on an individual level. In the next phase, we will aim to raise the level of technology while managing organizationally as a company. While AWS is the first to provide the latest technical information, I would like to work together to create cutting-edge use cases for Japanese companies.” (Mr. Kageyama)

Customer Profile: KINTO Technologies Co., Ltd.

We are working on the design, development, and operation management of information systems in the digital field, with the vision of "bringing 'emotion' to each person's 'mobility' as a leading mobility platform." As products supporting Toyota Group's mobility services, we are developing subscription service "KINTO ONE," customization service "KINTO FACTORY," and an inspiration app "PRISM" that uses AI to suggest travel where to go.

Hitoshi Kageyama

Hideki Tomonaga

Masaki Hirose

Takuya Uehira

Motoaki Satoyama

Key Services Currently In Use

Amazon Bedrock

The easiest way to build and scale generative AI applications using foundation models.

See here for details »

AWS Lambda

AWS Lambda lets you run code without provisioning or managing servers. You pay only for the compute time you consume.

Learn more »

Amazon S3

Amazon Simple Storage Service (Amazon S3) is an object storage service that offers industry-leading scalability, data availability, security, and performance.

Click here for details »

Amazon API Gateway

Amazon API Gateway is a fully managed service that makes it easy for developers to create, publish, maintain, monitor, and secure APIs at any scale. The API serves as a "front door" for applications to access data, business logic, and functionality from back-end services.

Click here for details »



Get Started

Organizations of all sizes across all industries are transforming their businesses and delivering on their missions every day using AWS. Contact our experts and start your own AWS journey today.