The Tehanu project in Rwanda’s Volcanoes National Park has demonstrated groundbreaking use of generative AI to infer and act on the interests of mountain gorillas. Leveraging technology solutions from Amazon Web Services (AWS), AWS Partner Anthropic, and with the support of AWS Partner Adastra, Tehanu created an automated pipeline to process behavioral data of gorillas, enabling the first-ever digital financial transactions by a non-human species. The AI solution synthesized vast academic and observational data, aligning conservation actions with species-specific preferences while supporting biodiversity efforts. This scalable, innovative approach sets a precedent for using AI to foster coexistence across species worldwide.
UK-based charity Lyra in Africa helps children in rural Tanzania attend and complete secondary school. Established in 2012, it has delivered 15 hostels for girls in rural government schools and provides an offline digital learning program to partner high schools, each with a computer lab with learning content preloaded on the computers. Lyra also works with two teacher training colleges to boost IT literacy among teachers. Until recently, the organization relied on service providers in the UK to collect donations from individual sponsors, but that was proving an expensive fundraising strategy. Lyra turned to AWS Partner Softcat to build a low-cost online donation platform on Amazon Web Services (AWS). The system has automated payment processing and recordkeeping that helps the charity automatically benefit from the UK government’s Gift Aid tax relief program.
Darwinbox wanted to reduce the time to infer resumes against job descriptions using PyTorch models. AWS Premier Partner Minfy helped them leverage Amazon SageMaker and AWS Inferentia to compile models with Neuron SDK and deploy them, achieving 87% faster inference without retraining. Key steps were compiling models with the Neuron SDK, extending SageMaker containers, using Inference Recommender to optimize configurations, and sending requests in mini-batches.