AWS DeepLens Challenge
Build machine learning projects using AWS DeepLens and make a difference!
Through a series of themed challenges that will run in 2018, you have the opportunity to combine your ideas with the capability of AWS DeepLens to create machine learning projects that can have a positive impact on the world. These challenges will help you gain valuable machine learning experience within a fun, collaborative, and inspiring environment. You’ll be making a positive impact on improving people’s lives and supporting non-profits that benefit our society.
It has been said that giving is not about making a donation, but about making a difference. The rewards will vary by challenge, but they will include things such as a donation per project submission to a challenge-related non-profit organization, a blog showcase for the projects that have the most potential positive effects, and, overall, the biggest reward will be that you get to learn and at the same time you will build something that could have a profound beneficial impact on the community. Project submissions and their authors will be promoted by AWS.
AWS DeepLens allows developers of all skill levels get started with deep learning through example projects, computer vision models, tutorials, and real world, hands-on exploration on a physical device. You can choose from a collection of pre-trained models ready to run on the device with a single click available in the AWS DeepLens console, or build custom models with Amazon SageMaker. Check out the AWS DeepLens resources available to get started today. Join the AWS DeepLens Challenge Slack Channel to collaborate with your fellow participants!
Challenge #1: Inclusivity
We’re continuing the popular Inclusivity Challenge that started this summer. Use your DeepLens to create a project that fosters inclusion, overcomes barriers, and strengthens bonds between people. Your project might help bridge a gap you have seen in your daily life or could focus on any aspect of inclusivity such as helping people with developmental disabilities, race, gender, education, geographic location, sexual orientation, and many others. Let’s put machine learning to work to make the world a better place for everyone!
Here’s Gene Boes, the CEO of the Northwest Center, to explain why inclusivity is important and how DeepLens can help.
Here’s a project called ASLens, created by Chris Coombs. ASLens uses AWS DeepLens to translate the American Sign Language alphabet to speech.
For each project that meets the entry criteria, you will receive $50 in AWS credits, and we will donate $249 to the Northwest Center, a charity that serves children and adults with developmental disabilities.
Challenge #2: Sustainability
For the Sustainability Challenge, use your DeepLens to create a project that highlights respect for the environment. We’re hoping to see projects that deal with ecological challenges, you might identify flora and fauna, track plant growth rates, encourage waste reduction or even improve waste sorting as you’ll see in the video below.
Here’s Amazon Worldwide Sustainability Manager, Alexis Fuge, talking about some of the innovative work Amazon has been doing to use DeepLens to help with waste management.
If you’re interested in the flora and fauna aspect of sustainability, check out this project created by Paul Langdon called Backyard Birder. It identifies birds and squirrels and keeps track of how many squirrels disrupt the bird feeders each day.
For each project that meets the entry criteria, you will receive $50 in AWS credits, and we will donate $249 to Engineers for a Sustainable World, a nonprofit network made up of people passionate about technical sustainability from across the world.
Challenge #3: Games
For the Games Challenge, use your DeepLens to create a project that focuses on playing games, sports, and puzzles. Your project may cover any area of gaming from single player to multi-player games, cooperative or competitive, either local or over the internet around the world. Feel free to adapt an existing game or create your own from scratch.
Here’s Geogrify's CEO, Kate Edwards, with a look at the subject of games.
For an example of a project that fits this challenge, take a look at this DeepLens powered Simon Says game created by Abdul Rahman Abdul Ghani and Michael Bawiec. This project showcases a Simon Says Deep Learning platform where everyone can join the same global game using DeepLens to verify the correct action of each player.
For each project that meets the entry criteria, you will receive $50 in AWS credits, and we will donate $249 to Girls Who Code, a group that creates workshops and books for girls to learn programming at an early age.
Challenge #4: Health
For the Health Challenge, you can use your DeepLens to develop a project about boosting your mental or physical health. Think of something in your life you want to improve, and figure out how DeepLens and AI might help get you there. We’re hoping to see projects that range from helping patients with chronic diseases through to yoga pose recognitions and exercise counting.
We have a couple of example projects that fit this challenge. Check out DermLens created by Terje Norderhaug and Tom Woolf, which allows patients with psoriasis to monitor and manage their condition using DeepLens, and Exercise Counter created by Tomas Holcman which counts and records repetitions of exercises to show personal statistics over time.
For each project that meets the entry criteria, you will receive $50 in AWS credits, and we will donate $249 to Hopecam, a non-profit which donates video cameras and tablets to overcome the social isolation experienced by children in treatment for cancer.
If AWS offers to make a donation to a charity for every eligible entry, then AWS will donate $249 for each eligible entry to a challenge. A minimum donation of $5,000 will be made to a charity of AWS’s choice, and the donation will be capped at $10,000. All of the fields in the submission form must be completed and the entry must be submitted by the each challenge’s respective deadline for an entry to be eligible.