Microsoft's annual Imagine Cup student developer competition would be easy to mistake for a commercial app accelerator. These aren't science fair presentations or even 24-hour hackathon projects; in almost all the cases, they were near-professional prototypes, and many of the finalists had interest from investors, government departments and universities to carry on working on their projects.
That didn't apply just to the three grand finalists, all of whom have investors, partners, medical facilities or government departments ready to work with them, but many of the 49 teams from around the world who made it to the final. That includes projects like Hachy, which used a smartphone camera and Azure's custom image recognition service instead of a $4,000 spectrometer to check the progress of fertilized chicken eggs on their way to hatching. The Dairy Association of China wants to start using that app.
Boomerang, a project for geotagging asthma inhalers with Azure IoT and Wi-Fi to help sufferers (or their parents) find where they left the inhaler last is in a 12-month 200-user survey with John Hopkins University to validate clinical outcomes (if you can find your inhaler, you'll use it more often).
Drugsafe uses OCR and machine learning to check if the packaging for drugs is counterfeit (about 40% of drugs in India are counterfeits); the team is talking to pharmaceutical companies like Bayer and Pfizer about signing them up to a Power BI dashboard that shows them which drug packaging is easier to counterfeit.
Ezaki-lab's machine learning-based automated fish feeding system has been running at a fish farm for nine months and has already saved nearly $3,000 in unnecessary food costs by not pouring out fish food when there aren't any fish around. Adam ROBO is an automated eye test system built with Node, Python and Cosmos DB that's already in use in 100 cities in Brazil; the team have a patent and interest from universities and multiple medical authorities, as well as the World Health Organization who wants to use the patient data to see the effect of things like higher ultraviolet of eye health.
Pengram built a remote assistance system that overlays 3D CAD models on the equipment technicians are working with, using augmented reality headsets like the HoloLens, pairing them with a remote expert who works in a VR headset; the team is already working with a Fortune 100 company who wants to use it to assess their energy grid, as well as with HTC who wants to offer it in the education market.
Vinculum is working with the UN High Commission for Refugees to help reunite separated refugee families using facial recognition of family photos. InterviewBot, a UK entry that gives you feedback on video interviews (everything from whether you mentioned key phrases on your CV to how often you said 'um') to help you prepare for real life interviews was already pitching to university careers department, but the team is also talking to LinkedIn.
Microsoft doesn't run the Imagine Cup as a form of "unicorn hunting", Charlotte Yarkoni, the CVP for growth ecosystems, assured us. "It's means to be a showcase of technology and innovation," she said. "A big part of the value for Microsoft is not just being engaged in the community but getting these insights; not just 'what are the problems this generation is thinking about' because that's what grows into this innovation but 'how are you thinking about solving it'."
And this year, the way a lot of the teams are thinking about solving it is with AI and machine learning. Around 40% of the entries in the entire competition had AI-based solutions, and the proportion was even higher in the 49 finalists. Only ten of the projects didn't specifically talk about using image recognition, speech recognition, machine learning, deep learning or other AI techniques (and that doesn't mean they weren't using them). Some of them used Microsoft's Cognitive Services, like the Face API, text analytics, speech to text and custom vision. But just as many had built their own machine learning models, either in the AzureML service or using open source frameworks like TensorFlow, for everything from monitoring bee hives to classifying the taste of coffee to checking whether a pineapple is ripe.
Of course, just using AI techniques doesn't mean the results are correct or helpful.
Team TBC used machine learning "to determine psychogeometric characteristics of a person from a photograph" to help recruiters find people who would fit the organizational culture of the company . Psychogeometry is usually about analysing someone's personality type by which geometric shape they choose, and this idea seems closer to phrenology. Plus, hiring for culture fit isn't how you get diversity.
If you're sarcastic in an interview, will the sentiment analysis of your interview transcript pick it up correctly?
And while there's a lot of research in the area, can the iCry2Talk machine-learning system that was one of the three grand finalists, really tell you if your baby is crying because it's hungry, tired, lonely or needs burping or a new nappy? You can't ask a baby what it meant and if the app tells you to change the nappy and your baby stops crying because that meant they get the cuddle they really wanted, can you be sure what the cry meant?
But the fact that students can tackle such ambitious ideas, good or bad, with any chance of success is a tribute to how accessible AI techniques have become. It's always heart warming to see the enthusiasm for the students in the Imagine Cup for tackling hard real-world problems that urgently need solving. It's impressive that AI tools have advanced far enough that even students can pick them up quickly enough to use in projects like these. And with so much interest in taking these projects further, we might get to see whether those AI tools provide real, useful results in use (because that's not guaranteed).
RECENT AND RELATED COVERAGE
Imagine an AI-driven software platform that can power next generation of civic participation -- around the world.
Everything you need to know about the path to creating an AI as smart as a human.
Imagine meetings that document themselves using AI tools to add real-time translation and accessibility support. How much of that is the future and how much can you do today?
PerceptiveIo, a stealth startup formed by a handful of former Microsoft holoportation and AI experts, may have been acquired by Google.
Semantic Machines focuses on conversational artificial intelligence, which is all the rage for digital voice assistants and social chatbots.
Like other technologies and industries, artificial intelligence will need to adopt supplier's declaration of conformity documents to build trust. How was that model built exactly?