As part of Roboflow‘s new partnership with OpenCV, I had the opportunity to be involved with the first round of the 2021 OpenCV AI Competition. If you haven’t heard of the computer vision competition, it’s the world’s biggest spatial AI competition, in which over 1,200 teams are competing for over $400,000 in prizes. The first round of the competition required teams to submit a project proposal. Over 200 proposals will be selected to move onto the second round, where each team gets 3 months, Luxonis OAK-D devices, and additional infrastructure to make their proposal a reality.
This meant that I got to see a bunch of innovative, cutting edge applications of computer vision in these proposals, submitted from all around the world! While reading through these proposals, I noticed a few themes in how people are looking to innovate. I’ll share 5 of these emerging trends in computer vision applications.
- Using computer vision to assist the visually impaired. Driven in part by the competition format, I saw dozens of submissions that focused on using computer vision to make tasks easier for the visually impaired. Examples include real-time understanding what objects and surfaces are around the wearer. Including where cars are (what trajectory they are on), where roads, sidewalks, cross-walks are – and more application specific indicators like which aisle your are in when grocery shipping. These all leverage a new type of spatial sensing fused with AI called Spatial AI.
- Using computer vision to bolster workplace safety. In the workplace, dangers aren’t limited to those with visual impairments. For example, each year 65,000 people wearing hard hats are injured and 1,000 are killed! A number of competition submissions focused on how to reduce the number of injuries and deaths in the workplace by identifying particularly risky areas in the workplace and warning individuals if they had stepped into these dangerous areas.
- Using computer vision to slow the spread of COVID. Speaking of workplace safety, perhaps it’s no surprise that one of the most common topics was the single thing that has united all of us: COVID. Teams submitted proposals to build technology to help encourage social distancing in an enclosed spaces like stores or to detect whether people were wearing face masks or using other PPE.
- Using computer vision in agriculture. Working for Roboflow, I’ve gotten the chance to see a lot of applications of computer vision to agriculture. (It comes with the territory of having a lot of Iowans in the office!) However, the competition submissions explored newer examples of how to make farming and growing cheaper, more efficient, and faster. More accurately estimating the size and quantity of plant leaves and replicating an extremely expensive phenotyping setup at a fraction of the cost are two such examples of improving agriculture with computer vision.
- Using computer vision to help animals. We commonly hear examples of computer vision in healthcare for humans, but a lot of those techniques may apply to animals as well. For example, monitoring multiple animal patients at the same time has the potential to improve outcomes, and submissions in the competition seek to do exactly that. There were also submissions focused on monitoring various species of animals to improve conservancy efforts around the globe.
These only provide a glimpse into the impressive, innovative projects that are being put together by teams around the globe. Those teams advancing to round 2 will be announced on Wednesday, 3 March – and we can’t wait!
As the competition continues, I’ll be sure to share more details around some of the innovative proposals, who advances to future rounds, and who ultimately wins – I’m excited to see these proposals come to life!
Wondering where to get started with your own computer vision problems? Brainstorm an idea, define your computer vision problem, and check out Roboflow and OpenCV! (If you use OAK devices and need to know “how to deploy to the Luxonis OAK-1” or “how to deploy to the OAK-D,” we’ve got you covered!)
This post was initially posted on the Roboflow Blog.