It is forecast that in 2012, 450 Million smart phones with cameras will be sold, increasing to 650 Million units in 2013. Those with interests in commercial applications of computer vision simply cannot afford to ignore this growth in “smart cameras” enabled by mobile devices. This tutorial is intended to be hands on. We will:
- Review OpenCV, the Open Source Computer Vision Library.
- Cover some of the tools for developing vision applications on mobile devices with focus on OpenCV.
- We will then show you step by step how to implement vision applications on
- Android and
- We will go through the implementation of a classical face detection sample.
- This application can serve as a stub which attendees can modify for their own applications. As time permits, we will guide/advise attendees in starting their own applications.
- Here are the slides for the iOS part:
- And the sample application is:
- OpenCV iOS tutorials:
Pre-requisites for the practice session
- OpenCV development page
- OpenCV Android page
- Tutorials on OpenCV with Android: Using Android binary package with Eclipse, Using OpenCV with Android binary package
- Tutorials on OpenCV for iOS
- UPDATE: Bring the USB<->microUSB connector together with your android/iOS device!
iOS part pre-requisities
- bring a mac with OSX 10.7 and Xcode 4.3 or later. OSX 10.8DP and Xcode 4.5 beta might work too. That would be enough to run some of the samples on iOS Simulator. Camera is not available in the simulator.
- In order to run sample apps on a device, you should enroll yourself as iOS developer at http://developer.apple.com (which costs $99 per year) and then register your device using Xcode organizer: http://developer.apple.com/library/ios/#documentation/ToolsLanguages/Conceptual/Xcode4UserGuide/Devices/Devices.html
- to check that you did everything correctly, download, build and run on your device the sample GLCameraRipple: http://developer.apple.com/library/ios/#samplecode/GLCameraRipple/Introduction/Intro.html
- bring this registered device and the USB<=>Dock connector wire.
Android part pre-requisities
- bring a laptop (running Windows 7 or MacOS X; Linux may work too) with Java Development Kit installed (http://java.oracle.com). We’d also suggest you pass the “Introduction into Android Development” tutorial that provides a step-by-step instructions on setup of Android development environment. It will be enough to run some of the samples on the emulator.
- bring your device and USB cable for it. It would be great if you learn how to run standard Android samples on a device before the tutorial, otherwise, because of a huge variety of different devices, it may or may not work during the tutorial.
Dr. Vincent Rabaud joined Willow Garage in January 2011 as a research engineer in computer vision. With a background in structure from motion his current focus is to teach a robot to recognize objects for grasping. Among other things, he is working on acquiring a database of household objects, developing 3D object recognition and fast feature detection on cellphones. His research interests include 3D, tracking, face recognition and anything that involves underusing CPU’s by feeding them very fast algorithms. Dr. Rabaud completed his PHD at UCSD, advised by Professor Belongie. He also holds a MS in space mechanics and space imagery from SUPAERO and a BS/MS in optimization from the Ecole Polytechnique.
Andrey Pavlenko is working in computer vision field for the last two years. He has developed Java API for OpenCV library, made key contribution to OpenCV for Android development, Android samples and tutorials. Before joining Itseez Andrey worked at Intel Nizhny Novgorod Lab during 13 years in various projects including video codecs, debugging and performance analysis tools. He also holds a MS in Computer Science from Nizhny Novgorod State University.