OpenCV provides a set of samples for Android developers. These samples show how OpenCV can be used from both Java and native level of Android. There are 2 groups of samples: samples for Java and C++ OpenCV API, and a group of sample applications. The first group is named as “Tutorial #” and considers important aspects for a beginner: using OpenCV in Java and C++, working with camera, mixing both Java and C++ calls to OpenCV in a single application.
Java and C++ API usage
The first group of samples illustrates how to use OpenCV Java API in your project. Follow the “OpenCV for Android SDK” tutorial to learn how to build them:
- Tutorial 1 – Camera Preview – shows the simplest way Android application can use OpenCV, i.e. via OpenCV application helper classes. It displays full screen preview using either Java or Native camera API and allows switching between them.
- Tutorial 2 – Mixed Processing – shows ways of pre-processing camera preview frames with both Java and C++ calls to OpenCV.
- Tutorial 3 – Camera Control – shows a way of basic manipulations with camera in an OpenCV based Android application. In particular it changes camera preview resolution, activates camera built-in effects, takes and saves a still picture.
There are also several sample applications written mostly in Java. They get a frame from camera, do some processing with OpenCV and visualize the frame on a device screen.
- Sample – image-manipulations – this example demonstrates how OpenCV can be used as an image processing and manipulation library. It supports several filters, demonstrates color space conversions and working with histograms. It doesn’t have special relation to computer vision, but OpenCV has powerful
imgproc modules, which may be useful in a wide range of applications, especially in the field of Computational Photography.
- Sample – 15-puzzle – shows how a simple game can be implemented with just a few calls to OpenCV. It is available on Google Play.
- Sample – face-detection – is the simplest implementation of the face detection functionality on Android. It supports 2 modes of execution: available by default Java wrapper for the cascade classifier, and manually crafted JNI call to a native class which supports tracking. Even Java version is able to show close to the real-time performance on a Google Nexus One device.
- Sample – color-blob-detection – this sample shows a trivial implementation of color blob tracker. User points to some region, and algorithms tries to select the whole blob of a similar color. Work with touch interface and contours is demonstrated.