news.

OpenCV and FAST corners on the iPhone 3G

iPhone FAST corners

Note: An updated post featuring extremely fast FAST corner detection is here: http://www.ml.sun.ac.za/mobile/stable-fast-corners-on-3gs-official-api/

As part of my research I am developing tools and libraries for use at the MIH Medialab at Stellenbosch that will allow us to create Augmented Reality applications on mobile devices. In the screenshot you can see the output of the Cambridge FAST corners algorithm running on an iPhone 3G. Video frames are captured from the iPhone’s camera using the camera callback Hook trick published by the brilliant iPhone hacker Norio Nomura (to whom I am eternally grateful). The video frames are processed by OpenCV which was compiled for the iPhone (static libs) and then passed onto the FAST corner detection algorithm with Nonmaximal suppression enabled. The resulting points are simply drawn on a transparent UIView overlayed on the cameraPreview view.

With the input image scaled down once, corner detection can be done on the iPhone in near real-time. I plan to investigate a multi-level detection scheme if necessary whereby detection on a rough version of the image can be  refined using multiple image pyramid levels.

Instructions on compiling openCV for the iPhone can be found at the following URLs:

http://niw.at/articles/2009/03/14/using-opencv-on-iphone/en
http://lambdajive.wordpress.com/2008/12/20/cross-compiling-for-iphone/
http://ildan.blogspot.com/2008/07/creating-universal-static-opencv.html
http://www.computer-vision-software.com/blog/2009/04/opencv-vs-apple-iphone/

All of these explain how to compile openCV for the iPhone on an OSX machine, however I decided to compile OpenCV on the iPhone itself. Compiling alone took about 8 hours to complete! All I had to do was make sure I have the GNU C++ compiler and standard C++ headers installed on the phone using Cydia.When compiling my FAST test app on the iPhone I had to include the following build flags in my Makefile:

LDFLAGS = -arch arm -lobjc
LDFLAGS += -lz -lstdc++ -lcv -lcxcore -lcvaux
LDFLAGS += -framework PhotoLibrary #NB! +++
LDFLAGS += -L”/usr/lib/opencv”

The resulting binary is about 6MB in size. For more instructions on compiling code on the iPhone itself and setting up a build environment, have a look at this example from Singaja, found at  http://www.hackint0sh.org/f9/90899.htm: Accelerometer.zip. You may need to use gUICache (Cydia app) and restart the springboard for your opentoolchain App to show up in the menu.

The next step is finding and implenenting a suitable tracking algorithm to track the FAST corners and estimate camera pose. Hopefully the iPhone’s accelerometer data can be utilized to aid in tracking. For an awesome iPhone Parallel Tracking and Mapping demo have a look at this video from George Klein of Cambridge: http://www.youtube.com/watch?v=pBI5HwitBX4

, , , ,

One Comment

  1. 1
    Jacobus on Thursday 15 April, 09:53 AM #

    Baie goeie idee. Die bomme wat uit die boek opstaan is baie goed gedoen. Jul het groot pret.

Leave a comment

Leave a Reply

(required)