All code included in this blog post is public domain, and may be freely reused without any restrictions. I want to emphasize that the code I am sharing here is more of a proof-of-concept than a rigorously built library. My intention is for this to be a starting point for anyone interested to create a library. (I might do so myself eventually.)
To start writing a JNI program, one must create a Java class with stubs to be written in C. The following Java program contains the stubs I used:
public class Webcam {
// Causes the native library to be loaded from the system.
static {System.loadLibrary("webcam");}
// Three primary methods for use by clients:
// Call start() before doing anything else.
public static void start(int w, int h) throws java.io.IOException {
width = w;
height = h;
setup();
start = System.currentTimeMillis();
frames = 0;
}
// Call grabFrame() every time you want to see a new image.
// For best results, call it frequently!
public static byte[] grabFrame() throws java.io.IOException {
byte[] result = makeBuffer();
grab(result);
frames += 1;
return result;
}
// Call end() when you are done grabbing frames.
public static double end() throws java.io.IOException {
duration = System.currentTimeMillis() - start;
dispose();
return 1000.0 * frames / duration;
}
// Three native method stubs, all private:
// Called by start() before any frames are grabbed.
private static native void setup() throws java.io.IOException;
// Called by grabFrame(), which creates a new buffer each time.
private static native void grab(byte[] img) throws java.io.IOException;
// Called by end() to clean things up.
private static native void dispose() throws java.io.IOException;
// Specified by the user, and retained for later reference.
private static int width, height;
// Record-keeping data to enable calculation of frame rates.
private static int frames;
private static long start, duration;
// Utility methods
public static int getBufferSize() {return width * height * 2;}
public static byte[] makeBuffer() {return new byte[getBufferSize()];}
public static void start() throws java.io.IOException {
start(160, 120);
}
public static int getWidth() {return width;}
public static int getHeight() {return height;}
}
Once the class has been created and compiled, we use javah to generate a C header file, which will be included by the C program we will write. From the above file, the following function prototypes are generated for the C program:
#include <jni.h>
JNIEXPORT void JNICALL Java_Webcam_setup
(JNIEnv *, jclass);
JNIEXPORT void JNICALL Java_Webcam_grab
(JNIEnv *, jclass, jbyteArray);
JNIEXPORT void JNICALL Java_Webcam_dispose
(JNIEnv *, jclass);
In each prototype, the JNIEnv* parameter represents the Java environment that the C code can access, and the jclass parameter denotes the class of which the static method is a member. The implementations of these functions I will defer to the end of this post. Following JNI conventions, the implementations are in a file named WebcamImp.c.To compile the JNI C code, we need to use our cross-compiler to generate a shared library. Once this is done, we need to copy the shared library onto the EV3, specifically, into a directory where it will actually be found at runtime. The following commands were sufficient for achieving this:
sudo apt-get install libv4l-dev
~/CodeSourcery/Sourcery_G++_Lite/bin/arm-none-linux-gnueabi-gcc
-shared -o libwebcam.so WebcamImp.c -fpic
-I/usr/lib/jvm/java-7-openjdk-i386/include -std=c99 -Wall
scp libwebcam.so root@10.0.1.1:/usr/lib
At this point, it is convenient to have an easy test to make sure the system works. To that end, add the following main() method to Webcam.java. You should be able to run it just like any other LeJOS EV3 program. It will print a period for every frame it successfully grabs. Of course, make sure your webcam is plugged in before you try this!
// At the top of the file
import lejos.hardware.Button;
// Add to the Webcam.java class
public static void main(String[] args) throws java.io.IOException {
int goal = 25;
start();
for (int i = 0; i < goal; ++i) {
grabFrame();
System.out.print(".");
}
System.out.println();
double fps = end();
System.out.println(fps + " frames/s");
while (!Button.ESCAPE.isDown());
}
Finally, we are ready to examine the JNI C code. To create this implementation, I adapted the Video4Linux2 example driver found at http://linuxtv.org/downloads/v4l-dvb-apis/capture-example.html. Here is an overview of my modifications to the driver:
- I reorganized the code to match the Webcam.java specification:
- Java_Webcam_setup() calls open_device(), init_device(), and start_capturing().
- The interior loop in mainloop() became the basis for the implementation of Java_Webcam_grab().
- Java_Webcam_dispose() calls stop_capturing(), uninit_device(), and close_device().
- The Webcam.java main() largely takes over the responsibilities of main() in the driver.
- I converted all error messages to throws of java.io.IOException.
- I added a global 200-char buffer to store the exception messages.
- I only used the code for memory-mapped buffers. This worked fine for my camera; there might exist cameras that do not support this option.
- I forced the image format to YUYV. For image processing purposes, having intensity information is valuable, and using this format provides it with a minimum of hassle.
- I allow the user to suggest the image dimensions, but I also send back the actual width/height values the driver and camera agreed upon.
- The first frame grab after starting the EV3 consistently times out when calling select(), but all subsequent grabs work fine. I added a timeout counter to the frame grabbing routine to avoid spurious exceptions while still checking for timeouts.
- It may not be beautiful, but in my experience it works reliably.
Without further ado, then, here is the code:
#include "Webcam.h"
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <assert.h>
#include <getopt.h> /* getopt_long() */
#include <fcntl.h> /* low-level i/o */
#include <unistd.h>
#include <errno.h>
#include <sys/stat.h>
#include <sys/types.h>
#include <sys/time.h>
#include <sys/mman.h>
#include <sys/ioctl.h>
#include <linux/videodev2.h>
#define REPORT_TIMEOUT 1
#define DEVICE "/dev/video0"
#define EXCEPTION_BUFFER_SIZE 200
#define MAX_TIMEOUTS 2
#define FORMAT V4L2_PIX_FMT_YUYV
#define CLEAR(x) memset(&(x), 0, sizeof(x))
struct buffer {
void *start;
size_t length;
};
static int fd = -1;
struct buffer *buffers;
static unsigned int n_buffers;
static char exceptionBuffer[EXCEPTION_BUFFER_SIZE];
// Place error message in exceptionBuffer
jint error_exit(JNIEnv *env) {
fprintf(stderr, "Throwing exception:\n");
fprintf(stderr, "%s\n", exceptionBuffer);
return (*env)->ThrowNew(env, (*env)->FindClass(env, "java/io/IOException"),
exceptionBuffer);
}
jint errno_exit(JNIEnv *env, const char *s) {
sprintf(exceptionBuffer, "%s error %d, %s", s, errno, strerror(errno));
return error_exit(env);
}
static int xioctl(int fh, int request, void *arg) {
int r;
do {
r = ioctl(fh, request, arg);
} while (-1 == r && EINTR == errno);
return r;
}
jboolean open_device(JNIEnv *env) {
struct stat st;
if (-1 == stat(DEVICE, &st)) {
sprintf(exceptionBuffer, "Cannot identify '%s': %d, %s\n", DEVICE, errno,
strerror(errno));
return error_exit(env);
}
if (!S_ISCHR(st.st_mode)) {
sprintf(exceptionBuffer, "%s is no device\n", DEVICE);
return error_exit(env);
}
fd = open(DEVICE, O_RDWR /* required */ | O_NONBLOCK, 0);
if (-1 == fd) {
sprintf(exceptionBuffer, "Cannot open '%s': %d, %s\n", DEVICE, errno,
strerror(errno));
return error_exit(env);
}
return -1;
}
jboolean init_device(JNIEnv * env, jclass cls) {
struct v4l2_capability cap;
struct v4l2_cropcap cropcap;
struct v4l2_crop crop;
struct v4l2_format fmt;
struct v4l2_requestbuffers req;
unsigned int min;
if (-1 == xioctl(fd, VIDIOC_QUERYCAP, &cap)) {
if (EINVAL == errno) {
sprintf(exceptionBuffer, "%s is no V4L2 device\n", DEVICE);
return error_exit(env);
} else {
return errno_exit(env, "VIDIOC_QUERYCAP");
}
}
if (!(cap.capabilities & V4L2_CAP_VIDEO_CAPTURE)) {
sprintf(exceptionBuffer, "%s is no video capture device\n", DEVICE);
return error_exit(env);
}
if (!(cap.capabilities & V4L2_CAP_STREAMING)) {
sprintf(exceptionBuffer, "%s does not support streaming i/o\n", DEVICE);
return error_exit(env);
}
/* Select video input, video standard and tune here. */
CLEAR(cropcap);
cropcap.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
if (0 == xioctl(fd, VIDIOC_CROPCAP, &cropcap)) {
crop.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
crop.c = cropcap.defrect; /* reset to default */
if (-1 == xioctl(fd, VIDIOC_S_CROP, &crop)) {
switch (errno) {
case EINVAL:
/* Cropping not supported. */
break;
default:
/* Errors ignored. */
break;
}
}
} else {
/* Errors ignored. */
}
jfieldID width_id = (*env)->GetStaticFieldID(env, cls, "width", "I");
jfieldID height_id = (*env)->GetStaticFieldID(env, cls, "height", "I");
if (NULL == width_id || NULL == height_id) {
sprintf(exceptionBuffer, "width or height not present in Webcam.java");
return error_exit(env);
}
CLEAR(fmt);
fmt.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
fmt.fmt.pix.width = (*env)->GetStaticIntField(env, cls, width_id);
fmt.fmt.pix.height = (*env)->GetStaticIntField(env, cls, height_id);
fmt.fmt.pix.pixelformat = FORMAT;
fmt.fmt.pix.field = V4L2_FIELD_INTERLACED;
if (-1 == xioctl(fd, VIDIOC_S_FMT, &fmt)) {
return errno_exit(env, "VIDIOC_S_FMT");
}
(*env)->SetStaticIntField(env, cls, width_id, fmt.fmt.pix.width);
(*env)->SetStaticIntField(env, cls, height_id, fmt.fmt.pix.height);
/* Buggy driver paranoia. */
min = fmt.fmt.pix.width * 2;
if (fmt.fmt.pix.bytesperline < min)
fmt.fmt.pix.bytesperline = min;
min = fmt.fmt.pix.bytesperline * fmt.fmt.pix.height;
if (fmt.fmt.pix.sizeimage < min)
fmt.fmt.pix.sizeimage = min;
CLEAR(req);
req.count = 4;
req.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
req.memory = V4L2_MEMORY_MMAP;
if (-1 == xioctl(fd, VIDIOC_REQBUFS, &req)) {
if (EINVAL == errno) {
sprintf(exceptionBuffer, "%s does not support memory mapping\n", DEVICE);
return error_exit(env);
} else {
return errno_exit(env, "VIDIOC_REQBUFS");
}
}
if (req.count < 2) {
sprintf(exceptionBuffer, "Insufficient buffer memory on %s\n", DEVICE);
return error_exit(env);
}
buffers = calloc(req.count, sizeof(*buffers));
if (!buffers) {
sprintf(exceptionBuffer, "Out of memory\n");
return error_exit(env);
}
for (n_buffers = 0; n_buffers < req.count; ++n_buffers) {
struct v4l2_buffer buf;
CLEAR(buf);
buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
buf.memory = V4L2_MEMORY_MMAP;
buf.index = n_buffers;
if (-1 == xioctl(fd, VIDIOC_QUERYBUF, &buf))
return errno_exit(env, "VIDIOC_QUERYBUF");
buffers[n_buffers].length = buf.length;
buffers[n_buffers].start =
mmap(NULL /* start anywhere */,
buf.length,
PROT_READ | PROT_WRITE /* required */,
MAP_SHARED /* recommended */,
fd, buf.m.offset);
if (MAP_FAILED == buffers[n_buffers].start) {
return errno_exit(env, "mmap");
}
}
return -1;
}
jboolean start_capturing(JNIEnv *env) {
unsigned int i;
enum v4l2_buf_type type;
for (i = 0; i < n_buffers; ++i) {
struct v4l2_buffer buf;
CLEAR(buf);
buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
buf.memory = V4L2_MEMORY_MMAP;
buf.index = i;
if (-1 == xioctl(fd, VIDIOC_QBUF, &buf)) {
return errno_exit(env, "VIDIOC_QBUF");
}
}
type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
if (-1 == xioctl(fd, VIDIOC_STREAMON, &type)) {
return errno_exit(env, "VIDIOC_STREAMON");
}
return -1;
}
JNIEXPORT void JNICALL Java_Webcam_setup(JNIEnv *env, jclass cls) {
if (open_device(env)) {
if (init_device(env, cls)) {
start_capturing(env);
}
}
}
void process_image(const void *p, int size, JNIEnv * env,
jbyteArray img) {
(*env)->SetByteArrayRegion(env, img, 0, size, (jbyte*)p);
}
jboolean read_frame(JNIEnv* env, jbyteArray img) {
struct v4l2_buffer buf;
CLEAR(buf);
buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
buf.memory = V4L2_MEMORY_MMAP;
if (-1 == xioctl(fd, VIDIOC_DQBUF, &buf)) {
switch (errno) {
case EAGAIN:
return 0;
case EIO:
/* Could ignore EIO, see spec. */
/* fall through */
default:
return errno_exit(env, "VIDIOC_DQBUF");
}
}
assert(buf.index < n_buffers);
process_image(buffers[buf.index].start, buf.bytesused, env, img);
if (-1 == xioctl(fd, VIDIOC_QBUF, &buf)) {
return errno_exit(env, "VIDIOC_QBUF");
}
return -1;
}
JNIEXPORT void JNICALL Java_Webcam_grab(JNIEnv * env, jclass cls,
jbyteArray img) {
int timeouts = 0;
for (;;) {
fd_set fds;
struct timeval tv;
int r;
FD_ZERO(&fds);
FD_SET(fd, &fds);
/* Timeout. */
tv.tv_sec = 2;
tv.tv_usec = 0;
r = select(fd + 1, &fds, NULL, NULL, &tv);
if (-1 == r) {
if (EINTR != errno)
continue;
else {
errno_exit(env, "select");
return;
}
}
if (0 == r) {
timeouts++;
#ifdef REPORT_TIMEOUT
fprintf(stderr, "timeout %d (out of %d)\nTrying again", timeouts,
MAX_TIMEOUTS);
#endif
if (timeouts > MAX_TIMEOUTS) {
sprintf(exceptionBuffer, "select timeout\n");
error_exit(env);
return;
}
}
if (read_frame(env, img))
break;
/* EAGAIN - continue select loop. */
}
}
void stop_capturing(JNIEnv *env) {
enum v4l2_buf_type type;
type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
if (-1 == xioctl(fd, VIDIOC_STREAMOFF, &type))
errno_exit(env, "VIDIOC_STREAMOFF");
}
void uninit_device(JNIEnv *env) {
unsigned int i;
for (i = 0; i < n_buffers; ++i)
if (-1 == munmap(buffers[i].start, buffers[i].length))
errno_exit(env, "munmap");
free(buffers);
}
void close_device(JNIEnv *env) {
if (-1 == close(fd))
errno_exit(env, "close");
fd = -1;
}
JNIEXPORT void JNICALL Java_Webcam_dispose(JNIEnv *env, jclass cls) {
stop_capturing(env);
uninit_device(env);
close_device(env);
}
No comments:
Post a Comment