Saturday, August 2, 2014

Lego Mindstorms, Past and Future

I've been a longtime user of the Lego Mindstorms robots.  I have used them primarily for teaching the Hendrix College course Robotics Explorations Studio.  I also have used them in some of my published work.  I employed the original Mindstorms RCX for several years.  In spite of its limitations (32K RAM, 3-character output screen, limited sensors), it was great fun to use.  It was powerful enough to run LeJOS, a port of the Java virtual machine.  In general, Lego has done a great job of making the software and hardware of the Mindstorms robots sufficiently open to enable developers to create their own operating systems to meet their needs, and the LeJOS series has been a great example of this.

Starting in 2008, we transitioned to Lego Mindstorms NXT.  I created a lab manual for the course employing the NXT and the pblua language.  It was, overall, a dramatic improvement to the product that opened up many new possibilities.  Significant improvements incorporated into the NXT included:
  • A fourth sensor port.
  • Rotation sensors incorporated into the motors.
  • A distance sensor (ultrasonic).
    • Many new third-party sensors (e.g. compass, gyroscope)
  • 64K RAM.
  • A rechargeable battery.  
  • A 100x60 pixel LCD output.
  • A USB connection for uploading programs.
  • A 32 bit ARM7 CPU.
The most recent incarnation is Lego Mindstorms EV3.  Superficially, it might not appear to be as large an improvement as the RCX to NXT transition.  The LCD screen is slightly larger (178x128 pixels), the kit includes some new types of sensors and motors, and it has a fourth motor port (to match the four sensor ports).  

But under the surface, the changes are again revolutionary:  
  • 64 megabytes of RAM (i.e., three orders of magnitude more!)
  • The ARM9 CPU is six times faster (300 MHz).
  • Its operating system kernel is a version of Linux.
  • It has a micro-SD slot (allowing for up to 32 gigabytes of persistent storage).
  • It has a USB port.
I have always been interested in using image processing as a robotic sensor.  Up until now, I have been placing a netbook atop a specially-designed NXT model, and using the netbook's webcam.  But the USB port means that a webcam can now be plugged directly into a Mindstorms robot, with image processing taking place as part of the EV3 program.   

As it happens, it is easier to observe that this is a possibility than to implement it.  

Due to my familiarity with Java, and the maturity of the LeJOS project, I've decided to try to make this work using the EV3 version of LeJOS.  This requires the following steps:
In writing the above posts, I learned that the Blogger interface provides no obvious means for formatting source code.  A very convenient page that generates the formatted code is http://codeformatter.blogspot.com/.

I've put together a short demo that you can watch of the EV3 LCD screen showing video in real-time.

No comments:

Post a Comment