Workshop in Technology for Sense-Expansion

Last week I had the pleasure of attending a 5-day workshop at Arizona State University titled simply “Workshop in Technology for Sense-Expansion.” The workshop was hosted by Dr. Frank Wilczek (recipient of the Nobel Prize in physics in 2004). The workshop was intended to give an introduction to human sensory biomechanics/molecular mechanics, and describe the two primary forms of sensory information that we use (sight and hearing). We discussed the nature of these two signals and how they can be used to sensory expansion (that is, the ability to see or hear more than we can currently). The basic idea is that sight and hearing can be combined to encode information that is usually not available. For example, if the UV spectrum were mapped to a range of audible frequencies, we could then “see” UV light. The same is true for IR, or any other electromagnetic radiation that our eyes to not detect. In the workshop we were introduced to several forms of image processing using Python. In particular, we utilized a technique called “temporal image processing” (TIP) to elucidate differences in images that are difficult to discern otherwise (starting with RGB images). We then applied this to hyperspectral data acquired from the PARC hyperspectral camera (currently under development, but Dr. Wilczek had a prototype for use to work with). The PARC hyperspectral camera is a cutting-edge digital imaging device that captures a wide range of frequencies (rather than just red, green, and blue). For example, a normal RGB image data contains height x width x 3 (or 4 if alpha is included) dimensions, corresponding to the image resolution (height x width) and the 3 colors (RGB – 3). Sometimes a 4th layer is included which contains alpha scaling values. The hyperspectral camera image data, however, was height x width x 148! This means that there are 148 different frequency spectra acquired. We used the hyperspectral camera to take pictures of a variety of objects, using “iLuminator” boxes that allowed us to control the frequency of light illuminating the object (pictured below). We also used Arduino to construct “synesthesia gloves” that allow the wearer to hold their hand over a color, and receive both auditory and visual information describing the color sampled (also pictured below). Overall, the workshop was a wonderful experience. I learned a lot of useful and interesting tools in Python, and had a great time working with the Arduinos and the hyperspectral camera. I even wrote a simple GUI to easily control the illumination setting in the iLuminator boxes. This workshop was a preliminary run of a course that Dr. Wilczek hopes to make available to everyone on the web.

The “iLuminator” box, which contains strips of red, green, blue, amber, pink and UV LEDs, driven by an Arduino and 5 FemtoBuck LED drivers. The LED intensities are controlled by a simple GUI application I wrote in Python. The front of the box has a cover, resulting in a space illuminated only by the overhead LEDs.
img_0726

Inside the iLuminator.img_0727
The Arduino micro and FemtoBuck LED drivers.img_0728
My simple Python GUI to control LED intensity. The program allows a serial port to be selected and connected to, and provides sliders and toggles to control LED intensity (as well as spin boxes to manually sent intensity and read off current intensity settings). It is written in PyQt4, and uses PySerial to manage the USB connection. You can download the code here (I’ve had to switch to using Mega to host my files now that Dropbox is ending their link service. I had some trouble using the download links with Safari, but Chrome seems to work fine).
python-gui
The top of the “synesthesia glove,” showing the 3V Pro Trinket, Pro Trinket Lipo backpack (and associated Lipo), SoundBoard, and NeoPixel Jewel, all from Adafruit.img_0729
The bottom of the “synesthesia glove,” showing the RGB and UV sensors from Adafruit, which were used to measure the light (color) reflected from objects that the glove was placed against. img_0730
An early setup of the PARC hyperspectral camera pointing at some geological compounds that fluoresce (fluorite and uranium, to name a few). The camera was later placed in an iLuminator to control illumination frequencies and intensities. img_0731
The previously mentioned objects close up and illuminated with UV light.img_0732
Dr. Wilczek and students working with the camera. My iLuminator box is bottom left. The camera was placed inside the box, and we captured images of many objects under different light frequencies.img_0733
An interesting “pin hole” camera effect from the spatial arrangement of the LEDs inside the box.
img_0735

Advertisements