Introduction: Ommatid Spherical Display: Electronics, Programming and Interactivity

About: 2015 Autodesk AiR. I can't respond to messages here: please send me email!

This Instructable talks about the electronics and programming of the Ommatid Spherical Display. It is called the Ommatid after Ommatidia, the compound insect eye it is inspired by. You can read about the physical construction of the globe and base in Ommatid: Constructing the Enclosure.

The Ommatid is a spherical globe that serves as both an information display and a physical input device. Each triangular facet has both a RGB LED for producing colored light as well as an IR sensor system that detects when the facet is being touched by reflected infrared light. There are 76 total facets and thus LEDs and sensor channels. The interior of the optical globe is an icosahedron with 20 triangular sides. Except for the bottom side which is used to attach the globe and to run wires, every other side has a printed circuit board with a microcontroller, 4 color LEDs, and 4 infrared LED/detector pairs.

Thanks to Mitch Altman for the photos on this step

Update: All design files and code (still in progress) can be found on github at https://github.com/headrotor/ommatid


Step 1: Prototyping the Optical Detector

In order to sense touch on the outside of the sphere, I elected to use a infrared detector. In principle this is simple: a powerful IR led shine out each facet, and a sensitive photodiode senses the reflected light from your hand. In practice this was a bit trickier: because the resin in the globe is not perfectly transparent some of the light is lost on the way out and the way in, and some is reflected internally. Plus common incandescent lamps and motion sensors are sources of infrared light that can fool the sensor.

A common way of making the sensing more robust is to modulate the light: by sensing the reflected light and subtracting the signal when the IR led is off, you can get a signal that is more immune to ambient IR. So this was the basis of my IR detection circuit: the schematic above has two reflectance circuits so I could investigate crosstalk: one channel's sensor picking up the other channel's illumination and vice-versa.

In operation, the IR LEDs D1 and D2 are turned on by FETs Q1 and Q2, driven by the logic output pins of a microcontroller (denoted IR1 and IR2). To turn on D1, a logic HIGH is applied to IR1, the gate of Q1,and similarly for D2, Q2, and IR2. Resistors R2, R3, R5 and R6 limit the current through the diodes. The IR sensors T1 and T2 are phototransistors that conduct when IR light is shined on them. In the dark they are high impedance and the resistors R1 and R4 pull up the sensor outputs S1 and S2 to nearly Vdd. As the light increases, the phototransistor conducts more and more current, and the voltage drops across R1 and R4, reducing the voltage on S1 and S2. These sensor outputs go to a analog-to-digital converter on the microcontroller so we can do the subtraction mentioned above. L1 and L2 are WS2811B color LEDs: it was important to include these so I could test that the IR circuit was not responding to visible light from the LEDs!

The scope trace shown above has the LED drive in yellow and the sensor output in blue: as the led is turned on, the sensor voltage drops. I tested this with a section of the optical globe to make sure it was transparent enough to IR and did in fact vary enough to measure when I touched the optical globe.

Step 2: Initial Circuit Prototypes

Though the circuit was fairly simple, I had to try several types of components to get a circuit that performed reliably. In particular, I used very bright IR LEDs that took 100 mA of current, and very sensitive Darlington phototransistors that have much higher gain than typical phototransistors. I prototyped these on several hand-wired circuit boards, and I settled on the two-channel circuit of the previous step's schematic to prototype with an Othermill.

Step 3: PCB Design and Prototype

When I was satisfied that I had the right components and that the circuit worked, I designed a PCB with four channels of IR sensors and four RGB LEDs. To read the analog output of the phototransistors, I used the ADCs built into the Atmel ATTINY1634 microcontroller. Between the four drive outputs and the four ADC inputs, serial communication, and clock signals, there was only one GPIO pin left: I used this for a status LED which was helpful for debugging.

This circuit board was tiny but I manage to fit it in using only two layers. On the front, facing outward, are the IR and RGB LEDs, the phototransistors and LED drive diodes. On the back was the microcontroller, the crystal and resonant capacitors, bypass capacitors, the status LED, and a RS-485 transceiver IC for serial communication with the host. To save room I used only one pair of current-limiting resistors for the IR LEDs: this means I can only turn on one at a time. This is not a limitation as it's good practice anyway to minimize crosstalk and save power.

To test this circuit, I had 5 boards manufactured by OSHPark in Portland, who produce an excellent-quality board at a very reasonable price. I stuffed these, and wrote some initial firmware to test the sensor circuits, which would light up each IR LED sequentially and measure the corresponding sensor output.

Step 4: Production PCB Assembly

Once the circuit board was designed and tested, I could move on to production. Because the interior of the optical globe is an icosahedron with 20 sides, I would need 19 total PCBs (the bottom side of the icosahedron is sacrificed for the mounting pillar). I had 20 made just in case I needed a spare. I had them made overseas with white soldermask: they were quite inexpensive but took more than a month to arrive due to customs delays.

I reflow-soldered the SMD crystals which were not easy to hand-solder using the new reflow oven at Pier 9. I hand-soldered all the other components: because I did not have a good solder stencil this was just as fast.

Step 5: Production PCBs

Here are close ups of one of the 19 PCBs that tile the inside of the optical sphere. On the bottom are the microcontroller and RS-485 transceiver ICs, as well as the clock crystal, programming header, status LED, LED current limiting resistors, and bypass capacitors.

On the top side you can see the four triangular sections: each mates with a segment of the sphere. In the middle of each triangle is the WS2812B RGB LED; on either side is the IR LED (white square) and IR phototransistor (black dot) as well as the switching transistor (3-pin SMD package).

Step 6: PCB Programming and Testing

Once assembled, each one of the 19 PCBs needed to be programmed and tested. First I needed to program the fuse bits to run with an external fast clock, then programmed with the firmware that would respond to serial requests by returning the four sensor values. This code took some work: I wound up running the ADCs at full speed; that is starting the next conversion in the interrupt routine called by completion of the previous conversion. In the interrupt routine I turned on the appropriate IR LED. After cycling through all four IR LEDs and sensors, I did it again with the IR LEDs turned off, storing the values. I then subtracted the the on value from the off value, giving a positive number proportional to the sensed reflectance and normalized by the ambient IR.

Each PCB had its own unique address programmed into it. For sanity-checking I had the microcontroller blink out the address in Morse code on startup so I could make sure I had them programmed correctly. All the PCBs were connected to the same RS-485 bus with a "speak-only-when-spoken-to" protocol: the microcontroller puts data on the bus only in response to a query from the host to its particular address. This prevents bus contention.

Step 7: PCB Assembly and Integration

Once individually fabricated and tested, it was time to wire all the PCBs together. EachPCB has six electrical connections: 5V power, ground, LED data in, LED data out, and two wires for the RS-485 differential serial data. The LED data lines are daisy-chained: each PCB is connected to its upstream and downstream neighbors. All the other connections are busses in that every board is connected to the same four wires.To help connect the busses, I made a bus board out of protoboard that attaches to the threaded rod. This also has sockets to mate with connectors for the top of the globe.

Step 8: Architecture and Programming

Though I could have driven each board's LEDs from the local microcontroller, the architecture I chose has the LED chain and the sensor array completely separated. This was for a very pragmatic reason: if for some reason I could not get the sensor array or multiprocessor communication working, well, I would still have a perfectly great spherical display! In any event, it all worked out well. One drawback of this architecture is that it needs a central computer to drive the LEDs in response to sensor events (as opposed to a local architecture where each board's microcontroller could light the local LEDs with no data communication necessary). I chose a Raspberry Pi because I knew the serial drivers handle my high-rate and non-standard 250 kbaud communication well, this is not the case with many Linux systems. Unlike the BeagleBone it also has 4 USB ports so I could connect both the RS-485 serial interface and the Fadecandy board that drives the LEDs, as well as a USB wifi dongle without a hub. This is crucial as all this gear eventually needs to fit in the hollowed-out interior of the base.

So the general architecture works like this: the host computer(Raspberry Pi) repeatedly queries each PCB for its sensor values in a round-robin fashion. Because each PCB only lights its sensor IR LEDs in response to a query, between-channel optical crosstalk is eliminated. After the state of all the sensors are read, the LED pattern is calculated in response and sent via OPC to the Fadecandy board.

I still have considerable work to do on the programming and interactivity: right now the Ommatid is running a glorified test program written in Python. In the future I will be putting the sensor query and LED generation code in separate processes to improve latency, and I might put the top and bottom spheres on separate RS-485 busses to double throughput. I also have brave ideas (and some progress in simulating) about sensing more sophisticated gestures like pinching and globe-spinning (for trackball-like responsiveness) as well as using the connection graph for simulating reactive patterns. If I can calculate the Laplacian on the triangular mesh, I can simulate the wave equation and reaction-diffusion patterns in response to touch. I'd also like to investigate making the Ommatid generate OSC events so it can be used as a musical controller. Imagine a performance where Ableton is controlled with an Ommatid rather than a laptop!