Originally the idea was to have a "video screen" made up of single-coloured LEDs, but using these fully-addressable RGB WS2812 jobbies, we're concentrating more on patterns and simple shapes, rather than actual images or text. So the plan is to have an editor where the user can create individual "frames" for a pattern, which will be stored in an eeprom chip (maybe something simple like a 24C256 serial eeprom). Then an animation pattern can be played by simply saying "display frame x, scroll left for y frames, with a delay of z milliseconds between each".
Our fully addressable RGB LEDs have multiple levels of reg, green and blue, thanks to a 3-byte interface (each colour has intensities 0-7, allowing the full range of RGB colours from 0x00 to 0xFF to be displayed). But in truth, there's not really much difference, on the LED output between RGB(255,0,0) and RGB(250,0,0). Similarly, the purple produced using RGB (255,0,255) looks very similar to the one produced by RGB(240,0,240) and similar subtle differences in intensity of red and blue.
Since each "frame" is a 6x5 grid (30 pixels) and each pixel is 3 bytes, to store the RGB values for a single full colour frame requires 90 bytes. For both lenses, that'll be 180 bytes. For a single frame. Our 24C256 eeprom has 32,768 bytes of memory. That sounds a lot, but it works out at just 182 individual frames (32768 / 180 = 182.04). This may be enough, but it'd be nice to have the extra capacity for "stop frame" animation, should the need arise.
So we've decided that instead of a full 24-bit colour interface, giving a palette of 16 million colours, we're going to use the GIF image format idea, and use a palette for each image that makes up a frame - simply reduce each colour to 2R, 2G, 2B and use the numbers 0-63 to refer to each individual colour.
Sure, we won't get 255 different shades of purple, for example, but we'll have about four. And four different shades of purple is more than enough, for drawing patterns on our glasses lenses! This allows us to store the colour of each pixel in the image in a single byte; meaning one frame of animation requires only 30 bytes per lens. And our eeprom can store over a thousand individual frames of animation (546 per lens).
When converting our stored image to a 24-bit value for the LEDs, we'll simply read the first 6 bits of the value from eeprom and break each value into 3 lots of 2-bit values (each part having the value 0-3).
On a value zero (b00) we send, obviously, zero.
On a value one (b01) we send 255/3 = 85 (or 0x55 in hex)
On a value two (b10) we send 2*(255/3) = 170 (0xAA in hex)
On a value three (b11) we send 255 (or 0xFF in hex).
In truth, when reducing colours to a low-bit pattern, red and blue tend to take precedence over green. So reducing a 24 bit 8R-8G-8B colour to 16 bit is usually done by reducing to 5R 6G 5B. Green gets the extra bit, as the human eye can detect a wider range of shades in the green spectrum, apparently (http://en.wikipedia.org/wiki/High_color). We've also found, with some RGB LEDs, that the red component can sometimes be a little overpowering (though on some, it's the blue element). So when we come to actually implement our bit-depth reduction, we can "weight" two of the three component colours, should any one be more prominent that the others. In practice, we may actually end up with 2R 3G 3B for example.
So we'll simply store each frame of our animations as a stream of 30 bytes (each pixel being a palette colour reference) in eeprom, jump to the correct address where the image information begins, and read back 30 bytes. We could even program the microcontroller to perform simple animations with each single frame, instead of having to keep reloading the information.
For example, we might load an image into memory, and draw it from the top-right corner, to the bottom left, using the sequence below:
We draw our images from top-left to bottom right, simply because of the LED pixel layout - the data in pins are on the bottom right of the LED, so it makes sense to push the data into the first pixel from the right, shunting it along to the next LED (to the left).
If we wanted to scroll the frame upwards, for example, we don't need to go back to the eeprom to load a second frame of animation - we simply do some array value swapping as shown on the spreadsheet under "shift up": so array(1) takes the value in array(7), array(7) takes the value in array(13), array(13) takes the value that was in array(19) and so on. Once all the values have been shifted around in the array, the 30 colours (90 bytes) of data are shifted into the LED matrix, to get the new, scrolled image to appear.
Of course, all this is fine in theory. What we need to do is finish soldering up at least one lens, to allow us to actually test it all out!
No comments:
Post a Comment