arduino – Samposium https://samnicholls.net The Exciting Adventures of Sam Mon, 20 Mar 2017 22:47:48 +0000 en-GB hourly 1 https://wordpress.org/?v=5.7.5 101350222 We made a Lego DNA Sequencer to teach kids about DNA, Sequencing and Phenotypes https://samnicholls.net/2017/03/15/lego-sequencer/ https://samnicholls.net/2017/03/15/lego-sequencer/#respond Wed, 15 Mar 2017 23:24:23 +0000 https://samnicholls.net/?p=2217 Every year the university is host to over a thousand primary school pupils as part of British Science Week. Last year you may remember I ran an activity that taught you how to be a sequence aligner through the medium of Lego. Describing what I’d like to do differently in future, I included the following:

To improve, my idea would be to get participants to construct a short genome out of Lego pieces that can be truly “sequenced” by pushing it through some sort of colour sensor or camera apparatus attached to an Arduino inside a future iteration of the trusty SAMTECH Sequencer range.

So I’d like to introduce the Sam and Tom Industrys Legogen Sequenceer(TM) 9001:

The Sequencer

The Lego “Sequenceer” has a tray that holds a small stack of Lego bricks (up to 12). Once loaded, a toothed side to the tray is pulled along a track with a gear attached to a stepper motor that was commandeered from an old Epson printer. An RGB colour sensor sits just over the track on a small acrylic bridge, in very close proximity to the passing bricks. The stepper is programmed to actuate a calibrated number of steps per brick, stopping to allow the RGB sensor to take a reading of a brick’s colour before moving to the next.

Our software then translates these readings to one of the four DNA bases and prints the result on the serial port for the user to see. Once complete, the stepper mode is reversed and the tray is “ejected” such that it protrudes once more out the front of the machine to allow for unloading and loading of the next Lego brick DNA sequence.

The Activity

My previous activity tasked pupils to align pre-constructed Lego DNA sequences to each-other. The goal was to introduce the idea of short read sequencing and how alignment of a short sequence is hard, let alone that of an unexpectedly-long human genome (highest guesses for base pair length were in the thousands). In general, the students enjoyed, but I felt I could massively improve the task if we used Lego for its intended purpose: construction.

This year, after quizzing pupils on what they know about DNA, I once again introduce the concept of using 2×2 blocks of Lego as DNA bases. We have four colours, matching the four bases of DNA we see in our own genomes. I explain that today we will be construct a 10 base DNA sequence, representing the entire genome of a monster, and that we can find out what that genome is with our sequencer.

I designed some official looking forms onto which the pupils could copy out the base calls for each of the ten bases on their monster’s genome, which all happen to be a gene that controls an interesting phenotype:

Our young scientists then look at our “monster datasheet”, that decodes each of the genes and what the phenotype for each of the alleles will be:

Now given their sequenced monster genome, and the datasheet, we ask the pupils to make their monster’s phenotype a reality – draw their monster to add to our records! The hardest part? Naming their newly discovered monster. Finally they add their name and school, to credit them with their discovery. Here was our first of the day. I stole our department’s stamp for an additional feeling of official-ness:

Conclusion

I’m so pleased with how this project has turned out. Not only is the sequencer itself the coolest electronics project I’ve had a hand in, but the reaction to the stand on the day from children and adults alike was so positive. The pupils loved the concept: getting to build something with Lego, and then drawing a cool monster that has special powers is clearly a recipe for success. It is rewarding to see that they’ve enjoyed the activity, as well as to know they can now tell you something about DNA and phenotypes too.

This is a significant improvement on my stand from last year. We get to look at bases and DNA, sequencing and importantly, how changes in bases correspond to changes in a phenotype: which makes us who we are. The activity tells much more of a story, rather than just aligning Lego bricks to explain that a hard problem is indeed hard. Here, we get to physically construct a tangible proxy to a genome and find out something about it through an experiment. Finally we add the result to the sum of knowledge of the subject of monsters discovered so far. My wall is now full of unique and interesting monster records, each drawn with some artistic license from the scientist in charge. Though, despite the large sequence space, I was amused to note the bias for 8 legged and 8 eyed monsters due to an environmental pressure for the selection of the T nucleotide: the colour blue.

Once again, I’ve had much more fun than my regular grumpy-self was expecting. I’ve been reminded that public engagement can be very rewarding.

Engage engagement

Some pictures of the stand from the day:

Finally, here are the images from our inaugural Monster Lab:

Monster Lab: Aber Uni Science Week 2017

tl;dr

  • Tom and I upgraded the old SAMTECH 9000 to create the Arduino-powered self-scanning Legogen Sequenceer 9001
  • Spending the entire day with children continues to be absolutely exhausting
  • Public engagement continues to be rewarded
  • Lego is an excellent way to introduce the concept of DNA, and genomes
  • Blue is a particularly popular favourite colour
  • Primary school children aren’t as dumb as everyone says they are
  • Packs of genuine 2×2 Lego bricks are really bloody expensive
]]>
https://samnicholls.net/2017/03/15/lego-sequencer/feed/ 0 2217
Building a Beehive Monitor: A Prototype https://samnicholls.net/2017/02/28/beehive-monitor-p1/ https://samnicholls.net/2017/02/28/beehive-monitor-p1/#respond Tue, 28 Feb 2017 00:31:11 +0000 https://samnicholls.net/?p=2148 Ever since I got my own beehive last summer, I’ve been toying with the idea about cramming it full of interesting sensors to report on the status of my precious hive. During Winter, it is too cold to perform inspections and since my final inspection last October I’ve grown increasingly curious and anxious as to whether the bees had survived. Last week, I finally got a spell of reasonable weather and I’m happy to report that the bees are doing well.

It has been some time since I’d last seen inside the hive. During and following the inspection, I was once again inspired to find a way to monitor their progress. Although primarily due of self-interest as I find the hive fascinating, monitoring a hive poses both some interesting technical problems, and a platform on which to actually quantify their status. Such information could become much more valuable in future when we have more hives, as it would be possible to establish a baseline from which to detect for anomalies early. But again, mostly because beehives are cool.

Proof of Concept Monitor and Transmitter

I’d been discussing these ideas with Tom, who you might remember from the lightstick project as being the half of the newly created Sam and Tom Industrys venture, who is actually capable of making things with electronics.

We began assembling a quick proof of concept. Handily, Tom had a bunch of DHT22 temperature and humidity sensors lying around which provided a good start. Regarding a microcontroller, I’ve wanted an excuse to play with radio communication for some time, and this was a nice opportunity to try out a pair of Moteino R4s. We also discussed a future idea to use Bluetooth because the idea of pairing my phone with a beehive during an inspection seemed both amusing, and a useful way to download data that is less practical to transmit over radio to a device.

We set to work. Tom retrofitted some code he had previously written to manage wireless comms between Moteinos, while I cut and badly stripped some wires. After some breadboard battling, we had a working prototype in quarter of an hour. Not content with this, we attached the sensors to lengths of flat cable, added a TEMT6000 ambient light sensor and a NeoPixel ring for no particular reason other than the fact we love NeoPixels so much.

Base Station Prototype Receiver

We verified that the packets containing the sensor data were being broadcast. The Moteino queries the sensors and loads the results into a struct which is then transmitted over radio as binary by the RMF69 library. Now we needed a base station to pick up the transmissions and do something interesting.

We already had packet handling and receiving code for the second Moteino, but as we’d decided (for the time being at least) that the endpoint for the sensor data was to be hosted on the internet, we needed WiFi. Enter our WiFi enabled friend: the ESP8266.

We needed to establish a software serial link between the Moteino and ESP8266, a trivial use-case that can be solved by the Arduino SoftwareSerial library. I did encounter some familiar confusion arising from our trusty pinout diagram, connecting the marked RX and TX pins on the NodeMCU to the Moteino interfered with my ability to actually program it. We instead set the RX/TX pins to the unused D1 and D2 respectively.

The receiving Moteino captures each transmitted packet, reads the first byte to determine its type (we’ve defined only one, but it’s good to future proof early) and if it is the right size, passes the binary data through its serial interface to the ESP8266 before returning an acknowledgement to the transmitting Moteino.

The NodeMCU uses the ESP8266WiFi library to establish a connection to the SSID defined in our configuration file. The ESP8266 Arduino repository serves as both a helpful lookup of the source, and typically features examples of each library.

Once connection to the designated SSID has been negotiated, the controller listens to the software serial receive pin for data. When available, the first byte is read (as before on the Moteinos) to determine the packet type. If the packet type is valid, we listen for as many bytes as necessary to fill the struct for that particular packet type. We ran in to a lot of head scratching here, eventually discovering that our packet structs were larger than calculated, as uint8_t‘s are packed with 3 leading bytes to align them to the architecture of the NodeMCU. This caused us almost an hour of debugging many off-by-some-bytes oddities in our packet handling.

Once a valid packet’s binary stream has been loaded into the appropriate struct, we translate it to a JSON string via the rather helpful ArduinoJSON library. As the ESP8266HTTPClient requires a char* payload, I wasn’t sure of the correct way to actually transmit the JSONObject, so I rather unpleasantly abused its printTo function to write it into a fixed size char array:

[...]
    StaticJsonBuffer<256> jsonBuffer;
    JsonObject& json = jsonBuffer.createObject();

    jsonifyPayload(json, payload, pkt_type);
        
    http.begin(endpoint, fingerprint);
    http.addHeader("Content-Type", "application/json");

    char msg[256];
    json.printTo(msg);
    Serial.print(msg);

    int httpCode = http.POST(msg);
[...]

Note the fingerprint in the http.begin: a required parameter containing the SHA1 fingerprint of the endpoint address if you are attempting to use HTTPS. Without this, the library will just refuse to make the connection. It would have been trivial to diagnose this earlier if we could actually work out how to properly enable debug messages from the ESP8266HTTPClient library. Despite our best attempts, we had to force the debug mechanism to work by manually editing away the guards around the #define DEBUG_HTTPCLIENT header, in the installed packages directory. No idea why. Grim.

Other time wasted involved a lot of tail chasing given after what appeared to be the delivery of a successful POST request, a httpCode of -5. It would later turn out that the Python SimpleHTTPServer example I lazily downloaded from the web failed to set the headers of the response to the POST request, causing the ESP8266HTTPClient to believe the connection to the server was lost (an all-else-fails assumption it makes if it is unable to parse a response from the server). The -5 refers to a definition in the library header: #define HTTPC_ERROR_CONNECTION_LOST (-5).

Finally, we received HTTP_CODE_OK responses back from our test server, which happily dumped our JSON response to the terminal on my laptop…. After I remembered to join the WiFi network we’d connected the NodeMCU to, at least.

The Web Interface: Back from the Dead

We have established end-to-end communications and confirmed that sensor data can find its way to a test server hosted on the local network. The final piece of the puzzle was to throw up a website designed to receive, store and present the data. Somewhat fortuitously, a project that I had begun and abandoned in 2014 seemed to fit the bill: Poisson, a flask-fronted, redis-backed, schema-less monitoring system. Despite not looking at it in almost three years, I felt there might be some mileage in revising it for our purpose.

Initially Poisson was designed to monitor the occurrence of arbitrary events, referred to by a key. A GET request with a key, would cause Poisson to add a timestamped entry to the redis database for that key. For example, one could count occurrences of passengers, customers, sales or other time sensitive events by emitting GET requests with a shared key to the Poisson server. The goal was to generate interesting Poisson distribution statistics from the timestamped data, but I never get as far as the fun mathematics. The front end made use of the hip-at-the-time socket.io framework, capable of pushing newly detected events in real time to connected clients, and drawing some pretty pretty graphs using d3 and cubism.

After some quick tweaks that allowed requests to also specify a value for the latest event (instead of assuming 1) and provided an API endpoint for POST requests that wrapped the mechanism of the original GET for each key:value pair in the JSON data, we were ready to go.

Until I discovered all this newfangled websocket nonsense didn’t seem to play well with apache in practice. After much frustration, the best solution appeared to involve installing gevent with pip on the server, which according to the flask-socketio documentation for deployment would cause flask-socketio to replace the development server (typically inappropriate for production use) with an embedded server capable of working in prod. I could then invoke the application myself to bind it to some local port. After some fiddling, I found the following two lines in my apache config were all that was needed to correctly proxy the locally-run Poisson application to my existing webserver.

ProxyPass http://localhost:5000/
ProxyPassReverse  http://localhost:5000/

Testing the Prototypes (not in a hive)

Clearly we’re not hive ready. We’ve planted the current version of the hardware at Tom’s house (to avoid having to deal with getting the ESP8266 to work with eduroam‘s 802.1x authentication) and had the server running long enough to collect over 100,000 events and iron out any small issues. We’re currently polling approximately every 30 seconds, and the corresponding deployment of the Poisson interface is configured to show over 24 hours of data at a resolution of 2 minutes, and the past 90 minutes at a resolution of 30 seconds.

All in all I’m pretty pleased with what we’ve managed to accomplish in around two days of half baked work, and the interface is probably the shiniest piece of web I’ve had the displeasure of making since Triage.

Conclusion

We’ve collected over 100,000 events with little issue. I’ve ordered some additional sensors of interest, on the way are some sound sensors, and infrared sensors. The latter I shall be using in an attempt to count (or at least monitor) bee in/out activity at the hive entrance.

It won’t be until nearer the middle-to-end of March that the weather will likely pick up enough to mess around with the hive at length. Though we must bee proof the sensors and construct a weather proof box for the transmitting Moteino before then. You can also look at the microcontroller code in our Telehive repository, and the Poisson flask application repository.

In the meantime, go spy on Tom’s flat live.

tl;dr

  • Moteinos are awesome, low power radio communications are cool
  • The Arduino IDE can be a monumental pain in the ass sometimes
  • We had to manually uncomment lines in the headers for the ESP8266HTTPClient library to enable debugging, we still cannot fathom why
  • You need to specify the SHA1 fingerprint of your remote server as a second parameter to http.begin if you want to use HTTPS with ESP8266HTTPClient
  • Your crap Python test server should set the headers for its response, else your ESP8266HTTPClient connection will think it has failed
  • ESP8266HTTPClient uses its own response codes that are #define‘d in its headers, not regular HTTP codes
  • uint8_t‘s are packed with 3 bytes to align them on the NodeMCU ESP8266 architecture
  • This kind of stuff continues to be fraught with frustration, but is rather rewarding when it all comes together
]]>
https://samnicholls.net/2017/02/28/beehive-monitor-p1/feed/ 0 2148
Controlling a NeoPixel LED Strip and SD Card with a Arduino ESP8266: My Lightstick Prototype https://samnicholls.net/2016/11/07/lightstick-p1/ https://samnicholls.net/2016/11/07/lightstick-p1/#respond Mon, 07 Nov 2016 10:26:12 +0000 https://samnicholls.net/?p=1465 For quite some time now, I’ve been wanting to make an LED lightstick as both a nice side-project to learn some hardware concepts and also to take pretty long-exposure pictures. However I’d always been a bit unsure as to where to start. Luckily for me, I’ve crossed paths with someone who knows what they are doing: in the shape of Dr. Tom Blanchard.

To skip the technical rambling and see some pretty pictures: check out my Lightstick Flickr album.

The Initial Prototype

Part of the reason I had put off starting this project was having to acquire and assemble many RGB LEDs into an array myself. Luckily Adafruit stock configurations of LEDs with an integrated driver (which they term NeoPixels) in various form factors, including chained together in 1m strips. Perfect! Now I could control a metre of RGB LEDs with just one pin from a microcontroller and some power. Having neatly avoided the hard task of sourcing and wiring-up enough RGB LEDs, and devising a signalling strategy to control them, the project was off the ground.
On Tom’s advice I purchased or commandeered the following components:

There are already plenty of good tutorials on driving Neopixels with an Arduino (Adafruit’s help pages are great), but I’ll attempt to briefly describe the construction of our first prototype, without giving the impression that this is a tutorial.

The primary structural component is sellotape; affixing the NeoPixel strip to the wooden dowel (which was not previously used by Tom to hang laundry), as well as the Arduino, voltage regulator and cabling. The battery cables are connected to the VIN and GND pins on the input side of the voltage regulator, responsible for stepping down the voltage to the 5V desired by the Arduino and Neopixels. The VOUT and GND regulator pins are connected separately to both the Arduino, and Neopixel strip.

img_20161031_190729

Tom bastardised a USB Type-B cable to power the Arduino Uno from the voltage regulator. The Neopixels are controlled with just one pin from the microcontroller (the green cable in the diagram below), any spare digital output pin will do. There seemed to be an additional grounding cable (white) from the NeoPixel strip, so I plugged this into the Arduino GND.

img_20161031_190706

In terms of programming, the hard part of interfacing with the NeoPixels has been dealt with by the lovely Adafruit team already, and their library can be easily downloaded via the Arduino IDE (see instructions for details). To test the prototype we just loaded the strandtest sample code.

Switching the Arduino for an ESP8266

The Arduino is not only rather chunky, but pretty overpowered for simply driving a NeoPixel strip. Despite a working prototype, I wanted to switch out the Arduino for something smaller. Checking my robotics toolbox, I found an ESP8266 lying around in the form of a NodeMCU.

Programming an ESP8266 with the Arduino IDE

Helpfully, we can program the ESP8266 from the Arduino IDE. This is nice, because although I don’t particularly like the Arduino IDE, it means I don’t have to learn how to program the controller from the command line properly; a boon for my lazy self. The relevant ESP8266 board configuration packages can be installed using the Arduino IDE Boards Manager (short instructions here).

For testing purposes I borrowed a NeoPixel Ring, primarily because it can be powered over USB (via any of the 3.3V output pins), and also for its more manageable size when breadboarding. I attached the NeoPixel Ring to one of the 3.3V and GND of the board. I modified strandtest to adjust the number of NeoPixels in the constructor (the ring has 12, compared to the 1m strip of 144) and inserted the signal cable in pin D6.

Nothing happened.

After more swearing and head scratching than I would like to admit, I discovered that the ESP82661 uses GPIO pin numbers, not physical pin numbers. That is, one must refer to the pinout diagram to discover the correct pin numbers to use when referring to them in code. More embarrassingly, and perhaps more critically I found that the circuit worked if I shifted my power and ground cables on the breadboard such that they actually lined up with the 3.3V and GND of the NodeMCU. Woops.

img_20161104_165649

For future reference I’ve found the following two pinout diagrams quite invaluable.
Note the diagrams are in different orientations, which totally didn’t cause me to incorrectly wire my NodeMCU twice.

esp8266-node-mcu-pinout

esp8266_devkit_horizontal-01

ESP8266 Lightstick Prototype

Having got the ESP8266 working with the NeoPixel on the breadboard, it was trivial to switch out the Arduino in our prototype. Simpy connect the VIN and GND pins to the voltage regulator output in place of the Arduino, and connect the NeoPixel signal cable to any free GPIO pin (remember ~D6 is GPIO 12). When programming the NodeMCU over USB, always unplug VIN and GND: otherwise the circuit will attempt to light the entire NeoPixel strip over USB…

Controlling an SD Card Reader with a ESP8266

Tom and I had an idea where we wanted to be able to load LED configurations from files on an SD card. Unfortunately this involved moving everything back to a breadboard, ruining the lovely miniaturised form factor we’d earned by switching to the ESP8266 in the first place. I unsoldered the cables attached to the controller and repinned everything on the breadboard, with an SD card reader I had ordered for another project that I’ve been putting off.

Getting the SD card working turned out to be a rather frustrating side-quest. I had difficulty finding documentation online, as well as decently drawn wiring diagrams for terms including “8266” and “SD Card”. Tom suggested I search instead for information on the “SPI” pins (Serial Peripheral Interface) instead. This yielded better results, but I still had trouble trying to read an SD card. I was using the various code examples hosted by the Arduino project on Github for the SD card library to test the set-up, to no avail. I referred to my pinout diagrams repeatedly, before eventually noticing that one of them was upside down when compared to the other, which had complicated my attempts to wire and configure the board.

Despite this, I still couldn’t seem to get a signal over the SDCMD (CS) pin. After almost half an hour of searching, I read that this SPI interface would often conflict with operational code already running on the microcontroller. I don’t know if this is true, but I swapped over to the HSPI pins (Hardware SPI) on the opposite side of the board (GPIO 12 to GPIO 15) in the following configuration:

  • CS to D8 (GPIO 15)
  • DI to D7 (GPIO 13)
  • DO to D6 (GPIO 12)
  • CLK to D5 (GPIO 14)
  • 3v and GND to the likely suspects…

img_20161106_153325

Success! Finally the CardInfo and ReadWrite examples worked! I could initialise an SD card, as well as read and write arbitrary files on that card. I created a quick file format where the first line consists of the number of RGB colour lines in the file; N followed by that many 10 byte lines containing 0-padded strings defining an RGB colour in the format: RRRGGGBBB\n. For example:

4
255020147
255000000
000255000
000000255

The latest prototype can now read this format from an SD card, and cycle through the selected colours. Cool! If you’re interested, our Lightstick code is on Github. Below is the v0.2 prototype breadboard affixed to the ex-laundry stick, with my favourite structural adhesive2.

img_20161106_164626

Pretty Photos

Our lightstick was ready for a proper test. We wandered campus last night looking for interesting patches of darkness to paint and took a bunch of long-exposure images. I’ve put a few of them on Flickr (below). Although just a prototype, I’m really pleased with the results. There are still a bunch of enhancements to make, and a few issues to iron out. Particularly voltage monitoring (so as to not drain the battery permanently), and levelling the brightness of some of the LED colours which can wash out the others.

Lightstick

It’s really cool to have actually put something together that works!

tl;dr

  • NeoPixels are pretty cool
  • Sellotape is a perfectly acceptable structural component no matter what Tom says
  • The ESP8266 is a tiny board of awesome
  • When specifying pins on the ESP8266, you need to use the GPIO number (e.g. D6 == GPIO 12)
  • When comparing pinout diagrams, ensure they are the same orientation to avoid wiring everything incorrectly multiple times
  • Electronics seems to be quite a frustrating endeavour
  • It’s quite easy to fuck up electronics and make things not work permanently3
  • I MADE A THING

  1. Or at least the board I have… 
  2. Tom finally suggested that I should perhaps consider switching to electrical tape, for its anti-static, anti-conductive properties, which I gather come in handy in this field, or something. 
  3. I haven’t done this yet, but I’ve come close to irrevocably draining my new battery 
]]>
https://samnicholls.net/2016/11/07/lightstick-p1/feed/ 0 1465