Introduction: Pumpkin Spicer

About: Member of Protospace, Calgary's hackerspace/makerspace. Currently studying software development.

It's a pumpkin with Sean Spicer's face on it. It emits a pumpkin spice scent. It's Pumpkin Spicer!

Bonus: Laser-engraved Sean Spicer face. Sean Spicer soundbites.

Step 1: Overall Architecture/Design

Inside the pumpkin will be the following, for the basic version:

  • A lamp (in my case, an LED or incandescent lightbulb)
  • Pumpkin spice scent emitter (in my case, a hacked-together wax melter with a Glade pumpkin spice wax melt)

Extra stuff for the advanced version:

  • Motion sensor
  • Microcontroller board (in my case, a NodeMCU v2 ESP8266 development board)
  • Audio playback module (in my case, a DFRobot DFPlayer Mini)
  • Speakers (in my case, some old computer speakers)
  • Switchable lighting (in my case, some signage LEDs)

A Raspberry Pi would likely be easier to use for this project, if you use one that has an audio output port. The ESP8266 isn't good at DAC, so that's why I'm using the DFPlayer. You wouldn't need that if you used a Raspberry Pi.

The motion sensor will send a signal to the microcontroller when it detects someone walking by. The microcontroller will react to this by choosing a random sound clip and sending a message to the audio module to play that sound clip. It will also (optionally) provide a Wi-Fi-accessible web interface so you can tell it to play specific sound clips using your phone or computer. The audio module will read sound files from a microSD card or USB flash drive and send the analog audio to the speakers. The pumpkin spice scent emitter will just run continuously because that's how wax melts work—if you use an aerosol spray device instead, you could also trigger it from motion, but I couldn't find a pumpkin spice aerosol locally.

Step 2: Pumpkin Spice Scent

I bought some Glade wax melts with pumpkin spice scent for about $5 from Walmart. They seem to be really expensive (like $50) everywhere online, at least in Canada.

If you have a Glade wax melter, or if you're using an aerosol, skip the rest of this step.

I didn't have a wax melter for them, so I looked it up and Big Clive has done a teardown video on it. He says it has a 20-watt heater. I used a 60-watt incandescent lightbulb to test the wax, using the bottom of a soft drink can. It smelled more like plain wax than pumpkin spice, but just sniffing the cold wax cubes smelled great, so I guessed that it was being overheated.

I was planning to build a resistive heater (equivalent to the Glade one), but ran out of time before Maker Faire—see later steps. Anyway, I just put the can bottom next to the lightbulb in the pumpkin and it worked fine to produce the scent without overheating the wax.

Step 3: Test Laser Cutting and Engraving of a Pumpkin

At Protospace we have two laser cutters, a Rabbit RL-80-1290 (80 watts, HV-excited CO2) and a Trotec Speedy 300 (80 watts, RF-excited CO2). I'm using the former for this project. The Trotec does have an option for engraving to different depths based on the darkness of the image, but I found that that wouldn't work well on a pumpkin—see the next step.

For cutting, I found that speed 0.5 (where speed 1 is a bit more or less than 1 mm/s—I don't remember which way it deviates) and power 100 (80 watts) was sufficient to cut all the way through one side of the pumpkin. However, this laser cutter has a longer lens on it than most hobby CO2 laser cutters do, so you might have to do multiple passes at different depths if you have a standard lens.

For engraving, I found that there was a wide range of settings that would work. This instructable says that pumpkin engraves well using the same settings as for leather. I've never done leather, but that might be a good starting point, and should be easy to look up for whatever power of laser cutter you have.

Anyway, find some good settings and techniques for your laser cutter and your pumpkins, by trial and error.

Step 4: Design Spicer Face

Start with a picture of Sean Spicer. I used this one. Copyright-wise, it's in the public domain because it was taken by an employee of the US government.

Extract Spicer's face. I used Pixlr, a Web-based Photoshop clone that happens to also be owned by Autodesk (the owner of Instructables). I cropped the image and then applied a layer mask to extract his face. To be able to apply a layer mask, the layer must first be changed from a background layer to a regular layer, which you can do by double-clicking the lock icon on it in the Layers palette. Then apply a layer mask using the icon at the bottom of the Layers palette, and paint black on the layer mask where you want the face to show up. You can use the eraser tool if you make some of the background visible. The advantage of using a layer mask instead of erasing the background is that it's easily reversible if you erase too much.

The rest of this step is stuff that didn't work for me/this image/this laser cutter. I present it for completeness and in case it works for someone else. What worked for me is in the next step.

Once that was done, I first tried layered engraving by making several copies of the layer, and thresholding each at a different value. Then I exported each layer to a separate 1-bit BMP file, which is what the laser cutter software we use for the Rabbit laser cutter likes. Then I engraved each layer one after the other, such that where the image was lighter (e.g. Sean's forehead) it would be engraved more times. That produced the Frankenstein-looking face you see on the pumpkin.

That wasn't the look I was going for.

I next tried tracing the contours of his face using the "Trace Bitmap" feature of Inkscape, an open-source vector graphics editor. To do that, import the raster image, right-click it, and choose "Trace Bitmap". In the screenshots above, I used the cropped but not extracted one. That produced something that looked very similar to the thresholded layers I created in Pixlr, so I gave up on this.

My next attempt was manual tracing, i.e. drawing Sean Spicer's face by hand based on the photo. I used the pen tool in Inkscape. When I was done I made the lines orange and put them on a black background to simulate how it would look vector-engraved on a pumpkin. It didn't look recognizable.

Then I thought to try halftone…

Step 5: Spicer Face Halftone and Good Engraving

This is the method you'll probably want to use. I went back to Pixlr to try halftone. What I found worked best was:

  1. Scale up the image by 3 times or so using Image Size.
  2. Run the halftone filter with Inverse on and Add off.
  3. Export 1-bit PNG. (This is the fourth image above. Go ahead and download it; the subsequent steps will probably vary depending on your laser cutting software.)
  4. Open in Microsoft Paint.
  5. Export 1-bit BMP.
  6. Engrave! I tested a couple of times on some regular printer paper and also on the back/bottom of the pumpkin.

Step 6: Gut the Pumpkin

You can also do this after engraving, but it will be lighter to handle if you do it first. Also, if you gut it first, you can see how it looks illuminated as soon as you take it out of the laser cutter, and you won't risk damaging the engraving (which I learned is easy for halftone) while gutting it. Scrape the inside well, especially where you're planning to engrave. I should have scraped mine more to let more light through.

Also, consider doing something to enhance the contrast. You could spray-paint the outside of the pumpkin black (and maybe then orange again), or cover it with something else opaque. You'll need to do that before engraving, because once the pumpkin is out of the laser cutter, you'll have a very difficult time getting it perfectly aligned to engrave again. Anyway, I didn't do that stuff and it worked okay, but could have been better.

Step 7: Engrave the Pumpkin!

If you're satisfied with everything so far, go ahead and put your pumpkin the laser cutter front-side-up, and run the engrave! Then put in a lamp of some kind, and the pumpkin spice wax melts. I used an LED bulb first, and it was super bright (way brighter than the room lights at Protospace, even through the pumpkin), but it wasn't hot enough to melt the wax. A 60 W incandescent bulb melted the wax, but was a bit dim. Unfortunately, I only had one of those outlet-to-bulb adapters. I should have scraped the pumpkin thinner at the front, but I was in a huge hurry by this point.

Anyway, the minimum viable product is now done! It's a pumpkin with Sean Spicer's face on it, and it produces a pumpkin spice scent!

Step 8: Show It Off!

Maker Faire Calgary was this past weekend, and I was volunteering to staff the Protospace booth. I didn't get Pumpkin Spicer done in time for day 1 of the Faire, but I did (barely) get the MVP done for day 2.

Step 9: Get Sean Spicer Audio Clips

For the audio portion of this project, you'll need to obtain some audio clips of Sean Spicer saying things.

Wikimedia Commons has a lot of videos of his press conferences. Like the image of his face I used, they're from Voice of America and therefore in the public domain, so we're free to use them.

You'll need to download them, convert them to audio-only, cut them up, and save them as MP3 for the DFPlayer Mini to use.

To download them, right-click the download button for the lowest-quality Ogg file in the "Transcode status" section of the file page on Wikimedia Commons and choose "Save file as…" (or equivalent in your web browser). Save all the ones you want into a folder. Then open VLC and choose "Convert / Save…" in the File menu. Add your files in the top pane of the window. Click "Convert / Save" at the bottom. In the next window choose "Audio - Vorbis (OGG)" from the Profile popup menu and choose a destination for the files. Then click "Start". This shouldn't take long.

Now open one of the audio files in Audacity and play it until Sean says something silly. Select the relevant section of the audio by dragging over it. Then choose "Export Selected Audio…" from the File menu and save it as an MP3 with constant (preset) bit rate (because that's more likely to be compatible).

Name all of your files as NNN.mp3, where each N is a digit (e.g. 001.mp3 to 999.mp3) and put them in a folder called 01. Put that folder on your microSD card, right in the root directory (i.e. not inside another folder on the card). Now put the card in the DFPlayer.

Step 10: Get Started With the ESP8266 or Other Arduino-compatible Board

To react to the motion sensor's signal and send the necessary signals to the DFPlayer, we'll need to use a microcontroller. I've wanted to learn to use the ESP8266 for ages, so that's what I'm using. (Specifically, I'm using an ESP8266 development board called the NodeMCU v2. I was going to use a SparkFun ESP8266 Thing Dev Board, which I was pretty sure I had, but I couldn't find it.) The ESP8266's built-in Wi-Fi will enable web control of the Sean Spicer soundbites. I'll be programming it using the Arduino IDE and language, though, so if you want to use any other Arduino-compatible board (such as an Arduino/Genuino Uno, Intel Galileo, etc.) you should be able to port the code easily by removing or adapting the parts that deal with Wi-Fi and the web interface.

Download and install the latest Arduino IDE, if you don't already have it. As of this writing, it's version 1.8.5. Once it's installed, install the ESP8266 board definitions. There are good instructions for this here and here, so I won't rewrite that.

Connect your board to your computer. The NodeMCU v2 has a built-in USB-to-serial converter chip (the Silicon Labs CP2102), so I can just use a USB cable. Some other boards, including the Arduino Uno, have such a chip, but others, such as the Arduino Nano and the SparkFun ESP8266 Thing (non-Dev) don't. If your board doesn't have the chip, you'll need to use an external converter, such as an FTDI or CP2102 board or cable.

Now, choose your board and COM port from the Tools menu and try running some example sketches to make sure it's all working. (A note about the NodeMCU v2: Use the NodeMCU 1.0 (ESP-12E Module) board definition in the Arduino IDE.) I tried Blink, WiFiScan, and HelloServer, and they all worked just fine right away.

Step 11: [ESP8266-specific] What We Can Learn From the HelloServer Example

In the previous step I mentioned that one of the examples I used to make sure my ESP8266 was working was the HelloServer example. We can learn a few things by inspecting its code:

const char* ssid = "........";  // at the top of the code
const char* password = "........";

WiFi.begin(ssid, password);  // in setup()

I initially thought that this sketch would cause the EPS8266 to create its own Wi-Fi network (i.e. act as an access point) with the SSID and password set here. This is not the case. This sketch actually connects to an existing Wi-Fi network. You need to type in the SSID (network name) and password of that network here, so that the ESP8266 can connect to it.

ESP8266WebServer server(80);  // back at the top of the code

That looks like it creates an object of type ESP8266WebServer, listening on port 80.

void handleRoot() {server.send(200, "text/plain", "hello from esp8266";}
void handleNotFound() {…}

These are functions that define the web server's responses to certain requests. They don't define what those requests are, though; that's handled by the following bindings in setup():

server.on("/", handleRoot);
server.onNotFound(handleNotFound);

These, being in the setup() function, only run once when the ESP8266 starts the program. They bind the response functions defined above to specific requests that the web server may receive. Notably, you don't call the function (e.g. handleRoot(), with () after the name); you just pass it by reference (i.e. you give the name of the function, and the server will call it when it needs it).

You can also bind a response inline:

server.on("/inline", [](){server.send(200, "text/plain", "this works as well");});

I'm not sure what's going on with the [](){}, but this shows that you don't need to bind a function to a request—you can just specify the response directly in the binding if you want to. Using functions is good for abstraction and DRY purposes, though—you can call the same handler function from multiple requests without repeating the whole response code.

server.begin();

Don't forget that line!

Also, in loop():

server.handleClient();

I'm not sure why the server needs to be repeatedly told to handle incoming requests, but it looks like it does, so don't forget that either.

One final thing. Back up in setup(), we find:

if (MDNS.begin("esp8266")) {
    Serial.println("MDNS responder started");
}

This code starts the mDNS responder. mDNS is a way to provide DNS on a local network without needing a DNS server. In other words, it should you type "esp8266.local" into the address bar of your browser, instead of typing in the IP address of the ESP8266. Unfortunately, it didn't seem to work for me. I did see "MDNS responder started" in the serial console, so it did start, but it seems Chrome and/or Windows doesn't know how to use mDNS or .local addresses. (Note for Chrome users: When you type "esp8266.local" into your address bar, Chrome will think you want do a Google search for that string. Typing in "http://esp8266.local" should prevent that. But, for the above reason, it still may not work.)

It would sure be nice to be able to type in "pumpkinspicer.local" to be able to control Pumpkin Spicer. But it doesn't seem to be working, and I want to use an Android device to control it, but Android doesn't support mDNS, so I guess that's not going to work. Maybe I'll have the ESP8266 create its own network called "Pumpkin Spicer" instead.

One thing that I didn't see was any call to yield(). My understanding from listening to my friends talking about ESP8266 programming is that you have to call yield() often to allow the Wi-Fi to work; otherwise, the ESP8266 will crash. I'm assuming there's a call to yield() inside server.handleClient().

That's what I learned just by reading this one example sketch. Not bad for not having read any documentation yet.

Step 12: [If Necessary] Modify Motion Sensor

The PIR motion sensor I have is this one. Its chip (EG4001) has a built-in timer that keeps the output high for a stated 20 seconds after it detects any motion. It also does not allow retriggering (extending the time of high output by continued motion). 20 seconds is too long for this application—it would stay high for a while after the audio clip ends, and that would prevent triggering of another audio clip by a new motion. Lack of retriggering support isn't really a problem for this application—I want a new signal for each new motion anyway.

The EG4001's datasheet is only available in Chinese. Fortunately, Google Translate allows you to upload a PDF and get it translated. Unfortunately, the output lacks images, and the text is sometimes overlapping. Fortunately, I have fixed those problems in the above screenshot by putting the original and the translated version side-by-side and editing the translated version so the text doesn't overlap.

The key point here is that the timer works by counting cycles of an oscillator (how most electronic timers work), and it always counts the same number of cycles (dependent on the version of the chip you've got—I've got the A version), so the period/frequency of the oscillator determines how long the timer will run. That is determined by the values of the resistor and the capacitor connected to the chip. The diagram at the upper left of the Chinese datasheet page shows how this works: The capacitor charges through the external resistor. When it reaches 0.6*Vdd, the internal resistor is switched on to discharge the capacitor. When the capacitor reaches 0.4*Vdd, the internal resistor is switched off and the capacitor again charges through the external resistor.

This means that we can replace the external capacitor and resistor with ones of different values to obtain a different oscillator period, and therefore a different timer delay. The datasheet provides the necessary formulas:

Tosc = 0.4*Rt*Ct*(Rt/(Rt-20 kΩ))

Tx = 100,000*Tosc (Time high after motion detected)

Ti = 20,000*Tosc (Time motion cannot be detected again after Tx ends; 20,000 is for the A version of the chip)

The stock values are Rt = 510 kΩ and Ct = 1.6 nF. With these values, the

According to those formulas, to minimize Tx, we need to minimize the Rt and/or Ct values. However, Rt cannot be lower than 100 kΩ. I was planning to replace them with two 56.2 kΩ resistors in series (for 112.4 kΩ) and three 33 pF capacitors in parallel (for 99 pF = 0.1 nF), resulting in Tx of 0.5 s and Ti of 0.1 s.

However, do you know what's a really small capacitor? No capacitor! And do you know what's easier than removing a capacitor and a resistor and replacing them with five components that the board isn't made to fit? Just removing one capacitor! Does that work? It does! Here's the behavior of the motion sensor after just removing the stock capacitor and not replacing it with anything:

TODO embed video

That'll work well enough for me. YMMV, though. The datasheet doesn't specify a minimum capacitance value as far as I can tell, but the chip might not be designed for nearly zero capacitance.

Step 13: Connect PIR Motion Sensor to ESP8266

The motion sensor has an onboard voltage regulator to regulate the voltage you give it (5 V to 9 V) down to 3.3 V for the actual sensor and chip. It outputs a 3.3 V signal. Also, the ESP8266 board I'm using is going to be powered from USB, with its own 3.3 V regulator. However, the sensor's output is 5 V and the ESP8266 is not 5 V-tolerant, so it's necessary to use a resistor voltage divider to reduce the signal voltage.

Step 14: Program ESP8266 to Take Input From PIR Sensor

The ESP8266 needs to take the signal from the motion sensor, and the best way to do that is to use an interrupt. In the Arduino language, you use attachInterrupt() to do that. The ESP8266 supports interrupts on all GPIO pins except 16, so don't connect your sensor to pin 16.

To keep the ISR short, I just wrote it to change a variable from false to true. Then, when control goes back to the main part of the program and the main loop runs its next iteration, it will notice that the variable is now set to true, and will send a message to the DFPlayer to play a sound file. In this test program, though, it only turns on the LED for 0.3 seconds, to show that it's working.

Step 15: Connect DFPlayer to ESP8266

My DFPlayer came with one of its header rows soldered at an angle, so it wouldn't sit in the breadboard. To fix that, I took the plastic part off the header—it just slides off—and bent the pins straight with my fingers.

For the connections, see the DFPlayer wiki page and the DFPlayer manual [PDF download]. Connect Vcc to the 3.3 V pin or the Vin pin on the ESP8266. (The DFPlayer's input range is 3.2–5 V.) Connect Rx and Tx to two GPIO pins on the ESP8266. Make a note of which pins you used.

Find a way to connect your speakers to the DFPlayer. I used a headphone socket salvaged from an old optical drive. You could also just cut the plug off your speakers' cable and hard-wire it.

Step 16: Connect Speakers

Get some speakers that will fit inside your pumpkin with room to spare. The DFPlayer can output a low-level signal that's appropriate for headphones and computer speakers that have built-in amplifiers, and it can also output an amplified signal to drive unamplified speakers up to 3 watts. I'm using the former option and hooking it up to some computer speakers.

Stuff the speakers inside your pumpkin. Maybe also cut some more holes in your pumpkin to let the sound out better.

Step 17: Program ESP8266 to Control DFPlayer

You will need to download the latest DFPlayer Mini Arduino library from here. Also at that link is some sample code to show you how to use it.

In the main loop, first check if motionSensed is true. If it is, pick a random number between 1 and the number of audio clips you have. Then send a command using DFPlayer.playFolder(a, b); where a is your folder (1, if you named it 01 as I said above) and b is your file (1–999). Also set motionSensed back to false so that the next motion can be detected.

The DFPlayer also has a BUSY pin that is low when it's playing and high when it's not playing. When you're checking if motionSensed is true, you should also check that BUSY is high, so that you don't start playing a new audio clip before the current one is finished.

I'm also going to use that BUSY signal to enable some LED lights inside the pumpkin so that Sean glows more brightly when he's speaking.

Halloween Contest 2017

Participated in the
Halloween Contest 2017

Audio Contest 2017

Participated in the
Audio Contest 2017