author2

Inventor22

201
Inbox View Profile
1Instructables93,273Views67CommentsUSA
Software engineer and long time hacker, but new to Instructables! Started tinkering with Arduino in highschool building cool little robots.

Achievements

10K+ Views Earned a bronze medal
Organization Contest
Contest Winner Grand Prize in the Organization Contest
  • FindyBot3000 - a Voice Controlled Organizer

    Yes and yes. Visual Studio is available for Mac, as well as the Particle IDE and other tools you'll need for this project. In their comments below, Strickce did have some trouble launching Visual Studio from Chrome on a Mac, but I posted manual instructions for connecting to the SQL database when you get to that step. Let me know if you run into any issues.

    View Instructable »
  • FindyBot3000 - a Voice Controlled Organizer

    Could you upload a screenshot of what you see (or don't see, for that matter)? I'm not experiencing any trouble when downloading the PDF.

    View Instructable »
  • Sure thing, go for it. Looking forward to seeing your version!

    View Instructable »
  • Wish I could be of more help there, but I've never used a Mac, so at best any advice would be a shot in the dark. Good to know you were able to get things working with Windows 10 though, and have the database all fixed up.When I was first starting with projects, I remember just how defeating it would be when I got hung up on a problem with no idea how to fix it, and how much I appreciated any help I got. Now, years later with a bit of experience under my belt, I get to pay it forward.

    View Instructable »
  • New step is up showing how to manually connect to the SQL database from Visual Studio. Let me know how it goes.

    No problem. I think the issue you may be running into is Safari-specific. My browser is Chrome, running on a Windows machine and don't run into any problems. Some google-fu turned up this, it may help: https://discussions.apple.com/thread/5058055If that doesn't, then the next easiest option is to try using a different browser, like Chrome or Firefox. Using the browser method here configures firewall rules and auto-fills some of the connection strings, making things a bit easier.If you're not interested in installing another browser, hang tight as I'm working on another step to show how to manually connect to the database through Visual Studio.

    View Instructable »
  • You bet. A brute force solution was along the lines of what I was thinking too. There are unfortunately a lot of edge cases with this. On occasion, I've had google interpret a command like: "Insert seven one eighth inch pins" as "71 eighth inch pins" instead of the desired "7 1/8 inch pins". These kinds of cases are where things can get a lot more tricky.Your idea with a mapping of "five" -> 5, "four" -> 4, etc. will work well enough to solve the bulk of these mistranslated scenarios though. Nice to hear you're having a good time with this project!

    View Instructable »
  • Hey Strickce, I posted a new step outlining how to update the database. It's a lot more straightforward than working with SQL queries directly. Good luck, and let me know if you have any questions.

    View Instructable »
  • Right on! As for the issue with trying to insert five of an item, I've run into the same issue, sometimes with 'four' as well. I resolved to using the same solution you came up with as a quick fix.I have no idea why the Google Assistant treats 'five' differently than other numbers -- not sure if even a Google engineer could tell you, neural nets are notoriously difficult to decode why they do something. My guess though, is because of the context in which 'five' can be used, for example: "Hey great work on that project, high five!". In this case, any user would want the words 'high five' in the sentence, and not 'high 5'. Use enough training data with "five" as the desired end result, as opposed to "5", and you'd get what we get here. Just a guess though.T…

    see more »

    Right on! As for the issue with trying to insert five of an item, I've run into the same issue, sometimes with 'four' as well. I resolved to using the same solution you came up with as a quick fix.I have no idea why the Google Assistant treats 'five' differently than other numbers -- not sure if even a Google engineer could tell you, neural nets are notoriously difficult to decode why they do something. My guess though, is because of the context in which 'five' can be used, for example: "Hey great work on that project, high five!". In this case, any user would want the words 'high five' in the sentence, and not 'high 5'. Use enough training data with "five" as the desired end result, as opposed to "5", and you'd get what we get here. Just a guess though.There's a software fix I had in mind for this, but I haven't had time to get around to it. I'll post another comment if I implement a fix.

    View Instructable »
  • Yup, you're spot on. As long as each row strip is getting power, jumping only the middle data line every other row should be good to go. However, this does mean you'll need to connect up the power supply to test all the LEDs.

    View Instructable »
  • Yeah, I had the same problem originally -- I'd forget what I called a part and end up wasting time manually looking for it. The tagging feature is key for letting others search for parts, as you mentioned.I looked into creating a custom Google Assistant app, but the learning curve was a bit too steep for my timeline, and the integration with IFTTT was dead simple. So I made the trade off -- ease of use, for a little less functionality. A UI may come later down the road, including a little LCD to show part inventories and other debug info without the need to debug through a computer.Thanks for the comment!

    Glad to hear you solved the problem Strickce. In my experience 90% of bugs with this project have been random unaccounted or inserted characters here and there which break the Json format one way or another. A good resource for analyzing and testing the Azure Function for any bugs / quality of output is to run the Azure Function locally with Visual Studio and send it requests using Postman (https://www.getpostman.com/). I've added the requests I used for testing to Github: (https://github.com/Inventor22/FindyBot3000/tree/master/Testing/Postman). Just import that file to Postman and you should be good to go.As for 6 boxes (nice!), you're right you'll need to just update the array dimensions (In both the Particle Photon and the Azure Function code). There's also LED offset and LED widt…

    see more »

    Glad to hear you solved the problem Strickce. In my experience 90% of bugs with this project have been random unaccounted or inserted characters here and there which break the Json format one way or another. A good resource for analyzing and testing the Azure Function for any bugs / quality of output is to run the Azure Function locally with Visual Studio and send it requests using Postman (https://www.getpostman.com/). I've added the requests I used for testing to Github: (https://github.com/Inventor22/FindyBot3000/tree/master/Testing/Postman). Just import that file to Postman and you should be good to go.As for 6 boxes (nice!), you're right you'll need to just update the array dimensions (In both the Particle Photon and the Azure Function code). There's also LED offset and LED width arrays (boxLedOffsetByColumnTop, boxLedWidthByColumnTop, etc.) which will need to be updated to account for the couple extra cabinets. The code is pretty janky around that so let me know if you need clarification on any of it.

    Hi Taran, I use a database to store all the items which have been inserted using the "Insert Item" command. When the user asks for an item, "Ok Google, find yellow LEDs", the program extracts the words "yellow leds", then uses that as a 'key' to lookup the entry in the database. If an entry is found, the coordinates of the box that the item is in are sent to the organizer, which then lights up the LEDs for that box.

    Great to hear. I chose Azure for a couple reasons: I was already familiar with the Azure ecosystem, and I wanted to learn about Serverless Functions. I thought about running a local SQL (or other) database solution, or even rolling my own via EEPROM or an SD card, but I specifically wanted this device to be an IoT thing, so I quickly threw away those ideas.In a future revision I may opt for on-chip storage and voice recognition, most likely running off a Raspberry Pi.

    View Instructable »
  • Thanks. That sounds great, a 3D setup would definitely be more compact. A bunch of different ideas could be used to represent the Z-axis, I like the flashing LEDs idea you mentioned. For sure, feel free to PM me or post any questions here.

    View Instructable »
  • Thanks, I've updated the Instructable with a new link and price.

    View Instructable »
  • Inventor22 followed Nikus
      • DIY 3D Printed Dremel CNC
      • 3D Printed Snowmobile
      • 3D Printed $15 Camera Slider
  • I can't tell from the listing whether the LEDs are individually addressable or not. However, given that the product page doesn't show any images of different colored LEDs on the same strip, i'm going to assume that the LED are not individually addressable. There was no sign of a datasheet either, so I really can't recommend this product, for this project.If the Aliexpress prices are outside your budget, there is a cheaper alternative -- using regular LEDs and an LED driver chip (MAX7219). Although the Max7219 chips aren't the cheapest either: https://www.sparkfun.com/products/9622This route does get fairly technical fair quickly though, in terms of wiring and programming.

    Looks like IFTTT only supports English so far, as KonradO6 mentioned: https://help.ifttt.com/hc/en-us/articles/360001445233-Is-IFTTT-available-in-multiple-languages-

    Thanks audrius. The drawers on the Akro-Mils cabinet are well made, from a semi-rigid plastic. Durable enough to store any bolts and screws. No loose bits of plastic from the injection molding process, and the design of the drawer makes it easy to pull out/push in without snagging on the frame. The black plastic frame to hold all the drawers is made from a more rigid, but more brittle plastic. The screw mounts on the back leave something to be desired though; I wouldn't trust a wall mounted cabinet if it was filled with screws, batteries, and other heavier items. That's not an issue in this project though, as the cabinets are sandwiched in the wooden frame.

    View Instructable »
  • Aliexpress is the cheapest place I know of for LED strips. If you really want to save money, you can do away with the LED strips altogether, and wire up individual LEDs instead. I did this with a previous version of this project that I built with a few friends. You can see it in operation here: https://www.youtube.com/watch?v=0K0eq_KBbkQCan you link the model number of the LED strip you're thinking of using - i'll check to see if it's compatible with the libraries I use, and we can go from there.

    View Instructable »
      • How to Make Infinity Mirror Heart With Arduino and RGB Leds
      • $20 Arduino Obstacle Avoidance Smart Car
      • 24 in 1 Clock - a Clock Tells Time of All Time Zones. Made From Paper
  • If you plan on using the Akro-Mils cabinets that I linked in the Instructable, I recommend against using LED strips which have 5050 LEDs. The horizontal spacers between rows is only 3-4mm wide, so if you were using a 5mm wide or greater LED strip with 5050 LEDs, you run the risk of snagging the strip when you pull a box open. The 4mm-wide LED strips I linked to don't have this problem.

    View Instructable »
  • Thanks AlexJ74!

    Yeah, Google Assistant is available on iPhone: https://itunes.apple.com/us/app/google-assistant/id1220976145?mt=8You will need to change some code around if you use nodeMCU, primarily in two main ways:1. Use the Arduino-version of the neomatrix.h, Adafruit_GFX.h, neopixel.h, ArduinoJson.h libraries that I make use of in FindyBot3000.ino. These should auto resolve as you load the code into the Arduino IDE.2. You can't use the Webhooks as implemented in the code, as they are configured through the Particle website (See Step 23: Software - Link Particle Photon to Azure Function). There will be a way to work around this though, but I couldn't tell you how off the top of my head, as I've never used nodeMCU. You might need to write custom HTTP requests. I did find this Instructable which lo…

    see more »

    Yeah, Google Assistant is available on iPhone: https://itunes.apple.com/us/app/google-assistant/id1220976145?mt=8You will need to change some code around if you use nodeMCU, primarily in two main ways:1. Use the Arduino-version of the neomatrix.h, Adafruit_GFX.h, neopixel.h, ArduinoJson.h libraries that I make use of in FindyBot3000.ino. These should auto resolve as you load the code into the Arduino IDE.2. You can't use the Webhooks as implemented in the code, as they are configured through the Particle website (See Step 23: Software - Link Particle Photon to Azure Function). There will be a way to work around this though, but I couldn't tell you how off the top of my head, as I've never used nodeMCU. You might need to write custom HTTP requests. I did find this Instructable which looks promising: https://www.instructables.com/IoT-Air-Freshner-with-NodeMCU-Arduino-IFTTT-and-Ad/ and this: https://docs.microsoft.com/en-us/azure/iot-hub/iot-hub-arduino-huzzah-esp8266-get-started Those should help get you started.

    View Instructable »
    • FindyBot3000 - a Voice Controlled Organizer
      29,046 views
      451 favorites
      109 comments
  • Inventor22 commented on ChrisN219's instructable Portal 2 Turret Gun

    This is epic!!! Thanks for the Instructable. Throwing this one on the projects bucket list for sure. Could you let me know the html behind those 'View in 3D' Fusion 360 parts you linked?

    Ah great, i'll get in touch with Tomatoskins and see what ancient instructables wisdom they hold.Thanks for the cudos on the organizer!

    View Instructable »
      • Coke Machine Level Detector - Now With Speech!
      • Portal 2 Turret - Master Turret Control
      • Portal 2 Turret Gun
  • Ah yeah, I came across this project several years ago actually. Industrial versions of part-finding systems like this exist all over the place, but not too many hobby level ones. In Amazon.com warehouses, they actually hove robots shuttling around entire shelves of items, and other algorithms which plan the shortest-path trajectory between parts bins for a human to pick up for packaging.The Cartesian bin push-out feature mentioned in some of the comments on this project is actually implemented in the project you linked -- pretty cool.As for entering commands via text, that's already supported by default :). You can open up google assistant from any android device (that you've logged into) and type in the same command you would otherwise speak. Take a look at the photo I've attached. …

    see more »

    Ah yeah, I came across this project several years ago actually. Industrial versions of part-finding systems like this exist all over the place, but not too many hobby level ones. In Amazon.com warehouses, they actually hove robots shuttling around entire shelves of items, and other algorithms which plan the shortest-path trajectory between parts bins for a human to pick up for packaging.The Cartesian bin push-out feature mentioned in some of the comments on this project is actually implemented in the project you linked -- pretty cool.As for entering commands via text, that's already supported by default :). You can open up google assistant from any android device (that you've logged into) and type in the same command you would otherwise speak. Take a look at the photo I've attached. It's verbatim the command I used in the demo video.

    View Instructable »
  • I was obsessed with Rube Goldberg contraptions back in highschool, glad to tip my hat off to him with this project hah.

    While IFTTT has Alexa support, it's not as advanced as the Google Assistant support. You can only get away with basic commands using Alexa + IFTTT, unless you write a custom applet (far more complicated).As an example, to turn on and off the display using Alexa, you would need to make two applets: One for turning the display on: "Alexa, turn the display on" and another to turn the display off: "Alexa, turn the display off". There is currently no way to have text 'ingredients' (as IFTTT calls them), to extract custom text.With Google Assistant, you can configure an IFTTT applet that takes wildcards: "Ok Google, turn the display $", and then the applet extracts whatever words you say after 'display' and stores it in the variable '$', which lets you pass it off…

    see more »

    While IFTTT has Alexa support, it's not as advanced as the Google Assistant support. You can only get away with basic commands using Alexa + IFTTT, unless you write a custom applet (far more complicated).As an example, to turn on and off the display using Alexa, you would need to make two applets: One for turning the display on: "Alexa, turn the display on" and another to turn the display off: "Alexa, turn the display off". There is currently no way to have text 'ingredients' (as IFTTT calls them), to extract custom text.With Google Assistant, you can configure an IFTTT applet that takes wildcards: "Ok Google, turn the display $", and then the applet extracts whatever words you say after 'display' and stores it in the variable '$', which lets you pass it off to another program -- the Particle Photon in this case.On the particle Photon side, we receive an event with the data stored in '$'. If $ == 'on', the display is turned on, if $ == 'off', the display is turned off.

    No worries, this is a lot of stuff to digest in one go.Git is a distributed version-control system -- basically it allows multiple developers to collaborate on the same source code. It helps prevent a million different versions of the same files from floating around and getting lost in an organizational mess.Github hosts git repositories, and this is where the code is stored.The Azure Function needs some code to run, but when we first deploy it, it's basically an empty service that exists in the cloud, doing nothing. This is where continuous integration with Github comes in.By linking the specific repository in your Github account to the Azure Function, any changes you make to the Github repo automatically get uploaded to run in the Azure Function.Now Git is pretty complicated at first, …

    see more »

    No worries, this is a lot of stuff to digest in one go.Git is a distributed version-control system -- basically it allows multiple developers to collaborate on the same source code. It helps prevent a million different versions of the same files from floating around and getting lost in an organizational mess.Github hosts git repositories, and this is where the code is stored.The Azure Function needs some code to run, but when we first deploy it, it's basically an empty service that exists in the cloud, doing nothing. This is where continuous integration with Github comes in.By linking the specific repository in your Github account to the Azure Function, any changes you make to the Github repo automatically get uploaded to run in the Azure Function.Now Git is pretty complicated at first, which is why I chose to use the Git Desktop Client in the Instructable (https://desktop.github.com/). But here is a video offering a tutorial on how to use git, any why it's useful: https://www.youtube.com/watch?v=HVsySz-h9r4

    Thanks bbrain, you're right, 1 pixel/bin would be sufficient. With the density of LEDs I chose however, I can scroll text across the screen. I used that feature to display how many of an item I have left.

    View Instructable »
      • PVC Class
      • Welding Class
      • Glue Class
  • Thanks zkus!I had no idea how bright the LEDs were going to be when I bought them, so I just did some quick napkin calculations according to the specs for max power consumed. Manufacturer mentioned 18W/meter @ 60 pixels/meter, so with 14 meters at 100% brightness, that gives 252W. We can use the power equation: P = IV, to give us a rough estimate of Amps needed. I = P/V = 252Watts/5V = 50.4A. So the 60A power supply I used should be more than sufficient.Did a quick sanity check too - knowing that the rough current drawn from other LEDs i've worked with is 20mA, we find: 0.020A*3 LEDs/pixel * 60 pixels/row * 14 rows = 50.4A - same number as before, great!I ran the LEDs at 30% brightness in the video, without gamma correction, and even then they were too bright. So you're spot on that I…

    see more »

    Thanks zkus!I had no idea how bright the LEDs were going to be when I bought them, so I just did some quick napkin calculations according to the specs for max power consumed. Manufacturer mentioned 18W/meter @ 60 pixels/meter, so with 14 meters at 100% brightness, that gives 252W. We can use the power equation: P = IV, to give us a rough estimate of Amps needed. I = P/V = 252Watts/5V = 50.4A. So the 60A power supply I used should be more than sufficient.Did a quick sanity check too - knowing that the rough current drawn from other LEDs i've worked with is 20mA, we find: 0.020A*3 LEDs/pixel * 60 pixels/row * 14 rows = 50.4A - same number as before, great!I ran the LEDs at 30% brightness in the video, without gamma correction, and even then they were too bright. So you're spot on that I could have gone with a smaller power supply. Probably worth throwing those calculations in the Instructable too, so others know why such a large power supply was used.

    Thanks virtualjc!

    Right on serhardt!

    Thanks Gusgonnet!

    View Instructable »
  • Inventor22 followed microcontrollers, leds, 3D-Printing, robots and 7 others channel