(top picture) microphone hidden on bookshelf;
(bottom picture) another microphone hidden inside this cute stuffed animal's nose....
This Instructable introduces "ARbotics" (Augmented Reality robotics), and in particular, robotics for phenomenological augmented reality.
Let's begin with a simple example: seeing otherwise invisible sound waves, as well as metasensing (metaveillance) of sound.
What is metasensing (metaveillance)?
A meta-conversation is a conversation about conversations. A meta-joke is a joke about jokes. Metadata is data about data (like the GPS coordinates that record where a picture was taken, which are embedded in the JPEG header information of the picture).
Likewise, metaveillance is the seeing (veillance) of sight or other senses. Metasensing is the sensing of sensing, or the sensing of sensors, or sensing their capacity to sense.
A nice simple example to begin with is the sensing of microphones, and the sensing of their capacity to sense. In order to make this sensory data visible, we use the S.W.I.M. (Sequential Wave Imprinting Machine), invented by S. Mann in 1974, which was the subject of a previous Instructable.
The SWIM may be mounted to a robot. Various kinds of robots can be used. Perhaps the simplest approach is to make a slide rail, rail car, or the like, or use a motorized toy car or truck, or both (e.g. mount a simple reciprocating robot in the back of a radio-controlled pickup truck). A large pickup truck is a good choice (as I used in the above figure) because it is easy to mount a motorized sliderail and rail car onto the open cargo area of the pickup truck, which can swing the SWIM back and forth while the truck can move forward through the space, thus automating the process of bugsweeping.
I built a working prototype for this project a long time ago, using a Tamiya TAM58372 Tamiya 58372 Ford F350 High-Lift Truck Kit Model Kit, but that same kit happens to still be available from Amazon.com, or you can usually get an old radio control model kit, car, truck, boat, drone, or the like from garage sale, flea market, or dumpster (people often throw away perfectly good radio control toys just because of some very minor problem you can very easily fix).
The "Bugbot" is not merely a robotic bug sweeper, but, more importantly, it is an augmented reality visualization machine.
A little bit of machine learning can go a long way here, and the simple "Bugbots" you build can form the basis for a wonderfully fun research program in ARbugbotics.
ARbug-bots are great for teaching, too. What's there to not love about using toy cars to find hidden microphones in cute little stuffed animals? Awaken the inner-child in your mind, with this spy-versus counter-spy: spybot versus counterspybot narrative: a great way to teach robotics and science at the same time!
So let's get started! We'll begin by measuring the speed-of-sound (or light) and making it "sit still". This will give us "sitting waves" that make visible the capacity of a microphone to hear.
The SWIM will form the basis for unlocking the power of AR to make visible the otherwise invisible sound waves, radio waves, gravitational waves, and many other waves that surround us.
Mount the SWIM on a rail or structure that allows it to move back-and-forth, and then you can motorize this back-and-forth sweeping movement -- -- RoboSWIM!
Paint everything black if you can, so that only the lights are contributing to the visuals.