Software apps and online services
Cloaking devices are a mainstay of science fiction. In the Star Trek spaceships such as Klingon and Romulan warbirds become invisible to the full electromagnetic spectrum causing all sorts of headaches for the Captain Kirk and the rest of the federation.
This project creates the special effect of cloaking the Alexa Echo, well at least to the Walobot radar, by physically manipulating the environment under Alexa voice control.
Alexa, set CLOAK to 100%
While experimenting with Walabot for the Mess-o-Meter project I became fascinated with visualising the data produced by the Walabot and also its sensitivity.
One application for this is to use the data itself for special effect generation.
In this project the special effect is to simulate a cloaking device by physically moving an object, in this case the Alexa Echo itself, in and out of radar visibility.
The default mode of the Walabot mode is to work as a differential sensor. This means that during a calibration step it captures the current environment and sets it as the reference. After calibration only the differences are reported.
So to cloak a device we first calibrate the scene with the device in a certain location. Then when the device is moved it generates a detectable radar signature which is visualized in the radar. If the device is then moved back into it's original position it becomes hidden in the calibrated background scene.
Voila! the device disappears from radar and it appears cloaked.
To make it work we need to be able to move and restore the device position in a repeatable way. To achieve this a servo was used to rotate an Alexa Echo in position. Because of the sensitivity of the Walabot even the difference of the location of the USB cable is enough to generate a detectable difference.
We're going to re-align the (USB) port nascelle to cloak the device -- Scotty
Of course in the movies computers are always voiced controlled so naturally we need to use Alexa to control our cloaking device.
This project uses the same approach to Alexa integration as my (prize winning) entry for the Alexa Smart Home Contest.
The basic approach is:
- Use Virtual Breadboard Smart Home Alexa Skill Controls in a Virtual Breadboard App project.
- Run the App to publish your custom Smart Home Endpoint and Link Alexa to the Virtual Breadboard Cloud Smart Home Skill
- Wire the Virtual Breadboard Alexa Control into a Virtual Logic controller
- Use an Edge:Bit to connect your Virtual Breadboard Logic Controller to the real world.
Like the Mess-o-Meter application we use a play on words to create a natural voice command but within the constraints of the Alexa Smart Home Api supported by the Virtual Breadboard Alexa Skill.
For this application we use the AlexaPercentageController which responds to the utterance:
Alexa, set <DeviceName> to <number> percent.
So when we use the name 'CLOAK' and for the endpoint device we create a custom utterance that can be directly handled by the Virtual Breadboard Alexa Skill and makes sense for the application
Alexa, set CLOAK to 100%
That totally sounds like something Captain Kirk would say
The great thing about Virtual Breadboard is that it makes complex things easy. The logic controller for this example is simple with only 3 components
- AlexaPercentageController- Converts voice to a voltage range
- ServoGenerator - Converts voltage range to Servo Signals
- Edge:Bit Servo - Decodes and pass-thru Servo Signals
The idea for this application is to use the Servo to physically move the Alexa Echo, or at least the USB cable connected to it, into different physical positions. Only 2 positions are used 100% a 0% but any position between could be used.
The percentage is set using the AlexaPercentageController either by using voice control or dragging the slider generating a virtual analog voltage range. The voltage value is then encoded into a periodic servo pulse by the Servo Function Generator. This signal is then decoded by the Servo component which rotates the virtual servo accordingly and also passes the signal through to a real servo when connected with an Edge:Bit.
The Virtual Breadboard Alexa Smart Home controls make it a snap to roll your own Smart Home Alexa controlled applications. Refer to the previous project for full details. But basically it's simply a matter of setting a few properties and running the application to publish to the Virtual Breadboard Cloud and your ready to Link to and control from Alexa.
TIP: The Alexa endpoint is defined in the AlexaPercentageController properties. As a minimum Alexa:EndpointId, Alexa:FriendlyName, Alexa:Categories should be set.
The Servo Function Generator converts the voltage range into a periodic pulsed signal suitable for driving a servo motor.
The Edge:Bit is a smart USB interface device which transparently connects and passes signals through to a wide range of open source modules and devices.
With the Edge:Bit you can plug-and-play real world sensors and actuators such as the Servo in your Virtual Breadboard projects.
To setup the shot the Walabot was mounted on a camera stand and a USB camera was mounted 'over the shoulder' of the Walabot to give the same point of view.
The Alexa Echo was mounted on a Servo (with a little help from some tape)
The Servo was connected to the Edge:Bit and mounted in the Walabot packaging itself. Handy packaging this - very robust.
It's really fun to make these sorts of 'mini explainer' type videos. Modern tools make this sort of thing unimaginably accessible compared to a few years ago.
To take this shot:
The Mess-o-Meter Radar visualisation software was used stand alone ( with Alexa Notification turned off ) to generate the radar sweep using the Walabot data source
Camtasia was used to capture the 'Live' shot above with screenshots recorded at the same time as the WebCam and audio. Camtasia was also used to resize and export the sub video components
PowerPoint was used to composite the video by overlaying the videos in 3D view mode and linking them together as animations to play at the same time. Powerpoint typing effect was also used for the opening text in the video.
Camtasia was used again to capture the PowerPoint composed video and apply the cloaking ripple transition and other audio effects to generate the final clip.
Walabot enables interesting spacial data to be easily accessed by developers.
Generating physical based special effects by controlling motion of objects in the Walabot field of view provides unique visualisation possibilities beyond that might be easily algorithmically generated.
In this project the data was visualised as a radar sweep but it's easy to imagine extending this to other visualisation platforms such as the HoloLens to create sophisticated special effects which have value as content in their own right.