CARGO CULT

Well, we survived, and two weeks later, it’s time to dredge up a few memories and put something down here (way more images and video coming soon! We swear).

image

First off, every aspect of Mirage went just about perfectly. Well, except for that damned speed sensor… finicky component.

Our dedicated parking spot was next to the legendary Disco Duck with PlayaSkool, at 3:00 & Esplanade.

image

We played live shows, running a sound reactive kinect screen as the band The Adversary played atop the Mirage - full drum kit, amps, keys, mics, and fx boards - to small hordes of happy burners.

Read More

Render Tests!

It’s been a busy month since our test build at Iveson Ranch on 4th of Juplaya… Mirage is all shiny and clean, with a new interior, dedicated PA system, speed sensor, Kinect camera portholes, and a brand new brain.

Photos of the build and other bits are coming soon, but meanwhile check out some demos for a small selection of interactive visual content:

This first clip shows some of the audio-reactive stuff Erl’s been working on: 3D, glitchy, smeary and pulsing. This is a real-time render from VDMX using Syphon, and all transitions and FX changes have been automated, loosely synchronized to the tempo of the music:

This next recording shows some of the potential contained in the two Kinect cameras being mounted to one side of the vehicle (although, this video is old and particularly choppy, just showing early experiments with render styles and audio->color changes. It was done on a 2008 laptop. The New Brain is real-time smooth!). The combined point clouds will be scaled and translated to fit the entire width of the screen, a slightly magnified mirror effect. This will be overlayed in different ways with other clips and effects. 

And here’s the most recent render that attempts to show what our speed sensor is meant for. At rest, animal walk cycles proceed left to right across the entire screen, as though the animals are moving in a circle around the vehicle. Be aware that the left and right halves of the video represent passenger side and driver side, respectively. The output is wrapped around the Mirage’s skin, with the nose in the center… As soon as the car begins to move, the preset changes to show the animals horizontally fixed and mirrored, with an (almost) properly parallaxed landscape that shifts between mountain ranges and an abstract city skyline. Here, too, automation of transitions, parameter changes, backgrounds, etc. makes the entire system hands-free, and all timing is directly linked to either the speed of the vehicle or the tempo of the music.

(Particularly pretty around 02:30, IMHO)

As always, more video and details to come… 

Speed sensor almost complete! Still working to make the unit stable and self-contained, but this is already promising.

Assembling a wireless speed sensor. Remarkably simple, actually. The reference manual wasn’t even consulted.
Just an Arduino Uno with a Hall effect sensor and a WiFly card.
The Hall Effect sensor detects magnetic flux and computes time deltas as eight magnets positioned around the rear wheel rotate near the sensor.
The time delay between pulses is used to compute an “offset rate” that is fed into the visualization via OSC over the network.
With this, elements of the visualization will move in the opposite direction of the vehicle at the same speed, lending to the illusion that the content has always existed, floating stationary above the playa, but invisible to the naked eye until it is revealed by the slow passage of a Mirage.

Assembling a wireless speed sensor. Remarkably simple, actually. The reference manual wasn’t even consulted.

Just an Arduino Uno with a Hall effect sensor and a WiFly card.

The Hall Effect sensor detects magnetic flux and computes time deltas as eight magnets positioned around the rear wheel rotate near the sensor.

The time delay between pulses is used to compute an “offset rate” that is fed into the visualization via OSC over the network.

With this, elements of the visualization will move in the opposite direction of the vehicle at the same speed, lending to the illusion that the content has always existed, floating stationary above the playa, but invisible to the naked eye until it is revealed by the slow passage of a Mirage.

Mirage 3.0 - Gearing Up

And so it begins again.

Our expanded crew is planning another test build at Fourth of Juplya, and the list of improvements is tremendous.

First off, we’ve got two new members who will be instrumental in adding a dedicated sound system as well as improvements to the interior and exterior of the vehicle: a cushy place to cuddle inside, a comfier dance floor on top, and a killer PA for private parties out in the deep or to augment the sound for anyone who wants to jack into our mixer - via cables or radio frequency magic.

New to the mix will be two Kinect cameras mounted to one side of the Mirage. Not only will our visual content be more refined and highly audio reactive, but now up to 12 people be able to influence the visuals directly by becoming a part of virtual simulations reflected back to the playa. We can paint light with our hands and bodies, see our reflections as VR avatars, become silhouetted masks for layered video content… the possibilities are almost too numerous.

The wireless network emanating from Mirage will also allow anyone to connect, open a browser window, and send a text message, photo, or video directly to the screen.

Additional goodies include a speed sensor for input from the forward motion of the vehicle and a new touch screen interface for simple command-and-control. Oh, and for 4JP, we opted to test with a projector instead of renting the LED rig, so now we have that in our arsenal. Projecting visuals onto the desert floor, fellow burners, and ambient dust? Sounds good.

Stay tuned for more as the pieces come together…

Mirage 2.0 makes its debut on the Playa. 

Music: “We Fail” by deadmau5

The release of this video got us a nice little writeup over at hackaday

Details, details…

So, at this point we basically have this massive vehicle, with a really long external monitor wrapped around it in the form of a .25 PPI screen of 168x24 ColorKinetics pixels.

It’s powered by a sweet little generator mounted atop the vehicle, to the back of the  illuminated dance floor.

A server rack housing a Philips VSMPro, controlling seven sPDS power supplies and a rats nets of cables that drive 84 LED strings, separated into 14 panels. All it does is take a 1024x768 HDMI input, and map it to the LEDs as though it were an external monitor.

VDMX output is simply rendered to the center of the input frame at ~1024x150 (do the math, it works out, I swear). Here’s what VDMX renders to the VSMPro input:

Here’s a rough, top-down diagram of our LED arrangement:

And here’s the custom door solution that allowed us to get in and out of the damn thing. A genius layout designed by Brendan Burke:

We’ve got audio feeding into the laptop with frequency analysis that controls parameters within VDMX and its sub-patches. 


Just one of 5 different VDMX layouts created over the course of 8 days.

A decent library of video clips, games, and Quartz patches, layered with an ungodly number effects, and an iPad2 controlling every aspect of the final mix. Sequences with automated changes were also programmed to give Earl a friggin break once in a while.


Various control surfaces designed in TouchOSC

…we even have a USB knob controller hot-glued to the dashboard that, depending on which way you rotate it, blacks out the LEDs covering the windshield so we can see while driving. It even “wipes” across the windshield from the center to the edges as you turn it, which looks extremely slick.

Of course, there’s plenty more details involved, from the frame/body construction, to the cable runs, the custom electrical system, to the local ethernet/wifi network and VSM setup, to the functionality of peripherals powered by the deep cycle marine cells… but that’s why we added the “ask” link on our blog.

If you want more details, let us know.

Software Output Tests

These are Earl’s first demos of what VDMX could do at our specified dimensions, presented for the rest of the crew after taking notes at Fourth of Juplaya. Based on the dimensions of the mesh on each side of the vehicle (roughly 84x24), and after a lot of math, it was determined that the actual dimensions of the signal should be 512x150 for each half. Thankfully, VDMX has highly customizable output settings, so we mirrored its output and rendered a 1024x150 image onto a 1024x768 frame. The Mirage simply acts as a really big external monitor on the MacBook.

So, imagine the video below is wrapped around the vehicle, with the nose in the center.

This second demo is tighter.

The third test shows some of the “flight simulator” capabilities that Earl built out to take advantage of body tracking with the Kinect used to be awesome, but YouTube removed it because of some naughty footage I used to emphasize the theme, “Fertility 2.0.”  I’d appeal it, but it’s hard to explain artistic merit to a faceless machine. Anyway, it also showed our ability to play PONG with the iPad2 as a remote, two-person controller. 

Fourth of Juplaya - Proving Grounds

These are videos from our test build at Fourth of Juplaya 2012. Mostly working out kinks with VDMX and planning out how to map nodes in the mesh properly.

Massive thanks to everyone at Iveson Ranch just up from the 12-Mile entrance for their kindness and hospitality!

Building out mirage at Fourth of Juplaya 2012.