The XKCD “Giant Head” Enhanced Depth Perception Project

Craft & Design Photography & Video Science
The XKCD “Giant Head” Enhanced Depth Perception Project

Many of you will probably have seen this one from late August, already. I haven’t found any indication that Mr. Munroe has actually done this, yet, but there’s no reason the idea shouldn’t work, in principle. To do so requires a viewer with an individually addressable video display for each eye, but these are not too hard to come by. And large-parallax static stereograms taken using synchronized still cameras are well known. Here’s a nice one from Flickr user 4423TKTM toxic:

3d cloud island (crosseye)

But I’m curious about the real-time full-motion video version. You’d need two HD webcams set up on, say, mountain peaks a couple miles apart. Perhaps overlooking an airport? Might do audio, too, while you’re at it, so you could use stereo earphones to get the full-on “giant head” effect…

50 thoughts on “The XKCD “Giant Head” Enhanced Depth Perception Project

  1. William Crawford says:

    We’ve got 3D monitors and televisions now, so the output part isn’t a problem.  We’re just waiting for someone to manage the video capture.  :)

  2. Matt Fedorko says:

    I thought this would be a great thing to have at a children’s museum to demonstrate scale — sort of like “Powers of 10” on steroids, where each kid could get a pair of glasses. It’s a wonderful idea, for sure.

  3. Woody says:

    I’m just wondering why would the cams need to be a couple miles apart, rather than the few hundred feet mentioned in the strip? Seems like it would be overkill and make clouds seem really small.

  4. Anonymous says:

    I’ve been making some preliminary plans to set this up at a festival next summer.  I was thinking opposite corners of a field, about 300-400 yards apart.  Would be curious to hear if anyone had recommendations for suitable off-the-shelf headset hardware.

  5. Nicolás Diego Badano says:

    I think there’s quite a big “gotcha” to this setup. Which direction do you point the cameras at? Both cameras midpoint should be focusing to the same point in the middle of the expected “volume of vision” for best performance, and this may be quite difficult to do. If not done properly 3d effect would be lost.As both eyes would see to very different, not brain correlatable pictures.

    1. Anonymous says:

      Yeah, this will be tricky.  I was thinking the cameras could perhaps sit on computerized telescope mounts so they could be precisely aimed.  The towers will also have to be very stable, as even a little bit of wind-induced wobble would, as you say, probably destroy the effect.

  6. Anonymous says:

    The charm really seems like it being real time and real. I would love just to be tricked into believeing I was moving at that scale. Which would be impossible to do with real cameras. I work in mocap and the minute those 3d Sony glasses/ or the Vuzix 1080P AR glasses come out I will scoop them up and start playing around. 

  7. Anonymous says:

    The charm really seems like it being real time and real. I would love just to be tricked into believeing I was moving at that scale. Which would be impossible to do with real cameras. I work in mocap and the minute those 3d Sony glasses/ or the Vuzix 1080P AR glasses come out I will scoop them up and start playing around. 

  8. Anonymous says:

    The charm really seems like it being real time and real. I would love just to be tricked into believeing I was moving at that scale. Which would be impossible to do with real cameras. I work in mocap and the minute those 3d Sony glasses/ or the Vuzix 1080P AR glasses come out I will scoop them up and start playing around. 

  9. Jonathan Peterson says:

    It’s worth noting that you don’t even need the video glasses.  Using the cross eyes 3D trick you can get a nice 3D image out of that paralax photo from Flickr, no problem.

  10. Matthew Simicsak says:

    Couple of Ideas:

    1: mount the cameras on UAVs

    2: display the image using one of the 3D TVs that can display two different images for two different people, but use the normal 3D glass instead of the special 2 person ones.

  11. Anonymous says:

    For those of you that are interested in trying this out, the practice of creating stereo images of very distant objects is known as “hyperstereo.” If you search for hyperstereo you’ll find the info you need about specific camera separation (aka interocular/interaxial distance).

    In my experience with 3D photography, a bigger separation between your camera doesn’t translate to “more 3d.” There’s a certain threshold that you’ll cross that will just start to confuse your brain.

    1. Kent Durvin says:

      A rule of thumb is that the closest object to the cameras should be at least 5 times the baseline (distance between cameras) and that creates a perceived distance of 5 times the distance between your eyes, or about 12 inches.

  12. John Edgar Park says:

    I’ve been goading the stereographers at work to do this.

  13. Dave Brunker says:

    Why not ask the experts?  The 3D Center of Art and Photography in Portland, Oregon is a club and museum of stereoscopic 3D enthusiasts.  They could give details on what distances would be needed and at what angles the cameras would have to be pointed.  The URL is http://www.3dcenter.us/  I’d really like to hear more about this.

  14. Dave Brunker says:

    Why not ask the experts?  The 3D Center of Art and Photography in Portland, Oregon is a club and museum of stereoscopic 3D enthusiasts.  They could give details on what distances would be needed and at what angles the cameras would have to be pointed.  The URL is http://www.3dcenter.us/  I’d really like to hear more about this.

  15. Kent Durvin says:

    I also have a pair of cameras on a board for a 3′ baseline that works for fireworks.

  16. Kent Durvin says:

    It is not necessary to have synchronized cameras. Clouds don’t change that fast.
    I have taken many nice 3D pairs from airplanes looking out the window.
    Take one, wait a second, and take another. The delay depends on how
    close the clouds are. Mountains and landscape also work this way.
    Shooting 3D is easy, but viewing is harder. I just display them side by
    side and do a cross-eyed trick. The images overlap, and then I bring the
    center 3D image into focus. It takes a little practice.

    1. Anonymous says:

      I am going on an airplane trip soon- will try this :)

      1. william beaty says:

        So, process any videos to extract two streams? One or two sec apart? Then place them side by side in realtime playback?  Then go find some youtube vids shot from aircraft windows.

  17. Kent Durvin says:

    I want to do this for a meteor shower. A baseline of a few miles is needed, with clear skies. Time exposures are needed to capture random meteor, so the cameras would have to be synchronized but not perfectly. Starting an intervalometer on both cameras at the same time would work. Aiming would be tricky, but I think centering on a certain star would do it.

  18. JPP says:

    Check this. 2 cameras 297m (about 1000 feet) apart.

    http://www.flickr.com/photos/ermione3/5068188095/in/photostream

  19. David Chatting says:

    Kenichi Okada did a project along these lines at the Royal College of Art, London – http://www.kenichiokada.com/projects/2008/wideeyes.html

Comments are closed.

Discuss this article with the rest of the community on our Discord server!
Tagged

I am descended from 5,000 generations of tool-using primates. Also, I went to college and stuff. I am a long-time contributor to MAKE magazine and makezine.com. My work has also appeared in ReadyMade, c't – Magazin für Computertechnik, and The Wall Street Journal.

View more articles by Sean Michael Ragan

ADVERTISEMENT

Maker Faire Bay Area 2023 - Mare Island, CA

Escape to an island of imagination + innovation as Maker Faire Bay Area returns for its 15th iteration!

Buy Tickets today! SAVE 15% and lock-in your preferred date(s).

FEEDBACK