Picture of Structured Light 3D Scanning
The same technique used for Thom's face in the Radiohead "House of Cards" video. I'll walk you through setting up your projector and camera, and capturing images that can be decoded into a 3D point cloud using a Processing application.

Point Clouds with Depth of Field from Kyle McDonald on Vimeo.

House of Cards Google Code project

I've included a lot of discussion about how to make this technique work, and the general theory behind it. The basic idea is:

1 Download ThreePhase
2 Rotate a projector clockwise to "portrait"
3 Project the included images in /patterns, taking a picture of each
4 Resize the photos and replace the example images in /img with them

Step 1: Theory: Triangulation

Picture of Theory: Triangulation
If you just want to make a scan and don't care how it works, skip to Step 3! These first two steps are just some discussion of the technique.

Triangulation from Inherent Features
Most 3D scanning is based on triangulation (the exception being time-of-flight systems like Microsoft's "Natal"). Triangulation works on the basic trigonometric principle of taking three measurements of a triangle and using those to recover the remaining measurements.

If we take a picture of a small white ball from two perspectives, we will get two angle measurements (based on the position of the ball in the camera's images). If we also know the distance between the two cameras, we have two angles and a side. This allows us to calculate the distance to the white ball. This is how motion capture works (lots of reflective balls, lots of cameras). It is related to how humans see depth, and is used in disparity-based 3D scanning (for example, Point Grey's Bumblebee).

Triangulation from Projected Features
Instead of using multiple image sensors, we can replace one with a laser pointer. If we know the angle of the laser pointer, that's one angle. The other comes from the camera again, except we're looking for a laser dot instead of a white ball. The distance between the laser and camera gives us the side, and from this we can calculate the distance to the laser dot.

But cameras aren't limited to one point at a time, we could scan a whole line. This is the foundation of systems like the DAVID 3D Scanner, which sweep a line laser across the scene.

Or, better yet, we could project a bunch of lines and track them all simultaneously. This is called structured light scanning.
1-40 of 216Next »
Shirleydavis2 months ago

Its cool :)

RocketPenguin2 months ago

Ok, so I placed the pictures into ThreePhase, and it isn't doing anything. All it does is open to a white screen. What is the solution?

Here are my three test pictures:

thermallyme3 months ago

Its nice

Its trendy :)

seif9353 months ago

yep ..while searching on the internet i found this structured light scanner software, well yeah it coast me 50 euro as i remember but it works perfectly for me especially that it does not force the user to use a DSLR camera (like the flexScan3d software) or a very precise video projector ( like David3d scaner soft force you to use )

i guess it s the most cheapest 3D scanner soft now


smartmiltoys6 months ago

I started working on a structured light scanner using a SHOWWX as
well. I’m planning on replacing the red laser in the projector with an
infrared one, with a corresponding filter on the camera. This way, the
scan is invisible to the human eye.

It also leaves the blue and green lasers in place to allow for
projection onto the scanned surface in real time (well, with a few
hundred millisecond delay). For example, you could place a white sphere
in front of the scanner, find its shape and location, and project the
Earth onto it. As you move the object, it continues to be tracked,
allowing your projection to stay onit.

E.C7 months ago

HI, New to this 3d scanning. I'd like to test the 'ThreePhase' software.

The images in the pattern folder are 1024*768. This is not the native resolution of my beamer. Could/ should I make new patterns in the native res. of my beamer for better result?

And the line pattern is not 'sharp'. Why is this?

Snellingkorey8 months ago


yieldlymph9 months ago

I am also getting the error "Cannot find class or type named PriorityQueue"

Any ideas?

Yes. I have an idea.


mentions that when "Processing" moved from v1 to v2, ThreePhase broke, and "Richard" of the above-named URL came up with a crowbar solution: cut out the controls entirely, by:

* renaming Controls.pde to Controls.pdeRENAMED

* moving its [four, public float] global variables over into ThreePhase.pde

* adding a line to top of PhaseUnwrap.pde:

"import java.util.PriorityQueue;" [so THAT's where the class or type is...]

For my part, I also:

* added this to the bottom of ThreePhase.pde's void setup() method, to automatically get a PLY mesh output (which I can view in meshlab for example):

"exportMesh = true;"

(under "update = true;")

* commented out the "if(takeScreenshot) {" in the same method because it seems to call the camera

* copied the (gone-missing from Controls.pde) "String getTimestamp() {" method from Controls.pde into Export.pde (because it complained when trying to create the export)

This is me using a meat hook on a crowbar--because I don't know bupkus about object-oriented programming or "processing" (what a name for something they knew in advance people were going to have to google)

Anyways, with this sort of bumbling, I was able to get past the errors and get a ply file (stored in the same dir with the PDEs).

[Idle thought: wonder what the output would look like taking the three "patterns" as the three "images": NULL image, right?]

Good luck. [and Thanks, Richard, over on meetup!]

BTW, meshlab showed a "phase error" in the result: the entire right shoulder is raised compared to the left.

davidbarcomb9 months ago

Very nice. Thanks

yieldlymph9 months ago


OlegKitFace11 months ago

Updated for processing 2.X:


ChrisH11 year ago

I am also getting the error "Cannot find class or type named PriorityQueue"

Any ideas?

JKPina ChrisH111 months ago

Same issue (also with LinkedList in ThreePhase-1): I put:

import java.util.*;

In any source, but a new error appeared: java.lang.NullPointerException :/

Can someone explain to me the point of using 5v and a regulator to get to 3.3v when there is a 3.3v supply from the pi. I am fairly new to this stuff and am just trying to learn. Thanks.

Thats cold

Nicely done... Amazing instructable

mousepaper1 year ago



Can someone explain to me the point of using 5v and a regulator to get to 3.3v when there is a 3.3v supply from the pi. I am fairly new to this stuff and am just trying to learn. Thanks.


Its wonderful

fastbobble1 year ago

Thats astounding

gorgeddamp1 year ago


Hey Guys,

if I have just a "single-dot-laser" I can determine X, Y and Z in help of the Triangulation.

So Z is Z = (f*b*tan(alpha))/(f+x*tan(alpha))


f -focal length

b - Laser Camera Disance

alpha - Laser Angle

So I have a Problem there. I calculated x because I need it in mm. So I did the following:

x = (x_pixel/pizel_widh_Image-0.5)*SensorSizeX

But my measurement are wrong. I am sure the Formula for Z is right. But I am not sure about x. Is there some one who can help me?

Its remarkable

illrings1 year ago

Its nice

airbugger1 year ago

Its extraordinary

Its exceptional :)

headlymph1 year ago


tealrink1 year ago



harechubby1 year ago




calmlunch1 year ago


VitaminX1 year ago

Nicely done... Amazing instructable

clapfilk1 year ago


1-40 of 216Next »