Newer
Older
This is an attempt to make a 3D scanner with a webcam, projector, and a web app.
Here is a demo of the scanner running:

We used:
- [Logitech C920 Webcam](https://www.logitech.com/en-us/products/webcams/c920s-pro-hd-webcam.960-001257.html)
- [AnyBeam Pico Mini Projector](https://www.amazon.com/AnyBeam-Projector-Focus-Free-Lightweight-Compatible/dp/B088BG59QR)
In broad strokes the steps to produce some 3D data are:
- Calibrate the instrinsics of the projector and the camera, which essentially means determine the focal length and screen sizes.
- Calibrate the extrinsics of the camera, which essentially means determine its location and rotation in space.
- Take reference image of scene to be scanned.
- Project plane onto scene.
- Sweep plane in one direction.
- Convert camera view and reference image to greyscale and subtract reference from camera view.
- Threshold the resulting image.
- Determine median of groups of white pixels.
- Normalize points in camera and projector space and find intersection between camera view line and projected plane to determine point in real space.
## Intrinsics Calibration
This is a well studied topic with well validated calibration techniques available in tools like [OpenCV](https://docs.opencv.org/4.x/dc/dbb/tutorial_py_calibration.html). To save time we ended up doing this quick and dirty and just looked up the focal length for the projector and camera. They are defined in the inital state like so:
```javascript
camera: {
focalLength: 1460,
width: 1920,
height: 1080
}
projector: {
focalLength: 1750,
width: 1280,
height: 720
}
```
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
## Extrinsic Calibration
To calbrate the extrinsics we create a reference image in real space which we will line up with a projected image. Once again this is a hack but it's quick! In our case we drew the corners of a rectangle on a piece of cardboard.

We then project this image over the cardboard and line up the corners.

To project this image we open the web app.

Turn on the video stream.

Open the background and place it on the projector.


Make the background full screen.

Now you're ready to project background images.
Select crosses and hit draw background.

Then move the projector to match the corners on the cardboard.
Now let's align the camera rotation by projecting two parallel lines and painting two red lines on our camera view which are all parallel. Select "vertical-lines" and check the "Overlay Lines" box.

Now turn off the overlay and project the crosses again. Click on each cross in the camera view to log its pixel coordinates in the console.

We'll plug these in to an OpenCV python program to generate the extrinsic calibration.
Set the points in the `imgPoints` variable in `calibrate_cam.py`. Here is an example, the order is left-top, right-top, left-bottom, right-bottom:
```python
imgPoints = np.array([[
[600.015625, 388.015625],
[1325.015625, 374.015625],
[611.015625, 886.015625],
[1321.015625, 880.015625]
]], dtype=np.float32)
```
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139

Use this to set the camera position variable in `index.js` with the second array. You'll have to invert the signs.
```javascript
state.cameraPos: [
4.43436471,
-72.2131717,
-43.49579285
]
```
Notice that for now we are assuming the rotations of the camera are negligible.
## Setting Reference
To set your reference place your object in the scene and set the "blank" background then click "Set Camera Reference"
That will show you the reference image and a snapshot of the current camera view minus the reference.

## Scanning
Select a plane to project as a background. You can use a rectangle or a gaussian.


Clicking "process" will show the average of the white pixel clumps in each column.

Hit "scan" to sweep the plane along the scene which will generate a height map and download a ply file.

Resuling in this height map:

And these point clouds:


These results are imprecise probably mostly contributable to our poor calibration, but despite that we can see that we are getting some 3D data.
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
## A Few Notes
There a number of obvious areas of improvements for the approach described.
### Exposure
We should definitely set the exposure on the camera before scanning. Unfortunately there seems to be a bug in Webkit that prevents this from being possible on a Mac. You can check for the available camera settings and attempt to set constraints like so:
```javascript
const initStream = async () => {
let videoConstraints = {
frameRate: 20,
width: 1920,
height: 1080,
};
const mediaDevices = await navigator.mediaDevices;
const devices = (await mediaDevices.enumerateDevices()).filter( x => x.label.includes("USB Camera"));
return mediaDevices.getUserMedia({
video: videoConstraints,
audio: false
})
}
const stream = await initStream();
const tracks = stream.getVideoTracks();
const track = tracks[0];
console.log(navigator.mediaDevices.getSupportedConstraints());
const capabilities = track.getCapabilities();
const settings = track.getSettings();
console.log({track, capabilities, settings});
track.applyConstraints({
advanced: [{
focusMode: "manual",
}]
})
```
### Camera Calibration
Camera calibration should definitely be done in Javascript.
For the extrinsic calibration we could project a known image onto the background and programmatically determine the locations in camera space rather then doing so by clicking.
Additionally the actual calibration step which we currently run with Python in OpenCV should be ported to JavaScript.
### Parallization/WebGL
All of the canvas rendering and painting can be ported to WebGl for significantly improved performance.
## Resources
- [MediaStreamTrack Spec](https://www.w3.org/TR/mediacapture-streams/#mediastreamtrack)
- [Nice Write Up on DIY Scanner and Camera Calibration](http://mesh.brown.edu/byo3d/notes/byo3D.pdf)