The idea of using triangulation for distance measurements is well known since Pythagorean time, when his brilliant formula become available for mathematicians.
What is new in this design, is lasers power control via "blooming" effect of CMOS camera. Here this "negative" effect was put to work instead of ADC. No need high price "no-blooming" camera! (More information on this link: http://dpanswers.com/content/tech_defects.php ) There are few others design approach, that I was trying to make in hardware/software, and some of them not fully implemented yet ( project just started ).
Power Control Loop (PCL) allows to get stable readings of the reflected back light beams, doesn't matter what is reflectivity of the object's surface, how well illuminated background and what distance range !!! edited: / (Regarding stability measurements in varying illumination conditions, right now there is a resistor for manual adjustment comparator trigger level, depends on average "black-fixed" video. Gonna get rid off it shortly)./
Probably, someone could "hack" a camera, and redesign build-in AGC to provide stable, "fixed-white" level of video signal. But it would be extremely difficult to do with this SMD components, lack of documentation and too complicated for average hobbyist. Plus after that camera is not "in use" anymore for it's main purpose.
Arduino has low size of RAM memory and 8-bit low power microprocessor, so full image processing could not be done. Instead, "build-in" 1-bit comparator forms visual map, where each cell stores time, when events was captured. As video frame created from top to bottom line by line, line number corresponds to Y coordinate, and time stamp on this line - X consequently. At this stage project is more like test bench, than final solution -);.
Right now I'm looking for optical zooming devices, to cover long / short distances automatically. Green lasers, I'm sure 'd bring better resolution, just have to find couple of them for affordable price.
Approximate range with low cost CMOS camera and w/o optical zoom: 0.2 - 10 meters. Accuracy would greatly depends on lasers base/spacing. Lasers base also defines minimum size of the tracking object in Z coordinate. NTSC camera
has viewing angle 52 degrees. Forget about pixels resolution for a moment, we are in analog television world -); Math to calculate the distance (I call it Z coordinate) is pretty simple:
D = B / tan ( phi ), where D is distance, B is lasers base, and phi is an angle what camera reports.
Phi = 52 degree / 832 = 0.0625 degree per coordinate difference.(See below where 832 comes from).
D = B / tan (( X1 - X2 ) * 0.0625). For example, B = 6 cm, X1 = 500, X2 = 512,
than D = 0.06 / tan ( ( 512 - 500) * 0.0625) = 4.58 meters.
- better accuracy;
- no interruption in distance measurements, even when object is in "vision field" area. I defined this area, for PCL operation. Width of vision field area is adaptive, it's narrowing to a few lines! after start-up system.
This version of software capable to follow 1 object in X axes. edited: see below Version 2 release notes. Object has to be visible by itself (rocket, vehicle, any source of light in general) OR highlighted by external light source - not focused to cover bigger space area. Reflectivity of object would define necessary power of the light source in this case, for specific distance range. Optical zoom would significantly improve systems performance.
Tracking in Vertical (Y axes) is not implemented yet, but coordinate reported on serial monitor. Math calculus of Z dimension is not included, simple trigonometry formula could be used. Calibration of mechanical setup would be necessary in order to get meaningful measurements results.
Some technical specification: edge detection resolution - 52 microsecond / per line x 16 MHz = 832 pixel; 235 lines / per frame; 235 x 832 overall. Speed 60 frame / second. There is no issue to get 486 vertical lines with lower speed 30 fps.52 microsecond is essential characteristic of the NTSC standard, which represents active line duration. I used NTSC cam, and for PAL/SECAM it's the same. 16 MHz oscillators frequency Arduino Uno board. I'm saying edge detection instead of spacial resolution as in current setup left edge is only detected with highest possible time accuracy 16 MHz. Technically it easy to modify settings to detect right side edge as well, and measure size of object, shape, and track few of them the same time. It wouldn't be 832 pixels, as interrupt routine timing overhead will slow down time response, and there is not much memory in Arduino to do a complex analysis of the picture anyway. This is why decision was made not to bother with right side. In current design, capture timing of event completely done by hardware. There is a link with details on video format: http://en.wikipedia.org/wiki/Analog_television#Synchronization
If you can imagine a balloon brightly highlighted from left side, with right edge invisible in shadow, it'd be close approximation. For time sync extraction: LM1881. Two sync signal vertical/horizontal are attached on pins 2 and 3. Hardware interrupt feature of AtMega 328 continuously updates synchronization information - current line number. Time capture done by analog comparator and timer 1. DC voltage has to be adjusted in order to trigger comparator reliable at specific level of the video signal.
And one more things to mention, servo motor driving. In order to avoid interrupt routines racing , between lines/frame syncs and servo motor software library, which generate a jitter in position of the servo, plus timing noise on video raster, I didn't use a standard Arduino Servo library, and generate servo-sync synchronously with frame sync. Frequency is up to 60 Hz instead of regular 50 Hz, but there is no complain from my Parallax servo, and I think it would the same with any other motors as well.
Link to download sketch: Arduino_Laser_TRF
12 December 2011 ***** VERSION 2 *****
* There are 4 main features in the project:
* 1. LOCALIZATION XY. CMOS Camera, LM1881. -------------------------47$
* 2. RANGE FINDER Z. Two Lasers plus Camera.-------------------------50$
* 3. TRACKING 3D. 2 Servo Motors, plus all of the above.----------80$
* 4. TRACE TOOL. Doesn't require hardware, software only.-------.
* Feature considered to be independent, so you can star to build from part 1, than move on next stage, and so on
* depends on a budget, parts availability or your interest!
* This version of software capable to track object in 3D space, X, Y and Z coordinates. Tracking feature requires
* Object to be visible in normal "visible" spectrum or near IR. Spectral range could be extended to thermal
* vision with different Image sensors. Emitting light by itself Object (rocket, star, any source of light in general)
* OR highlighted by external light source - not focused to cover bigger space area. For distance measurements
* (Z plane) Reflectivity of object and distance would define necessary power of the external light source in this case.
* Z coordinate / distance is calculated in real time, based on simple trigonometry formula D = B * tan (phi).
* Calibration of mechanical setup would be necessary in order to get accurate measurements results, especially
* on long distances, when angle phi becomes really small.
Link to download sketch: Arduino_Laser_TRF_V2
* I'm not removing first version from download section, as it's smaller and easier to understand logic
behind some software sub-module. Please, be advised that V1 has bugs in servo motor position calculation. V1 is preferable to hobbyist, who want only localization and distance measurements features.
NEXT LEVEL DEGREES OF FREEDOM : 6 D++