測定結果を処理しています#
概要#
The blaze-101 camera measures the distance the light travels for each sensor pixel.
Using these distances, the camera calculates x,y,z coordinates in a right-handed coordinate system for each sensor pixel. The coordinate system's origin is at the camera's optical center which is located inside the camera housing. The y axis is pointing down, and the z axis is pointing away from the camera.
The camera provides 3D information either as a depth map or as point cloud. In a depth map, z coordinates are encoded as 16-bit gray values. As illustrated below, all 3D coordinate values can be calculated from the gray values. A point cloud contains the x,y,z 3D coordinates for each sensor pixel as floating point numbers. The unit is mm.
If there is no valid depth information for a sensor pixel (e.g., due to outlier removal or insufficient light, i.e., light that is not strong enough to pass the confidence threshold), the corresponding values in a depth map or a point cloud are set to the value that is defined by the Scan3dInvalidDataValue
parameter (default setting is 0).
Processing Depth Maps#
Depth maps consist of 16-bit gray values. For each sensor pixel, the camera converts the z coordinate value to a gray value and stores it in the depth map.
In combination with the camera's calibration data provided by the Scan3dCoordinateScale
, Scan3dPrincipalPointU
, Scan3dPrincipalPointV
, and Scan3dFocalLength
parameters, complete 3D information can be retrieved from a range map.
Refer to the GrabDepthMap C++ sample for how to configure a camera to send range maps and how to access the range map data.
Calculating 3D Coordinates from the 2D Range Map#
To convert a range map's 16-bit gray values to z coordinates in mm, use the following formula:
z [mm] = gray2mm * g
where:
g
= gray value from the range map
gray2mm
= value of the Scan3dCoordinateScale
parameter
For calculating the x and y coordinates, use the following formulas:
x [mm] = (u-cx) * z / f
y [mm] = (v-cy) * z / f
where:
(u,v)
= column and row in the depth map
f
= value of the Scan3dFocalLength
parameter, i.e., the focal length of the camera's lens
(cx,cy)
= values of the Scan3dPrincipalPointU
and Scan3dPrincipalPointV
parameters, i.e., the principal point
C++ Sample Code#
// Query the conversion factor required to convert gray values to distances:
// Choose the z axis first...
GenApi::CEnumerationPtr(camera.GetParameter("Scan3dCoordinateSelector"))->FromString("CoordinateC");
// ... then retrieve the conversion factor.
const double gray2mm = GenApi::CFloatPtr(camera.GetParameter("Scan3dCoordinateScale"))->GetValue();
// Retrieve calibration data from the camera.
const double cx = GenApi::CFloatPtr(camera.GetParameter("Scan3dPrincipalPointU"))->GetValue();
const double cy = GenApi::CFloatPtr(camera.GetParameter("Scan3dPrincipalPointV"))->GetValue();
const double f = GenApi::CFloatPtr(camera.GetParameter("Scan3dFocalLength"))->GetValue();
const int width = (int)parts[0].width;
const int height = (int)parts[0].height;
uint16_t* pDepthMap = (uint16_t*) parts[0].pData;
for ( int v = 0; v < height; ++v ) {
for ( int u = 0; u < width; ++u>) {
const uint16_t g = pDepthMap[v*width + u];
const double z = gray2mm * g;
}
}
Processing Point Clouds#
No further processing is required to extract 3D information from a point cloud since the point cloud consists of x,y,z coordinate triples within the camera's coordinate system.
Refer to the FirstSample C++ sample for how configure a camera for sending a point cloud and how to access the data.
If you need a depth map in addition to a point cloud, refer to the ConvertPointCloud2DepthMap C++ sample that illustrates how to compute grayscale and RGB depth maps from point cloud.
Shifting the Origin of the Coordinate System to the Front of the Camera Housing#
The camera's coordinate system's origin is located in the camera's optical center which is inside the camera housing. If you prefer coordinates in a coordinate system which is located at the camera's front of the housing, i.e., which is translated along the z axis, a constant, device-specific offset has to be subtracted from the z coordinates. The required offset can be retrieved from the camera by getting the value of the ZOffsetOriginToCameraFront
parameter:
const double offset = GenApi::CFloatPtr(camera.GetParameter("ZOffsetOriginToCameraFront"))->GetValue();
If (x,y,z) are the coordinates of a point in the cameras's coordinate system, the corresponding coordinates (x',y',z') in a coordinate system that is shifted along the z axis to the front of the camera's housing can be determined using the following formulas:
x' = x
y' = y
z' = z - offset
Calculating Distances#
Given a point's coordinates (x,y,z)
in mm, the distance of that point to the camera's optical center can be calculated using the following formula:
d = sqrt( x*x + y*y + z*z )
The distance d'
to the front of the camera's housing can be calculated as follows:
z' = z - offset
d' = sqrt( x*x + y*y + z'*z')
Visualizing Depth Information as RGB images#
You can calculate RGB values from z coordinates or distance values using the following scheme for a rainbow color mapping. This can be useful for improved visualization of the data.
First, a depth value from the [minDepth..maxDepth]
value range is converted into a 10-bit value. This 10-bit depth value is mapped to 4 color ranges where each range has a resolution of 8 bits.
minDepth
and maxDepth
= values of the DepthMin
and DepthMax
parameters, i.e., the camera's current depth ROI
Depth Value | Mapped to Color Range |
---|---|
0..255 | red to yellow (255,0,0) -> (255,255,0) |
256..511 | yellow to green (255,255,0) -> (0, 255, 0) |
512..767 | green to aqua (0,255,0) -> (0,255,255) |
768..1023 | aqua to blue (0,255,255) -> (0, 0, 255) |
In the following code snippet, depth
is either a z value or a distance value in mm.
minDepth = (int) GenApi::CIntegerPtr(camera.GetParameter("DepthMin"))->GetValue();
maxDepth = (int) GenApi::CIntegerPtr(camera.GetParameter("DepthMax"))->GetValue();
const double scale = 65535.0 / (maxDepth - minDepth);
for each pixel {
// Set depth either to the corresponding z value or
// a distance value calculated from the z value.
// Clip depth if required.
if (depth < minDepth)
depth = minDepth;
else if (depth > maxDepth)
depth = maxDepth;
// Compute RGB values.
const uint16_t g = (uint16_t)((depth - minDepth) * scale);
const uint16_t val = g >> 6 & 0xff;
const uint16_t sel = g >> 14;
uint32_t res = val << 8 | 0xff;
if (sel & 0x01)
{
res = (~res) >> 8 & 0xffff;
}
if (sel & 0x02)
{
res = res << 8;
}
const uint8_t r = res & 0xff;
res = res >> 8;
const uint8_t g = res & 0xff;
res = res >> 8;
const uint8_t b = res & 0xff;
}