Processing Measurement Results (Stereo ace)#
概要#
Basler Stereo ace cameras use the rectified stereo image pair to compute disparity, error, and confidence images.
After rectification, an object point is guaranteed to be projected into the same row in both the left and the right image. The column into which that point is projected in the right image, however, is always further to the left, i.e., lower than, or equal to the same point's column in the left image. Disparity is the difference between the columns in the right and left images and expresses the depth or distance of the object point from the camera. The disparity image stores the disparity values of all pixels in the left camera image. The larger the disparity, the closer the object point. A disparity of 0 means that the projections of the object point are in the same image column and the object point is at an infinite distance.
There are some factors that hinder the reliable calculation of disparities:
- Occlusions: Occlusions are areas on the left sides of objects that aren't visible for the right camera. They are "hidden" from view and therefore the disparity can't be determined.
- Lack of texture: The projector of the Stereo ace allows you to create texture, except in extreme cases, e.g., completely black objects which don't reflect any light anymore.
Pixels for which the disparity can't be determined are marked as invalid by assigning them a disparity value of 0. To distinguish between invalid disparity measurements and disparity measurements of 0 for objects that are infinitely far away, the disparity value for the latter is set to the smallest possible disparity value above 0.
To compute disparity values, the stereo matching algorithm has to find corresponding object points in the left and right camera images. These are points that represent the same object point in the scene. For stereo matching, the Basler Stereo ace uses SGM (Semi-Global Matching).
Computing Depth Images and Point Clouds#
The disparity image contains 16-bit unsigned integer values. These values must be multiplied by the scale value given in the Scan3dCoordinateScale parameter to get the disparity values d in pixels. To compute the 3D object coordinates from the disparity values, the focal length and the baseline as well as the principle point are required. These parameters are available in the Scan3dFocalLength, Scan3dBaseline, Scan3dPrincipalPointU, and Scan3dPrincipalPointV features. The focal length and principal point depend on the image resolution of the selected component. Knowing these values, the pixel coordinates and the disparities can be transformed into 3D object coordinates in the camera coordinate system.
情報
Before reading Scan3dFocalLength, Scan3dBaseline, Scan3dPrincipalPointU, and Scan3dPrincipalPointV, you must set the ComponentSelector parameter to Disparity as these values are component-dependent.
Assuming that d16ik is the 16-bit disparity value at column i and row k of a disparity image, the float disparity in pixels dik is given by the following formula:
$$ d_{ik}=d16_{ik} \cdot \mathrm{Scan3dCoordinateScale} $$
The 3D reconstruction in meters can be expressed using the following parameters:
$$ P_x=\left(i+0.5-\mathrm{Scan3dPrincipalPointU}\right) \frac{\mathrm{Scan3dBaseline}}{d_{ik}}\ $$
$$ P_y=\left(k+0.5-\mathrm{Scan3dPrincipalPointV}\right) \frac{\mathrm{Scan3dBaseline}}{d_{ik}}\ $$
$$ P_z=\mathrm{Scan3dFocalLength} \frac{\mathrm{Scan3dBaseline}}{d_{ik}} $$
The set of all object points computed from the disparity image gives the point cloud, which can be used for 3D modeling applications. The disparity image is converted into a depth image by replacing the disparity value in each pixel with the value of Pz.
ヒント
Instead of reading the parameters directly, you can use chunk data. That way, the information required for 3D calculation is added to every image by the ChunkScan3dCoordinateScale, ChunkScan3dFocalLength, ChunkScan3dBaseline, ChunkScan3dPrincipalPointU, and ChunkScan3dPrincipalPointV chunk parameters. To enable chunk data, set the ChunkModeActive parameter to true. Before reading these chunk data, you must set the ChunkComponentSelector parameter to Disparity.
距離の計算#
点の座標(x,y,z)がmmで指定されている場合、その点からカメラの光学中心までの距離は、次の式を使用して計算できます。
$$ d = sqrt( x * x + y * y + z * z ) $$