Image sourceπŸ”—

Sends image frames received from a camera. The image source can work in two different modes:

  • Receive and pass images from a camera at the speed the camera is producing them. The camera may operate in a free run or a hardware-triggered mode.

  • Read images from a camera, but trigger acquisition through software every time a signal is received in the trigger input. This mode allows one to use an external sofware controller such as a timer for triggering images.

InputsπŸ”—

  • trigger: Software trigger. If this input is connected, images are captured only when an object is received in it. The trigger input accepts any object type.

  • cameraId: The ID of the selected camera. The camera ID must be composed of a driverId, a colon and a camera id, for example β€œwebcam:usb-Logitech_Ltd._video-index0”.

    The ID can also be given as a regular expression. This is indicated by surrounding the ID with forward slashes. For example, β€œ/^webcam:.*0$/” uses the first web camera whose id ends with β€œ0”. β€œ/./” (Automatic) matches all cameras and thus uses the very first camera found by the camera manager. See device IDs.

    If cameraId is a colon (:), a temporary virtual camera will be created. The object received in the trigger input will then be used as a file name to read.

  • autoConfig: If this flag is true, the camera will be configured with settings read from the camera configuration database at start-up. Otherwise, camera settings will not be touched with the exception of trigger mode, which will be set to software trigger (and back) if the trigger input is connected.

  • position: The method of requesting camera position parameters. While it is possible to change camera position parameters manually at run time, the position database will only be queried during start-up. Thus, changes made to the position database will take effect only after a restart.

  • frame: The position and orientation of the world coordinate system with respect to the camera. This parameter has an effect only in the ManualPosition mode. See coordinate frames for more information.

  • calibration: The method of requesting camera calibration parameters. While it is possible to change manual calibration parameters at run time, the calibration database will only be queried during start-up. Thus, changes made to the calibration database will take effect only after a restart.

  • sensorType: The type of the camera’s image sensor.

  • focalLength: The focal length of the camera in pixels.

  • autoCenter: If this flag is true, the principal point will be automatically placed to the center of each outgoing image. Otherwise, the fixed value given by the principalPoint parameter will be used.

  • principalPoint: The location of the principal point in pixel coordinates. The principal point is the point where the optical axis hits the image sensor.

  • distortionFactors: Radial distortion factors k1 and k2. Tangential distortion factors p1 and p2 are currently not used.

  • triggerTimeout: The number of milliseconds the tool will wait for an image after a software trigger. If no image is received from the camera within triggerTimeout milliseconds, a run-time error will be generated. Setting triggerTimeout to zero makes the tool to wait forever.

  • autoDecode: If set to false, the tool will emit images in the raw encoded color format such as Bayer or YUV provided by the camera driver. If set to true, the images are always automatically decoded to RGB before emitting.

OutputsπŸ”—

  • image: The image received from a camera driver or from the image property.

  • imageCount: The number of images sent since last reset.

  • missedFrames: The number of captured image frames that were missed due to exhausted computing resources since last reset. This output will be sent only when frames are missed, and it is not synchronous to the other outputs. One should use this output for informational purposes only. Attaching heavy analysis to it easily causes more frames to be missed.

enum CalibrationπŸ”—

Methods for getting calibration data.

Values:

enumerator NoCalibrationπŸ”—

The tool makes no changes to calibration parameters.

If incoming images have no calibration data, neither will the outgoing ones. Note that .kuva image files contain calibration data. If you feed such images to the image source with calibration set to NoCalibration, the calibration parameters stored in the images will be preserved.

enumerator DatabaseCalibrationπŸ”—

Calibration data is fetched from a calibration database based on the currently selected camera.

If no camera is selected, DatabaseCalibration is the same as NoCalibration.

enumerator ManualCalibrationπŸ”—

Calibration data is entered manually.

The manually entered data will be used even if the incoming image contains calibration parameters.

enum PositionπŸ”—

Methods for getting position data.

Values:

enumerator NoPositionπŸ”—

The tool makes no changes to position parameters.

If incoming images have no position data, neither will the outgoing ones. Note that .kuva image files contain position data. If you feed such images to the image source with position set to NoPosition, the position parameters stored in the images will be preserved.

enumerator DatabasePositionπŸ”—

Position data is fetched from a position database based on the currently selected camera.

If no camera is selected, DatabasePosition is the same as NoPosition.

enumerator ManualPositionπŸ”—

Position data is entered manually.

The manually entered data will be used even if the incoming image contained position parameters.

enum SensorTypeπŸ”—

Image sensor types.

Values:

enumerator AutoDetectSensorπŸ”—

Detect sensor type automatically based on the information given by the camera driver.

enumerator AreaSensorπŸ”—

Force the type to area scan.

enumerator LineSensorπŸ”—

Force the type to line scan.

QObject *driver() constπŸ”—

Returns a pointer to the current camera driver of zero if no camera driver is being used.

This function makes it possible to access the camera driver directly.