Class VisualNPipeline

Class Documentation

class aslam::VisualNPipeline

An interface for pipelines that turn images into VisualNFrames.

This is the abstract interface for visual pipelines that turn raw images into VisualNFrame data. The underlying pipeline may include undistortion or rectification, image contrast enhancement, feature detection and descriptor computation, or other operations.

The class has two NCameras calibration structs that represent the intrinsic and extrinsic calibration of the camera system. The “input” calibration (getInputNCameras()) represents the calibration of raw camera system, before any image processing, resizing, or undistortion has taken place. The “output” calibration (getOutputNCameras()) represents the calibration parameters of the images and keypoints that get set in the VisualNFrames struct. These are the camera parameters after image processing, resizing, undistortion, etc.

The class should synchronize images with nearby timestamps and handle out-of-order images. When all frames of a VisualNFrame are complete, they are added to a list of output frames in the order that they are completed. This list should be sorted by time (oldest first) and the number of elements can be queried by numVisualNFramesComplete(). The getNext() function retrieves the oldest complete VisualNFrames and leaves the remaining. The getLatestAndClear() function gets the newest VisualNFrames and discards anything older.

Public Functions

ASLAM_POINTER_TYPEDEFS(VisualNPipeline)
ASLAM_DISALLOW_EVIL_CONSTRUCTORS(VisualNPipeline)
VisualNPipeline(size_t num_threads, const std::vector<VisualPipeline::Ptr> &pipelines, const NCamera::Ptr &input_camera_system, const NCamera::Ptr &output_camera_system, int64_t timestamp_tolerance_ns)

Initialize a working pipeline.

Parameters
  • num_threads[in] The number of processing threads.

  • pipelines[in] The ordered image pipelines, one pipeline per camera in the same order as they are indexed in the camera system.

  • input_camera_system[in] The camera system of the raw images.

  • output_camera_system[in] The camera system of the processed images.

  • timestamp_tolerance_ns[in] How close should two image timestamps be for us to consider them part of the same synchronized frame?

~VisualNPipeline()
void shutdown()

Shutdown the thread pool and release blocking waiters.

void processImage(size_t camera_index, const cv::Mat &image, int64_t timestamp)

Add an image to the visual pipeline.

This function is called by a user when an image is received. The pipeline then processes the images and constructs VisualNFrames

Parameters
  • camera_index[in] The index of the camera that this image corresponds to

  • image[in] the image data

  • timestamp[in] the time in integer nanoseconds.

bool processImageBlockingIfFull(size_t camera_index, const cv::Mat &image, int64_t timestamp, size_t max_output_queue_size)

Same as processImage with the difference that the function call blocks if the output queue exceeds the specified limit.

Parameters
  • camera_index[in] The index of the camera that this image corresponds to.

  • image[in] the image data.

  • timestamp[in] the time in integer nanoseconds.

  • max_output_queue_size[in] the max. size of the output queue. The function call will block once this limit has been reached. As the frames are processed in a thread pool it is possible that the real queue size will exceed the defined size by the number of currently processed nframes.

Returns

Returns false if the queue is shut down.

bool processImageNonBlockingDroppingOldestNFrameIfFull(size_t camera_index, const cv::Mat &image, int64_t timestamp, size_t max_output_queue_size)

Same as processImage with the difference that the function call erases the oldest element of the queue if it exceeds the maximum size.

Parameters
  • camera_index[in] The index of the camera that this image corresponds to.

  • image[in] the image data.

  • timestamp[in] the time in integer nanoseconds.

  • max_output_queue_size[in] the max. size of the output queue. The function call will erase the oldest element of the output queue once this limit has been reached. As the frames are processed in a thread pool it is possible that the real queue size will exceed the defined size by the number of currently processed nframes.

Returns

Returns true if oldest nframe has been dropped.

size_t getNumFramesComplete() const

How many completed VisualNFrames are waiting to be retrieved?

size_t getNumFramesProcessing() const

Get the number of frames being processed.

std::shared_ptr<VisualNFrame> getNext()

Get the next available set of processed frames.

This may not be the latest data, it is simply the next in a FIFO queue. If there are no VisualNFrames waiting, this returns a NULL pointer.

bool getNextBlocking(std::shared_ptr<VisualNFrame> *nframe)

Get the next available set of processed frames.

Blocks if the queue is empty.

Returns

Returns false if the queue is shut down.

std::shared_ptr<VisualNFrame> getLatestAndClear()

Get the latest available data and clear anything older.

Returns

If there are no VisualNFrames waiting, this returns a NULL pointer.

bool getLatestAndClearBlocking(std::shared_ptr<VisualNFrame> *nframe)

Get the latest available data and clear anything older.

Block if the queue is empty.

Returns

Returns false if the queue is shut down.

std::shared_ptr<const NCamera> getInputNCameras() const

Get the input camera system that corresponds to the images passed in to processImage().

Because this pipeline may do things like image undistortion or rectification, the input and output camera systems may not be the same.

std::shared_ptr<const NCamera> getOutputNCameras() const

Get the output camera system that corresponds to the VisualNFrame data that comes out.

Because this pipeline may do things like image undistortion or rectification, the input and output camera systems may not be the same.

void waitForAllWorkToComplete() const

Blocks until all waiting frames are processed.

Public Static Functions

static VisualNPipeline::Ptr createTestVisualNPipeline(size_t num_cameras, size_t num_threads, int64_t timestamp_tolerance_ns)

Create a test visual npipeline.

Parameters
  • num_cameras[in] The number of cameras in the pipeline (determines the number of frames).

  • num_threads[in] The number of threads used in the pipeline for processing the data.

  • timestamp_tolerance_ns[in] Timestamp tolerance for frames to be considered belonging together. [ns]

Returns

Pointer to the visual npipeline.