We have two nodes in our system where one node publishes frames from the inbuilt camera and the other processing node takes the image and shows it using imshow in opencv. The messages are essentially Images from sensor_msgs. Both publish and subscribe nodes have queues set to 1 and in the case of subscriber node we also set the to buffer size to 2**24 (we are using rospy).
In the publisher node we set the timestamp to ropsy.Time.now() and in the subscriber node we compare the current time with the timestamp of the message to check how old the frame from the camera is, essentially the delay of communication. If the subscriber node does nothing else then show the image, the delay is very small, few miliseconds. However, if we simulate that the subscriber node needs to do something to process the frame and then show it, the delays increase substantially (we measure the delay before the messages are "processed"). For example, if we set the processing delay of 1 second, then all incoming frames are delayed by 1 second plus the additional 1 second of "processing time". If we take the how queues should work then the input queue on the subscriber should be overwritten with a new frame just before the callback is called (given that it publishes 30 FPS) which would then make the delay of only 1 second due to processing. Right now it seems that the previous message is somehow cached or that the subscriber queue is updated only when the callback function is called and never overwritten.
Are we thinking in the wrong way here, is this expected behaviour?
↧