Tracking the behavior, gaze, and fine-scaled movements of animals and birds has been a difficult task for researchers as there remains to be the scarcity of availability of enormous datasets of annotated images of animals for markerless pose tracking, taken from multiple angles with accurate 3D annotations. The complexity of observing and understanding the intricate behavior of birds and animals has led to a worldwide effort in devising progressive tracking methods.
To tackle this challenge, the researchers from the Cluster of Excellence Center for the Advanced Study of Collective Behavior (CASCB) on the University of Konstanz have developed a dataset to advance behavioral research. With this markerless method, they’ve made it possible to trace the fine-scaled behaviors of individual birds and observe their movements.
This research team has successfully managed to create a markerless method to discover and track the bird postures with the assistance of video recordings. They’ve called this method as 3D-POP(3D Posture of Pigeons). Through this method, one can record the video of pigeons and simply discover the gaze and behavior of every individual bird. Hence, it is not any longer required to connect movement transmitters to the animals to trace and discover birds.
Also, the dataset has enabled researchers to collectively study the behavioral patterns of birds by just using two cameras. The researchers used the proven fact that for birds, by tracking the top and body orientations, many key behaviors comparable to feeding (pecking ground), preening, vigilance (head scanning), courtship (head bowing), or walking will be quantified.
The researchers who formulated this 3D-POP method included video recordings of 18 unique pigeons in varied group sizes of 1,2,5 and 10 from many various and varied views. In addition they offered ground truth for identity, 2D-3D trajectories, and 2D-3D posture mapping for all individuals across your entire dataset of 300K frames. The dataset they formulated also consisted of annotations for object detection in the shape of bounding boxes.
The researchers collected the dataset from pigeons moving on a jute fabric (3.6m x 4.2m). They then scattered grains on this fabric to encourage the pigeons to feed in that fabric area. That feeding area was positioned inside a big enclosure equipped with a mo-cap(Motion capture) system (15m x 7m x 4m). The mo-cap system consisted of 30 motion capture cameras (12 Vicon Vero 2.2, 18 Vicon Vantage-5 cameras; 100Hz). On the corners of the feeding area, they placed 4 high-resolution (4K) Sony motion cameras mounted on standard tripods and an Arduino-based synchronization box that flashes RGB and infrared LED lights every 5 seconds. These 18 pigeons were put for experimentation for six days. They chose 10 pigeons every day randomly for the experimentation.
This method is proving useful in tracking animals’ behavior, gaze, and fine-scaled movements. The researchers have suggested that this annotation method will also be used with other birds or other animals in order that researchers can even study and analyze the behavior of other animals.
Try the Paper and Reference Article. All Credit For This Research Goes To the Researchers on This Project. Also, don’t forget to affix our 26k+ ML SubReddit, Discord Channel, and Email Newsletter, where we share the newest AI research news, cool AI projects, and more.
Rachit Ranjan is a consulting intern at MarktechPost . He’s currently pursuing his B.Tech from Indian Institute of Technology(IIT) Patna . He’s actively shaping his profession in the sphere of Artificial Intelligence and Data Science and is passionate and dedicated for exploring these fields.
edge with data: Actionable market intelligence for global brands, retailers, analysts, and investors. (Sponsored)