Researchers Developed a Novel Markerless AI Method to Track Bird Postures in 3D Using Video Recordings

Tracking the behavior, gaze, and fine-scaled movements of animals and birds has been a challenging task for researchers as there is still the scarcity of availability of large datasets of annotated images of animals for markerless pose tracking, taken from multiple angles with accurate 3D annotations. The complexity of observing and understanding the intricate behavior of birds and animals has led to a global effort in devising innovative tracking methods.

To tackle this challenge, the researchers from the Cluster of Excellence Center for the Advanced Study of Collective Behavior (CASCB) at the University of Konstanz have developed a dataset to advance behavioral research. With this markerless method, they have made it possible to track the fine-scaled behaviors of individual birds and observe their movements.

This research team has successfully managed to create a markerless method to identify and track the bird postures with the help of video recordings. They have called this method as 3D-POP(3D Posture of Pigeons). Through this method, one can record the video of pigeons and easily identify the gaze and behavior of each individual bird. Hence, it is no longer required to attach movement transmitters to the animals to track and identify birds. 

🚀 Build high-quality training datasets with Kili Technology and solve NLP machine learning challenges to develop powerful ML applications

Also, the dataset has enabled researchers to collectively study the behavioral patterns of birds by just using two cameras. The researchers used the fact that for birds, by tracking the head and body orientations, many key behaviors such as feeding (pecking ground), preening, vigilance (head scanning), courtship (head bowing), or walking can be quantified.

The researchers who formulated this 3D-POP method included video recordings of 18 unique pigeons in varied group sizes of 1,2,5 and 10 from many different and varied views. They also offered ground truth for identity, 2D-3D trajectories, and 2D-3D posture mapping for all individuals across the entire dataset of 300K frames. The dataset they formulated also consisted of annotations for object detection in the form of bounding boxes. 

The researchers collected the dataset from pigeons moving on a jute fabric (3.6m x 4.2m). They then scattered grains on this fabric to encourage the pigeons to feed in that fabric area. That feeding area was located inside a large enclosure equipped with a mo-cap(Motion capture) system (15m x 7m x 4m). The mo-cap system consisted of 30 motion capture cameras (12 Vicon Vero 2.2, 18 Vicon Vantage-5 cameras; 100Hz). At the corners of the feeding area, they placed 4 high-resolution (4K) Sony action cameras mounted on standard tripods and an Arduino-based synchronization box that flashes RGB and infrared LED lights every 5 seconds. These 18 pigeons were put for experimentation for 6 days. They selected 10 pigeons each day randomly for the experimentation.

This method is proving useful in tracking animals’ behavior, gaze, and fine-scaled movements. The researchers have suggested that this annotation method can also be used with other birds or other animals so that researchers can also study and analyze the behavior of other animals.


Check out the Paper and Reference Article. All Credit For This Research Goes To the Researchers on This Project. Also, don’t forget to join our 26k+ ML SubRedditDiscord Channel, and Email Newsletter, where we share the latest AI research news, cool AI projects, and more.


Rachit Ranjan is a consulting intern at MarktechPost . He is currently pursuing his B.Tech from Indian Institute of Technology(IIT) Patna . He is actively shaping his career in the field of Artificial Intelligence and Data Science and is passionate and dedicated for exploring these fields.


🔥 Gain a competitive
edge with data: Actionable market intelligence for global brands, retailers, analysts, and investors. (Sponsored)

Credit: Source link

Comments are closed.