Perception, Sensors, and Control

Sensor fusion is an essential part of future research. It combines data from various sensors to create a comprehensive picture of the environment, which is critical for applications such as autonomous driving. Edge computing also plays a crucial role in my future research. It moves data processing and storage away from centralized servers towards edges of networks closer to the data source, reducing latency and improving response times.

3D perception is another essential area of development. This technology allows machines to interpret and understand 3D data, which has many applications, such as augmented reality and robotics. Finally, real-time command and control systems are being developed to coordinate complex actions involving multiple agents like disaster response or military operations. By continuing to improve these technologies, future research will strive to provide more innovative and efficient solutions that are better suited for our increasingly connected world.

Realtime 3D-Perception

This project involves the development of a real-time 3D Perception using an Ouster 64 Beam Lidar, DJ Matric UAV, and NVIDIA Jetson for hardware-accelerated edge computing. The Ouster lidar provides high-resolution 3D point clouds for environment mapping and object detection. The DJI Matrice UAV is used to safely and accurately fly in environments that may be hazardous or difficult to reach on foot. Finally, the NVIDIA Jetson TX1 or Xavier processor is used for rapid data processing and inference, allowing faster and more accurate decisions in challenging environments.

Ultrasound Transducers