![]() Robots that do use electronic vision to sense their environment typically only have one functional eye in any given scenario, meaning depth and distance can't be achieved by visual input alone. But there's a problem: IR sensors don't work well in daylight, which is a pretty significant limitation when you're talking about drones. Some drones use infrared, a thermal sensing technology that's used in night vision, to perform collision avoidance. And that's just one of the sensing technologies Google employs on its cars. Its system can accurately pick up a pedestrian crossing the street 100 meters away, which is a real feat, but it also reportedly costs 60,000 dollars. Google uses laser-based LIDAR on its self-driving car to sense objects. The problem is that refined versions of those sensing technologies are very expensive. Just having opened the box and connected the camera, I don’t know what type of performance to expect on either platform but this is certainly something to start building on.Today's robots that navigate their environments autonomously rely on lasers, radar, infrared, or some combination of these technologies to gauge distance, recognize objects, and avoid collisions. While the Jetson TK1 shows the depth map application running at about 8 frames per second, the Jetson TX1 runs the same program at around 18 frames per second. The video also shows the difference in performance of the Jetson TK1 versus the Jetson TX1. In the video, you could see that there are some issues with the video when using the explorer program. There are a couple of useful tools in the ‘tools’ folder, and some sample applications in the ‘sample’ folder. The installer places the SDK files in /usr/local/zed Then ‘chmod +x’ the downloaded file and execute it from the terminal. Note: More detailed and specific directions for installation of the SDK are available from Getting Started with Jetson TK1 and the ZED.Īfter downloading, in a Terminal switch over to the directory where you downloaded the SDK. Download the appropriate version for the Jetson being used. To get the ZED SDK, go to the Stereolabs Developers website, then scroll to the ZED SDK for Jetson section. The default for the TX1 is that USB 3.0 is enabled. The Stereolabs SDK requires CUDA be installed on the Jetson and that USB 3.0 be enabled on the TK1. Installation on either of the Jetson Development Kits is straightforward. The ZED can sense depth between 1 and 20 meters. The Stereolabs SDK on the host uses the geometry of the fixed distance between the imaging elements, and using the known field of view of the imagers calculates an accurate depth map. Im looking to do vision(with a ZED camera which requires nvidia gpu) on a tk1(from first choice) or tx1(if we need more performance) and im already very. On the host, the frame in the stream is then converted to a depth map using the host GPU. The video stream is sent over USB 3.0 to a host. The camera provides a video stream, each frame of which consists of a composite of an image from each camera, side by side. One imager is mounted on the left and the other on the right side of the camera enclosure. The ZED camera is a stereoscopic imaging camera which contains two high definition imagers. Here’s a brief install and demonstration video on both the Jetson TK1 and Jetson TX1 Development Kits. One of the main applications for the ZED is robotic vision, which is a perfect compliment to the Jetson Tegras. Stereolabs ZED Camera is billed as “The World’s First High Definition 3D Camera for Depth Sensing”.
0 Comments
Leave a Reply. |