Lidar – NVIDIA Technical Blog News and tutorials for developers, data scientists, and IT admins 2025-03-26T00:06:00Z http://www.open-lab.net/blog/feed/ Zan Gojcic <![CDATA[Sensing New Frontiers with Neural Lidar Fields for Autonomous Vehicle Simulation]]> http://www.open-lab.net/blog/?p=68457 2023-08-10T17:11:16Z 2023-07-27T17:24:57Z Autonomous vehicle (AV) development requires massive amounts of sensor data for perception development. Developers typically get this data from two...]]> Autonomous vehicle (AV) development requires massive amounts of sensor data for perception development. Developers typically get this data from two...A still view of a lidar point cloud in a driving scene generated by neural lidar fields.

Autonomous vehicle (AV) development requires massive amounts of sensor data for perception development. Developers typically get this data from two sources��replay streams of real-world drives or simulation. However, real-world datasets offer limited flexibility, as the data is fixed to only the objects, events, and view angles captured by the physical sensors. It is also difficult to simulate��

Source

]]>
0
Tyler McKean <![CDATA[Create High-Quality Computer Vision Applications with Superb AI Suite and NVIDIA TAO Toolkit]]> http://www.open-lab.net/blog/?p=66232 2023-06-14T19:36:45Z 2023-06-12T17:00:00Z Data labeling and model training are consistently ranked as the most significant challenges teams face when building an AI/ML infrastructure. Both are essential...]]> Data labeling and model training are consistently ranked as the most significant challenges teams face when building an AI/ML infrastructure. Both are essential...Tennis game

Data labeling and model training are consistently ranked as the most significant challenges teams face when building an AI/ML infrastructure. Both are essential steps in the ML application development process, and if not done correctly, they can lead to inaccurate results and decreased performance. See the AI Infrastructure Ecosystem of 2022 report from the AI Infrastructure Alliance for more��

Source

]]>
0
Suhas Hariharapura Sheshadri https://www.linkedin.com/in/suhassheshadri/ <![CDATA[Build High Performance Robotic Applications with NVIDIA Isaac ROS Developer Preview 3]]> http://www.open-lab.net/blog/?p=63440 2023-10-20T18:02:36Z 2023-04-18T17:00:00Z Robots are increasing in complexity, with a higher degree of autonomy, a greater number and diversity of sensors, and more sensor fusion-based algorithms....]]> Robots are increasing in complexity, with a higher degree of autonomy, a greater number and diversity of sensors, and more sensor fusion-based algorithms....Robots in manufacturing facility

Robots are increasing in complexity, with a higher degree of autonomy, a greater number and diversity of sensors, and more sensor fusion-based algorithms. Hardware acceleration is essential to run these increasingly complex workloads, enabling robotics applications that can run larger workloads with more speed and power efficiency. The mission of NVIDIA Isaac ROS has always been to empower��

Source

]]>
0
Jason Black <![CDATA[NVIDIA Jetson Project of the Month: An AI-Powered Autonomous Miniature Race Car Gets on Track]]> http://www.open-lab.net/blog/?p=60024 2023-04-06T03:03:21Z 2023-01-25T19:15:20Z The 65th annual Daytona 500 will take place on February 19, 2023 and for many this elite NASCAR event is the pinnacle of the car racing world. For now, there...]]> The 65th annual Daytona 500 will take place on February 19, 2023 and for many this elite NASCAR event is the pinnacle of the car racing world. For now, there...

The 65th annual Daytona 500 will take place on February 19, 2023 and for many this elite NASCAR event is the pinnacle of the car racing world. For now, there are no plans to see an autonomous vehicle racing against cars with drivers, but it��s not too hard to imagine that scenario at a future race. At CES in early January, there was a competition to test the best autonomous racing vehicles.

Source

]]>
0
Asawaree Bhide <![CDATA[Detecting Objects in Point Clouds Using ROS 2 and TAO-PointPillars]]> http://www.open-lab.net/blog/?p=55668 2023-06-12T08:53:26Z 2022-09-30T16:00:00Z Accurate, fast object detection is an important task in robotic navigation and collision avoidance. Autonomous agents need a clear map of their surroundings to...]]> Accurate, fast object detection is an important task in robotic navigation and collision avoidance. Autonomous agents need a clear map of their surroundings to...

Accurate, fast object detection is an important task in robotic navigation and collision avoidance. Autonomous agents need a clear map of their surroundings to navigate to their destination while avoiding collisions. For example, in warehouses that use autonomous mobile robots (AMRs) to transport objects, avoiding hazardous machines that could potentially damage robots has become a challenging��

Source

]]>
2
Lei Fan <![CDATA[Detecting Objects in Point Clouds with NVIDIA CUDA-Pointpillars]]> http://www.open-lab.net/blog/?p=42822 2022-08-21T23:53:16Z 2022-01-11T17:01:35Z A point cloud is a data set of points in a coordinate system. Points contain a wealth of information, including three-dimensional coordinates X, Y, Z; color;...]]> A point cloud is a data set of points in a coordinate system. Points contain a wealth of information, including three-dimensional coordinates X, Y, Z; color;...

A point cloud is a data set of points in a coordinate system. Points contain a wealth of information, including three-dimensional coordinates X, Y, Z; color; classification value; intensity value; and time. Point clouds mostly come from lidars that are commonly used in various NVIDIA Jetson use cases, such as autonomous machines, perception modules, and 3D modeling. One of the key��

Source

]]>
6
Katie Washabaugh <![CDATA[Webinar: Learn How NVIDIA DriveWorks Gets to the Point with Lidar Sensor Processing]]> http://www.open-lab.net/blog/?p=36199 2023-08-18T19:33:59Z 2021-08-12T20:14:01Z With NVIDIA DriveWorks SDK, autonomous vehicles can bring their understanding of the world to a new dimension. The SDK enables autonomous vehicle developers to...]]> With NVIDIA DriveWorks SDK, autonomous vehicles can bring their understanding of the world to a new dimension. The SDK enables autonomous vehicle developers to...

With NVIDIA DriveWorks SDK, autonomous vehicles can bring their understanding of the world to a new dimension. The SDK enables autonomous vehicle developers to easily process three-dimensional lidar data and apply it to specific tasks, such as perception or localization. You can learn how to implement this critical toolkit in our expert-led webinar, Point Cloud Processing on DriveWorks, Aug. 25.

Source

]]>
0
Lei Fan <![CDATA[Accelerating Lidar for Robotics with NVIDIA CUDA-based PCL]]> http://www.open-lab.net/blog/?p=23786 2022-08-21T23:41:01Z 2021-02-01T05:42:08Z Many Jetson users choose lidars as their major sensors for localization and perception in autonomous solutions. Lidars describe the spatial environment around...]]> Many Jetson users choose lidars as their major sensors for localization and perception in autonomous solutions. Lidars describe the spatial environment around...

Many Jetson users choose lidars as their major sensors for localization and perception in autonomous solutions. Lidars describe the spatial environment around the vehicle as a collection of three-dimensional points known as a point cloud. Point clouds sample the surface of the surrounding objects in long range and high precision, which are well-suited for use in higher-level obstacle perception��

Source

]]>
25
Shane Murray <![CDATA[Autonomous Vehicle Radar Perception in 360 Degrees]]> http://www.open-lab.net/blog/?p=15945 2023-02-13T17:04:19Z 2019-11-27T17:30:39Z Our radar perception pipeline delivers 360-degree surround perception around the vehicle, using production-grade radar sensors operating at the 77GHz automotive...]]> Our radar perception pipeline delivers 360-degree surround perception around the vehicle, using production-grade radar sensors operating at the 77GHz automotive...

Our radar perception pipeline delivers 360-degree surround perception around the vehicle, using production-grade radar sensors operating at the 77GHz automotive microwave band. Signals transmitted and received at microwave wavelengths are resistant to impairment from poor weather (such as rain, snow, and fog), and active ranging sensors do not suffer reduced performance during night time��

Source

]]>
5
Hope Allen <![CDATA[Point Cloud Processing with NVIDIA DriveWorks SDK]]> http://www.open-lab.net/blog/?p=14504 2023-02-13T17:04:08Z 2019-05-28T13:00:43Z The NVIDIA DriveWorks SDK?contains a collection of CUDA-based low level point cloud processing modules optimized for NVIDIA DRIVE AGX platforms. The DriveWorks...]]> The NVIDIA DriveWorks SDK?contains a collection of CUDA-based low level point cloud processing modules optimized for NVIDIA DRIVE AGX platforms. The DriveWorks...

The NVIDIA DriveWorks SDK contains a collection of CUDA-based low level point cloud processing modules optimized for NVIDIA DRIVE AGX platforms. The DriveWorks Point Cloud Processing modules include common algorithms that any AV developer working with point cloud representations would need, such as accumulation and registration. Figure 1 shows NVIDIA test vehicles outfitted with lidar.

Source

]]>
0
���˳���97caoporen����