At CES 2025, NVIDIA announced key updates to NVIDIA Isaac, a platform of accelerated libraries, application frameworks, and AI models that accelerate the development of AI robots. NVIDIA Isaac streamlines the development of robotic systems from simulation to real-world deployment. In this post, we discuss all the new advances in NVIDIA Isaac: NVIDIA Isaac Sim is a reference…
]]>Training physical AI models used to power autonomous machines, such as robots and autonomous vehicles, requires huge amounts of data. Acquiring large sets of diverse training data can be difficult, time-consuming, and expensive. Data is often limited due to privacy restrictions or concerns, or simply may not exist for novel use cases. In addition, the available data may not apply to the full range…
]]>Programming robots for real-world success requires a training process that accounts for unpredictable conditions, different surfaces, variations in object size, shape, texture, and more. Consequently, physically accurate simulations are vital for training AI-enabled robots before deployment. Crafting physically accurate simulation requires advanced programming skills to fine-tune algorithms…
]]>Physical AI-powered robots need to autonomously sense, plan, and perform complex tasks in the physical world. These include transporting and manipulating objects safely and efficiently in dynamic and unpredictable environments. Robot simulation enables developers to train, simulate, and validate these advanced systems through virtual robot learning and testing. It all happens in physics…
]]>The era of AI robots powered by physical AI has arrived. Physical AI models understand their environments and autonomously complete complex tasks in the physical world. Many of the complex tasks—like dexterous manipulation and humanoid locomotion across rough terrain—are too difficult to program and rely on generative physical AI models trained using reinforcement learning (RL) in simulation.
]]>Smart cities are the future of urban living. Yet they can present various challenges for city planners, most notably in the realm of transportation. To be successful, various aspects of the city—from environment and infrastructure to business and education—must be functionally integrated. This can be difficult, as managing traffic flow alone is a complex problem full of challenges such as…
]]>According to the American Society of Quality (ASQ), defects cost manufacturers nearly 20% of overall sales revenue. The products that we interact with on a daily basis—like phones, cars, televisions, and computers—must be manufactured with precision so that they can deliver value in varying conditions and scenarios. AI-based computer vision applications are helping to catch defects in the…
]]>Building AI Models from scratch is incredibly difficult, requiring mountains of data and an army of data scientists. With the NVIDIA TAO Toolkit, you can use the power of transfer learning to fine-tune NVIDIA pretrained models with your own data and optimize for inference—without AI expertise or large training datasets. You can now experience the TAO Toolkit through NVIDIA LaunchPad…
]]>Today, NVIDIA announced the general availability of the latest version of the TAO Toolkit. As a low-code version of the NVIDIA Train, Adapt and Optimize (TAO) framework, the toolkit simplifies and accelerates the creation of AI models for speech and vision AI applications. With TAO, developers can use the power of transfer learning to create production-ready models customized and optimized…
]]>AI applications are powered by models. Deep learning models are built on mathematical algorithms and trained using data and human expertise. These models can accurately predict outcomes based on input data such as images, text, or speech. Building, training, and optimizing these tasks are both critical and time-intensive. Domain expertise and countless hours of computation are needed to…
]]>All AI applications are powered by models. Models can help spot defects in parts, detect the early onset of disease, translate languages, and much more. But building custom models for a specific use requires mountains of data and an army of data scientists. NVIDIA TAO, an AI-model-adaptation framework, simplifies and accelerates the creation of AI models. By fine-tuning state-of-the-art…
]]>Molecular simulation communities have faced the accuracy-versus-efficiency dilemma in modeling the potential energy surface and interatomic forces for decades. Deep Potential, the artificial neural network force field, solves this problem by combining the speed of classical molecular dynamics (MD) simulation with the accuracy of density functional theory (DFT) calculation.1 This is achieved by…
]]>The NVIDIA NGC catalog is a hub of GPU-optimized deep learning, machine learning and HPC applications. With highly performant software containers, pre-trained models, industry specific SDKs and Helm charts you can simplify and accelerate your end-to-end workflows. The NVIDIA NGC team works closely with our internal and external partners to update the content in the catalog on a regular basis.
]]>Building a state-of-the-art deep learning model is a complex and time-consuming process. To achieve this, large datasets collected for the model must be of high quality. Once the data is collected, it must be prepared and then trained, and optimized over several iterations. This is not always an option for many enterprises looking to bring their AI applications to market faster while reducing…
]]>The NVIDIA NGC team is hosting a webinar with live Q&A to dive into our new Jupyter notebook available from the NGC catalog. Learn how to use these resources to kickstart your AI journey. NVIDIA NGC Jupyter Notebook Day: Building a 3D Medical Imaging Segmentation Model Thursday, July 22 at 9:00 AM PT Image segmentation deals with placing each pixel (or voxel in the case of 3D) of an…
]]>Deep learning research requires working at scale. Training on massive data sets or multilayered deep networks is computationally intensive and can take an impractically long time as deep learning models are bound by memory. The key here is to compose the deep learning models in a structured way so that they are decoupled from the engineering and data, enabling researchers to conduct fast research.
]]>Data is the backbone to building state-of-the-art accurate AI models. And easy access to high-quality data sets can reduce the overall development time significantly. However, the required data may be siloed, can come from different sources (for example, sensors, images, documents) and can be in structured as well as unstructured formats. Manually moving and transforming data from different…
]]>Announcing the availability of MATLAB 2021a on the NGC catalog, NVIDIA’s hub of GPU-optimized AI and HPC software. The latest version provides full support for running deep learning, automotive and scientific analysis on NVIDIA’s Ampere GPUs. In addition to supporting Ampere GPUs, the latest version also includes the following features and benefits: Download the MATLAB R2021a…
]]>A container is a portable unit of software that combines the application and all its dependencies into a single package that is agnostic to the underlying host OS. In a high-performance computing (HPC) environment, containers remove the need for building complex environments or maintaining environment modules, making it easy for researchers and systems administrators to deploy their HPC…
]]>The NVIDIA NGC catalog, a GPU-optimized hub for HPC, ML and AI applications, now has a new look and we’re really excited about it! Over the past few months we’ve been working with our community, design and research teams to bring you an enhanced user experience, engineered to deliver the most relevant content and features, faster than ever before. The new user interface…
]]>AI workflows are complex. Building an AI application is no trivial task, as it takes various stakeholders with domain expertise to develop and deploy the application at scale. Data scientists and developers need easy access to software building blocks, such as models and containers, that are not only secure and highly performant, but which have the necessary underlying architecture to build their…
]]>HPC development environments are typically complex configurations composed of multiple software packages, each providing unique capabilities. In addition to the core set of compilers used for building software from source code, they often include a number of specialty packages covering a broad range of operations such as communications, data structures, mathematics, I/O control…
]]>Hospitals today are seeking to overhaul their existing digital infrastructure to improve their internal processes, deliver better patient care, and reduce operational expenses. Such a transition is required if hospitals are to cope with the needs of a burgeoning human population, accumulation of medical patient data, and a pandemic. The goal is not only to digitize existing infrastructure but…
]]>AI is going mainstream and is quickly becoming pervasive in every industry—from autonomous vehicles to drug discovery. However, developing and deploying AI applications is a challenging endeavor. The process requires building a scalable infrastructure by combining hardware, software, and intricate workflows, which can be time-consuming as well as error-prone. To accelerate the end-to-end AI…
]]>The MLPerf consortium mission is to “build fair and useful benchmarks” to provide an unbiased training and inference performance reference for ML hardware, software, and services. MLPerf Training v0.7 is the third instantiation for training and continues to evolve to stay on the cutting edge. This round consists of eight different workloads that cover a broad diversity of use cases…
]]>Many system administrators use environment modules to manage software deployments. The advantages of environment modules are that they allow you to load and unload software configurations dynamically in a clean fashion, providing end users with the best experience when it comes to customizing a specific configuration for each application. However, robustly supporting HPC and deep learning…
]]>HPC applications are critical to solving the biggest computational challenges to further scientific research. There is a constant need to drive efficiencies in hardware and software stacks to run larger scientific models and speed up simulations. High communication costs between GPUs prevents you from maximizing performance from your existing hardware. To address this, we’ve built NVIDIA…
]]>AI is moving from research to production and enterprises are embracing its power to develop and deploy it across a wide variety of applications, including retail analytics, medical imaging, autonomous driving, and smart manufacturing. However, developing and deploying open source AI software inherently has its own challenges. Data scientists need optimized software, developers need the right…
]]>