Kit SDK 107.0 is a major update release with primary updates for robotics development.
]]>With emerging use cases such as digital humans, agents, podcasts, images, and video generation, generative AI is changing the way we interact with PCs. This paradigm shift calls for new ways of interfacing with and programming generative AI models. However, getting started can be daunting for PC developers and AI enthusiasts. Today, NVIDIA released a suite of NVIDIA NIM microservices on…
]]>The next generation of AI-driven robots like humanoids and autonomous vehicles depends on high-fidelity, physics-aware training data. Without diverse and representative datasets, these systems don’t get proper training and face testing risks due to poor generalization, limited exposure to real-world variations, and unpredictable behavior in edge cases. Collecting massive real-world datasets for…
]]>AI is transforming how we experience our favorite games. It is unlocking new levels of visuals, performance, and gameplay possibilities with neural rendering and generative AI-powered characters. With game development becoming more complex, AI is also playing a role in helping artists and engineers realize their creative visions. At GDC 2025, NVIDIA is building upon NVIDIA RTX Kit…
]]>The release of NVIDIA Video Codec SDK 13.0 marks a significant upgrade, adding support for the latest-generation NVIDIA Blackwell GPUs. This version brings a wealth of improvements aimed at elevating both video encoding and decoding capabilities. From enhanced compression efficiency to better throughput and encoding quality, SDK 13.0 addresses the ever-evolving demands of the video ecosystem.
]]>NVIDIA announces the implementation of Multi-View High Efficiency Video Coding (MV-HEVC) encoder in the latest NVIDIA Video Codec SDK release, version 13.0. This significant update marks a major leap forward in hardware-accelerated, multi-view video compression. It offers enhanced compression efficiency and quality for stereoscopic and 3D video applications as compared to simulcast encoding.
]]>Hardware support for ray tracing triangle meshes was introduced as part of NVIDIA RTX in 2018. But ray tracing for hair and fur has remained a compute-intensive problem that has been difficult to further accelerate. That is, until now. NVIDIA GeForce 50 Series GPUs include a major advancement in the acceleration of ray tracing for hair and fur: hardware ray tracing support for the linear…
]]>Neural rendering is the next era of computer graphics. By integrating neural networks into the rendering process, we can take dramatic leaps forward in performance, image quality, and interactivity to deliver new levels of immersion. NVIDIA RTX Kit is a suite of neural rendering technologies to ray-trace games with AI, render scenes with immense geometry, and create game characters with…
]]>Geometric detail in computer graphics has increased exponentially in the past 30 years. To render high quality assets with higher instance counts and greater triangle density, NVIDIA introduced RTX Mega Geometry. RTX Mega Geometry is available today through NVIDIA RTX Kit, a suite of rendering technologies to ray trace games with AI, render scenes with immense geometry, and create game characters…
]]>The next generation of NVIDIA graphics hardware has arrived. Powered by NVIDIA Blackwell, GeForce RTX 50 Series GPUs deliver groundbreaking new RTX features such as DLSS 4 with Multi Frame Generation, and NVIDIA RTX Kit with RTX Mega Geometry and RTX Neural Shaders. NVIDIA RTX Blackwell architecture introduces fifth-generation Tensor Cores to drive AI workloads and fourth-generation RT Cores with…
]]>NVIDIA DLSS 4 is the latest iteration of DLSS introduced with the NVIDIA GeForce RTX 50 Series GPUs. It includes several new features: Here’s how you can get started with DLSS 4 in your integrations. This post focuses on the Streamline SDK, which provides a plug-and-play framework for simplified plugin integration. The NVIDIA Streamline SDK is an open-source framework that…
]]>Take the three self-paced courses at no cost through the NVIDIA Deep Learning Institute (DLI).
]]>Tune in January 16th at 9:00 AM PT for a live recap, followed by a Q&A of the latest developer announcements at CES 2025.
]]>NVIDIA today unveiled next-generation hardware for gamers, creators, and developers—the GeForce RTX 50 Series desktop and laptop GPUs. Alongside these GPUs, NVIDIA introduced NVIDIA RTX Kit, a suite of neural rendering technologies to ray trace games with AI, render scenes with immense geometry, and create game characters with lifelike visuals. RTX Kit enhances geometry, textures, materials…
]]>Grab your copy of GPU Zen 3 to learn about the latest in real-time rendering.
]]>Filmmaking is an intricate and complex process that involves a diverse team of artists, writers, visual effects professionals, technicians, and countless other specialists. Each member brings their unique expertise to the table, collaborating to transform a simple idea into a captivating cinematic experience. From the initial spark of a story to the final cut, every step requires creativity…
]]>NVIDIA OptiX is the API for GPU-accelerated ray tracing with CUDA, and is often used to render scenes containing a wide variety of objects and materials. During an OptiX launch, when a ray intersects a geometric primitive, a hit shader is executed. The question of which shader is executed for a given intersection is answered by the Shader Binding Table (SBT). The SBT may also be used to map input…
]]>NVIDIA just announced a series of small language models (SLMs) that increase the amount and type of information digital humans can use to augment their responses. This includes new large-context models that provide more relevant answers and new multi-modal models that allow images as inputs. These models are available now as part of NVIDIA ACE, a suite of digital human technologies that brings…
]]>Meshes are one of the most important and widely used representations of 3D assets. They are the default standard in the film, design, and gaming industries and they are natively supported by virtually all the 3D softwares and graphics hardwares. A 3D mesh can be considered as a collection of polygon faces, most commonly consisting of triangles or quadrilaterals.
]]>One of the great pastimes of graphics developers and enthusiasts is comparing specifications of GPUs and marveling at the ever-increasing counts of shader cores, RT cores, teraflops, and overall computational power with each new generation. Achieving the maximum theoretical performance represented by those numbers is a major focus in the world of graphics programming. Massive amounts of rendering…
]]>We are entering a new era of AI-powered digital workflow, where Windows 365 Cloud PCs are dynamic platforms that host AI technologies and reshape traditional processes. GPU acceleration unlocks the potential for AI-augmented workloads running on Windows 365 Cloud PCs, enabling advanced computing capabilities for everyone. The integration of NVIDIA GPUs with NVIDIA RTX Virtual Workstation…
]]>Producing commercials is resource-intensive, requiring physical locations and various props and setups to display products in different settings and environments for more accurate consumer targeting. This traditional process is not only expensive and time-consuming but also can be destructive to the physical environment. It leaves you with no ability to capture a new angle after you return home.
]]>The NVIDIA RTX AI for Windows PCs platform offers a thriving ecosystem of thousands of open-source models for application developers to leverage and integrate into Windows applications. Notably, llama.cpp is one popular tool, with over 65K GitHub stars at the time of writing. Originally released in 2023, this open-source repository is a lightweight, efficient framework for large language model…
]]>Gaming has always pushed the boundaries of graphics hardware. The most popular games typically required robust GPU, CPU, and RAM resources on a user’s PC or console—that is, until the advent of GeForce NOW and cloud gaming. Today, with the power of interactive streaming from the cloud, any user on almost any device can play the latest and greatest of today’s games. However…
]]>At Unreal Fest 2024, NVIDIA released new Unreal Engine 5 on-device plugins for NVIDIA ACE, making it easier to build and deploy AI-powered MetaHuman characters on Windows PCs. ACE is a suite of digital human technologies that provide speech, intelligence, and animation powered by generative AI. Developers can also now access a new Audio2Face-3D plugin for AI-powered facial animations in…
]]>Accelerate your OpenUSD workflows with this free curriculum for developers and 3D practitioners.
]]>The NVIDIA Maxine AI developer platform is a suite of NVIDIA NIM microservices, cloud-accelerated microservices, and SDKs that offer state-of-the-art features for enhancing real-time video and audio. NVIDIA partners use Maxine features to create better virtual interaction experiences and improve human connections with their applications. Making and maintaining eye contact are rare in virtual…
]]>Today, over 80% of internet traffic is video. This content is generated by and consumed across various devices, including IoT gadgets, smartphones, computers, and TVs. As pixel density and the number of connected devices grow, continued investment in fast, efficient, high-quality video encoding and decoding is essential. The latest NVIDIA data center GPUs, such as the NVIDIA L40S and NVIDIA…
]]>NVIDIA Holoscan for Media is now ready to be used in live production, taking advantage of the best of both networking and GPU technologies. Holoscan for Media is a software-defined, AI-enabled platform that enables live video pipelines to run on the same infrastructure as AI. It delivers applications from established and emerging vendors across the ecosystem on repurposable…
]]>Text-to-image diffusion models can generate diverse, high-fidelity images based on user-provided text prompts. They operate by mapping a random sample from a high-dimensional space, conditioned on a user-provided text prompt, through a series of denoising steps. This results in a representation of the corresponding image, . These models can also be used for more complex tasks such as image…
]]>At Gamescom 2024, NVIDIA announced our first on-device small language model (SLM) for improving the conversation abilities of game characters. We also announced that the first game to showcase NVIDIA ACE and digital human technologies is Amazing Seasun Games’ Mecha BREAK, bringing its characters to life and providing a more dynamic and immersive gameplay experience on NVIDIA GeForce RTX AI PCs.
]]>Effective video communication is important for everyone who communicates online. For businesses, educators, and content creators, it is vital. NVIDIA Maxine is a suite of NVIDIA-accelerated SDKs, cloud-native containerized NVIDIA NIM microservices for deploying AI features that enhance real-time audio and video for video conferencing, digital humans, virtual presence, and content creation.
]]>Shaders are specialized programs that run on the GPU that manipulate rays, pixels, vertices, and textures to achieve unique visual effects. With shaders, you can add creative expression and realism to the rendered image. They’re essential in ray tracing for simulating realistic lighting, shadows, and reflections. We love shaders, but they can be hard to debug. Shader calculations are complex…
]]>Generative physical AI models can understand and execute actions with fine or gross motor skills within the physical world. Understanding and navigating in the 3D space of the physical world requires spatial intelligence. To achieve spatial intelligence in physical AI involves converting the real world into AI-ready virtual representations that the model can understand.
]]>At SIGGRAPH 2024 this week, NVIDIA is showcasing the latest advancements in the NVIDIA Maxine AI developer platform, available through NVIDIA AI Enterprise. This platform empowers you to deploy cutting-edge AI features that enhance audio and video quality and enable augmented reality effects. NVIDIA just announced the upcoming availability of Maxine Video Relighting for early access…
]]>Recent advancements in generative AI and multi-view reconstruction have introduced new ways to rapidly generate 3D content. However, to be useful for downstream applications like robotics, design, AR/VR, and games, it must be possible to manipulate these 3D models in a physically plausible way. This poses a major challenge to traditional physics simulation algorithms, which were designed to…
]]>Developers from advertising agencies to software vendors are empowering global brands to deliver hyperpersonalization for digital experiences and visual storytelling with product configurator solutions. Integrating NVIDIA Omniverse with OpenUSD and generative AI into product configurators enables solution providers and software developers to deliver interactive, ray-traced…
]]>Complimentary trainings on OpenUSD, Digital Humans, LLMs and more with hands-on labs for Full Conference and Experience attendees.
]]>With the rise of chatbots and virtual assistants, customer interactions have evolved to embrace the versatility of voice and text inputs. However, integrating visual and personalized components into these interactions is essential for creating immersive, user-centric experiences. Enter UneeQ, a leading platform known for its creation of lifelike digital characters through AI-powered…
]]>Generative AI, the ability of algorithms to process various types of inputs—such as text, images, audio, video, and code—and generate new content, is advancing at an unprecedented rate. While this technology is making significant strides across multiple industries, one sector that stands to benefit immensely is the Architecture, Engineering, and Construction (AEC) industry.
]]>NVIDIA Video Codec SDK provides a comprehensive set of APIs for hardware-accelerated video encode and decode on Windows and Linux. The 12.2 release improves video quality for high-efficiency video coding (HEVC). It offers a significant reduction in bit rates, particularly for natural video content. This post details the following new features: The lookahead level can help analyze…
]]>NVIDIA Omniverse is a platform that enables you to build applications for complex 3D and industrial digitalization workflows based on Universal Scene Description (OpenUSD). The platform’s modular architecture breaks down into core technologies and services, which you can directly integrate into tools and applications, customizing as needed. This approach simplifies integration…
]]>In today’s digital age, creating realistic animated characters is crucial for filmmakers, game developers, and content creators looking to bring their visions to life. Reallusion is at the forefront of this cutting-edge art form, using powerful AI technologies like NVIDIA Audio2Face and NVIDIA Maxine to craft lifelike digital humans and character animations. A major challenge exists in…
]]>NVIDIA RTX Video is a collection of AI video enhancements that improve the visual quality of lower-quality video. Originally released as a driver API, NVIDIA RTX Video is now available as an SDK, letting you integrate these effects directly into your own applications. NVIDIA RTX Video Super Resolution upscales video while removing compression artifacts.
]]>NVIDIA ACE—a suite of generative AI-enabled digital human technologies—is now generally available for developers. Packaged as NVIDIA NIM microservices, ACE enables developers to deliver high-quality natural language understanding, speech synthesis, and facial animation for gaming, customer service, healthcare, and more. NVIDIA is also introducing ACE PC NIM microservices for deployment…
]]>AI is rapidly changing industrial visual inspection. In a factory setting, visual inspection is used for many issues, including detecting defects and missing or incorrect parts during assembly. Computer vision can help identify problems with products early on, reducing the chances of them being delivered to customers. However, developing accurate and versatile object detection models remains…
]]>Integrate RTX into your own game and understand what ReSTIR means for the future of real-time lighting in this May 21 webinar.
]]>Professional workflows have become more complex with the increased demand for graphics-intensive scenarios. From regular office applications to demanding manufacturing, architectural, engineering, and media and entertainment applications, GPU-enabled Cloud PCs are enabling professionals to access those applications from anywhere and on any device from the cloud. Windows 365 Cloud PCs are now…
]]>Text-to-image diffusion models have been established as a powerful method for high-fidelity image generation based on given text. Nevertheless, diffusion models do not always grant the desired alignment between the given input text and the generated image, especially for complicated idiosyncratic prompts that are not encountered in real life. Hence, there is growing interest in efficiently fine…
]]>Universal Scene Description, also called OpenUSD or USD, is an open and extensible framework for creating, editing, querying, rendering, collaborating, and simulating within 3D worlds. Invented by Pixar Animation Studios, USD is much more than a file format. It is an ecosystem and interchange paradigm that models, labels, classifies, and combines a wide range of data sources into a composed ground…
]]>NVIDIA Holoscan for Media is now available to all developers looking to build next-generation live media applications on fully repurposable clusters. Holoscan for Media is a software-defined platform for building and deploying applications for live media. It revolutionizes application development by providing an IP-based, cloud-native architecture that isn’t constrained by dedicated hardware…
]]>NVIDIA AI Workbench, a toolkit for AI and ML developers, is now generally available as a free download. It features automation that removes roadblocks for novice developers and makes experts more productive. Developers can experience a fast and reliable GPU environment setup and the freedom to work, manage, and collaborate across heterogeneous platforms regardless of skill level.
]]>The union of ray tracing and AI is pushing graphics fidelity and performance to new heights. Helping you build optimized, bug-free applications in this era of rendering technology, the latest release of NVIDIA Nsight Graphics introduces new features for ray tracing development, including tools to help you harness AI acceleration. Check out what’s new in the NVIDIA Nsight Graphics 2024.1…
]]>At GDC 2024, NVIDIA announced that leading AI application developers such as Inworld AI are using NVIDIA digital human technologies to accelerate the deployment of generative AI-powered game characters alongside updated NVIDIA RTX SDKs that simplify the creation of beautiful worlds. You can incorporate the full suite of NVIDIA digital human technologies or individual microservices into…
]]>As ray tracing becomes the predominant rendering technique in modern game engines, a single GPU RayGen shader can now perform most of the light simulation of a frame. To manage this level of complexity, it becomes necessary to observe a decomposition of shader performance at the HLSL or GLSL source-code level. As a result, shader profilers are now a must-have tool for optimizing ray tracing.
]]>NVIDIA Holoscan for Media is a software-defined platform for building and deploying applications for live media. Recent updates introduce a user-friendly developer interface and new capabilities for application deployment to the platform. Holoscan for Media now includes Helm Dashboard, which delivers an intuitive user interface for orchestrating and managing Helm charts.
]]>GPU-driven rendering has long been a major goal for many game applications. It enables better scalability for handling large virtual scenes and reduces cases where the CPU could bottleneck a game’s performance. Short of running the game’s logic on the GPU, I see the pinnacle of GPU-driven rendering as a scenario in which the CPU sends the GPU only the new frame’s camera information…
]]>When it comes to game application performance, GPU-driven rendering enables better scalability for handling large virtual scenes. Direct3D 12 (D3D12) introduces work graphs as a programming paradigm that enables the GPU to generate work for itself on the fly. For an introduction to work graphs, see Advancing GPU-Driven Rendering with Work Graphs in Direct3D 12. This post features a Direct3D…
]]>Learn how AI and NVIDIA Maxine are transforming the video streaming and conferencing industry.
]]>We are so excited to be back in person at GTC this year at the San Jose Convention Center. With thousands of developers, industry leaders, researchers, and partners in attendance, attending GTC in person gives you the unique opportunity to network with legends in technology and AI, and experience NVIDIA CEO Jensen Huang’s keynote live on-stage at the SAP Center. Past GTC alumni? Get 40%
]]>Gain a foundational understanding of USD, the open and extensible framework for creating, editing, querying, rendering, collaborating, and simulating within 3D worlds.
]]>Connect with industry leaders, learn from technical experts, and collaborate with peers at NVIDIA GTC 2024 Developer Days.
]]>This post was updated on April 17, 2024. For developers working on ray tracing applications for both DirectX 12 and Vulkan, ray tracing validation is here to help you improve performance, find hard-to-debug issues, and root-cause crashes. Unlike existing debug solutions, ray tracing validation performs checks at the driver level, which enables it to identify potential problems that…
]]>Developers and enterprises can now deploy lifelike virtual and mixed reality experiences with Varjo’s latest XR-4 series headsets, which are integrated with NVIDIA technologies. These XR headsets match the resolution that the human eye can see, providing users with realistic visual fidelity and performance. The latest XR-4 series headsets support NVIDIA Omniverse and are powered by NVIDIA…
]]>HOMEE AI, an NVIDIA Inception member based in Taiwan, has developed an “AI-as-a-service” spatial planning solution to disrupt the $650B global home decor market. They’re helping furniture makers and home designers find new business opportunities in the era of industrial digitalization. Using NVIDIA Omniverse, the HOMEE AI engineering team developed an enterprise-ready service to deliver…
]]>Many PC games are designed around an eight-core console with an assumption that their software threading system ‘just works’ on all PCs, especially regarding the number of threads in the worker thread pool. This was a reasonable assumption not too long ago when most PCs had similar core counts to consoles: the CPUs were just faster and performance just scaled. In recent years though…
]]>Join us at the Game Developers Conference March 18-22 to discover how the latest generative AI and NVIDIA RTX technologies are accelerating game development.
]]>On March 19, learn how to build generative AI-enabled 3D pipelines and tools using Universal Scene Description for industrial digitalization.
]]>Railroad simulation is important in modern transportation and logistics, providing a virtual testing ground for the intricate interplay of tracks, switches, and rolling stock. It serves as a crucial tool for engineers and developers to fine-tune and optimize railway systems, ensuring efficiency, safety, and cost-effectiveness. Physically realistic simulations enable comprehensive scenario…
]]>Get up to speed on the current state of ray tracing in the NVIDIA RTX Branch of Unreal Engine and what’s coming next.
]]>The NVIDIA Maxine developer platform redefines video conferencing and editing by providing developers and businesses with a variety of low-code implementation options. These include GPU-accelerated AI microservices , SDKs, and NVIDIA-hosted API endpoints for AI enhancement of audio and video streams in real time. The latest Maxine developer platform release introduces early access to Voice…
]]>At CES, NVIDIA shared that SDXL Turbo, LCM-LoRA, and Stable Video Diffusion are all being accelerated by NVIDIA TensorRT. These enhancements allow GeForce RTX GPU owners to generate images in real-time and save minutes generating videos, vastly improving workflows. SDXL Turbo achieves state-of-the-art performance with a new distillation technology, enabling single-step image…
]]>NVIDIA is announcing the Generative AI on RTX PCs Developer Contest – designed to inspire innovation within the developer community. Build and submit your next innovative generative AI projects on Windows PC with RTX Systems, and you could win an RTX 4090 GPU, a full GTC in-person conference pass, and more in great prizes.
]]>Generative AI technologies are revolutionizing how games are produced and played. Game developers are exploring how these technologies can accelerate their content pipelines and provide new gameplay experiences previously thought impossible. One area of focus, digital avatars, will have a transformative impact on how gamers will interact with non-playable characters (NPCs). Historically…
]]>Convai is a versatile developer platform for designing characters with advanced multimodal perception abilities. These characters are designed to integrate seamlessly into both the virtual and real worlds. Whether you’re a creator, game designer, or developer, Convai enables you to quickly modify a non-playable character (NPC), from backstory and knowledge to voice and personality.
]]>Capturing video footage and playing games at 8K resolution with 60 frames per second (FPS) is now possible, thanks to advances in camera and display technologies. Major leading multimedia companies including RED Digital Cinema, Nikon, and Canon have already introduced 8K60 cameras for both the consumer and professional markets. On the display side, with the newest HDMI 2.1 standard…
]]>In 2019, if you wanted to check out the cutting edge in video game graphics, you needed an NVIDIA GeForce RTX 20 Series GPU and a copy of a game that was released in 1997, Quake II. With those pieces in hand, you could be among the first players in the world to see path tracing running in real-time on a consumer GPU. “Somehow a game from 1997 convinced me it was time to upgrade…
]]>Six years ago, real-time ray tracing was seen as a pipe dream. Back then, cinematic-quality rendering required computer farms to slowly bake every frame overnight—a painstaking process. By 2018, this level of performance was achievable in real-time, at 45 frames per second, enabling applications like video games to take a massive leap in graphical quality. As part of our RTX 500…
]]>As we approach the end of another exciting year at NVIDIA, it’s time to look back at the most popular stories from the NVIDIA Technical Blog in 2023. Groundbreaking research and developments in fields such as generative AI, large language models (LLMs), high-performance computing (HPC), and robotics are leading the way in transformative AI solutions and capturing the interest of our readers.
]]>In the fourth installment of this series on the superpowers of OpenUSD, learn how any digital content creation tool can be connected to USD. OpenUSD’s data source interoperability allows data from different tools to be used in the same scene or project.
]]>Swap chains are an integral part of how you get rendering data output to a screen. They usually consist of some group of output-ready buffers, each of which can be rendered to one at a time in rotation. In parallel with rendering to one of a swap chain’s buffers, some other buffer in the swap chain is generally read from for display output. This post covers best practices when working with…
]]>NVIDIA today unveiled major upgrades to the NVIDIA Avatar Cloud Engine (ACE) suite of technologies, bringing enhanced realism and accessibility to AI-powered avatars and digital humans. These latest animation and speech capabilities enable more natural conversations and emotional expressions. Developers can now easily implement and scale intelligent avatars across applications using new…
]]>In this webinar, see how YouTube creator JSFILMZ uses NVIDIA RTX and how it enables him to iterate on creative ideas.
]]>Intrinsics can be thought of as higher-level abstractions of specific hardware instructions. They offer direct access to low-level operations or hardware-specific features, enabling increased performance. In this way, operations can be performed across threads within a warp, also known as a wavefront. The following code example is an example with…
]]>There are some useful intrinsic functions in the NVIDIA GPU instruction set that are not included in standard graphics APIs. Updated from the original 2016 post to add information about new intrinsics and cross-vendor APIs in DirectX and Vulkan. For example, a shader can use warp shuffle instructions to exchange data between threads in a warp without going through shared memory…
]]>Whole human brain imaging of 100 brains at a cellular level within a 2-year timespan, and subsequent analysis and mapping, requires accelerated supercomputing and computational tools. This need is well matched by NVIDIA technologies, which range across hardware, computational systems, high-bandwidth interconnects, domain-specific libraries, accelerated toolboxes, curated deep-learning models…
]]>NVIDIA recently caught up with veteran-level lighting artist Ted Mebratu to find out how he pushed real-time lighting to its limits with the Rainy Neon Lights scene created by environment artist Maarten Hof. Using an NVIDIA RTX 3090Ti and the NVIDIA RTX Branch of Unreal Engine (NvRTX), Mebratu spoke to NVIDIA about what his aspirations were for the scene and pushing the limits of real-time…
]]>For manufacturing and industrial enterprises, efficiency and precision are essential. To streamline operations, reduce costs, and enhance productivity, companies are turning to digital twins and discrete-event simulation. Discrete-event simulation enables manufacturers to optimize processes by experimenting with different inputs and behaviors that can be modeled and tested step by step.
]]>Learn how game developers can add leading-edge NVIDIA RTX technologies to Unreal Engine with custom branches.
]]>Take a deep dive into denoising diffusion models, from building a U-Net to training a text-to-image model.
]]>By using descriptor types, you can bind resources to shaders and specify how those resources are accessed. This creates efficient communication between the CPU and GPU and enables shaders to access the necessary data during rendering.
]]>Differentiable Slang easily integrates with existing codebases—from Python, PyTorch, and CUDA to HLSL—to aid multiple computer graphics tasks and enable novel data-driven and neural research. In this post, we introduce several code examples using differentiable Slang to demonstrate the potential use across different rendering applications and the ease of integration. This is part of a series…
]]>NVIDIA just released a SIGGRAPH Asia 2023 research paper, SLANG.D: Fast, Modular and Differentiable Shader Programming. The paper shows how a single language can serve as a unified platform for real-time, inverse, and differentiable rendering. The work is a collaboration between MIT, UCSD, UW, and NVIDIA researchers. This is part of a series on Differentiable Slang. For more information about…
]]>NVIDIA offers a large suite of tools for graphics debugging, including NVIDIA Nsight System for CPU debugging, and Nsight Graphics for GPU debugging. Nsight Aftermath is useful for analyzing crash dumps. Thanks to Patrick Neill, Jeffrey Kiel, Justin Kim, Andrew Allan, and Louis Bavoil for their help with this post.
]]>Photogrammetry is the process of capturing images and stitching them together to create a digital model of the physical world.
]]>The broadcast industry is undergoing a transformation in how content is created, managed, distributed, and consumed. This transformation includes a shift from traditional linear workflows bound by fixed-function devices to flexible and hybrid, software-defined systems that enable the future of live streaming. Developers can now apply to join the early access program for NVIDIA Holoscan for…
]]>Explore how ray-traced caustics combined with NVIDIA RTX features can enhance the performance of your games.
]]>Moment Factory is a global multimedia entertainment studio that combines specializations in video, lighting, architecture, sound, software, and interactivity to create immersive experiences for audiences around the world. At NVIDIA GTC 2024, Moment Factory will showcase digital twins for immersive location-based entertainment with Universal Scene Description (OpenUSD). To see the latest…
]]>This post covers best practices when working with shaders on NVIDIA GPUs. To get a high and consistent frame rate in your applications, see all Advanced API Performance tips. Shaders play a critical role in graphics programming by enabling you to control various aspects of the rendering process. They run on the GPU and are responsible for manipulating vertices, pixels, and other data.
]]>Ray and path tracing algorithms construct light paths by starting at the camera or the light sources and intersecting rays with the scene geometry. As objects are hit, new secondary rays are generated on these surfaces to continue the paths. In theory, these secondary rays will not yield an intersection with the same triangle again, as intersections at a distance of zero are excluded by the…
]]>