Allison Gray – NVIDIA Technical Blog News and tutorials for developers, data scientists, and IT admins 2022-08-21T23:38:08Z http://www.open-lab.net/blog/feed/ Allison Gray <![CDATA[Deploying Deep Neural Networks with NVIDIA TensorRT]]> http://www.open-lab.net/blog/parallelforall/?p=7674 2022-08-21T23:38:08Z 2017-04-03T05:53:05Z [caption id="attachment_7710" align="alignright" width="300"] Figure 1: NVIDIA Tensor RT provides 16x higher energy efficiency for neural network inference with...]]>

Editor’s Note: An updated version of this, with additional tutorial content, is now available. See “How to Speed Up Deep Learning Using TensorRT“. NVIDIA TensorRT is a high-performance deep learning inference library for production environments. Power efficiency and speed of response are two key metrics for deployed deep learning applications, because they directly affect the user experience…

Source

]]>
17
Allison Gray <![CDATA[Exploring the SpaceNet Dataset Using DIGITS]]> http://www.open-lab.net/blog/parallelforall/?p=7083 2022-08-21T23:37:56Z 2016-09-01T04:23:57Z DigitalGlobe, CosmiQ Works and NVIDIA recently announced the launch of the SpaceNet online satellite imagery repository. This public dataset of high-resolution...]]>

DigitalGlobe, CosmiQ Works and NVIDIA recently announced the launch of the SpaceNet online satellite imagery repository. This public dataset of high-resolution satellite imagery contains a wealth of geospatial information relevant to many downstream use cases such as infrastructure mapping, land usage classification and human geography estimation. The SpaceNet release is unprecedented: it’s the…

Source

]]>
14
Allison Gray <![CDATA[Production Deep Learning with NVIDIA GPU Inference Engine]]> http://www.open-lab.net/blog/parallelforall/?p=6823 2022-08-21T23:37:54Z 2016-06-20T06:01:59Z [caption id="attachment_6824" align="alignright" width="300"] Figure 1. NVIDIA GPU Inference Engine (GIE) provides even higher efficiency and performance for...]]>

[Update September 13, 2016: GPU Inference Engine is now TensorRT] Today at ICML 2016, NVIDIA announced its latest Deep Learning SDK updates, including DIGITS 4, cuDNN 5.1 (CUDA Deep Neural Network Library) and the new GPU Inference Engine. NVIDIA GPU Inference Engine (GIE) is a high-performance deep learning inference solution for production environments. Power efficiency and speed of response…

Source

]]>
18
Allison Gray <![CDATA[NVIDIA and IBM Cloud Support ImageNet Large Scale Visual Recognition Challenge]]> http://www.open-lab.net/blog/parallelforall/?p=5725 2022-08-21T23:37:36Z 2015-08-13T13:00:33Z This year��s ImageNet Large Scale Visual Recognition Challenge (ILSVRC) is about to begin. Every year, organizers from the University of North Carolina at...]]>

This year’s ImageNet Large Scale Visual Recognition Challenge (ILSVRC) is about to begin. Every year, organizers from the University of North Carolina at Chapel Hill, Stanford University, and the University of Michigan host the ILSVRC, an object detection and image classification competition, to advance the fields of machine learning and pattern recognition. Competitors are given more than 1.2…

Source

]]>
0
Allison Gray <![CDATA[Easy Multi-GPU Deep Learning with DIGITS 2]]> http://www.open-lab.net/blog/parallelforall/?p=5492 2022-08-21T23:37:33Z 2015-07-07T13:00:24Z DIGITS is an interactive deep learning development tool for data scientists and researchers, designed for rapid development and deployment of an optimized deep...]]>

DIGITS is an interactive deep learning development tool for data scientists and researchers, designed for rapid development and deployment of an optimized deep neural network. NVIDIA introduced DIGITS in March 2015, and today we are excited to announce the release of DIGITS 2, which includes automatic multi-GPU scaling. Whether you are developing an optimized neural network for a single data set…

Source

]]>
34
Allison Gray <![CDATA[Get Ready for the Low-Power Image Recognition Challenge with Jetson TK1]]> http://www.open-lab.net/blog/parallelforall/?p=5056 2022-08-21T23:37:32Z 2015-04-13T13:00:03Z Image recognition and GPUs go hand-in-hand, particularly when using deep neural networks (DNNs). The strength of GPU-based DNNs for image recognition has been...]]>

Image recognition and GPUs go hand-in-hand, particularly when using deep neural networks (DNNs). The strength of GPU-based DNNs for image recognition has been unequivocally demonstrated by their success over the past few years in the ImageNet Large Scale Visual Recognition Challenge (ILSVRC), and DNNs have recently achieved classification accuracy on par with trained humans, as Figure 1 shows.

Source

]]>
0
Allison Gray <![CDATA[DIGITS: Deep Learning GPU Training System]]> http://www.open-lab.net/blog/parallelforall/?p=4973 2022-08-21T23:37:31Z 2015-03-17T17:19:28Z The hottest area in machine learning today is Deep Learning, which uses Deep Neural Networks (DNNs) to teach computers to detect recognizable concepts in data....]]>

The hottest area in machine learning today is Deep Learning, which uses Deep Neural Networks (DNNs) to teach computers to detect recognizable concepts in data. Researchers and industry practitioners are using DNNs in image and video classification, computer vision, speech recognition, natural language processing, and audio recognition, among other applications. The success of DNNs has been…

Source

]]>
54
���˳���97caoporen����