Dong Meng – NVIDIA Technical Blog News and tutorials for developers, data scientists, and IT admins 2024-11-14T15:51:50Z http://www.open-lab.net/blog/feed/ Dong Meng <![CDATA[Simplifying and Accelerating Machine Learning Predictions in Apache Beam with NVIDIA TensorRT]]> http://www.open-lab.net/blog/?p=59150 2024-11-14T15:51:50Z 2022-12-16T20:56:33Z Loading and preprocessing data for running machine learning models at scale often requires seamlessly stitching the data processing framework and inference...]]>

Loading and preprocessing data for running machine learning models at scale often requires seamlessly stitching the data processing framework and inference engine together. In this post, we walk through the integration of NVIDIA TensorRT with Apache Beam SDK and show how complex inference scenarios can be fully encapsulated within a data processing pipeline. We also demonstrate how terabytes…

Source

]]>
0
Dong Meng <![CDATA[Accelerating Machine Learning Model Inference on Google Cloud Dataflow with NVIDIA GPUs]]> http://www.open-lab.net/blog/?p=34954 2022-08-21T23:52:18Z 2021-07-21T19:30:00Z Today, in partnership with NVIDIA, Google Cloud announced Dataflow is bringing GPUs to the world of big data processing to unlock new possibilities. With...]]>

Today, in partnership with NVIDIA, Google Cloud announced Dataflow is bringing GPUs to the world of big data processing to unlock new possibilities. With Dataflow GPU, users can now leverage the power of NVIDIA GPUs in their machine learning inference workflows. Here we show you how to access these performance benefits with BERT. Google Cloud’s Dataflow is a managed service for executing a…

Source

]]>
0
Dong Meng <![CDATA[Accelerating Analytics and AI with Alluxio and NVIDIA GPUs]]> http://www.open-lab.net/blog/?p=25111 2023-11-10T01:35:03Z 2021-03-23T12:00:00Z Data processing is increasingly making use of NVIDIA computing for massive parallelism. Advancements in accelerated compute mean that access to storage must...]]>

Data processing is increasingly making use of NVIDIA computing for massive parallelism. Advancements in accelerated compute mean that access to storage must also be quicker, whether in analytics, artificial intelligence (AI), or machine learning (ML) pipelines. The benefits from GPU acceleration are limited if data access dominates the execution time. GPU-based processing drives a…

Source

]]>
0
���˳���97caoporen����