We love seeing all of the NVIDIA GPU-related tweets – here’s some that we came across this week:
When you receive news that you get 45k extra to spend on GPU hardware. #bayer #deeplearning pic.twitter.com/3on4R0jydh
— Ken Heyndrickx (@KenScience) June 10, 2017
https://twitter.com/alpha_blues/status/874559723718279168
Just landed on my desk! More #GPU power for my group's #deeplearning research :) many thanks to @nvidia corporation for the donation pic.twitter.com/f6s9PAi3Al
— Benoit Huet (@Benoit_Huet) June 9, 2017
hackathon success, StarLord (aka Castro's hydro) runs as fast on a GPU as a node of CPUs. Still more work to be done…
— Michael Zingale (@Michael_Zingale) June 9, 2017
I finished building this server last night. 10 @NvidiaAI Titan X Pascal GPUs #DeepLearning @datalogdotai @AccelerateAI @mypollydotai pic.twitter.com/Z3AN4bhGjA
— Jack C Crawford (@jackccrawford) June 15, 2017
https://twitter.com/olexandr/status/871837067176792065
Thanks to #NVIDIA for suporting our research! pic.twitter.com/W1GFr6i5L0
— Pattern Recognition and Earth Observation Lab (@PATREO_UFMG) June 12, 2017
Visiting @alaincxs of @signalboxai showing their deep learning platform built on AWS #gpu #deeplearning https://t.co/RkDSnZTFV0
— AWS Startups (@AWSstartups) June 13, 2017
In fact GPUs have been central to DL, because the computation/$ ratio is so much better than CPUs. This GPU costs around USD 600. Not bad!
— RealScientists (@realscientists) June 12, 2017
https://twitter.com/MostafaElzoghbi/status/875043568704008192
Do you tweet? Follow us on twitter @GPUComputing and @NVIDIA.