使用 GPU 扩展和加速深度学习

Using GPUs to Scale and Speed-up Deep Learning

Training complex deep learning models with large datasets takes along time. In this course, you will learn how to use accelerated GPU hardware to overcome the scalability problem in deep learning.

1050 次查看
IBM
edX
  • 完成时间大约为 5
  • 中级
  • 英语
注:因开课平台的各种因素变化,以上开课日期仅供参考

你将学到什么

Explain what GPU is, how it can speed up the computation, and its advantages in comparison with CPUs.

Implement deep learning networks on GPUs.

Train and deploy deep learning networks for image and video classification as well as for object recognition.

课程概况

Training acomplex deep learning model with a very large datasetcan take hours, days and occasionally weeks to train. So, what is the solution? Accelerated hardware.

Youcan use accelerated hardware such as Google’s Tensor Processing Unit(TPU) or Nvidia GPU to accelerateyourconvolutional neural network computations timeon the Cloud. These chips arespecifically designed to support the training of neural networks, as well as the use of trained networks(inference).Accelerated hardware has recently been proven to significantly reduce training time.

But the problem is that your datamight be sensitiveand you may not feel comfortable uploading iton apublic cloud, preferring to analyze it on-premise.In this case, youneed to use an in-house system withGPU support. One solution isto useIBM’s Power SystemswithNvidia GPU andPowerAI. ThePowerAIplatform supports popular machine learning libraries and dependencies including Tensorflow, Caffe, Torch, and Theano.

In this course, you’ll understand what GPU-based accelerated hardware is and how it can benefit your deep learning scaling needs. You’ll also deploydeep learning networks on GPU accelerated hardware for several problems, including the classification ofimages and videos.

课程大纲

Module 1 – Quick review of Deep Learning
Intro to Deep Learning

Deep Learning Pipeline
Module 2 – Hardware Accelerated Deep Learning
How to accelerate a deep learning model?

Running TensorFlow operations on CPUs vs. GPUs
Convolutional Neural Networks on GPUs

Recurrent Neural Networks on GPUs
Module 3 – Deep Learning in the Cloud
Deep Learning in the Cloud

How does one use a GPU
Module 4 – Distributed Deep Learning
* Distributed Deep Learning
Module 5 – PowerAI vision
Computer vision

Image Classification
* Object recognition in Videos.

千万首歌曲。全无广告干扰。
此外,您还能在所有设备上欣赏您的整个音乐资料库。免费畅听 3 个月,之后每月只需 ¥10.00。
Apple 广告
声明:MOOC中国十分重视知识产权问题,我们发布之课程均源自下列机构,版权均归其所有,本站仅作报道收录并尊重其著作权益。感谢他们对MOOC事业做出的贡献!
  • Coursera
  • edX
  • OpenLearning
  • FutureLearn
  • iversity
  • Udacity
  • NovoEd
  • Canvas
  • Open2Study
  • Google
  • ewant
  • FUN
  • IOC-Athlete-MOOC
  • World-Science-U
  • Codecademy
  • CourseSites
  • opencourseworld
  • ShareCourse
  • gacco
  • MiriadaX
  • JANUX
  • openhpi
  • Stanford-Open-Edx
  • 网易云课堂
  • 中国大学MOOC
  • 学堂在线
  • 顶你学堂
  • 华文慕课
  • 好大学在线CnMooc
  • (部分课程由Coursera、Udemy、Linkshare共同提供)

© 2008-2022 CMOOC.COM 慕课改变你,你改变世界