Exploring Serverless Computing for Neural Network Training Conference Paper uri icon

abstract

  • 2018 IEEE. Serverless or functions as a service runtimes have shown significant benefits to efficiency and cost for event-driven cloud applications. Although serverless runtimes are limited to applications requiring lightweight computation and memory, such as machine learning prediction and inference, they have shown improvements on these applications beyond other cloud runtimes. Training deep learning can be both compute and memory intensive. We investigate the use of serverless runtimes while leveraging data parallelism for large models, show the challenges and limitations due to the tightly coupled nature of such models, and propose modifications to the underlying runtime implementations that would mitigate them. For hyperparameter optimization of smaller deep learning models, we show that serverless runtimes can provide significant benefit.

name of conference

  • 2018 IEEE 11th International Conference on Cloud Computing (CLOUD)

published proceedings

  • PROCEEDINGS 2018 IEEE 11TH INTERNATIONAL CONFERENCE ON CLOUD COMPUTING (CLOUD)

altmetric score

  • 4.35

author list (cited authors)

  • Feng, L., Kudva, P., Da Silva, D., & Hu, J.

citation count

  • 46

complete list of authors

  • Feng, Lang||Kudva, Prabhakar||Da Silva, Dilma||Hu, Jiang

publication date

  • July 2018