Seminar #36

时间: 2022-11-12 10:00-11:00 地点: zoom seminar

本周我们将在【线上】为大家带来姚班学长贾志豪老师对【人工智能系统】的介绍!

  • 贾志豪

    本次Seminar我们非常荣幸请到了卡内基梅隆大学计算机系助理教授贾志豪老师为我们介绍计算机系统领域的前沿研究。贾志豪老师2013年毕业于清华姚班,之后在斯坦福大学取得了博士学位。他的研究兴趣主要集中在为机器学习、量子计算和大规模数据分析等新兴应用领域构建系统。本次talk的主要内容是机器学习优化的自动生成,将会介绍计算图优化器TASO和分布式训练优化器FlexFlow。

    Automated Discovery of Machine Learning Optimizations

    Abstract: As an increasingly important workload, machine learning (ML) applications require different performance optimization techniques from traditional runtimes and compilers. In particular, to accelerate ML applications, it is generally necessary to perform ML computations on heterogeneous hardware and parallelize computations using multiple data dimensions, neither of which is even expressible in traditional compilers and runtimes. In this talk, I will describe my work on automated discovery of performance optimizations to accelerate ML computations. TASO, the Tensor Algebra SuperOptimizer, optimizes the computation graphs of deep neural networks (DNNs) by automatically generating potential graph optimizations and formally verifying their correctness. TASO outperforms rule-based graph optimizers in existing ML systems (e.g., TensorFlow, TensorRT, and TVM) by up to 3x by automatically discovering novel graph optimizations, while also requiring significantly less human effort. FlexFlow is a system for accelerating distributed DNN training. FlexFlow identifies parallelization dimensions not considered in existing ML systems (e.g., TensorFlow and PyTorch) and automatically discovers fast parallelization strategies for a specific parallel machine. FlexFlow has been used to train production ML models that do not scale well in current ML systems, achieving over 10x performance improvement.

    本次报告不需要人工智能和计算机系统的前置知识,欢迎全体同学参加!

  • 【重复一遍时间地点】北京时间 11.12 周六 上午 10:00-11:00 点击此处进行时区转换
  • 【zoom】 889 8461 8902

联系我们

Make IIIS Great Again!

清华大学姚班研讨会