ScalaGrad: A Statically Typed Automatic Differentiation Library for Safer Data Science
Metadata only
Date
2024Type
- Conference Paper
ETH Bibliography
yes
Altmetrics
Abstract
While the data science ecosystem is dominated by programming languages that do not feature a strong type system, it is widely agreed that using strongly typed programming languages leads to more maintainable and less error-prone code and ultimately more trustworthy results. We believe Scala 3 would be an excellent contender for data science in a strongly typed language, but it lacks a general automatic differentiation library, e.g., for gradient-based learning. We present ScalaGrad, a general and type-safe automatic differentiation library designed for Scala. It builds on and improves a novel approach from the functional programming community using immutable duals, which is conceptually simple, asymptotically optimal and allows differentiation of higher-order code. We demonstrate the ease of use, robust performance, and versatility of ScalaGrad through its applications to deep learning, higher-order optimization, and gradient-based sampling. Specifically, we show an execution speed comparable to PyTorch for a simple deep learning use case, capabilities for higher-order differentiation, and opportunities to design more specialized libraries decoupled from ScalaGrad. As data science challenges evolve in complexity, ScalaGrad provides a pathway to harness the inherent advantages of strongly typed languages, ensuring both robustness and maintainability. Show more
Publication status
publishedExternal links
Book title
2024 11th IEEE Swiss Conference on Data Science (SDS)Pages / Article No.
Publisher
IEEEEvent
Subject
Automatic Differentiation; Scala 3; ScalaGradNotes
Conference Presentation held on May 31, 2024.More
Show all metadata
ETH Bibliography
yes
Altmetrics