This release is a book publication announcement, not a software release. No code changes, dependencies, or developer action required.
Unclaimed project
Are you a maintainer of d2l-en? Claim this project to take control of your public changelog and roadmap.
Changelog
Interactive deep learning book with multi-framework code, math, and discussions. Adopted at 500 universities from 70 countries including Stanford, MIT, Harvard, and Cambridge.
Last updated over 1 year ago
AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.
Stable Diffusion web UI
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
A feature-rich command-line audio/video downloader
This release is a book publication announcement, not a software release. No code changes, dependencies, or developer action required.
D2L has gone 1.0.0-beta0! We thank all the 296 contributors for making this happen!
Chapter 1--11 will be forthcoming on Cambridge University Press (early 2023).

Add PaddlePaddle implementation for the d2l library (compatible with classic.d2l.ai).
See the PR in d2l-zh.
We are happy to release D2L 1.0.0-alpha1.post0 We thank all the contributors who have made this open-source textbook better for everyone.
This minor release includes the following updates:
We are excited to announce the release of D2L 1.0.0-alpha0! We thank all the 265 contributors who have made this open-source textbook better for everyone.
We have added the following new topics, with discussions of more recent methods such as ResNeXt, RegNet, ConvNeXt, Vision Transformer, Swin Transformer,...
This release fixes issues when installing d2l package and running d2l notebooks on Google Colab with Python 3.7 and updates PyTorch & TensorFlow to their respective latest versions.
More concretely, this release includes the following upgrades/fixes:
This release supports running the book with SageMaker Studio Lab for free and introduces several fixes:
Dive into Deep Learning is now available on arxiv!
We have added TensorFlow implementations up to Chapter 11 (Optimization Algorithms).
The following chapters have been significantly improved for v1.0:
We have added the brand-new Chapter: Attention Mechanisms:
Attention Cues
Attention Pooling: Nadaraya-Watson Kernel Regression
Attention S...
We have added PyTorch implementations up to Chapter 11 (Optimization Algorithms). Chapter 1--7 and Chapter 11 have also been adapted to TensorFlow.
The following chapters have been significantly improved for v1.0: