Check out the on-demand sessions from the Low-Code/No-Code Summit to learn how to successfully innovate and achieve efficiency by upskilling and scaling citizen developers. Watch now.
Among the most widely used machine learning (ML) technologies today is the open-source PyTorch framework.
PyTorch got its start at Facebook (now known as Meta) in 2016 with the 1.0 release debuting in 2018. In September 2022, Meta moved the PyTorch project to the new PyTorch Foundation, which is operated by the Linux Foundation. Today, PyTorch developers took the next major step forward for PyTorch, announcing the first experimental release of PyTorch 2.0. The new release promises to help accelerate ML training and development, while still maintaining backward-compatibility with existing PyTorch application code.
“We added an additional feature called `torch.compile` that users have to newly insert into their codebases,” Soumith Chintala, lead maintainer, PyTorch. told VentureBeat. “We are calling it 2.0 because we think users will find it a significant new addition to the experience.”
The new compiler in PyTorch that makes all the difference for ML
There have been discussions in the past about when the PyTorch project should call a new release 2.0.
Intelligent Security Summit
Learn the critical role of AI & ML in cybersecurity and industry specific case studies on December 8. Register for your free pass today.
In 2021, for example, there was a brief discussion on whether PyTorch 1.10 should be labeled as a 2.0 release. Chintala said that PyTorch 1.10 didn’t have enough fundamental changes from 1.9 to warrant a major number upgrade to 2.0.
The most recent generally available release of PyTorch is version 1.13, which came out at the end of October. A key feature in that release came from an IBM code contribution enabling the machine learning framework to work more effectively with commodity ethernet-based networking for large-scale workloads.
Chintala emphasized that now is the right time for PyTorch 2.0 because the project is introducing an additional new paradigm in the PyTorch user experience, called torch.compile, that brings solid speedups to users that weren’t possible in the default eager mode of PyTorch 1.0.
He explained that on about 160 open-source models on which the PyTorch project validated early builds of 2.0, there has been a 43% speedup and they worked reliably with the one-line addition to the codebase.
“We expect that with PyTorch 2, people will change the way they use PyTorch day-to-day,” Chintala said.
He said that with PyTorch 2.0, developers will start their experiments with eager mode and, once they get to training their models for long periods, activate compiled mode for additional performance.
“Data scientists will be able to do with PyTorch 2.x the same things that they did with 1.x, but they can do them faster and at a larger scale,” Chintala said. “If your model was training over 5 days, and with 2.x’s compiled mode it now trains in 2.5 days, then you can iterate on more ideas with this added time, or build a bigger model that trains within the same 5 days.”
More Python coming to PyTorch 2.x
PyTorch gets the first part of its name (Py) from the open-source Python programming language that is widely used in data science.
Modern PyTorch releases, however, haven’t been entirely written in Python — as parts of the framework are now written in the C++ programming language.
“Over the years, we’ve moved many parts of torch.nn from Python into C++ to squeeze that last-mile performance,” Chintala said.
Chintala said that within the later 2.x series (but not in 2.0), the PyTorch project expects to move code related to torch.nn back into Python. He noted that C++ is typically faster than Python, but the new compiler (torch.compile) ends up being faster than running the equivalent code in C++.
“Moving these parts back to Python improves hackability and lowers the barrier for code contributions,” Chintala said.
Work on Python 2.0 will be ongoing for the next several months with general availability not expected until March 2023. Alongside the development effort is the transition for PyTorch from being governed and operated by Meta to being its own independent effort.
“It is early days for the PyTorch Foundation, and you will hear more over a longer time horizon,” Chintala said. “The foundation is in the process of executing various handoffs and establishing goals.”
VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.