[ Sharing ]  PyTorch 1.8 Release, including Compiler and Distributed Training updates, and New Mobile Tutorials
  Comments:

PyTorch 1.8 Release, including Compiler and Distributed Training updates, and New Mobile Tutorials

  By : Leadtek AI Expert     953

AI News Sharing

The PyTorch 1.8 release brings a host of new and updated API surfaces ranging from additional APIs for NumPy compatibility, also support for ways to improve and scale your code for performance at both inference and training time. 

In addition, the distributed RPC framework released in PyTorch 1.8 provides mechanisms for multi-machine model training through a set of primitives to allow for remote communication, and a higher-level API to automatically differentiate models split across several machines.

The distributed RPC framework makes it easy to run functions remotely, supports referencing remote objects without copying the real data around, and provides autograd and optimizer APIs to transparently run backward and update parameters across RPC boundaries. These features can be categorized into four sets of APIs.

1. Remote Procedure Call (RPC)

2. Remote Reference (RRef)

3. Distributed Autograd

4. Distributed Optimizer.


Learn More Here

https://pytorch.org/blog/pytorch-1.8-released/

https://pytorch.org/docs/stable/rpc.html

https://pytorch.org/docs/master/rpc.html#remotemodule





Comments as following