Software:PyTorch

From HandWiki
Revision as of 10:05, 9 February 2024 by ScienceGen (talk | contribs) (link)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Short description: Open source machine learning library
PyTorch
PyTorch logo black.svg
Original author(s)
  • Adam Paszke
  • Sam Gross
  • Soumith Chintala
  • Gregory Chanan
Developer(s)Meta AI
Initial releaseSeptember 2016; 8 years ago (2016-09)[1]
Repositorygithub.com/pytorch/pytorch
Written in
Operating system
PlatformIA-32, x86-64, ARM64
Available inEnglish
TypeLibrary for machine learning and deep learning
LicenseBSD-3[2]
Websitepytorch.org

PyTorch is a machine learning framework based on the Torch library,[3][4][5] used for applications such as computer vision and natural language processing,[6] originally developed by Meta AI and now part of the Linux Foundation umbrella.[7][8][9][10] It is free and open-source software released under the modified BSD license. Although the Python interface is more polished and the primary focus of development, PyTorch also has a C++ interface.[11]

A number of pieces of deep learning software are built on top of PyTorch, including Tesla Autopilot,[12] Uber's Pyro,[13] Hugging Face's Transformers,[14] PyTorch Lightning,[15][16] and Catalyst.[17][18]

PyTorch provides two high-level features:[19]

History

Meta (formerly known as Facebook) operates both PyTorch and Convolutional Architecture for Fast Feature Embedding (Caffe2), but models defined by the two frameworks were mutually incompatible. The Open Neural Network Exchange (ONNX) project was created by Meta and Microsoft in September 2017 for converting models between frameworks. Caffe2 was merged into PyTorch at the end of March 2018.[20] In September 2022, Meta announced that PyTorch would be governed by PyTorch Foundation, a newly created independent organization – a subsidiary of Linux Foundation.[21]

PyTorch 2.0 was released on 15 March 2023.[22]

PyTorch tensors

Main page: Tensor (machine learning)

PyTorch defines a class called Tensor (torch.Tensor) to store and operate on homogeneous multidimensional rectangular arrays of numbers. PyTorch Tensors are similar to NumPy Arrays, but can also be operated on a CUDA-capable NVIDIA GPU. PyTorch has also been developing support for other GPU platforms, for example, AMD's ROCm and Apple's Metal Framework.[23]

PyTorch supports various sub-types of Tensors.[24]

Differences from physics "tensors"

A tensor in physics is similar to a PyTorch tensor, in that it's mostly a multidimensional array. The only additional feature of a physicist's tensor that's missing from a PyTorch tensor, is that when indexing its entries, some of the indices are written subscripted or superscripted, like [math]\displaystyle{ {{t_{ij}^k}_{l}}^{m} }[/math]. The number of superscripts and subscripts in a physics tensor is called the type of the tensor. It should be easy to understand that the type of the above tensor [math]\displaystyle{ t }[/math] is (2,3). A superscripted index is called a contravariant index, and a subscripted index is called a covariant index. A physicist's tensor supports four fundamental operations:

  1. Addition
  2. Tensor product
  3. Contraction
  4. Basis change

Of these, the co/contravariance distinction only affects the basis change operation. The other three operations are unaffected by co/contravariance, and are therefore easy to implement for PyTorch tensors. Thus, whether a PyTorch "tensor" deserves to be called a tensor is a matter of opinion.

A mathematician's tensor is also subtly different from a physicist's tensor, but the difference there is so minor that it has no effect on applications. For the sake of completeness, for a mathematician, a tensor of type (m,n) over a vector space [math]\displaystyle{ V }[/math] is an element of the vector space [math]\displaystyle{ \operatorname{Hom}(\underbrace{V \otimes \dotsb \otimes V}_n, \underbrace{V \otimes \dotsb \otimes V}_m) }[/math]. The first three of the four operations above can then be expressed in a basis-independent way, making the fourth one unnecessary. A physicist's tensor then results from picking a basis for [math]\displaystyle{ V }[/math], which turns a mathematician's tensor into a multidimensional array. Since the choice of basis for [math]\displaystyle{ V }[/math] was arbitrary, we need a basis change operation to eliminate this arbitrariness, which ends up being our operation 4.

PyTorch neural networks

Main page: Tensor (machine learning)

PyTorch defines a class called nn (torch.nn) to describe neural networks and to support training.

Example

The following program shows the low-level functionality of the library with a simple example

import torch
dtype = torch.float
device = torch.device("cpu") # This executes all calculations on the CPU
# device = torch.device("cuda:0") # This executes all calculations on the GPU

# Creation of a tensor and filling of a tensor with random numbers
a = torch.randn(2, 3, device=device, dtype=dtype)
print(a) # Output of tensor A
# Output: tensor([[-1.1884,  0.8498, -1.7129],
#                  [-0.8816,  0.1944,  0.5847]])

# Creation of a tensor and filling of a tensor with random numbers
b = torch.randn(2, 3, device=device, dtype=dtype)
print(b) # Output of tensor B
# Output: tensor([[ 0.7178, -0.8453, -1.3403],
#                  [ 1.3262,  1.1512, -1.7070]])

print(a*b) # Output of a multiplication of the two tensors
# Output: tensor([[-0.8530, -0.7183,  2.58],
#                  [-1.1692,  0.2238, -0.9981]])

print(a.sum()) # Output of the sum of all elements in tensor A
# Output: tensor(-2.1540)

print(a[1,2]) # Output of the element in the third column of the second row (zero based)
# Output: tensor(0.5847)

print(a.max()) # Output of the maximum value in tensor A
# Output: tensor(0.8498)

The following code-block shows an example of the higher level functionality provided nn module. A neural network with linear layers is defined in the example.

import torch

from torch import nn # Import the nn sub-module from PyTorch

class NeuralNetwork(nn.Module): # Neural networks are defined as classes

   def __init__(self): # Layers and variables are defined in the __init__ method
       super(NeuralNetwork, self).__init__() # Must be in every network.
       self.flatten = nn.Flatten() # Defining a flattening layer.
       self.linear_relu_stack = nn.Sequential( # Defining a stack of layers.
           nn.Linear(28*28, 512), # Linear Layers have an input and output shape
           nn.ReLU(), # ReLU is one of many activation functions provided by nn
           nn.Linear(512, 512),
           nn.ReLU(),
           nn.Linear(512, 10), 
       )
   def forward(self, x): # This function defines the forward pass.
       x = self.flatten(x)
       logits = self.linear_relu_stack(x)
return logits

See also

References

  1. Chintala, Soumith (1 September 2016). "PyTorch Alpha-1 release". https://github.com/pytorch/pytorch/releases/tag/v0.1.1. 
  2. Claburn, Thomas (12 September 2022). "PyTorch gets lit under The Linux Foundation". The Register. https://www.theregister.com/2022/09/12/pytorch_meta_linux_foundation/. 
  3. Yegulalp, Serdar (19 January 2017). "Facebook brings GPU-powered machine learning to Python". InfoWorld. https://www.infoworld.com/article/3159120/artificial-intelligence/facebook-brings-gpu-powered-machine-learning-to-python.html. 
  4. Lorica, Ben (3 August 2017). "Why AI and machine learning researchers are beginning to embrace PyTorch". O'Reilly Media. https://www.oreilly.com/ideas/why-ai-and-machine-learning-researchers-are-beginning-to-embrace-pytorch. 
  5. Ketkar, Nikhil (2017). "Introduction to PyTorch" (in en). Deep Learning with Python. Apress, Berkeley, CA. pp. 195–208. doi:10.1007/978-1-4842-2766-4_12. ISBN 9781484227657. 
  6. "Natural Language Processing (NLP) with PyTorch – NLP with PyTorch documentation" (in en). http://dl4nlp.info/en/latest/. 
  7. Patel, Mo (2017-12-07). "When two trends fuse: PyTorch and recommender systems" (in en). O'Reilly Media. https://www.oreilly.com/ideas/when-two-trends-fuse-pytorch-and-recommender-systems. 
  8. Mannes, John. "Facebook and Microsoft collaborate to simplify conversions from PyTorch to Caffe2" (in en). TechCrunch. https://techcrunch.com/2017/09/07/facebook-and-microsoft-collaborate-to-simplify-conversions-from-pytorch-to-caffe2/. "FAIR is accustomed to working with PyTorch – a deep learning framework optimized for achieving state of the art results in research, regardless of resource constraints. Unfortunately in the real world, most of us are limited by the computational capabilities of our smartphones and computers." 
  9. Arakelyan, Sophia (2017-11-29). "Tech giants are using open source frameworks to dominate the AI community" (in en-US). https://venturebeat.com/2017/11/29/tech-giants-are-using-open-source-frameworks-to-dominate-the-ai-community/. 
  10. "PyTorch strengthens its governance by joining the Linux Foundation" (in en). https://pytorch.org/blog/PyTorchfoundation/. 
  11. "The C++ Frontend". https://pytorch.org/cppdocs/frontend.html. 
  12. Karpathy, Andrej. "PyTorch at Tesla - Andrej Karpathy, Tesla". https://www.youtube.com/watch?v=oBklltKXtDE. 
  13. "Uber AI Labs Open Sources Pyro, a Deep Probabilistic Programming Language" (in en-US). Uber Engineering Blog. 2017-11-03. https://eng.uber.com/pyro/. 
  14. PYTORCH-TRANSFORMERS: PyTorch implementations of popular NLP Transformers, PyTorch Hub, 2019-12-01, https://pytorch.org/hub/huggingface_pytorch-transformers/, retrieved 2019-12-01 
  15. PYTORCH-Lightning: The lightweight PyTorch wrapper for ML researchers. Scale your models. Write less boilerplate, Lightning-Team, 2020-06-18, https://github.com/PyTorchLightning/pytorch-lightning/, retrieved 2020-06-18 
  16. "Ecosystem Tools" (in en). https://pytorch.org/ecosystem/. 
  17. GitHub - catalyst-team/catalyst: Accelerated DL & RL, Catalyst-Team, 2019-12-05, https://github.com/catalyst-team/catalyst, retrieved 2019-12-05 
  18. "Ecosystem Tools" (in en). https://pytorch.org/ecosystem/. 
  19. "PyTorch – About". https://pytorch.org/about/. 
  20. "Caffe2 Merges With PyTorch". 2018-04-02. https://medium.com/@Synced/caffe2-merges-with-pytorch-a89c70ad9eb7. 
  21. Edwards, Benj (2022-09-12). "Meta spins off PyTorch Foundation to make AI framework vendor neutral". https://arstechnica.com/information-technology/2022/09/meta-spins-off-pytorch-foundation-to-make-ai-framework-vendor-neutral/. 
  22. "PyTorch 2.0 brings new fire to open-source machine learning". VentureBeat. 15 March 2023. https://venturebeat.com/ai/pytorch-2-0-brings-new-fire-to-open-source-machine-learning/. 
  23. "Introducing Accelerated PyTorch Training on Mac" (in en). https://pytorch.org/blog/introducing-accelerated-pytorch-training-on-mac/. 
  24. "An Introduction to PyTorch – A Simple yet Powerful Deep Learning Library". 2018-02-22. https://www.analyticsvidhya.com/blog/2018/02/pytorch-tutorial/. 

External links