Skip to content

Commit 33f7cf1

Browse files
committed
Generate Python docs from pytorch/pytorch@85df746
1 parent 1a11123 commit 33f7cf1

File tree

2,263 files changed

+12455
-12766
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

2,263 files changed

+12455
-12766
lines changed

main/.buildinfo

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
# Sphinx build info version 1
22
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
3-
config: 344e96a845f17de0d62bb1f2ca2bf2f5
3+
config: 61a61cd8dca8de4bbcf802434e40be6b
44
tags: 645f666f9bcd5a90fca523b33c5a78b7

main/_images/RReLU.png

157 Bytes
Loading

main/_images/ReduceLROnPlateau.png

52 Bytes
Loading

main/_sources/community/persons_of_interest.rst.txt

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -349,9 +349,9 @@ XLA
349349
TorchServe
350350
~~~~~~~~~~
351351

352-
- Li Ning (`lxning <https://github.com/lxning>`__)
353-
- Ankith Gunapal (`agunapal <https://github.com/agunapal>`__)
354-
- Hamid Shojanazeri (`HamidShojanazeri <https://github.com/HamidShojanazeri>`__)
352+
- (emeritus) Li Ning (`lxning <https://github.com/lxning>`__)
353+
- (emeritus) Ankith Gunapal (`agunapal <https://github.com/agunapal>`__)
354+
- (emeritus) Hamid Shojanazeri (`HamidShojanazeri <https://github.com/HamidShojanazeri>`__)
355355
- (emeritus) Mark Saroufim (`msaroufIm <https://github.com/msaroufIm>`__)
356356
- (emeritus) Manoj Rao (`mycpuorg <https://github.com/mycpuorg>`__)
357357
- (emeritus) Vamshi Dantu (`vdantu <https://github.com/vdantu>`__)

main/_sources/generated/exportdb/index.rst.txt

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -19,8 +19,8 @@ support in export please create an issue in the pytorch/pytorch repo with a modu
1919
:caption: Tags
2020

2121
torch.escape-hatch
22-
torch.cond
2322
torch.dynamic-shape
23+
torch.cond
2424
python.closure
2525
torch.dynamic-value
2626
python.data-structure
@@ -233,7 +233,7 @@ cond_branch_class_method
233233

234234
.. note::
235235

236-
Tags: :doc:`torch.cond <torch.cond>`, :doc:`torch.dynamic-shape <torch.dynamic-shape>`
236+
Tags: :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`torch.cond <torch.cond>`
237237

238238
Support Level: SUPPORTED
239239

@@ -315,7 +315,7 @@ cond_branch_nested_function
315315

316316
.. note::
317317

318-
Tags: :doc:`torch.cond <torch.cond>`, :doc:`torch.dynamic-shape <torch.dynamic-shape>`
318+
Tags: :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`torch.cond <torch.cond>`
319319

320320
Support Level: SUPPORTED
321321

@@ -394,7 +394,7 @@ cond_branch_nonlocal_variables
394394

395395
.. note::
396396

397-
Tags: :doc:`torch.cond <torch.cond>`, :doc:`torch.dynamic-shape <torch.dynamic-shape>`
397+
Tags: :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`torch.cond <torch.cond>`
398398

399399
Support Level: SUPPORTED
400400

@@ -574,7 +574,7 @@ cond_operands
574574

575575
.. note::
576576

577-
Tags: :doc:`torch.cond <torch.cond>`, :doc:`torch.dynamic-shape <torch.dynamic-shape>`
577+
Tags: :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`torch.cond <torch.cond>`
578578

579579
Support Level: SUPPORTED
580580

@@ -666,7 +666,7 @@ cond_predicate
666666

667667
.. note::
668668

669-
Tags: :doc:`torch.cond <torch.cond>`, :doc:`torch.dynamic-shape <torch.dynamic-shape>`
669+
Tags: :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`torch.cond <torch.cond>`
670670

671671
Support Level: SUPPORTED
672672

@@ -1162,7 +1162,7 @@ dynamic_shape_map
11621162

11631163
.. note::
11641164

1165-
Tags: :doc:`torch.map <torch.map>`, :doc:`torch.dynamic-shape <torch.dynamic-shape>`
1165+
Tags: :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`torch.map <torch.map>`
11661166

11671167
Support Level: SUPPORTED
11681168

@@ -2157,7 +2157,7 @@ dynamic_shape_round
21572157

21582158
.. note::
21592159

2160-
Tags: :doc:`python.builtin <python.builtin>`, :doc:`torch.dynamic-shape <torch.dynamic-shape>`
2160+
Tags: :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`python.builtin <python.builtin>`
21612161

21622162
Support Level: NOT_SUPPORTED_YET
21632163

main/_sources/generated/exportdb/python.builtin.rst.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ dynamic_shape_round
55

66
.. note::
77

8-
Tags: :doc:`python.builtin <python.builtin>`, :doc:`torch.dynamic-shape <torch.dynamic-shape>`
8+
Tags: :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`python.builtin <python.builtin>`
99

1010
Support Level: NOT_SUPPORTED_YET
1111

main/_sources/generated/exportdb/torch.cond.rst.txt

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ cond_branch_class_method
55

66
.. note::
77

8-
Tags: :doc:`torch.cond <torch.cond>`, :doc:`torch.dynamic-shape <torch.dynamic-shape>`
8+
Tags: :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`torch.cond <torch.cond>`
99

1010
Support Level: SUPPORTED
1111

@@ -87,7 +87,7 @@ cond_branch_nested_function
8787

8888
.. note::
8989

90-
Tags: :doc:`torch.cond <torch.cond>`, :doc:`torch.dynamic-shape <torch.dynamic-shape>`
90+
Tags: :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`torch.cond <torch.cond>`
9191

9292
Support Level: SUPPORTED
9393

@@ -166,7 +166,7 @@ cond_branch_nonlocal_variables
166166

167167
.. note::
168168

169-
Tags: :doc:`torch.cond <torch.cond>`, :doc:`torch.dynamic-shape <torch.dynamic-shape>`
169+
Tags: :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`torch.cond <torch.cond>`
170170

171171
Support Level: SUPPORTED
172172

@@ -346,7 +346,7 @@ cond_operands
346346

347347
.. note::
348348

349-
Tags: :doc:`torch.cond <torch.cond>`, :doc:`torch.dynamic-shape <torch.dynamic-shape>`
349+
Tags: :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`torch.cond <torch.cond>`
350350

351351
Support Level: SUPPORTED
352352

@@ -438,7 +438,7 @@ cond_predicate
438438

439439
.. note::
440440

441-
Tags: :doc:`torch.cond <torch.cond>`, :doc:`torch.dynamic-shape <torch.dynamic-shape>`
441+
Tags: :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`torch.cond <torch.cond>`
442442

443443
Support Level: SUPPORTED
444444

main/_sources/generated/exportdb/torch.dynamic-shape.rst.txt

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ cond_branch_class_method
55

66
.. note::
77

8-
Tags: :doc:`torch.cond <torch.cond>`, :doc:`torch.dynamic-shape <torch.dynamic-shape>`
8+
Tags: :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`torch.cond <torch.cond>`
99

1010
Support Level: SUPPORTED
1111

@@ -87,7 +87,7 @@ cond_branch_nested_function
8787

8888
.. note::
8989

90-
Tags: :doc:`torch.cond <torch.cond>`, :doc:`torch.dynamic-shape <torch.dynamic-shape>`
90+
Tags: :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`torch.cond <torch.cond>`
9191

9292
Support Level: SUPPORTED
9393

@@ -166,7 +166,7 @@ cond_branch_nonlocal_variables
166166

167167
.. note::
168168

169-
Tags: :doc:`torch.cond <torch.cond>`, :doc:`torch.dynamic-shape <torch.dynamic-shape>`
169+
Tags: :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`torch.cond <torch.cond>`
170170

171171
Support Level: SUPPORTED
172172

@@ -270,7 +270,7 @@ cond_operands
270270

271271
.. note::
272272

273-
Tags: :doc:`torch.cond <torch.cond>`, :doc:`torch.dynamic-shape <torch.dynamic-shape>`
273+
Tags: :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`torch.cond <torch.cond>`
274274

275275
Support Level: SUPPORTED
276276

@@ -362,7 +362,7 @@ cond_predicate
362362

363363
.. note::
364364

365-
Tags: :doc:`torch.cond <torch.cond>`, :doc:`torch.dynamic-shape <torch.dynamic-shape>`
365+
Tags: :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`torch.cond <torch.cond>`
366366

367367
Support Level: SUPPORTED
368368

@@ -535,7 +535,7 @@ dynamic_shape_map
535535

536536
.. note::
537537

538-
Tags: :doc:`torch.map <torch.map>`, :doc:`torch.dynamic-shape <torch.dynamic-shape>`
538+
Tags: :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`torch.map <torch.map>`
539539

540540
Support Level: SUPPORTED
541541

@@ -600,7 +600,7 @@ dynamic_shape_round
600600

601601
.. note::
602602

603-
Tags: :doc:`python.builtin <python.builtin>`, :doc:`torch.dynamic-shape <torch.dynamic-shape>`
603+
Tags: :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`python.builtin <python.builtin>`
604604

605605
Support Level: NOT_SUPPORTED_YET
606606

main/_sources/generated/exportdb/torch.map.rst.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ dynamic_shape_map
55

66
.. note::
77

8-
Tags: :doc:`torch.map <torch.map>`, :doc:`torch.dynamic-shape <torch.dynamic-shape>`
8+
Tags: :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`torch.map <torch.map>`
99

1010
Support Level: SUPPORTED
1111

main/_sources/notes/serialization.rst.txt

Lines changed: 0 additions & 166 deletions
Original file line numberDiff line numberDiff line change
@@ -339,172 +339,6 @@ if one does not have access to the ``torch.load`` callsites.
339339
if ``weights_only`` was not passed as an argument.
340340

341341

342-
.. _serializing-python-modules:
343-
344-
Serializing torch.nn.Modules and loading them in C++
345-
----------------------------------------------------
346-
347-
See also: `Tutorial: Loading a TorchScript Model in C++ <https://pytorch.org/tutorials/advanced/cpp_export.html>`_
348-
349-
ScriptModules can be serialized as a TorchScript program and loaded
350-
using :func:`torch.jit.load`.
351-
This serialization encodes all the modules’ methods, submodules, parameters,
352-
and attributes, and it allows the serialized program to be loaded in C++
353-
(i.e. without Python).
354-
355-
The distinction between :func:`torch.jit.save` and :func:`torch.save` may not
356-
be immediately clear. :func:`torch.save` saves Python objects with pickle.
357-
This is especially useful for prototyping, researching, and training.
358-
:func:`torch.jit.save`, on the other hand, serializes ScriptModules to a format
359-
that can be loaded in Python or C++. This is useful when saving and loading C++
360-
modules or for running modules trained in Python with C++, a common practice
361-
when deploying PyTorch models.
362-
363-
To script, serialize and load a module in Python:
364-
365-
::
366-
367-
>>> scripted_module = torch.jit.script(MyModule())
368-
>>> torch.jit.save(scripted_module, 'mymodule.pt')
369-
>>> torch.jit.load('mymodule.pt')
370-
RecursiveScriptModule( original_name=MyModule
371-
(l0): RecursiveScriptModule(original_name=Linear)
372-
(l1): RecursiveScriptModule(original_name=Linear) )
373-
374-
375-
Traced modules can also be saved with :func:`torch.jit.save`, with the caveat
376-
that only the traced code path is serialized. The following example demonstrates
377-
this:
378-
379-
::
380-
381-
# A module with control flow
382-
>>> class ControlFlowModule(torch.nn.Module):
383-
def __init__(self):
384-
super().__init__()
385-
self.l0 = torch.nn.Linear(4, 2)
386-
self.l1 = torch.nn.Linear(2, 1)
387-
388-
def forward(self, input):
389-
if input.dim() > 1:
390-
return torch.tensor(0)
391-
392-
out0 = self.l0(input)
393-
out0_relu = torch.nn.functional.relu(out0)
394-
return self.l1(out0_relu)
395-
396-
>>> traced_module = torch.jit.trace(ControlFlowModule(), torch.randn(4))
397-
>>> torch.jit.save(traced_module, 'controlflowmodule_traced.pt')
398-
>>> loaded = torch.jit.load('controlflowmodule_traced.pt')
399-
>>> loaded(torch.randn(2, 4)))
400-
tensor([[-0.1571], [-0.3793]], grad_fn=<AddBackward0>)
401-
402-
>>> scripted_module = torch.jit.script(ControlFlowModule(), torch.randn(4))
403-
>>> torch.jit.save(scripted_module, 'controlflowmodule_scripted.pt')
404-
>>> loaded = torch.jit.load('controlflowmodule_scripted.pt')
405-
>> loaded(torch.randn(2, 4))
406-
tensor(0)
407-
408-
The above module has an if statement that is not triggered by the traced inputs,
409-
and so is not part of the traced module and not serialized with it.
410-
The scripted module, however, contains the if statement and is serialized with it.
411-
See the `TorchScript documentation <https://pytorch.org/docs/stable/jit.html>`_
412-
for more on scripting and tracing.
413-
414-
Finally, to load the module in C++:
415-
416-
::
417-
418-
>>> torch::jit::script::Module module;
419-
>>> module = torch::jit::load('controlflowmodule_scripted.pt');
420-
421-
See the `PyTorch C++ API documentation <https://pytorch.org/cppdocs/>`_
422-
for details about how to use PyTorch modules in C++.
423-
424-
.. _saving-loading-across-versions:
425-
426-
Saving and loading ScriptModules across PyTorch versions
427-
-----------------------------------------------------------
428-
429-
The PyTorch Team recommends saving and loading modules with the same version of
430-
PyTorch. Older versions of PyTorch may not support newer modules, and newer
431-
versions may have removed or modified older behavior. These changes are
432-
explicitly described in
433-
PyTorch’s `release notes <https://github.com/pytorch/pytorch/releases>`_,
434-
and modules relying on functionality that has changed may need to be updated
435-
to continue working properly. In limited cases, detailed below, PyTorch will
436-
preserve the historic behavior of serialized ScriptModules so they do not require
437-
an update.
438-
439-
torch.div performing integer division
440-
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
441-
442-
In PyTorch 1.5 and earlier :func:`torch.div` would perform floor division when
443-
given two integer inputs:
444-
445-
::
446-
447-
# PyTorch 1.5 (and earlier)
448-
>>> a = torch.tensor(5)
449-
>>> b = torch.tensor(3)
450-
>>> a / b
451-
tensor(1)
452-
453-
In PyTorch 1.7, however, :func:`torch.div` will always perform a true division
454-
of its inputs, just like division in Python 3:
455-
456-
::
457-
458-
# PyTorch 1.7
459-
>>> a = torch.tensor(5)
460-
>>> b = torch.tensor(3)
461-
>>> a / b
462-
tensor(1.6667)
463-
464-
The behavior of :func:`torch.div` is preserved in serialized ScriptModules.
465-
That is, ScriptModules serialized with versions of PyTorch before 1.6 will continue
466-
to see :func:`torch.div` perform floor division when given two integer inputs
467-
even when loaded with newer versions of PyTorch. ScriptModules using :func:`torch.div`
468-
and serialized on PyTorch 1.6 and later cannot be loaded in earlier versions of
469-
PyTorch, however, since those earlier versions do not understand the new behavior.
470-
471-
torch.full always inferring a float dtype
472-
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
473-
474-
In PyTorch 1.5 and earlier :func:`torch.full` always returned a float tensor,
475-
regardless of the fill value it’s given:
476-
477-
::
478-
479-
# PyTorch 1.5 and earlier
480-
>>> torch.full((3,), 1) # Note the integer fill value...
481-
tensor([1., 1., 1.]) # ...but float tensor!
482-
483-
In PyTorch 1.7, however, :func:`torch.full` will infer the returned tensor’s
484-
dtype from the fill value:
485-
486-
::
487-
488-
# PyTorch 1.7
489-
>>> torch.full((3,), 1)
490-
tensor([1, 1, 1])
491-
492-
>>> torch.full((3,), True)
493-
tensor([True, True, True])
494-
495-
>>> torch.full((3,), 1.)
496-
tensor([1., 1., 1.])
497-
498-
>>> torch.full((3,), 1 + 1j)
499-
tensor([1.+1.j, 1.+1.j, 1.+1.j])
500-
501-
The behavior of :func:`torch.full` is preserved in serialized ScriptModules. That is,
502-
ScriptModules serialized with versions of PyTorch before 1.6 will continue to see
503-
torch.full return float tensors by default, even when given bool or
504-
integer fill values. ScriptModules using :func:`torch.full` and serialized on PyTorch 1.6
505-
and later cannot be loaded in earlier versions of PyTorch, however, since those
506-
earlier versions do not understand the new behavior.
507-
508342
.. _utility functions:
509343

510344
Utility functions

0 commit comments

Comments
 (0)