mmpretrain/configs/lenet
Ma Zerun a05c79e806
[Refactor] Move transforms in mmselfsup to mmpretrain. (#1396)
* [Refactor] Move transforms in mmselfsup to mmpretrain.

* Update transform docs and configs. And register some mmcv transforms in
mmpretrain.

* Fix missing transform wrapper.

* update selfsup transforms

* Fix UT

* Fix UT

* update gaussianblur inconfigs

---------

Co-authored-by: fangyixiao18 <fangyx18@hotmail.com>
2023-03-03 15:01:11 +08:00
..
README.md [Refactor] Use mdformat instead of markdownlint to format markdown. (#844) 2022-06-02 15:22:01 +08:00
lenet5_mnist.py [Refactor] Move transforms in mmselfsup to mmpretrain. (#1396) 2023-03-03 15:01:11 +08:00

README.md

LeNet

Backpropagation Applied to Handwritten Zip Code Recognition

Abstract

The ability of learning networks to generalize can be greatly enhanced by providing constraints from the task domain. This paper demonstrates how such constraints can be integrated into a backpropagation network through the architecture of the network. This approach has been successfully applied to the recognition of handwritten zip code digits provided by the U.S. Postal Service. A single network learns the entire recognition operation, going from the normalized image of the character to the final classification.

Citation

@ARTICLE{6795724,
  author={Y. {LeCun} and B. {Boser} and J. S. {Denker} and D. {Henderson} and R. E. {Howard} and W. {Hubbard} and L. D. {Jackel}},
  journal={Neural Computation},
  title={Backpropagation Applied to Handwritten Zip Code Recognition},
  year={1989},
  volume={1},
  number={4},
  pages={541-551},
  doi={10.1162/neco.1989.1.4.541}}
}