Resources | Learn the basics of linear algebra in deep learning with Python and numpy

Source: Internet
Author: User

This article is a basic learning blog from the University of Paris, PhD Hadrien Jean, which aims to help beginners/Advanced Beginners Master the concept of linear algebra based on deep learning and machine learning. Mastering these skills can improve your ability to understand and apply a variety of data science algorithms.

For beginners, the theoretical underpinnings of deep learning (Ian Goodfellow, Yoshua Bengio, Aaron Courville) may be too simplistic. Based on the linear algebra of the second chapter of the book, the authors introduce the basis of linear algebra in machine learning, and readers can view the basic introduction of each section in the original, Chinese, or Mandarin notes, or refer directly to the derivation of the blog. In addition to the detailed derivation of some concepts, the author adds several examples and gives the implementation code of Python/numpy.

    • Blog Address: https://hadrienj.github.io/posts/Deep-Learning-Book-Series-Introduction/

    • GitHub Address: Https://github.com/hadrienj/deepLearningBook-Notes

    • "Deep Learning" Chinese version: Https://github.com/exacity/deeplearningbook-chinese

    • Recommend a learning exchange of q-un,719-139-688, like to learn Python friends come together.

"Deep Learning" chapter II catalogue.

Blog directory.

The derivation of the formula of the pure symbol may be too abstract, in the blog The author generally first lists the specific cases, and then gives the symbolic expression.

For example, use a colored array of numbers to explain the basic definition:

The difference between scalar, vector, matrix, tensor.

Symbolic representation:

Then give the Python/numpy sample code:

Build an array with NumPy.

For some operational relationships, the author gives an intuitive and understandable diagram:

The units circle and the ellipse transformed by matrix A, where the vectors are the two eigenvectors of a.

For some more complex objects, the author also gives the function visualization and interactive interface. For example, in the two-magnitude transformation problem of eigenvalue decomposition, the two-function

The visualization of its positive, negative, and unshaped stereotypes:

The interactive interface of the positive stereotypes function:

The last section of PCA (principal component analysis) is a comprehensive application of the concepts previously introduced, which readers can use as autonomous exercises.

PCA as a coordinate system transformation problem.

The eigenvector of the covariance matrix.

Rotates the data to get the maximum variance on one axis.

I wish you all a happy study!

Article reproduced: The Heart of the machine


Resources | Learn the basics of linear algebra in deep learning with Python and numpy

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.