Repeat the old books to deeply understand what linear algebra is.

Source: Internet
Author: User
Linear Algebra is a major branch of Higher Algebra. We know that a single equation is called a linear equation, and the algebra for discussing Linear Equations and linear operations is called a linear algebra. The most important content in linear algebra is the deciding factor and the moment matrix. In the 19th century, the terminologies and matrices have received great attention and have written thousands of articles on these two topics. From the mathematical point of view, the concept of vector is just a set of ordered ternary arrays. However, it uses force or speed as a direct physical meaning, and it can be used in mathematics to immediately write what is physically said. Vectors are used for gradients, divergence, and rotation. Similarly, the determinant and matrix are the same as the derivative (although dy/dx is only a symbol in mathematics, it indicates the long formula that includes the limit of △y/△x, but derivative itself is a powerful concept that enables us to directly and creatively imagine what happens physically ). Therefore, although on the surface, the determinant and matrix are just a language or stenographer, most of its vivid concepts can provide a key to the new field of thought. However, it has been proved that these two concepts are highly useful tools in mathematics and physics.

Linear Algebra and matrix theory are introduced and developed along with the study of Linear System Equations and coefficients. The concept of the determining factor was first proposed by Guan xiao, a Japanese mathematician in the 17th century. In 1683, he wrote a book called the law on solving questions, it refers to the "method to solve the problem of determining factors". The concept of determining factors and its expansion have been clearly described in the book. Leibnitz (1693), one of the first German mathematicians who proposed the concept of determining calculus, was Leibnitz ). Cramer's introduction to linear algebra analysis (Introduction d l 'analyze des lignes courbes alge 'briques) in 1750) the important basic formula for Solving Linear System Equations (Cramer Klein's law that people are familiar with) is presented ). In 1764, bezout systemized the procedures for determining the symbols of each item of the determinant. For a given n Homogeneous Linear Equations containing n unknown numbers, bezout proves that the condition that the coefficient determinant is equal to zero is that the equations have a non-zero solution. Vandremonde is the first person to systematically describe the theory of the determining factor (that is, to separate the row-column theory from the linear equations. In addition, a rule is given, which uses the second subformula and their remainder formula to expand the deciding factor. He is the founder of this theory for the study of the determining factor itself. In his 1772 essay on points and the world system, Laplace proved some rules of vandmonde and promoted his method of expanding the determining factor, this method is still named after the sub-formula contained in the r row and the set of their sub-formula. In 1841, the German mathematician Jacob also summarized and proposed the system theory of the determining factor. Another study of the determining factor is the greatest mathematician in France. He has developed the theory of the determining factor, in the symbol of the determining factor, he arranged the elements into a square matrix and used the new method of dual tabulation for the first time. At the same time, he found the formula for multiplication of the two determining factors and improved it and proved the expanded theorem of Laplace. Relatively speaking, the earliest use of the matrix concept is embodied in the bilinear linear work after the 1700 s by using the concept of a matrix. He expected to understand the maximum and minimum values of Multivariate functions. His method was known as the Laplace iteration method. To accomplish this, he first needs the first-order partial derivative to 0, and also has the condition of the second-order partial derivative matrix. This condition is the so-called positive and negative definition today. Although the use matrix is not explicitly proposed by Laplace.

Gauss (Gauss) Proposed Gaussian elimination method in 1800 and used it to solve the problem of the least square method in the calculation of astronomical objects and subsequent earth surface measurements. (This branch of mathematics used to measure, obtain the shape of the earth, or obtain the exact location of the earth is called geology .) Although Gauss is famous for its successful elimination of linear equation variables, however, as early as centuries ago, Chinese People's manuscripts showed up explaining how to use the Gaussian elimination method to solve a three-equation system with three unknown numbers. In the past few years, Gaussian elimination has been regarded as a part of the development of learning, not mathematics. The Gaussian-appointment elimination rule was first introduced in the survey manual written by Wilhelm Jordan. Many people mistakenly think of Camille Jordan, a famous mathematician, as the appointment in the Gaussian-appointment elimination method.

With the rich development of matrix algebra, people need to have appropriate symbols and appropriate matrix multiplication definitions. The two must meet at about the same time and place. In 1848, j.j. Sylvester of England first proposed the word matrix, which is derived from Latin and represents a row of numbers. In 1855, matrix algebra was developed by Arthur Cayley. Cayley studies the composition of linear transformations and puts forward the definition of matrix multiplication, so that the coefficient matrix of the composite transformation st becomes the product of matrix S and matrix T. He further studied algebra problems including matrix inversion. The famous Cayley-Hamilton theory asserted that the square of a matrix is the root of its characteristic polynomial, which was proposed by Cayley in his matrix theory collection in 1858. Using a single letter A to represent a matrix is crucial to the development of matrix algebra. In the early stages of development, the det (AB) = det (a) det (B) formula provided a connection between matrix algebra and the determinant. A mathematician, Mr. Hu, first gave the term of the feature equation, and proved that the matrix with more than 3 of the Order has the feature value and the real feature value of the real-symmetric determinant of any order. The concept of a similar matrix is given, it also proves that the similarity matrix has the same feature value, studies the substitution theory,

Mathematicians try to study vector algebra, but there is no natural definition of the product of two vectors in any Italian number. The first vector algebra involving an uninterchangeable Vector Product (neither v x w nor w x V) was developed by Hermann Grassmann in his linear expansion theory (die lineale ausdehnungslehre) it is proposed in a book. (1844 ). His point of view is also introduced into the product of a column matrix and a row matrix. The result is a matrix that is now called a rank number 1 or a simple matrix. At the end of the 19th century, American mathematical physicist Willard James published a famous discussion about elements of vector analysis. Afterwards, the physicist p. a. M. Dirac proposed that the product of row vectors and column vectors is scalar. The columns and vectors we used to use were given by physicists in the 20th century.

The development of matrices is closely related to linear transformations. By the 19th century, it only occupies a limited space in the formation of linear transformation theory. The definition of modern vector space was proposed by Peano in 1888. After the Second World War, with the development of modern digital computers, the matrix has a new meaning, especially in the numerical analysis of the matrix. Due to the rapid development and wide application of computers, many practical problems can be solved quantitatively through discrete numerical calculations. Therefore, as a linear algebra for dealing with discrete problems, it has become a necessary mathematical foundation for scientific and technical personnel engaged in scientific research and engineering design.
Reference: http://www.math.sjtu.edu.cn/jidi/xxds/kcjs/fzs.htm ::holand0903-magic apprenticeship level 11-8

Rating disabled There are currently 5 comments


Good
80% (4)
Not good
20% (1)
Other answers

1 in total

Development History of Linear Algebra

Linear Algebra is a major branch of Higher Algebra. We know that a single equation is called a linear equation, and the algebra for discussing Linear Equations and linear operations is called a linear algebra. In linear algebra, the most important content is the columns and matrices. In the 19th century, the terminologies and matrices have received great attention and have written thousands of articles on these two topics. From the mathematical point of view, the concept of vector is just a set of ordered ternary arrays. However, it uses force or speed as a direct physical meaning, and it can be used in mathematics to immediately write what is physically said. Vectors are used for gradients, divergence, and rotation. Similarly, the determinant and matrix are the same as the derivative (although dy/dx is only a symbol in mathematics, it indicates the long formula that includes the limit of △y/△x, but derivative itself is a powerful concept that enables us to directly and creatively imagine what happens physically ). Therefore, although on the surface, the determinant and matrix are just a language or stenographer, most of its vivid concepts can provide a key to the new field of thought. However, it has been proved that these two concepts are highly useful tools in mathematics and physics.

Linear Algebra and matrix theory are introduced and developed along with the study of Linear System Equations and coefficients. The concept of the determining factor was first proposed by Guan xiao, a Japanese mathematician in the 17th century. In 1683, he wrote a book called the law on solving questions, it refers to the "method to solve the problem of determining factors". The concept of determining factors and its expansion have been clearly described in the book. Leibnitz (1693), one of the first German mathematicians who proposed the concept of determining calculus, was Leibnitz ). Cramer's introduction to linear algebra analysis (Introduction d l 'analyze des lignes courbes alge 'briques) in 1750) the important basic formula for Solving Linear System Equations (Cramer Klein's law that people are familiar with) is presented ). In 1764, bezout systemized the procedures for determining the symbols of each item of the determinant. For a given n Homogeneous Linear Equations containing n unknown numbers, bezout proves that the condition that the coefficient determinant is equal to zero is that the equations have a non-zero solution. Vandremonde is the first person to systematically describe the theory of the determining factor (that is, to separate the row-column theory from the linear equations. In addition, a rule is given, which uses the second subformula and their remainder formula to expand the deciding factor. He is the founder of this theory for the study of the determining factor itself. In his 1772 essay on points and the world system, Laplace proved some rules of vandmonde and promoted his method of expanding the determining factor, this method is still named after the sub-formula contained in the r row and the set of their sub-formula. In 1841, the German mathematician Jacob also summarized and proposed the system theory of the determining factor. Another study of the determining factor is the greatest mathematician in France. He has developed the theory of the determining factor, in the symbol of the determining factor, he arranged the elements into a square matrix and used the new method of dual tabulation for the first time. At the same time, he found the formula for multiplication of the two determining factors and improved it and proved the expanded theorem of Laplace. Relatively speaking, the earliest use of the matrix concept is embodied in the bilinear linear work after the 1700 s by using the concept of a matrix. He expected to understand the maximum and minimum values of Multivariate functions. His method was known as the Laplace iteration method. To accomplish this, he first needs the first-order partial derivative to 0, and also has the condition of the second-order partial derivative matrix. This condition is the so-called positive and negative definition today. Although the use matrix is not explicitly proposed by Laplace.

Gauss (Gauss) Proposed Gaussian elimination method in 1800 and used it to solve the problem of the least square method in the calculation of astronomical objects and subsequent earth surface measurements. (This branch of mathematics used to measure, obtain the shape of the earth, or obtain the exact location of the earth is called geology .) Although Gauss is famous for its successful elimination of linear equation variables, however, as early as centuries ago, Chinese People's manuscripts showed up explaining how to use the Gaussian elimination method to solve a three-equation system with three unknown numbers. In the past few years, Gaussian elimination has been regarded as a part of the development of learning, not mathematics. The Gaussian-appointment elimination rule was first introduced in the survey manual written by Wilhelm Jordan. Many people mistakenly think of Camille Jordan, a famous mathematician, as the appointment in the Gaussian-appointment elimination method.

With the rich development of matrix algebra, people need to have appropriate symbols and appropriate matrix multiplication definitions. The two must meet at about the same time and place. In 1848, j.j. Sylvester of England first proposed the word matrix, which is derived from Latin and represents a row of numbers. In 1855, matrix algebra was developed by Arthur Cayley. Cayley studies the composition of linear transformations and puts forward the definition of matrix multiplication, so that the coefficient matrix of the composite transformation st becomes the product of matrix S and matrix T. He further studied algebra problems including matrix inversion. The famous Cayley-Hamilton theory asserted that the square of a matrix is the root of its characteristic polynomial, which was proposed by Cayley in his matrix theory collection in 1858. Using a single letter A to represent a matrix is crucial to the development of matrix algebra. In the early stages of development, the det (AB) = det (a) det (B) formula provided a connection between matrix algebra and the determinant. A mathematician, Mr. Hu, first gave the term of the feature equation, and proved that the matrix with more than 3 of the Order has the feature value and the real feature value of the real-symmetric determinant of any order. The concept of a similar matrix is given, it also proves that the similarity matrix has the same feature value, studies the substitution theory,

Mathematicians try to study vector algebra, but there is no natural definition of the product of two vectors in any Italian number. The first vector algebra involving an uninterchangeable Vector Product (neither v x w nor w x V) was developed by Hermann Grassmann in his linear expansion theory (die lineale ausdehnungslehre) it is proposed in a book. (1844 ). His point of view is also introduced into the product of a column matrix and a row matrix. The result is a matrix that is now called a rank number 1 or a simple matrix. At the end of the 19th century, American mathematical physicist Willard James published a famous discussion about elements of vector analysis. Afterwards, the physicist p. a. M. Dirac proposed that the product of row vectors and column vectors is scalar. The columns and vectors we used to use were given by physicists in the 20th century.

The development of matrices is closely related to linear transformations. By the 19th century, it only occupies a limited space in the formation of linear transformation theory. The definition of modern vector space was proposed by Peano in 1888. After the Second World War, with the development of modern digital computers, the matrix has a new meaning, especially in the numerical analysis of the matrix. Due to the rapid development and wide application of computers, many practical problems can be solved quantitatively through discrete numerical calculations. Therefore, as a linear algebra for dealing with discrete problems, it has become a necessary mathematical foundation for scientific and technical personnel engaged in scientific research and engineering design.

 

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.