描述
开 本: 16开纸 张: 胶版纸包 装: 平装-胶订是否套装: 否国际标准书号ISBN: 9787302668077
Gilbert Strang的《线性代数(第5版)》是一本经典线性代数教材。此书深入浅出地展示了线性代数的所有核心概念,讲述过程中恰当穿插了各种应用,体现了线性代数极端有用的思想。
线性代数内容包括行列式、矩阵、线性方程组与向量、矩阵的特征值与特征向量、二次型及Mathematica 软件的应用等。 每章都配有习题,书后给出了习题答案。本书在编写中力求重点突出、由浅入深、 通俗易懂,努力体现教学的适用性。本书可作为高等院校工科专业的学生的教材,也可作为其他非数学类本科专业学生的教材或教学参考书。第一章:向量简介。围绕向量和点积的概念,在平面和空间中引入了线性组合和线性无关的概念。
第二章:求解线性方程组。从这个基本点出发,自然引入矩阵,高斯消元,初等矩阵,可逆矩阵等重要概念,并讲述了LU分解。
第三章:线性空间与子空间。从几何的角度来理解线性方程组,引入矩阵的秩,空间的维数等重要概念。导出线性代数基本定理。
第四章:正交。给出四个基本子空间的正交关系,引入最小二乘法,以及Gram-Schmidt正交化。
第五章:行列式。从体积的角度引入行列式,证明其各种基本性质
第六章:特征值与特征向量。从如何计算方阵的高次幂出发,给出引入二者的动机。然后讲解矩阵的对角化,对称矩阵,正定矩阵。
第七章:奇异值分解。介绍了奇异值分解这个基本定理,并给出了很多应用,例如求解常微分方程,图像压缩等。
第八章:线性变换。引入抽象的线性变换的概念,讲述线性变换的矩阵表示,对角化与伪逆。
第九章:复向量与复矩阵。讨论如何自然的引入和考虑复矩阵。然后讲解Hermitian矩阵和酉矩阵,并重点介绍了快速Fourier变换这一工程上极端有用的理论,
第十章:应用。这一章集中讲授了线性代数在各个领域中的应用。
第十一章:数值线性代数。从计算实现的角度来重新看线性代数。这一部分是算法,科学计算等的一个入门介绍。
第十二章:概率与统计中的线性代数。从线性代数的理论角度审视概率统计中的基本概念,尤其是多元随机变量,多元正态分布以及加权最小二乘法。
1 Vectors and Matrices 1
1.1 Vectors and Linear Combinations 2
1.2 Lengths and Angles from Dot Products 9
1.3 Matrices and Their Column Spaces 18
1.4 Matrix Multiplication AB and CR 27
2 Solving Linear Equations Ax = b 39
2.1 Elimination and Back Substitution 40
2.2 Elimination Matrices and Inverse Matrices 49
2.3 Matrix Computations and A = LU 57
2.4 Permutations and Transposes 64
2.5 Derivatives and Finite Difference Matrices 74
3 The Four Fundamental Subspaces 84
3.1 Vector Spaces and Subspaces 85
3.2 Computing the Nullspace by Elimination: A = CR 93
3.3 The Complete Solution to Ax = b 104
3.4 Independence, Basis, and Dimension 115
3.5 Dimensions of the Four Subspaces 129
4 Orthogonality 143
4.1 Orthogonality of Vectors and Subspaces 144
4.2 Projections onto Lines and Subspaces 151
4.3 Least Squares Approximations 163
4.4 Orthonormal Bases and Gram-Schmidt 176
4.5 The Pseudoinverse of a Matrix 190
5 Determinants 198
5.1 3 by 3 Determinants and Cofactors 199
5.2 Computing and Using Determinants 205
5.3 Areas and Volumes by Determinants 211
6 Eigenvalues and Eigenvectors 216
6.1 Introduction to Eigenvalues : Ax = λx 217
6.2 Diagonalizing a Matrix 232
6.3 Symmetric Positive De?nite Matrices 246
6.4 Complex Numbers and Vectors and Matrices 262
6.5 Solving Linear Differential Equations 270
vii
viii Table of Contents
7 The Singular Value Decomposition (SVD) 286
7.1 Singular Values and Singular Vectors 287
7.2 Image Processing by Linear Algebra 297
7.3 Principal Component Analysis (PCA by the SVD) 302
8 Linear Transformations 308
8.1 The Idea of a Linear Transformation 309
8.2 The Matrix of a Linear Transformation 318
8.3 The Search for a Good Basis 327
9 Linear Algebra in Optimization 335
9.1 Minimizing a Multivariable Function 336
9.2 Backpropagation and Stochastic Gradient Descent 346
9.3 Constraints, Lagrange Multipliers, Minimum Norms 355
9.4 Linear Programming, Game Theory, and Duality 364
10 Learning from Data 370
10.1 Piecewise Linear Learning Functions 372
10.2 Creating and Experimenting 381
10.3 Mean, Variance, and Covariance 386
Appendix 1 The Ranks of AB and A B 400
Appendix 2 Matrix Factorizations 401
Appendix 3 Counting Parameters in the Basic Factorizations 403
Appendix 4 Codes and Algorithms for Numerical Linear Algebra 404
Appendix 5 The Jordan Form of a Square Matrix 405
Appendix 6 Tensors 406
Appendix 7 The Condition Number of a Matrix Problem 407
Appendix 8 Markov Matrices and Perron-Frobenius 408
Appendix 9 Elimination and Factorization 410
Appendix 10 Computer Graphics 414
Index of Equations 419
Index of Notations 422
Index 423
One goal of this Preface can be achieved right away. You need to know about the video lectures for MIT’s Linear Algebra course Math 18.06. Those videos go with this book, and they are part of MIT’s OpenCourseWare. The direct links to linear algebra are
https://ocw.mit.edu/courses/18-06-linear-algebra-spring-2010/ https://ocw.mit.edu/courses/18-06sc-linear-algebra-fall-2011/
On YouTube those lectures are at https://ocw.mit.edu/1806videos and /1806scvideos
The .rst link brings the original lectures from the dawn of OpenCourseWare. Problem solutions by graduate students (really good) and also a short introduction to linear algebra were added to the new 2011 lectures. And the course today has a new start—the crucial ideas of linear independence and the column space of a matrix have moved near the front.
I would like to tell you about those ideas in this Preface.
Start with two column vectors a1 and a2. They can have three components each, so they correspond to points in 3-dimensional space. The picture needs a center point which locates the zero vector :
. . . . . .
2 1 0
a1 = . 3 . a2 = . 4 . zero vector = . 0 . .
1 2 0
The vectors are drawn on this 2-dimensional page. But we all have practice in visualizing three-dimensional pictures. Here are a1, a2, 2a1, and the vector sum a1 a2.
..
3
.. ..
a1 a2 =71 3
24
0
.. ..
a1 =3 2a1 =6
12
i
That picture illustrated two basic operations—adding vectors a1 a2 and multiplying a vector by 2. Combining those operations produced a “linear combination” 2a1 a2 :
Linear combination =ca1 da2 for any numbers c and d
Those numbers cand dcan be negative. In that case ca1 and da2 will reverse their direc-tions : they go right to left. Also very important, c and d can involve fractions. Here is a picture with a lot more linear combinations. Eventually we want all vectors ca1 da2.
评论
还没有评论。