Chapter 4 Projections and Linear Transformations 1. Inconsistent Systems and Projection 2. Orthogonal Bases 3. Properties of Determinants 2.

Author:Tashicage Gagami
Language:English (Spanish)
Published (Last):14 March 2012
PDF File Size:2.93 Mb
ePub File Size:18.29 Mb
Price:Free* [*Free Regsitration Required]

Chapter 4 Projections and Linear Transformations 1. Inconsistent Systems and Projection 2. Orthogonal Bases 3. Properties of Determinants 2. Complex Eigenvalues and Jordan Canonical Form 2. Computer Graphics and Geometry 3. Linear algebra provides a beautiful illustration of this, in that it is by nature both algebraic and geometric. Our intuition concerning lines and planes in space acquires an algebraic interpretation that then makes sense more generally in higher dimensions.

Indeed, it is fair to say that linear algebra lies at the foundation of modern mathematics, physics, statistics, and many other disciplines. Linear problems appear in geometry, analysis, and many applied areas. From a pedagogical point of view, linear algebra is an ideal subject for students to learn to think about mathematical concepts and to write rigorous mathematical arguments. One of our goals in writing this text—aside from presenting the standard computational aspects and some interesting applications—is to guide the student in this endeavor.

We hope this book will be a thought-provoking introduction to the subject and its myriad applications, one that will be interesting to the science or engineering student but will also help the mathematics student make the transition to more abstract advanced courses.

We have tried to keep the prerequisites for this book to a minimum. Although many of our students will have had a course in multivariable calculus, we do not presuppose any exposure to vectors or vector algebra. We assume only a passing acquaintance with the derivative and integral in Section 6 of Chapter 3 and Section 4 of Chapter 4. Of course, in the discussion of differential equations in Section 3 of Chapter 7, we expect a bit more, including some familiarity with power series, in order for students to understand the matrix exponential.

We have also added solutions to many more exercises at the back of the book, hoping that this will help some of the students; in the case of exercises requiring proofs, these will provide additional worked examples that many students have requested. We continue to believe that good exercises are ultimately what makes a superior mathematics text. We hope that they give readers an idea how the subject developed and who the key players were.

We indicate the end of a proof by the symbol. We have also introduced the Leslie matrix and an application to population dynamics in Section 6. In particular, we now obtain all the orthogonality relations among these four subspaces in Section 2. Until the end of Section 1, we have tied the computation of determinants to row operations only, proving at the end that this implies multilinearity.

We have included more solved problems at the back of the book and, in many cases, have added similar new exercises. We have added some additional blue boxes, as well as a table giving the locations of them all. And we have added more examples early in the text, including more sample proof arguments. We next treat systems of linear equations, starting with a discussion of hyperplanes in Rn , then introducing matrices and Gaussian elimination to arrive at reduced echelon form and the parametric representation of the general solution.

We then discuss consistency and the relation between solutions of the homogeneous and inhomogeneous systems. We conclude with a selection of applications. Multiplication of matrices is viewed as a generalization of multiplication of matrices by vectors, introduced in Chapter 1, but then we come to understand that it represents composition of linear transformations.

We now have separate sections for inverse matrices and elementary matrices where the LU decomposition is introduced and introduce the notion of transpose. We expect that most instructors will treat elementary matrices lightly. The heart of the traditional linear algebra course enters in Chapter 3, where we deal with subspaces, linear independence, bases, and dimension. We continue our study of linear transformations in the context of the change-of-basis formula.

The diagonalization problem emerges as natural, and we will return to it fully in Chapter 6. We give a more thorough treatment of determinants in Chapter 5 than is typical for introductory texts. We have, however, moved the geometric interpretation of signed area and signed volume to the last section of the chapter.

We characterize the determinant by its behavior under row operations and then give the usual multilinearity properties. Chapter 6 is devoted to a thorough treatment of eigenvalues, eigenvectors, diagonalizability, and various applications.

We conclude the section with an optional discussion of Markov processes and stochastic matrices. In the last section, we prove the Spectral Theorem, which we believe to be—at least in this most basic setting—one of the important theorems all mathematics majors should know; we include a brief discussion of its application to conics and quadric surfaces. Chapter 7 consists of three independent special topics. Although Jordan canonical form does not ordinarily appear in introductory texts, it is conceptually important and widely used in the study of systems of differential equations and dynamical systems.

We discuss the notion of perspective projection, which is how computer graphics programs draw images on the screen. We give special thanks to our colleagues Ed Azoff and Roy Smith, who have suggested improvements for the second edition. We would also like to thank the following colleagues around the country, who reviewed the manuscript and offered many helpful comments for the improved second edition: Richard Blecksmith Mike Daven Jochen Denzler Darren Glass S. Ravindran William T.

We are also indebted to Gil Strang for shaping the way most of us have taught linear algebra during the last decade or two. Preface xi The authors welcome your comments and suggestions.

We believe it is essential to plan the course so as to have time to come to grips with diagonalization and applications of eigenvalues, including at least one day devoted to the Spectral Theorem.

Thus, every instructor will have to make choices and elect to treat certain topics lightly, and others not at all. For such a course, there should be ample material to cover, treading lightly on the mechanics and spending more time on the theory and various applications, especially Chapter 7.

We believe strongly that presenting proofs in class is only one ingredient; the students must play an active role by wrestling with proofs in homework as well. Although we have parted ways with most modern-day authors of linear algebra textbooks by avoiding technology, we have included a few problems for which a good calculator or computer software will be more than helpful. In addition, when teaching the course, we encourage our students to take advantage of their calculators or available software e.

Those instructors who are strong believers in the use of technology will no doubt have a preferred supplementary manual to use.

Distinguishing among points in Rn , vectors starting at the origin, and vectors starting elsewhere is always a confusing point at the beginning of any introductory linear algebra text. Another mathematical and pedagogical issue is that of using only column vectors to represent elements of Rn. But for reasons having to do merely with typographical ease, we have not hesitated to use the previous notation from time to time in the text or in exercises when it should cause no confusion.

Similarly, we tread lightly in Chapter 5, skipping the proof of Proposition 2. We have moved the discussion of the geometry of determinants to Section 3; instructors who have the extra day or so should certainly include it. Because we try to emphasize geometry and orthogonality more than most texts, we introduce the orthogonal complement of a subspace early in Chapter 3.

In rewriting, we have devoted all of Section 2 to the four fundamental subspaces. We always end the course with a proof of the Spectral Theorem and a few days of applications, usually including difference equations and Markov processes but skipping the optional Section 6. We do not cover Section 7. Instructors who choose to cover abstract vector spaces Section 3.

A few of the exercises will require some calculus skills. With careful planning, we are able to cover all of the mandatory topics and all of the recommended supplementary topics, but we consider ourselves lucky to have any time at all left for Chapter 7. Above all else, we sincerely hope you will have fun.

To this end, there are approximately exercises, a large portion of them having multiple parts. These include computations, applied problems, and problems that ask you to come up with examples.

It is our intent to help you in your quest to become a better mathematics student. In some cases, studying the examples will provide a direct line of approach to a problem, or perhaps a clue. But in others, you will need to do some independent thinking. We have provided many examples that demonstrate the ideas and computational tools necessary to do most of the exercises. Nevertheless, you may sometimes believe you have no idea how to get started on a particular problem.

In more conceptual problems, it may help to make up an example illustrating what you are trying to show; you might try to understand the problem in two or three dimensions—often a picture will give you insight. In other words, learn to play a bit with the problem and feel more comfortable with it. Remember that in multi-part problems, the hypotheses given at the outset hold throughout the problem. Moreover, usually but not always we have arranged such problems in such a way that you should use the results of part a in trying to do part b, and so on.

Resist as long as possible the temptation to refer to the solutions! Thus, if your instructor assigns them, you should make sure you understand how to do them.

Once again, we hope you will have fun as you embark on your voyage to learn linear algebra. We come across two ways of describing hyper planes—either parametrically or as solutions of a Cartesian equation. Going back and forth between these two formulations will be a major theme of this text. The fundamental tool that is used in bridging these descriptions is Gaussian elimination, a standard algorithm used to solve systems of linear equations.

We close the chapter with a variety of applications, some not of a geometric nature. This is the algebraic representation of the vector x.

Thanks to Descartes, we can identify the ordered pair x1 , x2 with a point in the Cartesian plane, R2. The relationship of this point to the origin 0, 0 gives rise to the geometric interpretation of the vector x—namely, the arrow pointing from 0, 0 to x1 , x2 , as illustrated in Figure 1. The vector x has length and direction. We denote the zero vector 0, 0 by 0 and agree that it has no direction.

We say two vectors are equal if they have the same coordinates, or, equivalently, if they have the same length and direction.



Fenririsar This is the first book to provide C program for gauss elimination s. We ship from India. This book provides a self-contained and accessible introduction to linear and multilinear algebra. The Bookshelf application offers access: Most schools will have calculus prerequisites for those who are seeking a degree in engineering. A Geometric Approach by S.

ASTM A536 GRADE 80-55-06 PDF



Related Articles