Squashed 'third_party/eigen/' changes from 61d72f6..cf794d3


Change-Id: I9b814151b01f49af6337a8605d0c42a3a1ed4c72
git-subtree-dir: third_party/eigen
git-subtree-split: cf794d3b741a6278df169e58461f8529f43bce5d
diff --git a/doc/TutorialLinearAlgebra.dox b/doc/TutorialLinearAlgebra.dox
index b09f354..a727241 100644
--- a/doc/TutorialLinearAlgebra.dox
+++ b/doc/TutorialLinearAlgebra.dox
@@ -40,8 +40,9 @@
     <tr>
         <th>Decomposition</th>
         <th>Method</th>
-        <th>Requirements on the matrix</th>
-        <th>Speed</th>
+        <th>Requirements<br/>on the matrix</th>
+        <th>Speed<br/> (small-to-medium)</th>
+        <th>Speed<br/> (large)</th>
         <th>Accuracy</th>
     </tr>
     <tr>
@@ -49,6 +50,7 @@
         <td>partialPivLu()</td>
         <td>Invertible</td>
         <td>++</td>
+        <td>++</td>
         <td>+</td>
     </tr>
     <tr class="alt">
@@ -56,6 +58,7 @@
         <td>fullPivLu()</td>
         <td>None</td>
         <td>-</td>
+        <td>- -</td>
         <td>+++</td>
     </tr>
     <tr>
@@ -63,6 +66,7 @@
         <td>householderQr()</td>
         <td>None</td>
         <td>++</td>
+        <td>++</td>
         <td>+</td>
     </tr>
     <tr class="alt">
@@ -70,13 +74,23 @@
         <td>colPivHouseholderQr()</td>
         <td>None</td>
         <td>+</td>
-        <td>++</td>
+        <td>-</td>
+        <td>+++</td>
     </tr>
     <tr>
         <td>FullPivHouseholderQR</td>
         <td>fullPivHouseholderQr()</td>
         <td>None</td>
         <td>-</td>
+        <td>- -</td>
+        <td>+++</td>
+    </tr>
+    <tr class="alt">
+        <td>CompleteOrthogonalDecomposition</td>
+        <td>completeOrthogonalDecomposition()</td>
+        <td>None</td>
+        <td>+</td>
+        <td>-</td>
         <td>+++</td>
     </tr>
     <tr class="alt">
@@ -84,21 +98,40 @@
         <td>llt()</td>
         <td>Positive definite</td>
         <td>+++</td>
+        <td>+++</td>
         <td>+</td>
     </tr>
     <tr>
         <td>LDLT</td>
         <td>ldlt()</td>
-        <td>Positive or negative semidefinite</td>
+        <td>Positive or negative<br/> semidefinite</td>
         <td>+++</td>
+        <td>+</td>
         <td>++</td>
     </tr>
+    <tr class="alt">
+        <td>BDCSVD</td>
+        <td>bdcSvd()</td>
+        <td>None</td>
+        <td>-</td>
+        <td>-</td>
+        <td>+++</td>
+    </tr>
+    <tr class="alt">
+        <td>JacobiSVD</td>
+        <td>jacobiSvd()</td>
+        <td>None</td>
+        <td>-</td>
+        <td>- - -</td>
+        <td>+++</td>
+    </tr>
 </table>
+To get an overview of the true relative speed of the different decompositions, check this \link DenseDecompositionBenchmark benchmark \endlink.
 
 All of these decompositions offer a solve() method that works as in the above example.
 
 For example, if your matrix is positive definite, the above table says that a very good
-choice is then the LDLT decomposition. Here's an example, also demonstrating that using a general
+choice is then the LLT or LDLT decomposition. Here's an example, also demonstrating that using a general
 matrix (not a vector) as right hand side is possible.
 
 <table class="example">
@@ -167,8 +200,11 @@
 
 \section TutorialLinAlgLeastsquares Least squares solving
 
-The best way to do least squares solving is with a SVD decomposition. Eigen provides one as the JacobiSVD class, and its solve()
-is doing least-squares solving.
+The most accurate method to do least squares solving is with a SVD decomposition.
+Eigen provides two implementations.
+The recommended one is the BDCSVD class, which scale well for large problems
+and automatically fall-back to the JacobiSVD class for smaller problems.
+For both classes, their solve() method is doing least-squares solving.
 
 Here is an example:
 <table class="example">
@@ -179,9 +215,10 @@
 </tr>
 </table>
 
-Another way, potentially faster but less reliable, is to use a LDLT decomposition
-of the normal matrix. In any case, just read any reference text on least squares, and it will be very easy for you
-to implement any linear least squares computation on top of Eigen.
+Another methods, potentially faster but less reliable, are to use a Cholesky decomposition of the
+normal matrix or a QR decomposition. Our page on \link LeastSquares least squares solving \endlink
+has more details.
+
 
 \section TutorialLinAlgSeparateComputation Separating the computation from the construction