blob: e71bd39dbfa82d4c4606041956189376cc7f177a [file] [log] [blame]
Austin Schuh70cc9552019-01-21 19:46:48 -08001====
2Why?
3====
4.. _chapter-features:
5
6* **Code Quality** - Ceres Solver has been used in production at
7 Google for more than four years now. It is clean, extensively tested
8 and well documented code that is actively developed and supported.
9
10* **Modeling API** - It is rarely the case that one starts with the
11 exact and complete formulation of the problem that one is trying to
12 solve. Ceres's modeling API has been designed so that the user can
13 easily build and modify the objective function, one term at a
14 time. And to do so without worrying about how the solver is going to
15 deal with the resulting changes in the sparsity/structure of the
16 underlying problem.
17
18 - **Derivatives** Supplying derivatives is perhaps the most tedious
19 and error prone part of using an optimization library. Ceres
20 ships with `automatic`_ and `numeric`_ differentiation. So you
21 never have to compute derivatives by hand (unless you really want
22 to). Not only this, Ceres allows you to mix automatic, numeric and
23 analytical derivatives in any combination that you want.
24
25 - **Robust Loss Functions** Most non-linear least squares problems
26 involve data. If there is data, there will be outliers. Ceres
27 allows the user to *shape* their residuals using a
28 :class:`LossFunction` to reduce the influence of outliers.
29
30 - **Local Parameterization** In many cases, some parameters lie on a
31 manifold other than Euclidean space, e.g., rotation matrices. In
32 such cases, the user can specify the geometry of the local tangent
33 space by specifying a :class:`LocalParameterization` object.
34
35* **Solver Choice** Depending on the size, sparsity structure, time &
36 memory budgets, and solution quality requirements, different
37 optimization algorithms will suit different needs. To this end,
38 Ceres Solver comes with a variety of optimization algorithms:
39
40 - **Trust Region Solvers** - Ceres supports Levenberg-Marquardt,
41 Powell's Dogleg, and Subspace dogleg methods. The key
42 computational cost in all of these methods is the solution of a
43 linear system. To this end Ceres ships with a variety of linear
44 solvers - dense QR and dense Cholesky factorization (using
45 `Eigen`_ or `LAPACK`_) for dense problems, sparse Cholesky
46 factorization (`SuiteSparse`_, `CXSparse`_ or `Eigen`_) for large
47 sparse problems custom Schur complement based dense, sparse, and
48 iterative linear solvers for `bundle adjustment`_ problems.
49
50 - **Line Search Solvers** - When the problem size is so large that
51 storing and factoring the Jacobian is not feasible or a low
52 accuracy solution is required cheaply, Ceres offers a number of
53 line search based algorithms. This includes a number of variants
54 of Non-linear Conjugate Gradients, BFGS and LBFGS.
55
56* **Speed** - Ceres Solver has been extensively optimized, with C++
57 templating, hand written linear algebra routines and OpenMP or C++11 threads
58 based multithreading of the Jacobian evaluation and the linear solvers.
59
60* **Solution Quality** Ceres is the `best performing`_ solver on the NIST
61 problem set used by Mondragon and Borchers for benchmarking
62 non-linear least squares solvers.
63
64* **Covariance estimation** - Evaluate the sensitivity/uncertainty of
65 the solution by evaluating all or part of the covariance
66 matrix. Ceres is one of the few solvers that allows you to to do
67 this analysis at scale.
68
69* **Community** Since its release as an open source software, Ceres
70 has developed an active developer community that contributes new
71 features, bug fixes and support.
72
73* **Portability** - Runs on *Linux*, *Windows*, *Mac OS X*, *Android*
74 *and iOS*.
75
76* **BSD Licensed** The BSD license offers the flexibility to ship your
77 application
78
79.. _best performing: https://groups.google.com/forum/#!topic/ceres-solver/UcicgMPgbXw
80.. _bundle adjustment: http://en.wikipedia.org/wiki/Bundle_adjustment
81.. _SuiteSparse: http://www.cise.ufl.edu/research/sparse/SuiteSparse/
82.. _Eigen: http://eigen.tuxfamily.org/
83.. _LAPACK: http://www.netlib.org/lapack/
84.. _CXSparse: https://www.cise.ufl.edu/research/sparse/CXSparse/
85.. _automatic: http://en.wikipedia.org/wiki/Automatic_differentiation
86.. _numeric: http://en.wikipedia.org/wiki/Numerical_differentiation