Annoucement: scikits.optimization 0.1

I’m pleased to announce the first release of one of my projects. This scikits is based on a generic framework that can support unconstrained cost function minimization. It is based on a separation principle and is also completely object oriented.

Several optimizers are available:

  • Nelder-Mead or simplex minimization
  • Unconstrained gradient-based minimization

The usual criterias can be used:

  • Iteration limit
  • Parameter change (relative and absolute)
  • Cost function changer (relative and absolute)
  • Composite criterion generation (AND/OR)

Different direction searches are available:

  • Gradient
  • Several conjugate-gradient (Fletcher-Reeves, …)
  • Decorators for selecting part of the gradient
  • Marquardt step

Finally several line searches (1D minimization) were coded:

  • Fibonacci and gold number methods (exact line searches)
  • Wolfe-Powell soft and strong rules
  • Goldstein line search
  • Cubic interpolation

Additional helper classes can be used:

  • Finite difference differentation (central and forward)
  • Quadratic cost (for least square estimation)
  • Levenberg-Marquardt approximation for least square estimation

Although it is the 0.1 version, the code is quite stable and is used in the learn scikit.

The package can be easy-installed or can be found on PyPI.

Several tutorials are available or will be available on the future at the following locations:

Buy Me a Coffee!
Other Amount:
Your Email Address:

3 thoughts on “Annoucement: scikits.optimization 0.1”

    1. OpenOpt is a wrapper to several different optimizer implementations. We had a wrapper for this package, but is was very limited. As openopt has a common interface for all optimizers, we would have needed a wrapper per combination, which would be just too many for my framework.
      Dmitrey worked on some parts that are not yet stabilized though (I didn’t find the time to add, test, … everything he did during his first GSoC).

      Thanks for the comment 😉

Leave a Reply