February 18th 2014

Book review: Learning scikit-learn – Machine Learning in Python

It seems that Packt Publishing is on a publishing spree on Machine Learning in Python. After Building Machine Learning Systems In Python for which I was technical reviewer, Packt published Learning Scikit-Learn In Python last November.

Continue Reading »

1 Star2 Stars3 Stars4 Stars5 Stars (1 votes, average: 4.00 out of 5)
Loading ... Loading ...

4 Comments »

September 4th 2013

Book: Building Machine Learning Systems in Python

I recently had the opportunity to be a technical reviewer for the new Building Machine Learning Systems in Python. As I took part in the book, I won’t write a review like what I did for other books.

First, I have to say that I was impressed by the quality of the content. Although I had some things that I thought were not excellent (I still need to check how my reviews changed the book), it’s the best book I’ve read from Packt so far. It has a good balance between code and comprehension, which is an equilibrium that is rarely achieved.

I don’t think it is possible to write a better book on Machine Learning in Python, unless the ecosystem evolves with new algorithms. Which it will, and it will mean a new edition of the book! Neat!

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading ... Loading ...

3 Comments »

May 11th 2013

Annoucement: scikits.optimization 0.3

I’m please to announce a new version for scikits.optimization. The main focus of this iteration was to finish usual unconstrained optimization algorithms.

Changelog

  • Fixes on the Simplex state implementation
  • Added several Quasi-Newton steps (BFGS, rank 1 update…)

The scikit can be installed with pip/easy_install or downloaded from PyPI

Old announces:

Buy Me a Coffee!



Other Amount:



Your Email Address :



1 Star2 Stars3 Stars4 Stars5 Stars (1 votes, average: 5.00 out of 5)
Loading ... Loading ...

No Comments yet »

May 2nd 2013

Comparison of optimization algorithms

In the next version of scikits.optimization, I’ve added some Quasi-Newton steps. Before this version is released, I thought I would compare several methods of optimizing the Rosenbrock function.
Continue Reading »

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading ... Loading ...

2 Comments »

December 18th 2012

Optimization scikit: Polytope (Simplex/Nelder-Mead) optimization

Now that version 0.2 of scikit.optimization is out, here is a tutorial on the gradient-free optimizer based on the simplex algorithm.

When the only thing you have is the cost function and when you don’t have dozens of parameters, the first thing that can be tried is a simplex algorithm.

Continue Reading »

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading ... Loading ...

No Comments yet »

November 15th 2012

Annoucement: scikits.optimization 0.2

It has been a while, too long for sure, since my last update on this scikit. I’m pleased to announce that some algorithms are finally fixed as well as some tests.

Changelog:

  • Fixed Polytope/Simplex/Nelder-Mead
  • Fixed the Quadratic Hessian helper class

Additional tutorials will be available in the next weeks.

Old announces:

Buy Me a Coffee!



Other Amount:



Your Email Address :



1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading ... Loading ...

1 Comment »

July 5th 2011

Book review: Sage Beginner’s Guide

I heard about Sage when I started learning Python, but I never quite gotten in the bandwagon. Now, this Beginner’s Guide seems a good place to start.

Continue Reading »

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading ... Loading ...

No Comments yet »

March 8th 2011

Book review: Data Analysis with Open Source Tools

When faced with a new dataset, the issue is to find how it should be analyzed. A lot of books addresses the theoretical way of doing it, but this book gives practical clues to do it. Besides, it isn’t based on commercial tools like MATLAB, but on open source tools that can be freely downloaded on the Internet.

Continue Reading »

1 Star2 Stars3 Stars4 Stars5 Stars (1 votes, average: 5.00 out of 5)
Loading ... Loading ...

No Comments yet »

February 1st 2011

Electronic: The purpose of an oversampling filter

A few months ago, I’ve posted a note on an overdrive. The main issue of this kind of non-linear filter is aliasing, a process that adds digital acoustic content. The best way to solve the issue is to oversample the input before processing the signal.

Continue Reading »

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading ... Loading ...

No Comments yet »

October 5th 2010

Electronic: Simulation of a simple overdrive

There are some effects that are simpler than other. Digital ones are generally easier than analog ones, and purely digital filter are also easier than digitally-transformed analog ones. Linear filters such as passband, cutband, … are easy to digitally design, chorus can be achieved through some spectral computations, delay and reverbation are computationnally expensive but easy to code.

It said that analog devices have a unique sound that digital devices cannot achieve. In fact, much is due to the simplications that occur when digitizing an analog device. One of the most blatant examples is the overdrive, which I took from Simulanalog.
Continue Reading »

1 Star2 Stars3 Stars4 Stars5 Stars (3 votes, average: 5.00 out of 5)
Loading ... Loading ...

5 Comments »

Next »

  • Blog Vitals

    Blog Stats
    17,274,744
    172
    197
    34
  • Advertisement