Francesco Orabona


Parameter-free Optimization Algorithms
There is currently no centralized place where my recent parameter-free optimization algorithms are.
I coded COCOB (aka SGD for Deep Networks without learning rates) in Tensorflow here:
I also added two variants of it in Vowpal Wabbit (VW), see options "coin" and "pistol".

yamall - Yet Another Machine Learning Library
yamall is a Java machine learning library, a fast black-box local learner, and a hadoop implementation. It implements most of the state-of-the-art features used in machine learning algorithms, e.g. namespaces, hashing of the features, single and multipass stochastic gradient descent.
yamall comes from the necessity to have a secure Java implementation of state-of-the-art machine learning algorithms. The local version imitates the interface of Vowpal Wabbit, in order to facilitate the migration. At the same time, to harness the full power of yamall you can directly call the java functions in your code. Also, there is a Hadoop version to be able to scale to big datasets.
I designed and coded it while at Yahoo Research and it is now open source.

DOGMA - Discriminative Online (Good?) Matlab Algortithms
DOGMA is a MATLAB toolbox for Online Learning. It implements all the many online learningalgorithms in a unique framework. The main aim of the library is simplicity: all the implemented algorithms are easy to be used, understood, and modified. For this reason, all the implementations are in plain MATLAB, limiting the use of mex files only when it is strictly necessary.
The library focuses on linear and kernel online algorithms, mainly developped in the "relative mistake bound" framework. Examples are Perceptron, Passive-Aggresive, ALMA, NORMA, SILK, Projectron, RBP, Banditron, etc.

Old Stuff
In the past I was quite active as programmer for the Texas Instruments TI-89 calculator. In particular my program "Control's Toolbox" has been used in the past also as a teaching tool for Control Theory courses. You still can find my programs here.