Site Search  

Home

Research

Resources

Teaching

Affiliations

Resources » Software

The following software packages can be downloaded for non-commercial usage. Note that, in general, Matlab software is coded for easy understanding and not computational speed, while C-software emphasizes speed and efficieny rather than readability. In case our software contributes to successful publications, we would be grateful for acknowledgements in order to increase our future research funding opportunities.

NOTE: we are migrating to git-web for our software repository. If the Download link takes you to a git repository, you can download the latest version of the software by clicking on the snapshot link on the right of the top line of the repository (labeled master ).


Dynamic Movement Primitive Software:

Attach:Download.gif Δ

Download

Motor Primitives Based on Dynamic Systems Theory: Dynamic motor primtives code kinematic movement plans in terms of the time evolution of nonlinear differential equations. The movement plans can be executed by an appropriate controller, e.g., a computed torque controller or a simple PD controller. The motor primitives can be learned from sample trajectories using locally weighted regression. A good reference is Ijspeert, Nakanishi, & Schaal (2003) in the NIPS proceedings, or Schaal, Peters, Nakanish, & Ijspeert (2003) in the IROS workshop proceeding (see Publications).


Path Integral Reinforcement (PI2) Learning Software:

Attach:Download.gif Δ

Download

Policy Improvement with Path Integrals (PI2): PI2 is a probabilistic trajectory-based reinforcement learning method which directly learns the parameters of a parameterized policy. It is one of the most efficient and easy to use direct policy learning methods to date. A good reference is Theodorou, Buchli, & Schaal (2010) in the Journal of Machine Learning Research (see Publications). The matlab sample code under this download link requires the Dynamic Movement Primitive matlab code above to work.


Locally Weighted Statistical Learning Software:

Attach:Download.gif Δ

Download

Locally Weighted Regression (LWR): A memory-based nonparametric learning system, using leave-one-out cross validation to optimize the bandwidth of the kernel. The kernel is only optimized globally, such that functions with large changes in the Hessian tend to overfit. A good reference is Atkeson, Moore, & Schaal (1997) in the Journal of Artificial Intelligent Reviews (see Publications).


Attach:Download.gif Δ

Download

Receptive Field Weighted Regression (RFWR): An incremental nonparametric function approximator (not memory-based), using an incremental approximation to leave-one-out cross validation to optimize the bandwidth of each kernel locally. A very fast efficient and incremental learning system for low dimensional problems. See LWPR (below) for a solution for higher dimensions. A good reference is Schaal & Atkeson (1998), Neural Computation (see Publications).


Attach:Download.gif Δ

Download

Locally Weighted Projection Regression (LWPR): In essence, a similar algorithm as RFWR, but it uses special projection regression techniques to deal efficiently with high dimensional spaces. The algorithm is of linear computational complexity in the number of input dimensions. It is also numerically very robust, and should always perform as good as RFWR. This algorithm is highly recommended as a replacement of RFWR, but do not expect the Matlab implementation to really exploit the computational efficiency of LWPR. Good references are Schaal & Atkeson (1998), Neural Computation, for a start, and Vijayakumar, S., & Schaal, S. (ICML2000) for details (see Publications). A full journal paper is in preparation and can be obtained by emailing {$ \textrm sethu@usc.edu $}.

Bayesian Learning Software:

Attach:Download.gif Δ

Download

Variational Bayesian Least Squares: A Variational Bayesian algorithm that can perform efficient high-dimensional linear regression, handling large numbers of irrelevant and redundant dimensions in the input data. Good references are Ting & al. (2005) in the NIPS proceedings and D'Souza, Vijayakumar & Schaal (2004) in the ICML proceedings (see Publications).


Attach:Download.gif Δ

Download

Bayesian Regression with Input Noise for High Dimensional Data: A Bayesian treatment of factor analysis in joint-space that can accurately identify parameters in a high-dimensional linear regression problem when input data is noise-contaminated. An application to nonlinear parameter identification in Rigid Body Dynamics is described in Ting, Mistry, Peters, Schaal & Nakanishi (2006) in the RSS proceedings. Another good reference is Ting, D'Souza & Schaal (2006) in the ICML proceedings (see Publications).


Attach:Download.gif Δ

Download

Real-time Automatic Outlier Detection: A weighted least squares-like approach to outlier detection, where each data sample has a weight associated with it. This model treats the weights probabilistically and learn their optimal values, avoiding modeling with heuristic error functions, sampling or tuning of open parameters (such as a threshold parameter). Details on how to perform automatic outlier detection in linear regression can be found in the paper by Ting, D'Souza & Schaal (2007) in the ICRA proceedings. We can also incorporate this approach to a Kalman filter, allowing us to do real-time automatic outlier detection on streaming data. A good reference is the CLMC technical report by Ting, Theodorou & Schaal (2007). (see Publications).

Humanoid Simulation Software:

Designed by: Nerses Ohanyan & Jan Peters
Page last modified on July 05, 2010, at 10:53 PM