%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
If you find this code useful, please cite our work as:

Zorzi, M., Testolin, A. and Stoianov, I. (2013)
'Modeling language and cognition with deep unsupervised learning:
a tutorial overview.' Frontiers in Psychology.

Testolin, A., Stoianov, I., De Filippo De Grazia, M. and Zorzi, M. (2013)
'Deep unsupervised learning on a desktop PC: A primer for cognitive
scientists'. Frontiers in Psychology.

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

The code is not tied to a specific benchmark task and it can be easily
adapted to other learning problems.
In order to maximize portability, all implementations assume to manipulate
data stored in 3-D matrices of size x*y*z, where x is the mini-batch size,
y is the input size (i.e., the number of visible units) and z is the total
number of mini-batches.
We also provide instructions on how to run our codes on the popular MNIST
example, which we used as a benchmark task.

Note that NVIDIA drivers must be properly installed on your system, along
with a compatible CUDA graphic card with at least 1GB of dedicated memory.
MATLAB Parallel Computing Toolbox (v2011 or above) must be properly
installed in the system.

Network and learning parameters can be directly set inside the script
'deeptrain_GPU.m'. Running the script will train a deep belief net.
At the end of the learning phase, a MATLAB file containing the results
will be created into the source code directory. Note that the training
data must be supplied as an additional '.mat' file
(e.g., 'MNIST_data_n.mat' for the MNIST example).



---------------------------------------------------------------------
MNIST example

Step 1: Data preparation
The raw MNIST dataset must be first converted into a suitable format:

1. Download the following 4 files from http://yann.lecun.com/exdb/mnist
   * train-images-idx3-ubyte.gz
   * train-labels-idx1-ubyte.gz
   * t10k-images-idx3-ubyte.gz
   * t10k-labels-idx1-ubyte.gz

2. Unzip them by executing
   * gunzip train-images-idx3-ubyte.gz
   * gunzip train-labels-idx1-ubyte.gz
   * gunzip t10k-images-idx3-ubyte.gz
   * unzip t10k-labels-idx1-ubyte.gz

NB: Make sure that file names have not been changed during unzipping
(in particular, it may happen that '-' is replaced with '.')

3. Convert the raw images into MATLAB/Octave format by copying them into
the source code directory and running the routine 'converter.m' 

4. Save the training set into a suitable 3-D matrix by running the routine
'makebatches.m' (mini-batch size can be set inside that file before
running it). This will produce a file called 'MNIST_data_n.mat', where n is
the chosen mini-batch size.


Step 2: Unsupervised deep learning
Now you are ready to train a deep belief network using 'deeptrain_GPU.m'.
Make sure you set all the desired hyperparameters before proceeding.


Step 3: Testing the network
Hierarchical generative models are not evaluated using classification
performance, as for supervised deep networks. However, we can probe the
model with a variety of techniques, such as by plotting the receptive
fields of the neurons or by performing read-outs at different levels of
representations.
Some useful routines to perform such analyses can be found here:
http://ccnl.psy.unipd.it/research/deeplearning
