Ufldl learning notes and programming job: vectorization (vectoring/vectoring)
Ufldl provides a new tutorial, which is better than the previous one. Starting from the basics, the system is clear and has programming practices.
In the high-quality deep learning group, you can learn DL directly without having to delve into other machine learning algorithms.
So I started to do this recently. The tutorial, coupled with Matlab programming, is perfect.
The address of the new tutorial is: http://ufldl.stanford.edu/tutorial/
Link to this study: http://ufldl.stanford.edu/tutorial/supervised/Vectorization/
What is vectoring? How to vectorize?
In short, the embodiment of Vectorization in code is to replace some for loop operations with matrix operations.
Some examples are provided in the tutorial.
For example, when linear regression is used to calculate the predicted value, the general practice is to calculate one sample, and m samples will have to be cycled m times.
In the vectorized operation, the prediction values of M example are obtained at one time by multiplying a matrix.
I personally think that to implement vectorized programming, you must have a clear impression of the main data (matrices) in the model in your mind, especially the matrix dimension.
Why vectoring?
1. The code is concise; 2. the speed is fast.
The speed is high. Some cyclic operations you write are much slower than matrix operations encapsulated by Matlab. Because those MATLAB numeric libraries are written based on parallel computing by scientists. Of course, you may be able to write code faster than those libraries.
At the beginning, I was self-reliant. I wrote some kernel functions on Cuda for computation. Later, the result was certainly much slower than cublas.
Therefore, if you want to write a CNN framework that is faster than caffe, you have to write a section.
In this section, the programming job is to perform vectorized programming for linear and logistic regression.
Since I have already been vectorized programming, I will not list the code here.
Linger
Link: http://blog.csdn.net/lingerlanlan/article/details/38390429