Idea: Using RNN to model users ' browsing order, using FNN to simulate CF, two networks learning together
RNN Network structure:
The state of the output layer represents a page that a user browses, which can be seen as a one-hot representation, and STATE0 to 3 is the page that is browsed in turn. Because RNN input number is limited, if the user browses too many pages, then will lose the first of those pages, paper in order to retain this part of the information, using the history state to retain the information of the former X-n status, as a separate states input
The vector for the history state is represented as follows:
FNN Analog CF:
Input is the purchase of each user item vector 0-1 represents. The output is a vector of item length, indicating the current user's purchase probability (although the input is all users, but at training time is a specific user calculation loss, so the output is the user's item purchase probability, learning in the network)
The two networks are put together and counted as a softmax after their last relu layer, and then the output of the two network Softmax is counted as a softmax. The overall structure is as follows:
The first four columns are RNN, the number of loops is 4, the last column is FNN
Thesis note-personal Recommendation Using deep Recurrent neural Networks in NetEase