Now to deduce the process of getting the variational parameter update, which is part of the paper's appendix, to avoid getting bogged down in too much detail and to influence the overall understanding, you can not focus on the solution details when you first learn LDA. The first thing to write L about γ , ? Function. According to the previous definition of L: L (Gamma ,?;α,Beta )= E q [l ogP(θ ,Z,W|α,Beta )]? E q [l ogQ(θ ,Z)]
(1)
If you calculate 5 expectations separately, you can get the following equation:
(2)
The 5 expected calculations in the above formula use the following formulas, which are derived from the author's appendix:
5 Expected calculations:
Next, separate the γ Span style= "Display:inline-block; width:0px; Height:2.564em; " > The derivative of the deviation derivation is 0, and the solution γ Span style= "Display:inline-block; width:0px; Height:2.564em; " > 。
We simplify the (2) type of L, leaving only the ? About the item:
Deviation Guide:
Solution to:
For γ Span style= "Display:inline-block; width:0px; Height:2.564em; " > , same steps:
Main references "Latent Dirichlet Allocation"
LDA Topic Model Learning Note 3.5: Derivation of variational parameters