Building your recurrent neural network-step by step
Welcome to Course 5 ' s-A-assignment! In this assignment, you'll implement your The recurrent neural network in NumPy.
Recurrent neural Networks (RNN) are very effective for Natural Language processing and other sequence tasks because they h Ave "Memory". They can read inputs X⟨t⟩x^{\langle T \rangle} (such as words) one at a time, with remember some information/context GH the hidden layer activations. Passed from one time-step to the next. This is allows a uni-directional RNN to take information from the past to process later inputs. A bidirection RNN can take context from both the past and the future.
notation:
-Superscript [l] [L] denotes an object associated with the Lth l^{th} layer.
-EXAMPLE:A[4] a^{[4] is the 4th 4^{th} layer activation. W[5] w^{[5]} and B[5] b^{[5] are the 5th 5^{th} layer parameters.
Superscript (i) (i) denotes an object associated with the ith i^{th} example. Example:x (i) x^{(i)} is the ith i^{th} training input.
Superscript⟨t⟩\langle T \rangle denotes a object at the TTH t^{th} time-step. Example:x⟨t⟩x^{\langle T \rangle} is the input x at the TTH t^{th} time-step. X (i) ⟨t⟩x^{(i) \langle T \rangle} is the input at the T