Edited, memorised or added to reading queue

on 14-Dec-2025 (Sun)

Do you want BuboFlash to help you learning these things? Click here to log in or create user.

#deep-learning #keras #lstm #python #sequence
RNNs are inherently deep in time, since their hidden state is a function of all previous hidden states. The question that inspired this paper was whether RNNs could also benefit from depth in space; that is from stacking multiple recurrent hidden layers on top of each other, just as feedforward layers are stacked in conventional deep networks. — Speech Recognition With Deep Recurrent Neural Networks, 2013
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on


Parent (intermediate) annotation

Open it
7.1. The Stacked LSTM RNNs are inherently deep in time, since their hidden state is a function of all previous hidden states. The question that inspired this paper was whether RNNs could also benefit from depth in space; that is from stacking multiple recurrent hidden layers on top of each other, just as feedforward layers are stacked in conventional deep networks. — Speech Recognition With Deep Recurrent Neural Networks, 2013 In the same work, they found that the depth of the network was more important than the number of memory cells in a given layer to model skill. Stacked LSTMs are now a stable technique f

Original toplevel document (pdf)

cannot see any pdfs




Flashcard 7777541033228

Tags
#deep-learning #keras #lstm #python #sequence
Question

Input Weights.

Used to weight input for the [...] time step.

Answer
current

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
Input Weights. Used to weight input for the current time step.

Original toplevel document (pdf)

cannot see any pdfs







#deep #keras #learning #tensorflow #tfc-II
Model development for production continues to be a combination of automatic and hand-designed learning—which is often crucial for proprietary needs or advantages. But designing by hand does not mean starting from scratch; typically, you would start with a stock model and make tweaks and adjustments.
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on


Parent (intermediate) annotation

Open it
Model development for production continues to be a combination of automatic and hand-designed learning—which is often crucial for proprietary needs or advantages. But designing by hand does not mean starting from scratch; typically, you would start with a stock model and make tweaks and adjustments. To do this effectively, you need to know how the model works and why it works that way, the concepts that underlie its design, and the pros and cons of alternative building blocks you w

Original toplevel document (pdf)

cannot see any pdfs




#deep #keras #learning #tensorflow #tfc-II
designing by hand does not mean starting from scratch; typically, you would start with a stock model and make tweaks and adjustments. To do this effectively, you need to know how the model works and why it works that way, the concepts that underlie its design, and the pros and cons of alternative building blocks you will learn from other SOTA models
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on


Parent (intermediate) annotation

Open it
Model development for production continues to be a combination of automatic and hand-designed learning—which is often crucial for proprietary needs or advantages. But designing by hand does not mean starting from scratch; typically, you would start with a stock model and make tweaks and adjustments. To do this effectively, you need to know how the model works and why it works that way, the concepts that underlie its design, and the pros and cons of alternative building blocks you will learn from other SOTA models

Original toplevel document (pdf)

cannot see any pdfs




[unknown IMAGE 7626420784396] #has-images #tensorflow #tensorflow-certificate

Tracking your experiments

One really good habit is to track the results of your experiments. There are tools to help us!

statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on


Parent (intermediate) annotation

Open it
Tracking your experiments One really good habit is to track the results of your experiments. There are tools to help us! Resource: Try: Tensorboard - a component of Tensorflow library to help track modelling experiments Weights & Biase

Original toplevel document

TfC 01 regression
Take away: You should minimize the time between your experiments (that's way you should start with smaller models). The more experiments you do, the more things you figure out that don't work. <span>Tracking your experiments One really good habit is to track the results of your experiments. There are tools to help us! Resource: Try: Tensorboard - a component of Tensorflow library to help track modelling experiments Weights & Biases Saving and loading models Two formats: SavedModel format (including optimizer's step) HDF5 format What about TensorFlow Serving format? # Save the entire model using SavedModel model_3




#RNN #ariadne #behaviour #consumer #deep-learning #patterns #priority #recurrent-neural-networks #retail #simulation #synthetic-data

Customer Shopping Pattern Model

We study the customer behaviour though time with equal time steps (intervals) as demonstrated in Figure 3. The time interval can be weekly, bi-weekly, monthly, or etc. Since the first purchase time among customers is different, we define a lower limit and upper limit during the time. The lower limit refers to the start point of our study and the upper limit refers to the end point of our study through time. In this case, we can define some equal time intervals between the lower and upper limits, as shown in Figure 3. The shopper’s purchase is then identified in each time interval and the R, F, and M variables are computed with respect to any point of interest.

statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on


Parent (intermediate) annotation

Open it
Customer Shopping Pattern Model We study the customer behaviour though time with equal time steps (intervals) as demonstrated in Figure 3. The time interval can be weekly, bi-weekly, monthly, or etc. Since the first purchase time among customers is different, we define a lower limit and upper limit during the time. The lower limit refers to the start point of our study and the upper limit refers to the end point of our study through time. In this case, we can define some equal time intervals between the lower and upper limits, as shown in Figure 3. The shopper’s purchase is then identified in each time interval and the R, F, and M variables are computed with respect to any point of interest. For example, if the point of interest is at t 4 , the recency is the time difference between the last purchase before the time of interest and the purchase itself. The time difference c

Original toplevel document (pdf)

cannot see any pdfs