Edited, memorised or added to reading queue

on 12-Nov-2024 (Tue)

Do you want BuboFlash to help you learning these things? Click here to log in or create user.

#recurrent-neural-networks #rnn
A Bayesian non-parametric approach to flexibly model purchasing dynamics depending on calendar time effects, inter-event timing and customer lifetime was recently proposed by Dew and Ansari (2018).
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on


Parent (intermediate) annotation

Open it
achieved by applying more general hidden Markov models (see, e.g., Netzer, Lattin, & Srinivasan (2008), Schweidel, Bradlow, & Fader (2011), Romero, van der Lans, & Wierenga (2013)). <span>A Bayesian non-parametric approach to flexibly model purchasing dynamics depending on calendar time effects, inter-event timing and customer lifetime was recently proposed by Dew and Ansari (2018). However, all such approaches come at the cost of additional model complexity, rising computational cost, and a loss in the sufficiency <span>

Original toplevel document (pdf)

cannot see any pdfs




[unknown IMAGE 7100438023436] #has-images #recurrent-neural-networks #rnn
traditional statistical forecasting models often suffer from poor efficiency when increasing model complexity and heavily rely on manual feature engineering and data labeling
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on


Parent (intermediate) annotation

Open it
capability both in the short and in the long-term with limited engineering requirements at low computational cost and providing a direct link toward managerial decision-making. Recognizing that <span>traditional statistical forecasting models often suffer from poor efficiency when increasing model complexity and heavily rely on manual feature engineering and data labeling, Table 1 picks up these issues and compares some of the key differences between stochastic BTYD models and the deep learning approach we present in this section <span>

Original toplevel document (pdf)

cannot see any pdfs




Flashcard 7666402725132

Tags
#ML-engineering #ML_in_Action #learning #machine #software-engineering
Question
Nothing is more demoralizing than building an ML solution that solves the [...] problem
Answer
wrong

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
Nothing is more demoralizing than building an ML solution that solves the wrong problem

Original toplevel document (pdf)

cannot see any pdfs







#deep-learning #keras #lstm #python #sequence

3 common examples for managing state:

  • A prediction is made at the end of each sequence and sequences are independent. State should be reset after each sequence by setting the batch size to 1.
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on


Parent (intermediate) annotation

Open it
To make this more concrete, below are a 3 common examples for managing state: A prediction is made at the end of each sequence and sequences are independent. State should be reset after each sequence by setting the batch size to 1. A long sequence was split into multiple subsequences (many samples each with many time steps). State should be reset after the network has been exposed to the entire sequence by making

Original toplevel document (pdf)

cannot see any pdfs




Flashcard 7666406919436

Tags
#RNN #ariadne #behaviour #consumer #deep-learning #priority #recurrent-neural-networks #retail #simulation #synthetic-data
Question
We focus on predicting the probability P (ou | xu 1 , . . . , x u T ) of a consumer u to place an order ou , which we model as a [...] classification problem. For instance, we could be interested in orders in general or of specific products.
Answer
binary

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
We focus on predicting the probability P (ou | xu 1 , . . . , x u T ) of a consumer u to place an order ou , which we model as a binary classification problem. For instance, we could be interested in orders in general or of specific products.

Original toplevel document (pdf)

cannot see any pdfs







#deep-learning #keras #lstm #python #sequence

Input must be three-dimensional, comprised of samples, time steps, and features in that order.

Samples. These are the rows in your data. One sample may be one sequence.

statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on


Parent (intermediate) annotation

Open it
The first hidden layer in the network must define the number of inputs to expect, e.g. the shape of the input layer. Input must be three-dimensional, comprised of samples, time steps, and features in that order. Samples. These are the rows in your data. One sample may be one sequence. Time steps. These are the past observations for a feature, such as lag variables. Features. These are columns in your data.

Original toplevel document (pdf)

cannot see any pdfs




Flashcard 7666411113740

Tags
#deep-learning #keras #lstm #python #sequence
Question
LSTMs may not be ideal for all sequence prediction problems. For example, in [...] forecasting, often the information relevant for making a forecast is within a small window of past observations
Answer
time series

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
LSTMs may not be ideal for all sequence prediction problems. For example, in time series forecasting, often the information relevant for making a forecast is within a small window of past observations

Original toplevel document (pdf)

cannot see any pdfs







Flashcard 7666412948748

Tags
#tensorflow #tensorflow-certificate
Question
  1. Converting [...] columns

For example: Use pandas get_dummies() function

Answer
non-numerical

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
Converting non-numerical columns For example: Use pandas get_dummies() function

Original toplevel document

TfC_01_FINAL_EXAMPLE.ipynb
Getting dataset ready for tensorflow Converting non-numerical columns For example: Use pandas get_dummies() function insurance_one_hot = pd.get_dummies(insurance,dtype="int32") #to avoid bool which generate problem with model fitting in TensorFlow insurance_one_hot # Create X and y values (features and labels) y = insurance_one_hot['charges'] X = insurance_one_hot.drop('charges', axis=1) #y = y.values # This is not necessary #X = X.values #X, y, X