Edited, memorised or added to reading queue

on 19-Nov-2024 (Tue)

Do you want BuboFlash to help you learning these things? Click here to log in or create user.

Flashcard 7656668794124

Tags
#feature-engineering #lstm #recurrent-neural-networks #rnn
Question
The LSTM neural network would be well-suited for modeling online customer behavior across [...] websites since it can naturally capture inter-sequence and inter-temporal interactions from multiple streams of clickstream data without growing exponentially in complexity.
Answer
multiple

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
The LSTM neural network would be well-suited for modeling online customer behavior across multiple websites since it can naturally capture inter-sequence and inter-temporal interactions from multiple streams of clickstream data without growing exponentially in complexity. </sp

Original toplevel document (pdf)

cannot see any pdfs







Flashcard 7663773682956

Tags
#ggplot2
Question
The problem here is that by default scales::[...] multiplies its input value by 100. This can be controlled by the scale parameter.
Answer
percent()

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
The problem here is that by default scales::percent() multiplies its input value by 100. This can be controlled by the scale parameter.

Original toplevel document

Open it
Something is not right here! 4000%!? That seems a bit excessive. The problem here is that by default scales::percent() multiplies its input value by 100. This can be controlled by the scale parameter. scales::percent(100, scale = 1) Copy ## [1] "100%" However, scale_y_continuous() expects a function as input for its labels parameter not the actual labels itself. Thus, using percent() is not an option anymore. Fortu







Flashcard 7667204099340

Tags
#deep-learning #has-images #keras #lstm #python #sequence
[unknown IMAGE 7104054824204]
Question
For example, if we had two time steps and one feature for a [...] sequence with two lag observations per row, it would be specified as on listing 4.5
Answer
univariate

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
For example, if we had two time steps and one feature for a univariate sequence with two lag observations per row, it would be specified as on listing 4.5

Original toplevel document (pdf)

cannot see any pdfs







Flashcard 7667291393292

Tags
#feature-engineering #lstm #recurrent-neural-networks #rnn
Question
The dimensionality of the vector is often reduced through word [...], a technique used in natural language processing
Answer
embedding

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
The dimensionality of the vector is often reduced through word embedding, a technique used in natural language processing

Original toplevel document (pdf)

cannot see any pdfs







Flashcard 7667293228300

Tags
#deep-learning #keras #lstm #python #sequence
Question
Sequence prediction problems must be [...] as supervised learning problems. That is, data must be transformed from a sequence to pairs of input and output pairs.
Answer
re-framed

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
Sequence prediction problems must be re-framed as supervised learning problems. That is, data must be transformed from a sequence to pairs of input and output pairs.

Original toplevel document (pdf)

cannot see any pdfs







Flashcard 7667295587596

Tags
#ML_in_Action #learning #machine #software-engineering
Question
Part 3 (chapters 14–16) focuses on “the after”: specifically, considerations related to streamlining production release, retraining, monitoring, and attribution for a project. With examples focused on [...], feature stores, and a passive retraining system, you’ll be shown how to implement systems and architectures that can ensure that you’re building the minimally complex solution to solve a business problem with ML
Answer
A/B testing

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
(chapters 14–16) focuses on “the after”: specifically, considerations related to streamlining production release, retraining, monitoring, and attribution for a project. With examples focused on <span>A/B testing, feature stores, and a passive retraining system, you’ll be shown how to implement systems and architectures that can ensure that you’re building the minimally complex solution to solve

Original toplevel document (pdf)

cannot see any pdfs







Flashcard 7667297422604

Tags
#deep-learning #keras #lstm #python #sequence
Question

The choice of [...] [number] will influence both:

- The internal state accumulated during the forward pass.

- The gradient estimate used to update weights on the backward pass.

Answer
time steps

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
The choice of time steps [number] will influence both: - The internal state accumulated during the forward pass. - The gradient estimate used to update weights on the backward pass.

Original toplevel document (pdf)

cannot see any pdfs