#Docker

# Docker look at the log of an exited container (with timestamps)

docker logs -t [NAZWA KONTENERA]

#### Flashcard 7560894483724

Tags
#abm #agent-based #priority #rooftop-solar #simulation #synthetic-data
Question
Our third assumption is that each individual makes independent decisions at each time t, conditional on state x. Again, if x includes all [...] relevant to an agent’s decision, this assumption is relatively innocuous
features

status measured difficulty not learned 37% [default] 0

#### Parent (intermediate) annotation

Open it
Our third assumption is that each individual makes independent decisions at each time t, conditional on state x. Again, if x includes all features relevant to an agent’s decision, this assumption is relatively innocuous

#### Original toplevel document (pdf)

cannot see any pdfs

#### Flashcard 7560896318732

Tags
#deep-learning #keras #lstm #python #sequence
Question
How to Convert Categorical Data to Numerical Data This involves two steps: 1. Integer Encoding. 2. [...] Encoding.
One Hot

status measured difficulty not learned 37% [default] 0

#### Parent (intermediate) annotation

Open it
How to Convert Categorical Data to Numerical Data This involves two steps: 1. Integer Encoding. 2. One Hot Encoding.

#### Original toplevel document (pdf)

cannot see any pdfs

#### Flashcard 7560898153740

Tags
#deep-learning #keras #lstm #python #sequence
Question
LSTMs may not be ideal for all sequence prediction problems. For example, in time series forecasting, often the information relevant for making a forecast is within a small window of past observations. Often an [...] with a window or a linear model may be a less complex and more suitable model
MLP

status measured difficulty not learned 37% [default] 0

#### Parent (intermediate) annotation

Open it
ideal for all sequence prediction problems. For example, in time series forecasting, often the information relevant for making a forecast is within a small window of past observations. Often an <span>MLP with a window or a linear model may be a less complex and more suitable model <span>

#### Original toplevel document (pdf)

cannot see any pdfs

Tags
#Docker
Question

# Docker look at the log of an exited container (with timestamps)

docker logs [...] [NAZWA KONTENERA]
-t

status measured difficulty not learned 37% [default] 0

Open it
Docker look at the log of an exited container (with timestamps) docker logs -t [NAZWA KONTENERA]

#### Annotation 7560902872332

 #feature-engineering #lstm #recurrent-neural-networks #rnn The effect of a direct mailing does not end after the campaign is over, and the customer has made her decision to respond or not. An advertising campaign or customer retention program can impact customers' behaviors for several weeks, even months. Customers tend to remember past events, at least partially. Hence, the effects of marketing actions tend to carry- over into numerous subsequent periods

#### Parent (intermediate) annotation

Open it
The effect of a direct mailing does not end after the campaign is over, and the customer has made her decision to respond or not. An advertising campaign or customer retention program can impact customers' behaviors for several weeks, even months. Customers tend to remember past events, at least partially. Hence, the effects of marketing actions tend to carry- over into numerous subsequent periods (Lilien, Rangaswamy, & De Bruyn, 2013; Schweidel & Knox, 2013; Van Diepen et al., 2009). The LSTM neural network, which we introduce next, is a kind of RNN that has been modifie

#### Original toplevel document (pdf)

cannot see any pdfs

#### Flashcard 7560904707340

Tags
#feature-engineering #lstm #recurrent-neural-networks #rnn
Question
Because of their typical [...], the hidden states of RNN models are usually more potent than that of hidden markov models (e.g., Netzer, Lattin, & Srinivasan, 2008), which are commonly used in marketing to capture customer dynamics.
high-dimensionality

status measured difficulty not learned 37% [default] 0

#### Parent (intermediate) annotation

Open it
Because of their typical high-dimensionality, the hidden states of RNN models are usually more potent than that of hidden markov models (e.g., Netzer, Lattin, & Srinivasan, 2008), which are commonly used in marketing to captur

#### Original toplevel document (pdf)

cannot see any pdfs

#### Flashcard 7560907066636

Tags
#feature-engineering #lstm #recurrent-neural-networks #rnn
Question
In this paper, we have shown that recent neural network architectures, traditionally used in natural language processing and machine translation, could effectively do away with the complicated and time-consuming step of feature engineering, even when applied to highly [...] problems such as predicting the future behaviors of a panel of customers.
structured

status measured difficulty not learned 37% [default] 0

#### Parent (intermediate) annotation

Open it
itionally used in natural language processing and machine translation, could effectively do away with the complicated and time-consuming step of feature engineering, even when applied to highly <span>structured problems such as predicting the future behaviors of a panel of customers. <span>

#### Original toplevel document (pdf)

cannot see any pdfs

#### Annotation 7560909163788

 #causality #statistics Given that we have tools to measure association, how can we isolate causation? In other words, how can we ensure that the association we measure is causation, say, for measuring the causal effect of 𝑋 on 𝑌 ? Well, we can do that by ensuring that there is no non-causal association flowing between 𝑋 and 𝑌

Open it

#### Original toplevel document (pdf)

cannot see any pdfs

#### Flashcard 7560911523084

Tags
#feature-engineering #has-images #lstm #recurrent-neural-networks #rnn
[unknown IMAGE 7103902780684]
Question
Fig. 2. Classic [...] neural network (A), recurrent neural network (B), and “unrolled” graphical representation of a recurrent neural network (C) where we use sequence data (x 1 ,x 2 ,x 3 ) to make sequence predictions (y 1 ,y 2 ,y 3 ) while preserving information through the hidden states h 1 ,h 2 ,h 3
feedforward

status measured difficulty not learned 37% [default] 0

#### Parent (intermediate) annotation

Open it
Fig. 2. Classic feedforward neural network (A), recurrent neural network (B), and “unrolled” graphical representation of a recurrent neural network (C) where we use sequence data (x 1 ,x 2 ,x 3 ) to make sequence

#### Original toplevel document (pdf)

cannot see any pdfs

#### Flashcard 7560913358092

Tags
#deep-learning #keras #lstm #python #sequence
Question
LSTMs may not be ideal for all sequence prediction problems. For example, in time series forecasting, often the information relevant for making a forecast is within a [...] window of past observations
small

status measured difficulty not learned 37% [default] 0

#### Parent (intermediate) annotation

Open it
LSTMs may not be ideal for all sequence prediction problems. For example, in time series forecasting, often the information relevant for making a forecast is within a small window of past observations

#### Original toplevel document (pdf)

cannot see any pdfs

#### Annotation 7560923319564

 #pytest #python #unittest Beware of float return values! 0.1 + 0.1 + 0.1 == 0.3 Sometimes false assert 0.1 + 0.1 + 0.1 == 0.3, "Usual way to compare does not always work with floats!" Instead use: assert 0.1 + 0.1 + 0.1 == pytest.approx(0.3)

#### Flashcard 7560925154572

Tags
#pytest #python #unittest
Question

Beware of float return values!
0.1 + 0.1 + 0.1 == 0.3 Sometimes false

assert 0.1 + 0.1 + 0.1 == 0.3, "Usual way to compare does not always work with floats!"

assert 0.1 + 0.1 + 0.1 == pytest.[...](0.3)

approx

status measured difficulty not learned 37% [default] 0

Open it
float return values! 0.1 + 0.1 + 0.1 == 0.3 Sometimes false assert 0.1 + 0.1 + 0.1 == 0.3, "Usual way to compare does not always work with floats!" Instead use: assert 0.1 + 0.1 + 0.1 == pytest.<span>approx(0.3) <span>

#### Annotation 7560929873164

 [unknown IMAGE 7560930397452] #has-images Multiple assertions in one unit test Test will pass only if both assertions pass.