# on 12-Jan-2023 (Thu)

#### Annotation 7103919557900

 #feature-engineering #lstm #recurrent-neural-networks #rnn It is worth noting that though our study focuses on LSTM neural networks, there are other variants of the RNN as well such as the Gated Recurrent Unit (GRU) which use internal recurrence and gating mechanism along with the external recurrence of the RNN (Cho et al., 2014; Chung, Gulcehre, Cho, & Bengio, 2014). However, research seems to suggest that none of the existing variants of the LSTM may significantly improve on the vanilla LSTM neural network

#### pdf

cannot see any pdfs

#### Annotation 7552574295308

 #ML-engineering #ML_in_Action #learning #machine #software-engineering These inexperienced DS team members, performing only the most cursory of research, adapt a basic demo from a blog post. While their basic testing shows promise, they fail to thoroughly research the implementation details required for employing the model on their data. By retraining the pretrained model on only a few hundred images of two of the many thousands of products from their corpus of images, their misleading results hide the problem with their approach

#### pdf

cannot see any pdfs

#Docker

# Docker look at the log of an exited container

docker logs -t [NAZWA KONTENERA]

Tags
#Docker
Question

# Docker look at the log of an exited container

docker logs -t [[...]]
NAZWA KONTENERA

status measured difficulty not learned 37% [default] 0

Open it
Docker look at the log of an exited container docker logs -t [NAZWA KONTENERA] docker logs -t [NAZWA KONTENERA]

#### Annotation 7559033261324

Debuging shiny applications
#R #debugger #has-images #shiny

#### Breakpoint Limitations

Unfortunately, breakpoints aren’t helpful in all situations. For technical reasons, breakpoints can only be used inside the shinyServer function. You can’t use them in code in other .R files. And breakpoints can tell you something about why code is executing, but they can’t always tell you why something isn’t executing.

### browser() statements

The browser() statement is another useful debugging tool. It acts like a breakpoint–when evaluated, it halts execution and enters the debugger. You can add it anywhere an R expression is valid.

Unlike breakpoints, browser() works everywhere, so it’s suitable for use in any code invoked by your Shiny app. You can also invoke browser() conditionally to create conditional breakpoints; for instance:

The downside of browser() is that you need to re-run your Shiny application to apply it, and you need to remember to take it out afterwards.

#### Annotation 7559036931340

 Using ls and cd comands on network folders #filesharing #linux #network Using ls and cd comands on network folders This depends on how is the NAS setup. Does it use windows sharing? Then you can use smbclient to connect to the server and there you can use lsand cd.

#### Flashcard 7559042436364

Tags
#feature-engineering #lstm #recurrent-neural-networks #rnn
Question
Park and Fader (2004) leveraged internet clickstream data from multiple websites, such that relevant information from one website could be used to explain behavior on the other. The LSTM neural network would be well-suited for modeling online customer behavior across multiple websites since it can naturally capture [...] and inter-temporal interactions from multiple streams of clickstream data without growing exponentially in complexity.
inter-sequence

status measured difficulty not learned 37% [default] 0

#### Parent (intermediate) annotation

Open it
ebsite could be used to explain behavior on the other. The LSTM neural network would be well-suited for modeling online customer behavior across multiple websites since it can naturally capture <span>inter-sequence and inter-temporal interactions from multiple streams of clickstream data without growing exponentially in complexity. <span>

#### Original toplevel document (pdf)

cannot see any pdfs

#### Flashcard 7559046368524

Tags
#deep-learning #keras #lstm #python #sequence
Question
The promise of recurrent neural networks is that the [...] dependence and contextual information in the input data can be learned. A recurrent network whose inputs are not fixed but rather constitute an input sequence can be used to transform an input sequence into an output sequence while taking into account contextual information in a flexible way