Edited, memorised or added to reading queue

on 02-Dec-2024 (Mon)

Do you want BuboFlash to help you learning these things? Click here to log in or create user.

#deep-learning #keras #lstm #python #sequence

7.1. The Stacked LSTM

RNNs are inherently deep in time, since their hidden state is a function of all previous hidden states. The question that inspired this paper was whether RNNs could also benefit from depth in space; that is from stacking multiple recurrent hidden layers on top of each other, just as feedforward layers are stacked in conventional deep networks. — Speech Recognition With Deep Recurrent Neural Networks, 2013

In the same work, they found that the depth of the network was more important than the number of memory cells in a given layer to model skill. Stacked LSTMs are now a stable technique for challenging sequence prediction problems. A Stacked LSTM architecture can be defined as an LSTM model comprised of multiple LSTM layers. An LSTM layer above provides a sequence output rather than a single value output to the LSTM layer below. Specifically, one output per input time step, rather than one output time step for all input time steps.

statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on

pdf

cannot see any pdfs




Flashcard 7629867453708

Question

Predictive customer scores

The company develops analytics—often using several types of machine-learning algorithms—to understand and track what is [...] customer satisfaction and business performance, and to detect specific events in customer journeys.

Answer
influencing

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
Predictive customer scores The company develops analytics—often using several types of machine-learning algorithms—to understand and track what is influencing customer satisfaction and business performance, and to detect specific events in customer journeys.

Original toplevel document (pdf)

cannot see any pdfs







Flashcard 7629873220876

Tags
#DAG #causal #edx
Question

Two sources of bias:

- common cause ([...])

- conditioning on common effect (selection bias)

Answer
confounding

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
Two sources of bias: - common cause (confounding) - conditioning on common effect (selection bias)

Original toplevel document (pdf)

cannot see any pdfs







Flashcard 7669103856908

Tags
#deep-learning #keras #lstm #python #sequence
Question
The computational unit of the LSTM network is called the memory cell, memory block, or just [...] for short.
Answer
cell

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
The computational unit of the LSTM network is called the memory cell, memory block, or just cell for short.

Original toplevel document (pdf)

cannot see any pdfs







Flashcard 7669105954060

Tags
#recurrent-neural-networks #rnn
Question
it also accurately predicts periods of elevated transaction activity and captures other forms of purchase [...] that can be leveraged in simulations of future sequences of customer transactions.
Answer
dynamics

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
it also accurately predicts periods of elevated transaction activity and captures other forms of purchase dynamics that can be leveraged in simulations of future sequences of customer transactions.

Original toplevel document (pdf)

cannot see any pdfs







Flashcard 7669107789068

Tags
#R #debugger #shiny
Question
Unlike breakpoints, browser() works everywhere, so it’s suitable for use in any code invoked by your [...] app.
Answer
Shiny

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
Unlike breakpoints, browser() works everywhere, so it’s suitable for use in any code invoked by your Shiny app.

Original toplevel document

Debuging shiny applications
The browser() statement is another useful debugging tool. It acts like a breakpoint–when evaluated, it halts execution and enters the debugger. You can add it anywhere an R expression is valid. <span>Unlike breakpoints, browser() works everywhere, so it’s suitable for use in any code invoked by your Shiny app. You can also invoke browser() conditionally to create conditional breakpoints; for instance: if (input$bins > 50) browser() The downside of browser() is that you need to re-run your







Flashcard 7669109886220

Tags
#tensorflow #tensorflow-certificate
Question
tf.ones([...])


<tf.Tensor: shape=(10, 7), dtype=float32, numpy=
array([[1., 1., 1., 1., 1., 1., 1.],
       [1., 1., 1., 1., 1., 1., 1.],
       [1., 1., 1., 1., 1., 1., 1.],
       [1., 1., 1., 1., 1., 1., 1.],
       [1., 1., 1., 1., 1., 1., 1.],
       [1., 1., 1., 1., 1., 1., 1.],
       [1., 1., 1., 1., 1., 1., 1.],
       [1., 1., 1., 1., 1., 1., 1.],
       [1., 1., 1., 1., 1., 1., 1.],
       [1., 1., 1., 1., 1., 1., 1.]], dtype=float32)>

Answer
[10, 7]

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Open it
tf.ones([10, 7]) <tf.Tensor: shape=(10, 7), dtype=float32, numpy= array([[1., 1., 1., 1., 1., 1., 1.], [1., 1., 1., 1., 1., 1., 1.], [1., 1., 1., 1., 1., 1., 1.], [1., 1., 1., 1., 1., 1., 1.], [1.,







#deep-learning #keras #lstm #python #sequence
they found that the depth of the network was more important than the number of memory cells in a given layer to model skill. Stacked LSTMs are now a stable technique for challenging sequence prediction problems. A Stacked LSTM architecture can be defined as an LSTM model comprised of multiple LSTM layers
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on


Parent (intermediate) annotation

Open it
urrent hidden layers on top of each other, just as feedforward layers are stacked in conventional deep networks. — Speech Recognition With Deep Recurrent Neural Networks, 2013 In the same work, <span>they found that the depth of the network was more important than the number of memory cells in a given layer to model skill. Stacked LSTMs are now a stable technique for challenging sequence prediction problems. A Stacked LSTM architecture can be defined as an LSTM model comprised of multiple LSTM layers. An LSTM layer above provides a sequence output rather than a single value output to the LSTM layer below. Specifically, one output per input time step, rather than one output time step

Original toplevel document (pdf)

cannot see any pdfs




#Master2-TAI-Presentation-article-virologie

Evaluation of antiviral activities of the drugs To analyze the antiviral activities of chloroquine phosphate (CQ, Sigma-Aldrich, no. C6628) and hydroxychloroquine sulfate (HCQ, MCE, no. HY-B1370), the cytotoxicity of the two drugs on Vero E6 cells was first determined using cell counting kit-8 (CCK8) (Beyotime, China) according to manufacturer’s protocol. Then, Vero E6 cells (1 × 105 cells/well) cultured in 48-well cell-culture plates were pre-treated with the different concentrations of the two compounds for 1 h at 37 °C, followed by infection with virus at different MOIs (0.01, 0.02, 0.2, and 0.8) for 2 h.

Subsequently, the virus-drug mixture was removed and cells were extensively washed with PBS. Then, the fresh drug-containing medium was added and further maintained until 48 h p.i., and the virus yield in the infected cell supernatant was quantified by quantitative real-time RT-PCR (qRT-PCR) (Takara, Cat no. 9766) as described previously.2 Briefly, total viral RNA was isolated and cDNA was synthesized by reverse transcription. Quantitative PCR was performed using cDNA as the template with specific primers against the receptor binding domain (RBD) of viral spike gene. A standard curve was generated by using serially-diluted plasmid standards. After determining viral RNA copies in each group by qRT-PCR, the dose-response curves were plotted by using GraphPad Prism 6 software

statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on

pdf

cannot see any pdfs




#Master2-TAI-Presentation-article-virologie
Time-of-addition experiment Time-of-addition experiment of CQ and HCQ were performed as previously. 2 Briefly, Vero E6 cells (1 × 105 cells/well) were treated with CQ (10 μM), HCQ (30 μM) or PBS control at different stages of virus infection (Entry, Full-time, and Post-entry), and then infected with virus at an MOI of 0.07. At 16 h p.i., virus yield in the infected cell supernatant was quantified by qRT-PCR. The expression of viral NP in the infected cells was analyzed by immunofluorescence analysis (IFA) using anti-NP rabbit sera (1:1000 dilution) and Alexa 488-labeled goat anti-rabbit IgG (1:500 dilution; Abcam) as the primary and the secondary antibody, respectively. The nuclei were stained with Hoechst 33258 dye (Beyotime, China). The cells were imaged by fluorescence microscopy
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on

pdf

cannot see any pdfs




#Master2-TAI-Presentation-article-virologie

For co-localization analysis of virions and EEA1+ EEs, the fixed cells were incubated with rabbit sera against NP (anti-NP, 1:500 dilution) and mouse anti-EEA1 antibody (1:100 dilution, Cell Signaling Technology) as the primary antibodies, and then stained with Alexa 647-labeled goat anti-rabbit IgG (1:500 dilution; Abcam) and Alexa 488-labeled rabbit anti-mouse IgG (1:500 dilution; Abcam) as the secondary antibodies, respectively. For co-localization analysis of virions and LAMP1+ ELs, mouse monoclonal antibody against NP (1:100 dilution) and rabbit anti-LAMP1 antibody (1:500 dilution, Cell Signaling Technology) were used as the primary antibodies, and Alexa 647-labeled rabbit anti-mouse IgG (1:500 dilution; Abcam) and

Alexa 488-labeled goat anti-rabbit IgG (1:500 dilution; Abcam) were used as the secondary antibodies, respectively. Fluorescence images were obtained by using a confocal microscope (Nikon A1RMP two-photon microscope). Then, the proportions of SARS-CoV-2 particles (yellow) co-localized with EEs or ELs (green) to all particles (red) in the cells were quantified and analyzed (n > 30) by image J (Colocation Threshold plugin).

statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on

pdf

cannot see any pdfs




#Master2-TAI-Presentation-article-virologie
The mechanism of CQ and HCQ in inhibition of virus entry To analyze the mode of action of CQ and HCQ in inhibiting virus entry, co-localization analysis of virions and early endosomes (EEs) or endolysosomes (ELs) were carried out. Vero E6 cells (2.5 × 105 cells/well) cultured in 35-mm glass-bottom culture dishes were pre-treated with CQ (50 μM), HCQ (50 μM), or PBS control for 1 h before virus attachment, and then incubated with SARS-CoV-2 (MOI = 10) at 4 °C to allow virus binding for 1 h. After being washed three times with pre-chilled PBS, the cells were further cultured with fresh drug-containing medium at 37 °C for 90 min, and then fixed and subjected to IFA
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on

pdf

cannot see any pdfs