Edited, memorised or added to reading queue

on 09-Oct-2025 (Thu)

Do you want BuboFlash to help you learning these things? Click here to log in or create user.

Flashcard 7734847999244

Tags
#deep-learning #embeddings
Question
With the similar idea of how we get word embeddings, we can make an analogy like this: a word is like a product; a [...] is like a sequence of ONE customer’s shopping sequence; an article is like a sequence of ALL customers’ shopping sequence
Answer
sentence

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
With the similar idea of how we get word embeddings, we can make an analogy like this: a word is like a product; a sentence is like a sequence of ONE customer’s shopping sequence; an article is like a sequence of ALL customers’ shopping sequence

Original toplevel document (pdf)

cannot see any pdfs







Flashcard 7762250697996

Tags
#deep-learning #keras #lstm #python #sequence
Question
as part of framing your problem you must split long sequences into subsequences that are both long enough to [...] for making predictions, but short enough to efficiently train the network
Answer
capture relevant context

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
as part of framing your problem you must split long sequences into subsequences that are both long enough to capture relevant context for making predictions, but short enough to efficiently train the network

Original toplevel document (pdf)

cannot see any pdfs