Edited, memorised or added to reading queue

on 03-Jan-2025 (Fri)

Do you want BuboFlash to help you learning these things? Click here to log in or create user.

Flashcard 7545578196236

Tags
#DAG #causal #edx
Question
Inverse probability [...] is in fact just one of the group of so called G-methods
Answer
weighting

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
Inverse probability matching is in fact just one of the group of so called G-methods

Original toplevel document (pdf)

cannot see any pdfs







Flashcard 7628373232908

Tags
#DAG #causal #edx
Question
So all these methods for confounding adjustment -- stratification, matching, inverse probability weighting, G-formula, G-estimation -- have two things in common. First, they require data on the [...] that block the backdoor path. If those data are available, then the choice of one of these methods over the others is often a matter of personal taste. Unless the treatment is time-varying -- then we have to go to G-methods
Answer
confounders

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
So all these methods for confounding adjustment -- stratification, matching, inverse probability weighting, G-formula, G-estimation -- have two things in common. First, they require data on the <span>confounders that block the backdoor path. If those data are available, then the choice of one of these methods over the others is often a matter of personal taste. Unless the treatment is time-vary

Original toplevel document (pdf)

cannot see any pdfs







Flashcard 7628979309836

Tags
#deep-learning #keras #lstm #python #sequence
Question
Batch : A pass through a subset of samples in the training dataset after which the [...] are updated. One epoch is comprised of one or more batches
Answer
network weights

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
Batch : A pass through a subset of samples in the training dataset after which the network weights are updated. One epoch is comprised of one or more batches

Original toplevel document (pdf)

cannot see any pdfs







Flashcard 7673834245388

Tags
#abm #agent-based #machine-learning #model #priority #synergistic-integration
Question
Semisupervised learning falls in between supervised or unsupervised learning algorithms. It is an approach that combines a small amount of labeled data with a large amount of unlabeled data during training when the cost of [...] work may render large, fully labeled training sets infeasible, whereas the acquisition of unlabeled data is relatively inexpensive.
Answer
labeling

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
in between supervised or unsupervised learning algorithms. It is an approach that combines a small amount of labeled data with a large amount of unlabeled data during training when the cost of <span>labeling work may render large, fully labeled training sets infeasible, whereas the acquisition of unlabeled data is relatively inexpensive. <span>

Original toplevel document (pdf)

cannot see any pdfs







#causal #inference

The G-methods family, as developed by James Robins and colleagues, specifically includes:

  1. Inverse probability weighting (IPW)
  2. G-computation
  3. G-estimation of structural nested models

Inverse probability matching (propensity score matching) is not technically a G-method. While it shares the fundamental goal of addressing confounding and selection bias, it employs a different mathematical framework and estimation approach

statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on