Edited, memorised or added to reading list

on 26-Sep-2018 (Wed)

Do you want BuboFlash to help you learning these things? Click here to log in or create user.

Flashcard 3401628126476

Question
In python, there is a syntatic convention for marking variables and methods private in a class, for example, how do you mark method called privateMethod(self, x) private?
Answer
__privateMethod(self, x)
^^ note the leading double underscore


statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

pdf

cannot see any pdfs







#has-images

美国留学体验——芝加哥+北密西根




美帝国主义最近和天朝贸易战打得火热,给大的经济环境造成了不利的影响,股市也是毫无生气。
政治归政治,美国的学术水平和科研水平在世界是毋庸置疑的第一,这也吸引了世界各国的学生前往留学。小弟去年在美国读书,刚刚翻出旧照感觉犹在昨日,就发出来和大家分享分享留学体验吧。
1. 去程
在香港转机,国泰航空的餐食是真的好的没话说,秒杀大陆所有航司……空姐的普通话不好,很多只会英语或粤语,也就没有硬着头皮去搭讪聊天了。




2. 学校和日常
读书的学校是一所全美前15的私立大学,近邻密西根湖,风景非常好。夏天的时候,本科生有一节帆船课,很酷;冬天的时候,湖上白雪皑皑,与世隔绝般安静,也可以在湖面上溜冰。
当时的诺贝尔化学奖得主是一位隔壁学院的老爷爷,学术水平非常高,但是人也非常和蔼。跟国内高校的有些老师还是境界差蛮多的。









天天吃餐馆太贵,一般人均一顿要15-25美元(人民币大概一两百吧),为了省点钱我是从亚洲超市买食材然后自己做,毕竟留学的学费都是家里出,生活上能节约就节约。。虽然没那么好吃,但也好歹可以下咽了……






3. 芝加哥
芝加哥是一个黑帮传统城市,有些区域经常发生黑帮火并并且伤及路人。理论上,芝加哥北部是白人区,相对安全;南部黑人区,一般不去,过于危险。
作为一个全美前3大城市之一,整体还是蛮不错的。





4. 北密西根赏枫
秋季的北密西根是看枫叶的好机会,拿着天朝的驾照租了辆道奇,一路开车从芝加哥到北密西根。。单程大概花了10+小时吧,我和我室友轮流。不过风景还是值得这么折腾的。
在中途的一个airbnb住了一晚,巧的是房东虽然是地道美国人,但是很喜欢中国,家里摆了乒乓球台,还有之前去四川时候买的纪念品,相谈甚欢。











大概这就是刚到美国的初体验了,后来还去了墨西哥/加州/纽约波士顿等地玩,有机会的话可以再分享哈哈哈

大家中秋快乐啊!

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on




#ISAN3010 #book #ch4
statistical analysis can enable managers to focus on the big picture and in turn make reasonably correct and unbiased business decisions. In addition, because a picture is worth a thousand words, statistical data plotted graphically can paint a picture of the entire business case and can be used by the project managers to support their arguments while negotiating or when they find themselves cornered

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book #ch4
Scattered, fragmented, and unorganized data , when properly organized and ana- lyzed, becomes information. Information , when properly interpreted, becomes knowledge. Knowledge , when properly used, enables informed, effective, and rational decision- making.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Data: Raw (unanalyzed) results of measurements and observations constitute data. It can be qualitative or quantitative

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Qualitative data: Qualitative data refers to a quality or attribute. It is non- numerical and descriptive, which can only be observed or felt but not mea- sured

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Quantitative data: Refers to quantity or numbers. It is numerical and can be measured

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Information: Analyzed and organized data becomes information.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Statistics: Statistics is referred to as the methodology of gathering, organizing, analyzing, and interpreting data for decision-making

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Population: This includes all measurements or observations that are of interest; for example, all stakeholders of a project

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Sample: This is a subset of the population—for example, the project stakeholders or subjects selected to take part in a survey

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Probability trial: This is an experiment conducted to collect responses or specific measurements from selected subjects; for example, rolling a die

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Outcome or result: This is an output obtained after conducting a single prob- ability trial; for example, obtaining 6 after rolling a die

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Sample space: This is a collection of all possible outcomes or results of a prob- ability experiment; for example, {1, 2, 3, 4, 5, 6}.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Event: This is a specific set of select outcomes of a probability trial and is a subset of the sample space. For example, {1, 3, 5} is an event representing all odd out- comes of rolling a die experiment with a set of possible outcomes (sample space) given by {1, 2, 3, 4, 5, 6}

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
In classical or theoretical probability, each outcome of a probability experiment or trial is equally likely to occur: P(Classical) = Number of outcomes in an event / Total number of all outcomes in a sample space

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
n empirical or statistical probability, each outcome of a probability experiment or trial is not equally likely to occur; rather, the probability of occurrence of each outcome is dependent upon the result of a probability experiment. P(Empirical) = Frequency of occurrence of an event / Total frequency of occur- rence of all events in a sample space

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Probability Range The range of probabilities includes all probabilities between 0 (0%) and 1 (100%), both extremes inclusive. 0 ≤ P(E) ≤ 1

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Conditional probability is the occurrence of a certain event after the occurrence of another event. It is denoted by P(X | Y), which implies probability of occurrence of event X , given that event Y already occurred.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Designing a Statistical Study A statistical study involves the collection and analysis of data and can be designed by following these steps: 1. Identify the topic (variable) of interest and domain (population) of study. 2. Develop a detailed plan for collecting data. If you use a sample, make sure the sample is representative of the population. 3. Collect the data. 4. Describe the data using descriptive statistics techniques. 5. Analyze the data using statistical techniques. 6. Interpret the data and make decisions about the population using inferential sta- tistics. 7. Interpret the analysis results. 8. Make decisions based on the interpretation of the analysis results

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
These questionnaires are what are called statistical surveys. They are used to collect quantitative information (factual or just opinions) from the target population, called a sample in statistical language.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Experiments are conducted to collect the factual data via measurements of the experiments’ results

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
In this technique of data collection, data is collected by simply observ- ing the sample population without any type of influence or experimental manipulation

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
The mean has both advantages and disadvantages. It is a reliable measure because all data values in the probability distribution are used to calculate the mean, and it can be used for both continuous and discrete quantitative data. However, the mean can be influenced by outliers (a data value that is far off from the rest of data values) in the probability distribution.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Mean The mean of a probability distribution is equal to the sum of all possible values in the distribution divided by the total number of values in the distribution. It is often referred to as the arithmetic average of a probability distribution.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
The median has advantages and disadvantages. It is less affected by outliers and skewed data than the mean and is usually the preferred measure of central tendency when the distribution is not symmetrical. However, the median cannot be identified for categorical nominal data, because it cannot be logically ordered.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Mode The mode is the value that occurs most frequently in a data set. If no data value occurs more than once, the data set does not have a mode. If two data values have the same frequency of occurrence in the data set, then each of the two values is a mode and this type of data set is called bimodal

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
The PERT (Program Evaluation and Review Technique) three-point estimation tech- nique to estimate the duration of a project activity is an example of weighted mean or average. According to this estimation technique, the estimated duration of a project activity is obtained by calculating the weighted mean of the pessimistic, realistic (most likely), and optimistic values of the duration using the below formula: Duration = (P + 4R + O) / 6 where P is the pessimistic value, R is the realistic value, and O is the optimistic value. The realistic or most likely estimate is weighted 4 times more than the pessimistic and the optimistic estimates.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Weighted Mean A weighted mean is calculated by using the data values that have different weights assigned to them. x xw w i = Σ Σ where w is the weight of each data value x.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Range Range is the difference between the maximum and minimum values in a quantitative data set.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Probability Distribution The assignment of a probability to each of the possible outcomes of a random statistical experiment is called a probability distribution.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Random Variable The outcome of a probability distribution represented in numerical form is called a random variable, denoted by the letter x. An example of a random variable would be the number of support calls a company’s call center received in 24 hours.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Discrete versus Continuous Random Variables Type Comparison Factors Discrete Continuous Characteristics Possible outcomes are finite and countable (data can take only certain values) Possible outcomes are infinite and uncountable (data can take any value in an interval)

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Poisson Distribution The following are the characteristics of a Poisson distribution: ■ The experiment in a Poisson experiment involves counting the number of times an event would occur in an interval of area, volume, or time. ■ The event is equally likely to occur for each of the intervals. ■ The number of occurrences in one interval has no dependency on the number of occurrences in other intervals.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
The following are the characteristics of a normal distribution: ■ The bell-shaped normal curve is symmetric about the mean μ . ■ The total area under the normal curve is equal to one. ■ The mean, mode, and median of a normal distribution are equal. ■ The normal curve exhibits maximum peak at the mean and slopes down as it moves away from the mean. It appears to be touching the x-axis as it keeps mov- ing away from the mean but it never does touch the x-axis.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Mean, Variance, and Standard Deviation of a Binomial Distribution The following are the characteristics of a binomial distribution: ■ A binomial experiment involves a fixed number of trials and all trials are indepen- dent of each other. ■ Each trial has only two outcomes: a success or a failure.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
A normal distribution that has mean 0 and standard deviation 1 is called a standard normal distribution.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Confidence Intervals A confidence interval is an interval estimate with range of values specifying probability or confidence that the value of a parameter of interest lies within it.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Point Estimate versus Interval Estimate A point estimate of a parameter of interest is a single value of a statistic, 3 whereas an interval estimate specifies a range for the probability of having the parameter of interest within it as shown in Figure 4.5 . The point estimate is more accurate

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
The level of confidence, denoted by the letter c , indicates the probability that an interval estimate contains the parameter of interest.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
Predictive analytics include a variety of statistical techniques that are used to analyze cur- rent and historical factual information to make predictions about the future uncertainties.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
establishing a mathemati- cal equation as a model to represent the interactions between the participating variables.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#ISAN3010 #book
linear regression model portrays the correlation between one dependent variable and one or more independent variables

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




#rothbard #vietnam
American intervention in Vietnam beganwhen the American government, under Franklin Roosevelt, 1941, delivered a ultimatum to Japan to get its armed forces out of China and Indochina, from what would later be Vietnam. This ultimatum set the stage for Pearl Harbor

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

Subverting Peace and Freedom - LewRockwell
ir attitude toward foreign policy was the Vietnam war. America’s imperial war in Vietnam was, indeed, a microcosm of what has been tragically wrong with American foreign policy in this century. <span>American intervention in Vietnam did not begin, as most people believe, with Kennedy or Eisenhower or even Truman. It began no later than the date when the American government, under Franklin Roosevelt, on November 26, 1941, delivered a sharp and insulting ultimatum to Japan to get its armed forces out of China and Indochina, from what would later be Vietnam. This U.S. ultimatum set the stage inevitably for Pearl Harbor. Engaged in a war in the Pacific to oust Japan from the Asian continent, the United States and its OSS (predecessor to the CIA) favored and aided Ho Chi Minh’s Communist-run national re




Flashcard 3404425465100

Tags
#rothbard #vietnam
Question
American intervention in Vietnam beganwhen the American government, under [quem], [ano] , delivered a ultimatum to Japan to get its armed forces out of China and Indochina, from what would later be Vietnam. This ultimatum set the stage for Pearl Harbor
Answer
Franklin Roosevelt
1941


statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
American intervention in Vietnam beganwhen the American government, under Franklin Roosevelt, 1941, delivered a ultimatum to Japan to get its armed forces out of China and Indochina, from what would later be Vietnam. This ultimatum set the stage for Pearl Harbor

Original toplevel document

Subverting Peace and Freedom - LewRockwell
ir attitude toward foreign policy was the Vietnam war. America’s imperial war in Vietnam was, indeed, a microcosm of what has been tragically wrong with American foreign policy in this century. <span>American intervention in Vietnam did not begin, as most people believe, with Kennedy or Eisenhower or even Truman. It began no later than the date when the American government, under Franklin Roosevelt, on November 26, 1941, delivered a sharp and insulting ultimatum to Japan to get its armed forces out of China and Indochina, from what would later be Vietnam. This U.S. ultimatum set the stage inevitably for Pearl Harbor. Engaged in a war in the Pacific to oust Japan from the Asian continent, the United States and its OSS (predecessor to the CIA) favored and aided Ho Chi Minh’s Communist-run national re







Flashcard 3404427824396

Tags
#rothbard #vietnam
Question
American intervention in Vietnam began when the American government, under Franklin Roosevelt, 1941, delivered a ultimatum to [país] to get its armed forces out of China and Indochina, from what would later be Vietnam. This ultimatum set the stage for [evento]
Answer
Japan
Pearl Harbor


statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
American intervention in Vietnam beganwhen the American government, under Franklin Roosevelt, 1941, delivered a ultimatum to Japan to get its armed forces out of China and Indochina, from what would later be Vietnam. This ultimatum set the stage for Pearl Harbor

Original toplevel document

Subverting Peace and Freedom - LewRockwell
ir attitude toward foreign policy was the Vietnam war. America’s imperial war in Vietnam was, indeed, a microcosm of what has been tragically wrong with American foreign policy in this century. <span>American intervention in Vietnam did not begin, as most people believe, with Kennedy or Eisenhower or even Truman. It began no later than the date when the American government, under Franklin Roosevelt, on November 26, 1941, delivered a sharp and insulting ultimatum to Japan to get its armed forces out of China and Indochina, from what would later be Vietnam. This U.S. ultimatum set the stage inevitably for Pearl Harbor. Engaged in a war in the Pacific to oust Japan from the Asian continent, the United States and its OSS (predecessor to the CIA) favored and aided Ho Chi Minh’s Communist-run national re







it is consid- ered good practice to generate a fresh key for every received payment. This technique exploits the fact that verifying the integrity of the ledger does not re- quire knowing exactly who took part in the transactions, only that they followed the agreed upon rules of the system

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




Flashcard 3404432280844

Question
What are X.509 certificates?
Answer
[default - edit me]


statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

pdf

cannot see any pdfs







Bitcoin has BIP 70 9 which specifies a way of signing a “payment request” using X.509 certificates linked to the web PKI, giving a cryptographically secured and standardised way of knowing who you are dealing with. Identities in this system are the same as used in the web PKI: a domain name, email address or EV (extended validation) organisation name.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




States may define fields of type Party, which encapsulates an identity and a public key. When a state is deserialised from a transaction in its raw form, the identity field of the Party object is null and only the public (composite) key is present. If a transaction is deserialised in conjunction with X.509 certificate chains linking the transient public keys to long term identity keys the identity field is set. In this way a single data rep- resentation can be used for both the anonymised case, such as when validating dependencies of a transaction, and the identified case, such as when trading directly with a counterparty. Trading flows incorporate sub-flows to transmit certificates for the keys used, which are then stored in the local database. How- ever the transaction resolution flow does not transmit such data, keeping the transactions in the chain of custody pseudonymous.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




Flashcard 3404439883020

Question
What is BIP 32?
Answer
[default - edit me]


statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

pdf

cannot see any pdfs







Flashcard 3404450630924

Question
How does corda handle deterministic key derivation?
Answer
Corda allows for but does not mandate the use of determinstic key derivation schemes such as BIP 32 13 . The infrastructure does not assume any mathematical relationship between public keys because some cryptographic schemes are not compatible with such systems. Thus we take the efficiency hit of always linking transient public keys to longer term keys with X.509 certificates.


statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

pdf

cannot see any pdfs







Flashcard 3404465573132

Question
Explain Deterministic key derivation
Answer
[default - edit me]


statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

pdf

cannot see any pdfs







It is sometimes convenient to reveal a small part of a transaction to a counter- party in a way that allows them to check the signatures and sign it themselves. A typical use case for this is an oracle, defined as a network service that is trusted to sign transactions containing statements about the world outside the ledger only if the statements are true. Here are some example statements an oracle might check: • The price of a stock at a particular moment was X. • An agreed upon interest rate at a particular moment was Y. • If a specific organisation has declared bankruptcy. • Weather conditions in a particular place at a particular time.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




Flashcard 3404562304268

Question
Explain a Merkle hash tree
Answer
[default - edit me]


statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

pdf

cannot see any pdfs







One way to implement oracles would be to have them sign a small data structure which is then embedded somewhere in a transaction (in a state or command). We take a different approach in which oracles sign the entire transaction, and data the oracle doesn’t need to see is “torn off” before the transaction is sent

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




By presenting a counterparty with the data elements that are needed along with the Merkle branches linking them to the root hash, as seen in the diagrams below, that counterparty can sign the entire transaction whilst only being able to see some of it. Additionally, if the counterparty needs to be convinced that some third party has already signed the transaction, that is also straightforward. Typically an oracle will be presented with the Merkle branches for the command or state that contains the data, and the timestamp field, and nothing else. The resulting signature contains flag bits indicating which parts of the structure were presented for signing to avoid a single signature covering more than expected.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




Each state in a transaction specifies a contract (boolean function) that is invoked with the entire transaction as input. All contracts must accept in order for the transaction to be considered valid

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




Encumbrances allow a state to specify another state that must be present in any transaction that consumes it. For example, a time lock contract can define a state that contains the time at which the lock expires, and a simple contract that just compares that time against the transaction timestamp. The asset state can be included in a spend-to-self transaction that doesn’t change the ownership of the asset but does include a time lock state in the outputs. Now if the asset state is used, the time lock state must also be used, and that triggers the execution of the time lock contract.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

pdf

cannot see any pdfs




Flashcard 3404574625036

Question
[default - edit me]
Answer
We investigated such learning across the life span, between 4–85 years of age with an implicit probabilistic sequence learning task, and we found that the difference in implicitly learning high vs. low probability events - measured by raw reaction time (RT) - exhibited a rapid decrement around age of 12


statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill
The Best Time to Acquire New Skills: Age-related Differences in Implicit Sequence Learning across Human Life Span
Yet, the ontogenetic changes in humans’ implicit learning abilities have not yet been characterized, and, thus, their role in acquiring new knowledge efficiently during development is unknown. <span>We investigated such learning across the life span, between 4–85 years of age with an implicit probabilistic sequence learning task, and we found that the difference in implicitly learning high vs. low probability events - measured by raw reaction time (RT) - exhibited a rapid decrement around age of 12. Accuracy and z-transformed data showed partially different developmental curves suggesting a re-evaluation of analysis methods in developmental research. The decrement in raw RT differ







#Anki

My investigation into changing the setting and algorithm optimization only led to more frustration and questions. There are a lot of parameters and settings in Anki:

Steps, Graduating Interval, Easy Interval, Starting Ease, Easy bonus, Interval Modifier, Steps and New Interval for lapses.

It’s not difficult understanding what each parameter means. The real question is how each parameter affects each other. Different parameters are so interconnected that changing one parameter could have a huge influence on review and the behaviors of the algorithm. It is like the butterfly effect.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

Bookuctivity – Books + Productivity
tention and repetition workload. Mostly, people would change the setting this way or that way just because “it feels right” or “this is how I do it” without giving valid reasons or explanation. <span>My investigation into changing the setting and algorithm optimization only led to more frustration and questions. There are a lot of parameters and settings in Anki: Steps, Graduating Interval, Easy Interval, Starting Ease, Easy bonus, Interval Modifier, Steps and New Interval for lapses. It’s not difficult understanding what each parameter means. The real question is how each parameter affects each other. Different parameters are so interconnected that changing one parameter could have a huge influence on review and the behaviors of the algorithm. It is like the butterfly effect. Another similar analogy from Is Red Meat Killing Us? : [The treatment group were treated] with a cocktail of 14 different chemotherapy drugs, plus radiation, plus surgery, plus hypnosis




#Anki

In Anki, maybe the combination of decreasing the Starting Ease and Easy Bonus WHILE increasing the Steps and New Interval for lapses works better. Or every 10% increase of Starting Ease has to be matched with 30% decrease of the Easy Bonus.

I simply did not know how to change them. I still don’t.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

Bookuctivity – Books + Productivity
up, and therefore the treatment has worked. But how do we know EXACTLY what led to the survival benefit? Was it 3 of the 14 drugs? The surgery? The kittens? We cannot know from this experiment. <span>In Anki, maybe the combination of decreasing the Starting Ease and Easy Bonus WHILE increasing the Steps and New Interval for lapses works better. Or every 10% increase of Starting Ease has to be matched with 30% decrease of the Easy Bonus. I simply did not know how to change them. I still don’t. Since I wouldn’t know how my changes had affected my long-term retention, I got paranoia of undermining the algorithm’s efficiency and wasting my time. You may argue, “Hey, you can chec




#Anki
This is the concept of desirable difficulty: the more you struggle to recall the answer, given that you succeed, the better and stronger your long-term retention will be. So, increasing the Interval Modifier will hurt my long-term retention because they are making review easier, undermining the effect of desirable difficulty.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

Bookuctivity – Books + Productivity
ew theory of disuse, the largest increments in both storage and retrieval strength occur when the to-be-learned (or relearned) information has low retrieval strength and high storage strength.” <span>This is the concept of desirable difficulty: the more you struggle to recall the answer, given that you succeed, the better and stronger your long-term retention will be. So, increasing the Interval Modifier will hurt my long-term retention because they are making review easier, undermining the effect of desirable difficulty. From “Making Things Hard on Yourself, But in a Good Way: Creating Desirable Difficulties to Enhance Learning “: “Substantial improvements in performance across practice can occur withou




#Anki

In the default setting, if you get a card wrong, the New Interval for lapses would render it new again, like you’ve never seen or learned the card.

From Anki’s Manual:

New interval controls how much Anki should reduce the previous interval by. If the card had a 100 day interval, the default of 0% would reduce the interval to 0

So, for example, you have answered a card correctly (Easy) for 10 times over 3 years. If you then fail it ONCE (Again), Anki would banish the card back to square one. I discovered this brutal New Interval setting only after 4 years of using Anki.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

Bookuctivity – Books + Productivity
odds of what I learned from the literature. So I was trapped in the dilemma of knowing I had to change the default setting without knowing how. What I Discovered From Tweaking the Anki Setting <span>In the default setting, if you get a card wrong, the New Interval for lapses would render it new again, like you’ve never seen or learned the card. From Anki’s Manual : New interval controls how much Anki should reduce the previous interval by. If the card had a 100 day interval, the default of 0% would reduce the interval to 0 So, for example, you have answered a card correctly (Easy) for 10 times over 3 years. If you then fail it ONCE (Again), Anki would banish the card back to square one. I discovered this brutal New Interval setting only after 4 years of using Anki. There are a lot of factors affecting your retrieval strength, aka recalling the correct answer. According to the encoding variability of the fluctuation model, contextual cues, includin




#Anki

The default setting is highly inefficient as having to start over is completely unnecessary. Just a one-time corrective feedback for that card should probably make it retrievable again for a long time (years maybe, personal guess). Having to review it over and over again leads to a lot of frustration. Imagine the pain of reviewing a card for years and then slipped for once… all the previous effort to expand the retrieval schedules. Puff. Gone. It’s like repetitively building a sand castle only to be washed away moments later.

I wonder how many users are still using that brutal default setting… I am sure a lot of people never bothered to tweak the settings, not to mention read the manual to understand what each parameter means and how it affects the algorithm. I think once people know how the default setting behaves, they would tweak it immediately.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

Bookuctivity – Books + Productivity
-Regulated Learning: Beliefs, Techniques, and Illusions “: “Once information is interrelated with prior knowledge in long-term memory, it tends to remain stored, if not necessarily accessible.” <span>The default setting is highly inefficient as having to start over is completely unnecessary. Just a one-time corrective feedback for that card should probably make it retrievable again for a long time (years maybe, personal guess). Having to review it over and over again leads to a lot of frustration. Imagine the pain of reviewing a card for years and then slipped for once… all the previous effort to expand the retrieval schedules. Puff. Gone. It’s like repetitively building a sand castle only to be washed away moments later. I wonder how many users are still using that brutal default setting… I am sure a lot of people never bothered to tweak the settings, not to mention read the manual to understand what each parameter means and how it affects the algorithm. I think once people know how the default setting behaves, they would tweak it immediately. On the other hand, in SuperMemo, the algorithm is mostly under the hood. You can only change the forgetting curve index. Heck, Woz even limits you the range of only 10%-20%. That’s it.




#Anki
If you’re still using Anki, I HIGHLY recommend tweaking the default setting. It doesn’t have to be exactly like mine. There are still a lot of guesses and “personal experience”, but it’s definitely more efficient than the default setting. One thing for certain is that the New Interval for lapses COULDN’T BE ZERO. I would suggest setting it over 40%. As I explained, it doesn’t make sense to banish the card and “learn” it again.

statusnot read reprioritisations
last reprioritisation on reading queue position [%]
started reading on finished reading on

Bookuctivity – Books + Productivity
ld guesses and then worrying about messing it up. I just have to trust (and I do) that the algorithm will take care of everything for me. There’s freedom in the lack of options and choices. PS: <span>If you’re still using Anki, I HIGHLY recommend tweaking the default setting. It doesn’t have to be exactly like mine. There are still a lot of guesses and “personal experience”, but it’s definitely more efficient than the default setting. One thing for certain is that the New Interval for lapses COULDN’T BE ZERO. I would suggest setting it over 40%. As I explained, it doesn’t make sense to banish the card and “learn” it again. Superior Algorithm of SuperMemo Is SM-17 better than SM-2? This question frequently comes up in r/Anki and help.supermemo.org. This is almost as frequent as “Is SuperMemo better than An