Edited, memorised or added to reading queue

on 17-Jan-2023 (Tue)

Do you want BuboFlash to help you learning these things? Click here to log in or create user.

Flashcard 5039829683468

Tags
#has-images
Question
PROVOKE voorbeeld



statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

pdf

cannot see any pdfs







#feature-engineering #lstm #recurrent-neural-networks #rnn
Because of their typical high-dimensionality, the hidden states of RNN models are usually more potent than that of hidden markov models (e.g., Netzer, Lattin, & Srinivasan, 2008), which are commonly used in marketing to capture customer dynamics. The HMM has N discrete hidden states (where N is typically small) and, therefore, has only log 2 (N) bits of information available to capture the sequence history (Brown & Hinton, 2001). On the other hand, the RNN has distributed hidden states, which means that each input generally results in changes across all the hidden units of the RNN (Ming et al., 2017). RNNs combine a large number of distributed hidden states with nonlinear dynamics to update these hidden states, thereby allowing it to have a more substantial representational capacity when compared with an HMM
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on

pdf

cannot see any pdfs




#feature-engineering #lstm #recurrent-neural-networks #rnn
While an RNN can carry forward useful information from one timestep to the next, however, it is much less effective at capturing long-term dependencies (Bengio, Simard, & Frasconi, 1994; Pascanu, Mikolov, & Bengio, 2013). This limitation turns out to be a crucial problem in marketing analytics.
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on

pdf

cannot see any pdfs




#feature-engineering #lstm #recurrent-neural-networks #rnn
The effect of a direct mailing does not end after the campaign is over, and the customer has made her decision to respond or not. An advertising campaign or customer retention program can impact customers' behaviors for several weeks, even months. Customers tend to remember past events, at least partially. Hence, the effects of marketing actions tend to carry- over into numerous subsequent periods (Lilien, Rangaswamy, & De Bruyn, 2013; Schweidel & Knox, 2013; Van Diepen et al., 2009). The LSTM neural network, which we introduce next, is a kind of RNN that has been modified to effectively capture long-term dependencies in the data
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on

pdf

cannot see any pdfs




#feature-engineering #lstm #recurrent-neural-networks #rnn
The logit model performs remarkably well at high lift values (i.e., 20%), whereas the random forest model shines at lower lift values (lift at 1%). This result might suggest that the best traditional model to deploy depends on the degree of targeting the analyst seeks. Random forest models are particularly good at identifying tiny niches of super-responsive donors, and therefore are well suited for ultra-precise targeting. Logit models generalize well. Hence, they appear more appropriate to target wider portions of the donor base.
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on

pdf

cannot see any pdfs




#feature-engineering #lstm #recurrent-neural-networks #rnn

Brand Choice and Market Share Forecasting Using Scanner Data

Demand forecasting for products within a category is a critical task for retailers and brand managers alike. The multinomial logit model (MNL) is commonly used to predict brand choice and market share using marketing-mix and loyalty variables (Guadagni & Little, 1983). Artificial feedforward neural networks (ANN) have also been shown to effectively predict household brand choices, as well as brand market shares (Agrawal & Schorling, 1996). Since brand choices can be modeled as sequential choices, and data complexity increases exponentially with the number of brands (with interaction effects), LSTM neural networks offer suitable alternatives. Similar to our studies, we could encode brand choices and the decision environment as we encoded solicitations and donations: as a multidimensional vector. We conjecture that testing the performance of LSTM neural networks in the context of brand choices would constitute an exciting replication area.

statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on

pdf

cannot see any pdfs




#RNN #ariadne #behaviour #consumer #deep-learning #priority #retail #simulation #synthetic-data
From Neural Network architectures perspective, close to our work is Deep Neural Network Ensembles for Time Series Classification [8]. In this paper, authors show how an ensemble of multiple Convolutional Neural Networks can improve upon the state-of-the-art performance of individual neural networks. They use 6 deep learning classifiers in- including Multi-Layer Perceptron, Fully Convolutional Neural Network, Residual Network, Encoder [20], Multi-Channels Deep Convolutional Neural Networks [29] and Time Convolutional Neural Network [28]. The first three were originally proposed in [24]. We propose the application of such architectures in the consumer choice world and apply the concept of entity embeddings [9] along with neural network architectures like Multi-Layer Perceptron, Long Short Term Memory (LSTM), Temporal Convolutional Networks (TCN) [13] and TCN-LSTM
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on

pdf

cannot see any pdfs




#Shell #linux

How to Force User to Change Password at Next Login in Linux

User name: ravi

# passwd --expire ravi
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on

How to Force User to Change Password at Next Login in Linux
User name: ravi # passwd --expire ravi How to Force User to Change Password at Next Login in Linux Using passwd Command To force a user to change his/her password, first of all the password must have expired and to cause a u




Flashcard 7559702777100

Question
读完本书后,你将会充分理解什么是深度学习、什么时候该用深度学习,以及它的局限性
Answer
[default - edit me]

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

pdf

cannot see any pdfs







图像识别、时间序列预测、 情感分析、图像和文字生成等
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on

pdf

cannot see any pdfs




#feature-engineering #lstm #recurrent-neural-networks #rnn
In this paper, we have shown that recent neural network architectures, traditionally used in natural language processing and machine translation, could effectively do away with the complicated and time-consuming step of feature engineering, even when applied to highly structured problems such as predicting the future behaviors of a panel of customers.
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on


Parent (intermediate) annotation

Open it
In this paper, we have shown that recent neural network architectures, traditionally used in natural language processing and machine translation, could effectively do away with the complicated and time-consuming step of feature engineering, even when applied to highly structured problems such as predicting the future behaviors of a panel of customers. We apply the LSTM neural networks to predict customer responses in direct marketing and discuss its possible application in other contexts within marketing, such as market-share forecas

Original toplevel document (pdf)

cannot see any pdfs




#feature-engineering #lstm #recurrent-neural-networks #rnn
The RNN processes the entire sequence of available data without having to summarize it into features.
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on


Parent (intermediate) annotation

Open it
The RNN processes the entire sequence of available data without having to summarize it into features. Since customer transactions occur sequentially, they can be modeled as a sequence prediction task using an RNN as well, where all firm actions and customer responses are represented by

Original toplevel document (pdf)

cannot see any pdfs




Flashcard 7559715884300

Tags
#bayes #programming #r #statistics
Question
The role of Bayesian inference is to compute the exact relative credibilities of candidate parameter values, while also taking into account their [...] probabilities.
Answer
prior

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
The role of Bayesian inference is to compute the exact relative credibilities of candidate parameter values, while also taking into account their prior probabilities.

Original toplevel document (pdf)

cannot see any pdfs







[unknown IMAGE 7104054824204] #deep-learning #has-images #keras #lstm #python #sequence
You can specify the input shape argument that expects a tuple containing the number of time steps and the number of features
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on


Parent (intermediate) annotation

Open it
You can specify the input shape argument that expects a tuple containing the number of time steps and the number of features. For example, if we had two time steps and one feature for a univariate sequence with two lag observations per row, it would be specified as on listing 4.5

Original toplevel document (pdf)

cannot see any pdfs




#deep-learning #keras #lstm #python #sequence
For a multiclass classification problem, the results may be in the form of an array of probabilities (assuming a one hot encoded output variable) that may need to be converted to a single class output prediction using the argmax() NumPy function.
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on


Parent (intermediate) annotation

Open it
rovided by a linear activation function. For a binary classification problem, the predictions may be an array of probabilities for the first class that can be converted to a 1 or 0 by rounding. <span>For a multiclass classification problem, the results may be in the form of an array of probabilities (assuming a one hot encoded output variable) that may need to be converted to a single class output prediction using the argmax() NumPy function. Alternately, for classification problems, we can use the predict classes() function that will automatically convert uncrisp predictions to crisp integer class values. <span>

Original toplevel document (pdf)

cannot see any pdfs




#feature-engineering #lstm #recurrent-neural-networks #rnn
Response models in direct marketing predict customer responses from past customer behavior and marketing activity. These models often summarize past events using features such as recency or frequency and the process of feature engineering has received significant attention
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on


Parent (intermediate) annotation

Open it
Response models in direct marketing predict customer responses from past customer behavior and marketing activity. These models often summarize past events using features such as recency or frequency (e.g., Blattberg, Kim, & Neslin, 2008; Malthouse, 1999; Van Diepen, Donkers, & Franses, 2009), and the process of feature engineering has received significant attention

Original toplevel document (pdf)

cannot see any pdfs




Flashcard 7559727680780

Tags
#RNN #ariadne #behaviour #consumer #deep-learning #priority #recurrent-neural-networks #retail #simulation #synthetic-data
Question
Consumer behaviour in e-commerce can be described by sequences of [...] with a webshop. We show that recurrent neural networks (RNNs) are a natural fit for modelling and predicting consumer behaviour.
Answer
interactions

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
Consumer behaviour in e-commerce can be described by sequences of interactions with a webshop. We show that recurrent neural networks (RNNs) are a natural fit for modelling and predicting consumer behaviour.

Original toplevel document (pdf)

cannot see any pdfs







#abm #agent-based #machine-learning #model #priority
The connection between input and decision is then handled objectively by an Artificial Neural Network. This also means that the model is highly adaptive. If the goals of the agents, their input or properties of the system change, retraining the Neural Network is the only adaptation that is necessary.
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on


Parent (intermediate) annotation

Open it
ased model, namely the definition of the rules and equations governing agent behaviour is translated to the definition of the goals of each agent and which parts of the system they can observe. <span>The connection between input and decision is then handled objectively by an Artificial Neural Network. This also means that the model is highly adaptive. If the goals of the agents, their input or properties of the system change, retraining the Neural Network is the only adaptation that is necessary. Section 3 showcases this flexibility with different examples. In addition to the framework’s flexibility and objectivity, it also enables an intuitive way to include bounded rationality

Original toplevel document (pdf)

cannot see any pdfs




#deep-learning #keras #lstm #python #sequence
The choice of activation function is most important for the output layer as it will define the format that predictions will take.
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on


Parent (intermediate) annotation

Open it
The choice of activation function is most important for the output layer as it will define the format that predictions will take. For example, below are some common predictive modeling problem types and the structure and standard activation function that you can use in the output layer: Regression: Linear activati

Original toplevel document (pdf)

cannot see any pdfs




#RNN #ariadne #behaviour #consumer #deep-learning #priority #recurrent-neural-networks #retail #simulation #synthetic-data
In the future, predictions on the level of products and individual tastes will be in our focus, enabling sophisticated recommendation products. This will require richer input descriptions at individual time-steps. Likewise, more sophisticated RNN architectures will be promising for future research
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on


Parent (intermediate) annotation

Open it
are employing RNNs in production now which offers significant advantages over existing methods: reduced feature engineering; improved empirical performance; and better prediction explanations. <span>In the future, predictions on the level of products and individual tastes will be in our focus, enabling sophisticated recommendation products. This will require richer input descriptions at individual time-steps. Likewise, more sophisticated RNN architectures will be promising for future research <span>

Original toplevel document (pdf)

cannot see any pdfs




Flashcard 7559735020812

Tags
#feature-engineering #lstm #recurrent-neural-networks #rnn
Question
The LSTM network forms a chain of repeating modules, like any RNN, but the modules, apart from the external recurrent function of the RNN, possess an [...] (or self-loop), which lets the gradients flow for long durations without exploding or vanishing
Answer
internal recurrence

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
The LSTM network forms a chain of repeating modules, like any RNN, but the modules, apart from the external recurrent function of the RNN, possess an internal recurrence (or self-loop), which lets the gradients flow for long durations without exploding or vanishing

Original toplevel document (pdf)

cannot see any pdfs







#feature-engineering #lstm #recurrent-neural-networks #rnn
Demand forecasting for products within a category is a critical task for retailers and brand managers alike. The multinomial logit model (MNL) is commonly used to predict brand choice and market share using marketing-mix and loyalty variables (Guadagni & Little, 1983)
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on


Parent (intermediate) annotation

Open it
Brand Choice and Market Share Forecasting Using Scanner Data Demand forecasting for products within a category is a critical task for retailers and brand managers alike. The multinomial logit model (MNL) is commonly used to predict brand choice and market share using marketing-mix and loyalty variables (Guadagni & Little, 1983). Artificial feedforward neural networks (ANN) have also been shown to effectively predict household brand choices, as well as brand market shares (Agrawal & Schorling, 1996). Since

Original toplevel document (pdf)

cannot see any pdfs




#feature-engineering #lstm #recurrent-neural-networks #rnn
Because of their typical high-dimensionality, the hidden states of RNN models are usually more potent than that of hidden markov models (e.g., Netzer, Lattin, & Srinivasan, 2008), which are commonly used in marketing to capture customer dynamics.
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on


Parent (intermediate) annotation

Open it
Because of their typical high-dimensionality, the hidden states of RNN models are usually more potent than that of hidden markov models (e.g., Netzer, Lattin, & Srinivasan, 2008), which are commonly used in marketing to capture customer dynamics. The HMM has N discrete hidden states (where N is typically small) and, therefore, has only log 2 (N) bits of information available to capture the sequence history (Brown & Hinton, 2

Original toplevel document (pdf)

cannot see any pdfs




#feature-engineering #lstm #recurrent-neural-networks #rnn
The HMM has N discrete hidden states (where N is typically small) and, therefore, has only log 2 (N) bits of information available to capture the sequence history (Brown & Hinton, 2001). On the other hand, the RNN has distributed hidden states, which means that each input generally results in changes across all the hidden units of the RNN (Ming et al., 2017). RNNs combine a large number of distributed hidden states with nonlinear dynamics to update these hidden states, thereby allowing it to have a more substantial representational capacity when compared with an HMM
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on


Parent (intermediate) annotation

Open it
states of RNN models are usually more potent than that of hidden markov models (e.g., Netzer, Lattin, & Srinivasan, 2008), which are commonly used in marketing to capture customer dynamics. <span>The HMM has N discrete hidden states (where N is typically small) and, therefore, has only log 2 (N) bits of information available to capture the sequence history (Brown & Hinton, 2001). On the other hand, the RNN has distributed hidden states, which means that each input generally results in changes across all the hidden units of the RNN (Ming et al., 2017). RNNs combine a large number of distributed hidden states with nonlinear dynamics to update these hidden states, thereby allowing it to have a more substantial representational capacity when compared with an HMM <span>

Original toplevel document (pdf)

cannot see any pdfs




Flashcard 7559744457996

Tags
#deep-learning #keras #lstm #python #sequence
Question
[...] prediction involves predicting an output sequence given an input sequence. For example: Input Sequence: 1, 2, 3, 4, 5 Output Sequence: 6, 7, 8, 9, 1
Answer
Sequence-to-sequence

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
Sequence-to-sequence prediction involves predicting an output sequence given an input sequence. For example: Input Sequence: 1, 2, 3, 4, 5 Output Sequence: 6, 7, 8, 9, 1

Original toplevel document (pdf)

cannot see any pdfs







Flashcard 7559749176588

Tags
#Shell #linux
Question
Answer
passwd

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
How to Force User to Change Password at Next Login in Linux User name: ravi # passwd --expire ravi

Original toplevel document

How to Force User to Change Password at Next Login in Linux
User name: ravi # passwd --expire ravi How to Force User to Change Password at Next Login in Linux Using passwd Command To force a user to change his/her password, first of all the password must have expired and to cause a u