Edited, memorised or added to reading queue

on 12-May-2024 (Sun)

Do you want BuboFlash to help you learning these things? Click here to log in or create user.

#tensorflow #tensorflow-certificate

Preprocessing data

ct = make_column_transformer((OneHotEncoder(dtype="int32"), ['Sex']), remainder="passthrough") #other columns unchangaed
ct.fit(X_train) 
X_train_transformed = ct.transform(X_train)
X_test_transformed = ct.transform(X_test)
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on

TfC_01_ADDITIONAL_01_Abalone.ipynb
Preprocessing data ct = make_column_transformer((OneHotEncoder(dtype="int32"), ['Sex']), remainder="passthrough") #other columns unchangaed ct.fit(X_train) X_train_transformed = ct.transform(X_train) X_test_transformed = ct.transform(X_test) Predictions valuation_predicts = model.predict(X_valuation_transformed) (array([[ 9.441547], [10.451973], [10.48082 ], ..., [10.401164], [13.13452 ], [ 8.081818]], dtype=float32), (6041




Flashcard 7626542943500

Tags
#has-images #recurrent-neural-networks #rnn
[unknown IMAGE 7101511240972]
Question
Note that the model is completely [...] about further extensions: all individual-level, cohort-level, time-varying, or time-invariant covariates are simply encoded as categorical input variables, and are handled equally by the model
Answer
agnostic

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
Note that the model is completely agnostic about further extensions: all individual-level, cohort-level, time-varying, or time-invariant covariates are simply encoded as categorical input variables, and are handled equally by th

Original toplevel document (pdf)

cannot see any pdfs







Flashcard 7626604809484

Tags
#deep-learning #keras #lstm #python #sequence
Question

One Hot Encoding

For categorical variables where no such [...] relationship exists, the integer encoding is not enough.

Answer
ordinal

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
One Hot Encoding For categorical variables where no such ordinal relationship exists, the integer encoding is not enough.

Original toplevel document (pdf)

cannot see any pdfs







#recurrent-neural-networks #rnn
Each prediction is generated by drawing a sample from the multinomial output distribution calculated by the bottom network layer; our model therefore does not produce point or interval estimates, each output is a simulated draw
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on


Parent (intermediate) annotation

Open it
Each prediction is generated by drawing a sample from the multinomial output distribution calculated by the bottom network layer; our model therefore does not produce point or interval estimates, each output is a simulated draw. To make the predicted transaction sequences robust against sampling noise, we repeat this process for each customer several times and take the mean expected number of transactions in a

Original toplevel document (pdf)

cannot see any pdfs




Flashcard 7626829991180

Tags
#tensorflow #tensorflow-certificate
Question
changeable_tensor = tf.Variable([10, 7])

changeable_tensor[0] = 77

Output:
TypeError: 'ResourceVariable' object does not support item assignment


changeable_tensor[0].assign(77)

Output:
<tf.[...] 'UnreadVariable' shape=(2,) dtype=int32, numpy=array([77,  7], dtype=int32)>

Answer
Variable

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Tensorflow basics
> changeable_tensor = tf.Variable([10, 7]) changeable_tensor[0] = 77 Output: TypeError: 'ResourceVariable' object does not support item assignment changeable_tensor[0].assign(77) Output: <tf.<span>Variable 'UnreadVariable' shape=(2,) dtype=int32, numpy=array([77, 7], dtype=int32)> <span>







Flashcard 7626831039756

Tags
#has-images #tensorflow #tensorflow-certificate
[unknown IMAGE 7626420784396]
Question
Typical workflow: build a model -> fit it -> evaulate -> [...] -> fit > evaluate -> ....
Answer
tweak

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
Typical workflow: build a model -> fit it -> evaulate -> tweak -> fit > evaluate -> ....

Original toplevel document

TfC 01 regression
activation functions # 2. Compiling: change optimizer or its parameters (eg. learning rate) # 3. Fitting: more epochs, more data ### How? # from smaller model to larger model Evaluating models <span>Typical workflow: build a model -> fit it -> evaulate -> tweak -> fit > evaluate -> .... Building model: experiment Evaluation model: visualize What can visualize? the data model itself the training of a model predictions ## The 3 sets (or actually 2 sets: training and test







Flashcard 7626833661196

Tags
#tensorflow #tensorflow-certificate
Question

Preprocessing data

ct = make_column_transformer((OneHotEncoder(dtype="int32"), ['Sex']), remainder="passthrough") #other columns unchangaed
ct.fit(X_train) 
X_train_transformed = ct.transform(X_train)
X_test_transformed = ct.transform([...])
Answer
X_test

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
sformer((OneHotEncoder(dtype="int32"), ['Sex']), remainder="passthrough") #other columns unchangaed ct.fit(X_train) X_train_transformed = ct.transform(X_train) X_test_transformed = ct.transform(<span>X_test) <span>

Original toplevel document

TfC_01_ADDITIONAL_01_Abalone.ipynb
Preprocessing data ct = make_column_transformer((OneHotEncoder(dtype="int32"), ['Sex']), remainder="passthrough") #other columns unchangaed ct.fit(X_train) X_train_transformed = ct.transform(X_train) X_test_transformed = ct.transform(X_test) Predictions valuation_predicts = model.predict(X_valuation_transformed) (array([[ 9.441547], [10.451973], [10.48082 ], ..., [10.401164], [13.13452 ], [ 8.081818]], dtype=float32), (6041







Flashcard 7626835234060

Tags
#has-images #tensorflow #tensorflow-certificate
[unknown IMAGE 7626420784396]
Question

## The 3 sets (or actually 2 sets: training and test set) - USING ONLY TensorFlow

tf.random.set_seed(999)

X_train, X_test = [...](tf.random.shuffle(X, seed=42), num_or_size_splits=[40, 10])

Answer
tf.split

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
## The 3 sets (or actually 2 sets: training and test set) - USING ONLY TensorFlow tf.random.set_seed(999) X_train, X_test = tf.split(tf.random.shuffle(X, seed=42), num_or_size_splits=[40, 10])

Original toplevel document

TfC 01 regression
> evaulate -> tweak -> fit > evaluate -> .... Building model: experiment Evaluation model: visualize What can visualize? the data model itself the training of a model predictions <span>## The 3 sets (or actually 2 sets: training and test set) tf.random.set_seed(999) X_train, X_test = tf.split(tf.random.shuffle(X, seed=42), num_or_size_splits=[40, 10]) def plot_predictions(train_data = X_train, train_labels = y_train, test_data = X_test, test_labels = y_test, predictions = y_pred): """ Plots training data, testing_data """ plt.figure(







[unknown IMAGE 7626420784396] #has-images #tensorflow #tensorflow-certificate
Saving and loading models

Two formats:

  • SavedModel format (including optimizer's step)
  • HDF5 format

What about TensorFlow Serving format?

# Save the entire model using SavedModel
model_3.save("best_model_3_SavedModel")
# SavedModel is in principle protobuff)pb file # Save model in HDF5 format:
model_3.save("best_model_3_HDF5.h5")
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on

TfC 01 regression
is to track the results of your experiments. There are tools to help us! Resource: Try: Tensorboard - a component of Tensorflow library to help track modelling experiments Weights & Biases <span>Saving and loading models Two formats: SavedModel format (including optimizer's step) HDF5 format What about TensorFlow Serving format? # Save the entire model using SavedModel model_3.save("best_model_3_SavedModel") # SavedModel is in principle protobuff)pb file # Save model in HDF5 format: model_3.save("best_model_3_HDF5.h5") Load model loaded_model_SM = tf.keras.models.load_model('/content/best_model_3_SavedModel') loaded_model_SM.summary() <span>




[unknown IMAGE 7626420784396] #has-images #tensorflow #tensorflow-certificate

Load model

loaded_model_SM = tf.keras.models.load_model('/content/best_model_3_SavedModel')
loaded_model_SM.summary()

statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on

TfC 01 regression
# Save the entire model using SavedModel model_3.save("best_model_3_SavedModel") # SavedModel is in principle protobuff)pb file # Save model in HDF5 format: model_3.save("best_model_3_HDF5.h5") <span>Load model loaded_model_SM = tf.keras.models.load_model('/content/best_model_3_SavedModel') loaded_model_SM.summary() <span>




[unknown IMAGE 7626420784396] #has-images #tensorflow #tensorflow-certificate

Saving and loading models

Two formats:

  • SavedModel format (including optimizer's step)
  • HDF5 format
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on


Parent (intermediate) annotation

Open it
Saving and loading models Two formats: SavedModel format (including optimizer's step) HDF5 format What about TensorFlow Serving format? # Save the entire model using SavedModel model_3.save("best_model_3_SavedModel") # SavedModel is in principle protobuff)pb file # Save model in HD

Original toplevel document

TfC 01 regression
is to track the results of your experiments. There are tools to help us! Resource: Try: Tensorboard - a component of Tensorflow library to help track modelling experiments Weights & Biases <span>Saving and loading models Two formats: SavedModel format (including optimizer's step) HDF5 format What about TensorFlow Serving format? # Save the entire model using SavedModel model_3.save("best_model_3_SavedModel") # SavedModel is in principle protobuff)pb file # Save model in HDF5 format: model_3.save("best_model_3_HDF5.h5") Load model loaded_model_SM = tf.keras.models.load_model('/content/best_model_3_SavedModel') loaded_model_SM.summary() <span>




Flashcard 7626842311948

Tags
#has-images #tensorflow #tensorflow-certificate
[unknown IMAGE 7626420784396]
Question

Saving and loading models

Two formats:

  • [...] format (including optimizer's step)
  • HDF5 format
Answer
SavedModel

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
Saving and loading models Two formats: SavedModel format (including optimizer's step) HDF5 format

Original toplevel document

TfC 01 regression
is to track the results of your experiments. There are tools to help us! Resource: Try: Tensorboard - a component of Tensorflow library to help track modelling experiments Weights & Biases <span>Saving and loading models Two formats: SavedModel format (including optimizer's step) HDF5 format What about TensorFlow Serving format? # Save the entire model using SavedModel model_3.save("best_model_3_SavedModel") # SavedModel is in principle protobuff)pb file # Save model in HDF5 format: model_3.save("best_model_3_HDF5.h5") Load model loaded_model_SM = tf.keras.models.load_model('/content/best_model_3_SavedModel') loaded_model_SM.summary() <span>







Flashcard 7626844146956

Tags
#has-images #tensorflow #tensorflow-certificate
[unknown IMAGE 7626420784396]
Question

Saving and loading models

Two formats:

  • SavedModel format (including optimizer's step)
  • [...]
Answer
HDF5 format

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
Saving and loading models Two formats: SavedModel format (including optimizer's step) HDF5 format

Original toplevel document

TfC 01 regression
is to track the results of your experiments. There are tools to help us! Resource: Try: Tensorboard - a component of Tensorflow library to help track modelling experiments Weights & Biases <span>Saving and loading models Two formats: SavedModel format (including optimizer's step) HDF5 format What about TensorFlow Serving format? # Save the entire model using SavedModel model_3.save("best_model_3_SavedModel") # SavedModel is in principle protobuff)pb file # Save model in HDF5 format: model_3.save("best_model_3_HDF5.h5") Load model loaded_model_SM = tf.keras.models.load_model('/content/best_model_3_SavedModel') loaded_model_SM.summary() <span>







Flashcard 7626845719820

Tags
#tensorflow #tensorflow-certificate
Question
# Calculate MSE "by hand" in steps - identify functions

abs_err = tf.abs(tf.subtract(tf.cast(y_test, dtype=tf.float32), [...](y_pred)))
sq_abs_err = tf.multiply(abs_err, abs_err)
sq_abs_err
tf.math.reduce_mean(sq_abs_err)



<tf.Tensor: shape=(), dtype=float32, numpy=155.11417>

Answer
tf.squeeze

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Calculate MSE &quot;by hand&quot; in steps - identify functions
# Calculate MSE "by hand" in steps - identify functions abs_err = tf.abs(tf.subtract(tf.cast(y_test, dtype=tf.float32), tf.squeeze(y_pred))) sq_abs_err = tf.multiply(abs_err, abs_err) sq_abs_err tf.math.reduce_mean(sq_abs_err) <tf.Tensor: shape=(), dtype=float32, numpy=155.11417>