Edited, memorised or added to reading queue

on 12-Jan-2025 (Sun)

Do you want BuboFlash to help you learning these things? Click here to log in or create user.

#tensorflow #tensorflow-certificate
Finding the best learning rate
#model
tf.random.set_seed(42) model_7 = tf.keras.Sequential([ tf.keras.layers.Dense(4, activation='relu'), tf.keras.layers.Dense(4, activation='relu'), tf.keras.layers.Dense(1, activation='sigmoid')
]) model_7.compile(optimizer=tf.keras.optimizers.Adam(), loss='binary_crossentropy', metrics=['accuracy']) # callback
lr_scheduler = tf.keras.callbacks.LearningRateScheduler(lambda epoch: 1e-4 * 10**(epoch/20)) history_7 = model_7.fit(X_train, y_train, epochs=100, callbacks=[lr_scheduler]) # plot learning rate vs the loss
plt.figure(figsize=(10, 7))
plt.semilogx(history_7.history['lr'], history_7.history['loss'])
plt.xlabel('Learning Rate')
plt.ylabel('Loss')
plt.title('Learning Rate vs Loss');
statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on

TfC_02_classification-PART_1
cision boundry plt.contourf(xx, yy, y_pred, cmap=plt.cm.RdYlBu, alpha=0.7) plt.scatter(X[:, 0], X[:, 1], c=y, s=40, cmap=plt.cm.RdYlBu) plt.xlim(xx.min(), xx.max()) plt.ylim(yy.min(), yy.max()) <span>Finding the best learning rate #model tf.random.set_seed(42) model_7 = tf.keras.Sequential([ tf.keras.layers.Dense(4, activation='relu'), tf.keras.layers.Dense(4, activation='relu'), tf.keras.layers.Dense(1, activation='sigmoid') ]) model_7.compile(optimizer=tf.keras.optimizers.Adam(), loss='binary_crossentropy', metrics=['accuracy']) # callback lr_scheduler = tf.keras.callbacks.LearningRateScheduler(lambda epoch: 1e-4 * 10**(epoch/20)) history_7 = model_7.fit(X_train, y_train, epochs=100, callbacks=[lr_scheduler]) # plot learning rate vs the loss plt.figure(figsize=(10, 7)) plt.semilogx(history_7.history['lr'], history_7.history['loss']) plt.xlabel('Learning Rate') plt.ylabel('Loss') plt.title('Learning Rate vs Loss'); <span>




Flashcard 7674693815564

Tags
#ML-engineering #ML_in_Action #learning #machine #software-engineering
Question
The most effective way to solve those business problems that we’re all tasked with as data science (DS) practitioners is to follow a process designed around preventing [...], confusion, and complexity.
Answer
rework

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
The most effective way to solve those business problems that we’re all tasked with as data science (DS) practitioners is to follow a process designed around preventing rework, confusion, and complexity.

Original toplevel document (pdf)

cannot see any pdfs







Flashcard 7674695650572

Tags
#recurrent-neural-networks #rnn
Question
Different seq2seq models can be created depending on how we manipulate the input data; i.e., we can [...] certain parts of the input sequence and train the model to predict what is missing, to ‘‘fill in the blanks”.
Answer
conceal

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
Different seq2seq models can be created depending on how we manipulate the input data; i.e., we can conceal certain parts of the input sequence and train the model to predict what is missing, to ‘‘fill in the blanks”.

Original toplevel document (pdf)

cannot see any pdfs







Flashcard 7674697747724

Tags
#tensorflow #tensorflow-certificate
Question

Bag of tricks to improve model

  1. Create model - more [...], more neurons, different activation
Answer
layers

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
Bag of tricks to improve model Create model - more layers, more neurons, different activation

Original toplevel document

TfC_02_classification-PART_1
nse(10, activation='relu'), tf.keras.layers.Dense(1, activation='sigmoid') ]) model.compile(optimizer=tf.keras.optimizers.Adam(), loss=tf.keras.losses.binary_crossentropy, metrics=['accuracy']) <span>Bag of tricks to improve model Create model - more layers, more neurons, different activation Compile mode - other loss, other optimizer, change optimizer parameters Fit the model - more epochs, more data examples # plots model predictions agains true data import numpy as np def plot_decision_boundry(model, X, y): """ Take in a trained model, features and labels and create numpy.meshgrid of the d







Flashcard 7674705874188

Tags
#tensorflow #tensorflow-certificate
Question

# Get the patterns of a layer in our network

weights, biases = model_35.[...][1].get_weights()

Answer
layers

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
# Get the patterns of a layer in our network weights, biases = model_35.layers[1].get_weights()

Original toplevel document

TfC_02_classification-PART_2
tant: This time there is a problem with loss function. In case of categorical_crossentropy the labels have to be one-hot encoded In case of labels as integeres use SparseCategoricalCrossentropy <span># Get the patterns of a layer in our network weights, biases = model_35.layers[1].get_weights() <span>