A Generative Adversarial Network for the generation of new synthetic art.
-
Updated
Jun 16, 2023 - Jupyter Notebook
A Generative Adversarial Network for the generation of new synthetic art.
Implementation of Linear Regression using TensorFlow's low-level API with a custom tf.GradientTape training loop. Covers manual gradient computation, weight updates, and visualization of predictions vs actual values for educational understanding of core training mechanics.
An in-depth guide to customizing model.fit() in TensorFlow/Keras by overriding the train_step function. Covers the manual implementation of the forward pass, loss calculation, gradient application, and metric updates. Includes a basic GAN implementation as a practical example.
A hands-on guide to automatic differentiation in TensorFlow using tf.GradientTape. Covers computing gradients for variables vs. constants, using tape.watch(), visualizing derivatives, and handling multiple parameters.
An end-to-end implementation of a custom training and validation loop for a CNN on the Fashion MNIST dataset. This project demonstrates low-level model training using tf.GradientTape and tf.keras.metrics, without relying on model.fit().
Custom TensorFlow training loops for image classification: a foundational CNN on Eurosat using tf.GradientTape for learning, and an optimized MNIST MLP with BatchNorm, Dropout, and learning rate scheduling for higher accuracy.
Add a description, image, and links to the custom-training-loop topic page so that developers can more easily learn about it.
To associate your repository with the custom-training-loop topic, visit your repo's landing page and select "manage topics."