Skip to content
#

mini-batch-gradient-descent

Here are 69 public repositories matching this topic...

Curso Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization. Segundo curso del programa especializado Deep Learning. Este repositorio contiene todos los ejercicios resueltos. https://www.coursera.org/learn/neural-networks-deep-learning

  • Updated Jan 11, 2019
  • Jupyter Notebook

A five-course specialization covering the foundations of Deep Learning, from building CNNs, RNNs & LSTMs to choosing model configurations & paramaters like Adam, Dropout, BatchNorm, Xavier/He initialization, and others.

  • Updated Apr 26, 2019
  • Jupyter Notebook

Your all-in-one Machine Learning resource – from scratch implementations to ensemble learning and real-world model tuning. This repository is a complete collection of 25+ essential ML algorithms written in clean, beginner-friendly Jupyter Notebooks. Each algorithm is explained with intuitive theory, visualizations, and hands-on implementation.

  • Updated Jul 22, 2025
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the mini-batch-gradient-descent topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the mini-batch-gradient-descent topic, visit your repo's landing page and select "manage topics."

Learn more