Navigation Menu

Search code, repositories, users, issues, pull requests..., provide feedback.

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly.

To see all available qualifiers, see our documentation .

  • Notifications You must be signed in to change notification settings

My solutions to Quizzes and Programming Assignments of the specialization.

zhang-guodong/Deep-Learning-Specialization

Folders and files, repository files navigation, deep learning specialization.

My solutions to Quizzes and Programming Assignments of the specialization .

Please Star or Fork if it helps. Thanks.

  • Quiz: Recurrent Neural Networks
  • Programming Assignment: Building your Recurrent Neural Network - Step by Step
  • Programming Assignment: Dinosaur Island-Character-Level Language Modeling
  • Programming Assignment: Jazz Improvisation with LSTM
  • Quiz: Natural Language Processing & Word Embeddings
  • Programming Assignment: Operations on Word Vectors - Debiasing
  • Programming Assignment: Emojify (Raw file. The coded file was gone by mistake.)
  • Quiz: Sequence Models & Attention Mechanism
  • Programming Assignment: Neural Machine Translation
  • Programming Assignment: Trigger Word Detection
  • Quiz: Transformers
  • Programming Assignment: Transformers Architecture with TensorFlow
  • Jupyter Notebook 100.0%

Deep-Learning-Specialization-Coursera

This repo contains the updated version of all the assignments/labs (done by me) of deep learning specialization on coursera by andrew ng. it includes building various deep learning models from scratch and implementing them for object detection, facial recognition, autonomous driving, neural machine translation, trigger word detection, etc., deep learning specialization coursera [updated version 2021].

GitHub Repo

Announcement

[!IMPORTANT] Check our latest paper (accepted in ICDAR’23) on Urdu OCR

UTRNet

This repo contains all of the solved assignments of Coursera’s most famous Deep Learning Specialization of 5 courses offered by deeplearning.ai

Instructor: Prof. Andrew Ng

This Specialization was updated in April 2021 to include developments in deep learning and programming frameworks. One of the most major changes was shifting from Tensorflow 1 to Tensorflow 2. Also, new materials were added. However, Most of the old online repositories still don’t have old codes. This repo contains updated versions of the assignments. Happy Learning :)

Programming Assignments

Course 1: Neural Networks and Deep Learning

  • W2A1 - Logistic Regression with a Neural Network mindset
  • W2A2 - Python Basics with Numpy
  • W3A1 - Planar data classification with one hidden layer
  • W3A1 - Building your Deep Neural Network: Step by Step¶
  • W3A2 - Deep Neural Network for Image Classification: Application

Course 2: Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization

  • W1A1 - Initialization
  • W1A2 - Regularization
  • W1A3 - Gradient Checking
  • W2A1 - Optimization Methods
  • W3A1 - Introduction to TensorFlow

Course 3: Structuring Machine Learning Projects

  • There were no programming assignments in this course. It was completely thoeretical.
  • Here is a link to the course

Course 4: Convolutional Neural Networks

  • W1A1 - Convolutional Model: step by step
  • W1A2 - Convolutional Model: application
  • W2A1 - Residual Networks
  • W2A2 - Transfer Learning with MobileNet
  • W3A1 - Autonomous Driving - Car Detection
  • W3A2 - Image Segmentation - U-net
  • W4A1 - Face Recognition
  • W4A2 - Neural Style transfer

Course 5: Sequence Models

  • W1A1 - Building a Recurrent Neural Network - Step by Step
  • W1A2 - Character level language model - Dinosaurus land
  • W1A3 - Improvise A Jazz Solo with an LSTM Network
  • W2A1 - Operations on word vectors
  • W2A2 - Emojify
  • W3A1 - Neural Machine Translation With Attention
  • W3A2 - Trigger Word Detection
  • W4A1 - Transformer Network
  • W4A2 - Named Entity Recognition - Transformer Application
  • W4A3 - Extractive Question Answering - Transformer Application

I’ve uploaded these solutions here, only for being used as a help by those who get stuck somewhere. It may help them to save some time. I strongly recommend everyone to not directly copy any part of the code (from here or anywhere else) while doing the assignments of this specialization. The assignments are fairly easy and one learns a great deal of things upon doing these. Thanks to the deeplearning.ai team for giving this treasure to us.

Connect with me

Name: Abdur Rahman

Institution: Indian Institute of Technology Delhi

Find me on:

LinkedIn

Deep-Learning-Specialization

Coursera deep learning specialization, sequence models.

This course will teach you how to build models for natural language, audio, and other sequence data. Thanks to deep learning, sequence algorithms are working far better than just two years ago, and this is enabling numerous exciting applications in speech recognition, music synthesis, chatbots, machine translation, natural language understanding, and many others.

  • Understand how to build and train Recurrent Neural Networks (RNNs), and commonly-used variants such as GRUs and LSTMs.
  • Be able to apply sequence models to natural language problems, including text synthesis.
  • Be able to apply sequence models to audio applications, including speech recognition and music synthesis.

Week 1: Sequence Models

Learn about recurrent neural networks. This type of model has been proven to perform extremely well on temporal data. It has several variants including LSTMs, GRUs and Bidirectional RNNs, which you are going to learn about in this section.

Assignment of Week 1

  • Quiz 1: Recurrent Neural Networks
  • Programming Assignment: Building a recurrent neural network - step by step
  • Programming Assignment: Dinosaur Island - Character-Level Language Modeling
  • Programming Assignment: Jazz improvisation with LSTM

Week 2: Natural Language Processing & Word Embeddings

Natural language processing with deep learning is an important combination. Using word vector representations and embedding layers you can train recurrent neural networks with outstanding performances in a wide variety of industries. Examples of applications are sentiment analysis, named entity recognition and machine translation.

Assignment of Week 2

  • Quiz 2: Natural Language Processing & Word Embeddings
  • Programming Assignment: Operations on word vectors - Debiasing
  • Programming Assignment: Emojify

Week 3: Sequence models & Attention mechanism

Sequence models can be augmented using an attention mechanism. This algorithm will help your model understand where it should focus its attention given a sequence of inputs. This week, you will also learn about speech recognition and how to deal with audio data.

Assignment of Week 3

  • Quiz 3: Sequence models & Attention mechanism
  • Programming Assignment: Neural Machine Translation with Attention
  • Programming Assignment: Trigger word detection

Course Certificate

Certificate

IMAGES

  1. Sequence Models

    coursera sequence models assignment

  2. GitHub

    coursera sequence models assignment

  3. Coursera, Deep Learning 5, Sequence Models, week2, Natural Language

    coursera sequence models assignment

  4. Sequence Models(Coursera) Deep learning Transformer Networks week4 programming assignment and Quiz

    coursera sequence models assignment

  5. Coursera: Sequence Models

    coursera sequence models assignment

  6. [coursera/SequenceModels/week3]Sequence models & Attention mechanism

    coursera sequence models assignment

VIDEO

  1. Lab Assignment 3: Sequence Editing

  2. Lecture 3: Introduction to Sequence Models (Noah Smith)

  3. Sequence models week2 programming assignment Deeplearning specialization Coursera

  4. Sequence Models week3 Quiz & programming assignment Deep Learning specialization

  5. 2024 High Performance Computing Lecture 11 Deep Sequence Models & CFD Applications Part2 💻

  6. Assignment 9.4 Python Data Structures