This tutorial is the third part of the “Text Generation in Deep Learning with Tensorflow & Keras” series.

In this series, we have been covering all the topics related to Text Generation with sample implementations in Python.

In this tutorial, we will focus on how to build an Efficient TensorFlow Input Pipeline for Word-Level Text Generation.

First, we will download a sample corpus (text file).

After opening the file and reading it line-by-line, we will split the text into words.

Then, we will generate pairs including an input word sequence (X) and an output word (y).

Using tf.data API and…


We have just started a new tutorial series on Text Generation.

Image for post
Image for post

There will be the following parts in that series:

Part A: Fundamentals

Part B: Tensorflow Data Pipeline for Character Level Text Generation

Part C: Tensorflow Data Pipeline for Word Level Text Generation

Part D: Recurrent Neural Network (LSTM) Model for Character Level Text Generation

Part E: Encoder-Decoder Model for Character Level Text Generation

Part F: Recurrent Neural Network (LSTM) Model for Word Level Text Generation

Part G: Encoder-Decoder Model for Word Level Text Generation

We share the complete Python/Tensorflow/Keras codes as Colab Notebooks.

Furthermore, you will be able to watch YouTube videos for a visual explanation of these posts.

Enjoy!

Murat Karakaya Akademi.


This tutorial is the second part of the “Text Generation in Deep Learning with Tensorflow & Keras” series.

In this series, we have been covering all the topics related to Text Generation with sample implementations in Python.

In this tutorial, we will focus on how to build an Efficient TensorFlow Input Pipeline for Character-Level Text Generation.

First, we will download a sample corpus (text file).

After opening the file and reading it line-by-line, we will convert it to a single line of text.

Then, we will split the text into input character sequence (X) and output character (y).

Using tf.data.Dataset


Image for post
Image for post

INDEX PAGE

This is the index page of the “Text Generation in Deep Learning” series.

We will cover all the topics related to Text Generation with sample implementations in Python Tensorflow Keras.

You can access the codes, videos, and posts from the below links.

If you would like to learn more about Deep Learning with practical coding examples, please subscribe to my YouTube Channel or follow my blog on Medium. Do not forget to turn on notifications so that you will be notified when new parts are uploaded.

MEDIUM BLOG LINKS:

Part A: Fundamentals

Part B: Tensorflow Data Pipeline for Character Level Text Generation

Part…


Fundamentals

This tutorial is the first part of the “Text Generation in Deep Learning” series.

We will cover all the topics related to Text Generation with sample implementations in Python Tensorflow Keras.

You can access the codes, videos, and posts from the below links.

In this part, we will learn the Fundamentals of Text Generation in Deep Learning.

If you would like to learn more about Deep Learning with practical coding examples, please subscribe to my YouTube Channel or follow my blog on Medium. …


This is the fourth part of the “How to solve Classification Problems in Keras?” series.

Before starting this tutorial, I strongly suggest you go over Part A: Classification with Keras to learn all related concepts.

The link to all the parts of the series is provided in the video description.

In this tutorial, we will focus on how to solve Multi-Label Classification Problems in Deep Learning with Tensorflow & Keras.

First, we will download a sample Multi-label dataset.

In multi-label classification problems, we mostly encode the true labels with multi-hot vectors.

We will experiment with combinations of various last layer’s…


Today, we will focus on how to solve Classification Problems in Deep Learning with Tensorflow & Keras.

When we design a model in Deep Neural Networks, we need to know how to select proper label encoding, Activation, and Loss functions, along with accuracy metrics according to the classification task at hand.

Thus, in this tutorial, we will first investigate the types of Classification Problems. Then, we will see the most frequently used label encodings in Keras. We will learn how to select Activation & Loss functions according to the given classification type and label encoding. …


In this tutorial, we will focus on how to Build Efficient TensorFlow Input Pipelines for Image Datasets in Deep Learning with Tensorflow & Keras.

First, we will review the tf.data library. Then, we will download a sample image and label files. After gathering all the image file paths in the directories, we will merge file names with lables to create the train and test datasets. Using tf.data.Dataset methods, we will learn how to map, prefetch, cache, and batch the datasets correctly so that the data input pipeline will be efficient in terms of time and performance. We will discuss how…


This is the third part of the “How to solve Classification Problems in Keras?” series.

If you have not gone over Part A and Part B, please review them before continuing with this tutorial.

The link to all parts is provided below.

In this tutorial, we will focus on how to solve Multi-Class Classification Problems in Deep Learning with Tensorflow & Keras.

First, we will download the MNIST dataset.

In multi-class classification problems, we have two options to encode the true labels by using either:

  • integer numbers, or
  • one-hot vector

We will experiment with both encodings to observe the effect…


Hello Everyone!

Due to the importance of Seq2Seq learning, I prepared a series of posts and videos.

You can access all my SEQ2SEQ Learning videos on Murat Karakaya Akademi Youtube channel in ENGLISH or in TURKISH

You can access all tutorial on my Medium blog.

Image for post
Image for post
Encoder-Decoder Model with Global Attention

Here is the list of the tutorials:

Part A: AN INTRODUCTION TO SEQ2SEQ LEARNING AND A SAMPLE SOLUTION WITH MLP NETWORK

Part B: SEQ2SEQ LEARNING WITH RECURRENT NEURAL NETWORKS (LSTM)

Part C: SEQ2SEQ LEARNING WITH A BASIC ENCODER…

Murat Karakaya

Assoc. Prof. Computer Engineering

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store