Skip to content

n0obcoder/UNet-based-Denoising-Autoencoder-In-PyTorch

Repository files navigation

UNet-based-Denoising-Autoencoder-In-PyTorch

Cleaning printed text using Denoising Autoencoder based on UNet architecture in PyTorch

Acknowledgement

The UNet architecture used here is borrowed from https://github.com/jvanvugt/pytorch-unet. The only modification made in the UNet architecture mentioned in the above link is the addition of dropout layers.

Requirements

  • torch >= 0.4
  • torchvision >= 0.2.2
  • opencv-python
  • numpy >= 1.7.3
  • matplotlib
  • tqdm

Generating Synthetic Data

Set the number of total synthetic images to be generated num_synthetic_imgs and set the percentage of training data train_percentage in config.py Then run

python generate_synthetic_dataset.py

It will generate the synthetic data in a directory named data (can be changed in the config.py) in the root dirctory.

Training

Set the desired values of lr, epochs and batch_size in config.py

Start Training

In config.py,

  • set resume to False
python train.py

Resume Training

In config.py,

  • set resume to True and
  • set ckpt to the path of the model to be loaded, i.e. ckpt = 'model02.pth'
python train.py

Losses

The model was trained for 12 epochs for the configuration mentioned in config.py loss after 12 epochs

Testing

In config.py,

  • set ckpt to the path of the model to be loaded, i.e. ckpt = 'model02.pth'
  • set test_dir to the path that contains the noisy images that you need to denoise ('data/val/noisy' by default)
  • set test_bs to the desired batch size for the test set (1 by default)
python test.py

Once the testing is done, the results will be saved in a directory named results

Results {Noisy (Top) and Denoised (Bottom) Image Pairs)}

*
res01.png
*
res02.png
*
res03.png
*
res04.png
*
res05.png

About

Cleaning printed text using Denoising Autoencoder based on UNet architecture in PyTorch

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy