Skip to content

Code for "Texture Networks: Feed-forward Synthesis of Textures and Stylized Images" paper.

License

Notifications You must be signed in to change notification settings

mstrazds/texture_nets

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

62 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Texture Networks + Instance normalization: Feed-forward Synthesis of Textures and Stylized Images

In our paper we describe a faster way to generate textures and stylize images. It requires learning a feedforward generator with a loss function proposed by Gatys et. al.. When the model is trained, a texture sample or stylized image any size can be generated instantly.

Instance Normalization: The Missing Ingredient for Fast Stylization presents a better architectural design for the generator network. By switching batch_norm to instance norm we facilitate the learning process resulting in much better quality.

Prerequisites

Download VGG-19.

cd data/pretrained && bash download_models.sh && cd ../..

Stylization

Training

Basic example:

th train.lua -data <path to any image dataset>  -style_image path/to/img.jpg

The image dataset should be structured as in fb.resnet.torch having train and val folders and some folders corresponding to classes as you were doing classification. You can create a dummy folder train/dymmy/ and val/dummy/ and store all the images in them. Only images from train forlder will be used. Change the code or rename folders to use val folder. You can use any dataset for example mscoco or imagenet. Use validation part if using imagenet.

To achieve the results from the paper you need to play with -image_size, -style_size, -style_layers, -content_layers, -style_weight.

Do not hesitate to set batch_size to one, but remember the larger batch_size the larger learning_rate you can use.

Testing

th test.lua -input_image path/to/image.jpg -model data/checkpoints/model.t7

Play with -image_size here.

You can find a pretrained model here. It is not the model from the paper.

Generating textures

soon

Hardware

  • The code was tested with 12GB NVIDIA Titan X GPU and Ubuntu 14.04.
  • You may decrease batch_size, image_size if the model do not fit your GPU memory.
  • The pretrained models do not need much memory to sample.

Credits

The code is based on Justin Johnson's great code for artistic style.

The work was supported by Yandex and Skoltech.

About

Code for "Texture Networks: Feed-forward Synthesis of Textures and Stylized Images" paper.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Lua 99.3%
  • Shell 0.7%