BézierSketch: A generative model for scalable vector sketches

Paper Suppl.


The study of neural generative models of human sketches is a fascinating contemporary modeling problem due to the links between sketch image generation and the human drawing process. The landmark SketchRNN provided breakthrough by sequentially generating sketches as a sequence of waypoints. However this leads to low-resolution image generation, and failure to model long sketches. In this paper we present BézierSketch, a novel generative model for fully vector sketches that are automatically scalable and high-resolution. To this end, we first introduce a novel inverse graphics approach to stroke embedding that trains an encoder to embed each stroke to its best fit Bézier curve. This enables us to treat sketches as short sequences of paramaterized strokes and thus train a recurrent sketch generator with greater capacity for longer sketches, while producing scalable high-resolution results. We report qualitative and quantitative results on the Quick, Draw! benchmark.

Slides for my ECCV '20 talk

PS: Reusing any of these slides would require permission from the author.

Full talk at ECCV 2020


Want to cite this paper ?

@InProceedings{das2020bziersketch,
    title = {BézierSketch: A generative model for scalable vector sketches},
    author = {Ayan Das and Yongxin Yang and Timothy Hospedales and Tao Xiang and Yi-Zhe Song},
    booktitle = {The European Conference on Computer Vision (ECCV)},
    year = {2020}
}