By-Example Synthesis of Vector Textures

Christopher Palazzolo, Oliver van Kaick, David Mould
Code PDF DOI

Abstract

Teaser image

We propose a new method for synthesizing an arbitrarily sized novel vector texture given a single raster exemplar. In an analysis phase, our method first segments the exemplar to extract primary textons, secondary textons, and a palette of background colors. Then, it clusters the primary textons into categories based on visual similarity, and computes a descriptor to capture each texton's neighborhood and inter-category relationships. In the synthesis phase, our method first constructs a gradient field with a set of control points containing colors from the background palette. Next, it places primary textons based on the descriptors, in order to replicate a similar texton context as in the exemplar. The method also places secondary textons to complement the background detail. We compare our method to previous work with a wide range of perceptual-based metrics, and show that we are able to synthesize textures directly in vector format with quality similar to methods based on raster image synthesis.

Vector Interpolation

A virtual 3D art gallery showcasing some of our results.

More coming soon

WIP

WIP

Citation

@article{PALAZZOLO2025104224,
	title = {Breaking art: Synthesizing abstract expressionism through image rearrangement},
	journal = {Computers & Graphics},
	volume = {129},
	pages = {104224},
	year = {2025},
	issn = {0097-8493},
	doi = {https://doi.org/10.1016/j.cag.2025.104224},
	url = {https://www.sciencedirect.com/science/article/pii/S0097849325000652},
	author = {Christopher Palazzolo and Oliver {van Kaick} and David Mould},
	keywords = {Abstract art synthesis, Image generation, Image segmentation}
}

Acknowledgements

We thank the anonymous reviewers for their valuable feedback. This work was partially supported by the Natural Sciences and Engineering Research Council of Canada (NSERC) and Carleton University. We would also like to thank the members of the Graphics, Imaging, and Games Lab (GIGL) as well as our friends outside the lab for their suggestions and comments.