Neural Inverse Knitting: From Images to Manufacturing Instructions

Alexandre Kaspar*, Tae-Hyun Oh*, Liane Makatura, Petr Kellnhofer, Jacqueline Aslarus, and Wojciech Matusik
* Equal contribution
Neural Inverse Knitting

Illustration of our new problem of automatic machine instruction generation from a single image. On the top-left is an example instruction map which gets knitted into a physical artifact shown on its right. We propose a machine learning pipeline to solve the inverse problem by leveraging synthetic renderings of the instruction maps.


Motivated by the recent potential of mass customization brought by whole-garment knitting machines, we introduce the new problem of automatic machine instruction generation using a single image of the desired physical product, which we apply to machine knitting. We propose to tackle this problem by directly learning to synthesize regular machine instructions from real images. We create a cured dataset of real samples with their instruction counterpart and propose to use synthetic images to augment it in a novel way. We theoretically motivate our data mixing framework and show empirical results suggesting that making real images look more synthetic is beneficial in our problem setup. We make our dataset and code publicly available for reproducibility and to motivate further research related to manufacturing and program synthesis.


In submission.