Implicit Neural Representations for Generative Modeling of Living Cell Shapes

Investor logo


This publication doesn't include Faculty of Medicine. It includes Faculty of Informatics. Official publication website can be found on


Year of publication 2022
Type Article in Proceedings
Conference International Conference on Medical Image Computing and Computer Assisted Intervention
MU Faculty or unit

Faculty of Informatics

Keywords cell shape modeling; neural networks; implicit neural representations; signed distance function; generative model; interpolation
Description Methods allowing the synthesis of realistic cell shapes could help generate training data sets to improve cell tracking and segmentation in biomedical images. Deep generative models for cell shape synthesis require a light-weight and flexible representation of the cell shape. However, commonly used voxel-based representations are unsuitable for high-resolution shape synthesis, and polygon meshes have limitations when modeling topology changes such as cell growth or mitosis. In this work, we propose to use level sets of signed distance functions (SDFs) to represent cell shapes. We optimize a neural network as an implicit neural representation of the SDF value at any point in a 3D+time domain. The model is conditioned on a latent code, thus allowing the synthesis of new and unseen shape sequences. We validate our approach quantitatively and qualitatively on C. elegans cells that grow and divide, and lung cancer cells with growing complex filopodial protrusions. Our results show that shape descriptors of synthetic cells resemble those of real cells, and that our model is able to generate topologically plausible sequences of complex cell shapes in 3D+time.
Related projects:

You are running an old browser version. We recommend updating your browser to its latest version.

More info