Software developed for the film industry is being adapted to predict how surgery on a particular person’s face will alter their appearance after the operation.

The software, which models the effect of the different incisions surgeons can make, is designed to help minimise the disfigurement some patients can suffer after a major operation.

“The system allows the user to see the results of a particular wound closure and edit the cutting path to explore different options,” says Steve Pieper, a computer scientist at the Brigham and Women’s Hospital’s Surgical Planning Laboratory in Boston who helped create it.

When surgeons remove a facial tumour, for example, they have to cut the skin to create flaps that they can pull back to reveal the tissue below. Because skin can bunch or stretch unpredictably, and surgery can interfere with the muscles and other soft tissue, it is difficult to predict what will happen after the skin flaps are rejoined during the operation.

Unique structure

Pieper began tackling this problem more than 10 years ago with a program that predicted how a human face might look after surgery. But this used a generic face, and did not take account of the unique structure of soft tissue beneath the surface of different individuals’ skin.

Now Pieper, together with colleagues from Digital Elite in Los Angeles, a company that specialises in facial modelling for the film industry, has produced software that solves this problem by basing its calculations on data from MRI scans of the patient undergoing surgery.

The scans show the structure of the epidermis, the dermis and the subcutaneous fat, the three layers closest to the skin surface. This is combined with a 3D scan of the skin surface to give the external shape of the face.

These layers have an important effect on the way the face looks and the forces the skin is put under when it is cut and as it is knitting back together. MRI scans can give a good indication of the dimensions and physical properties of these layers, such as their stiffness, which can be used to predict the effects surgery might have.

Real time

The new technique gives surgeons a way of creating a virtual model of the face that includes these layers and models their physical properties. Using the data from the MRI scan, the software employs a widespread technique known as “finite element modelling” to divide each layer into thousands of three-dimensional elements.

These elements interact with each other according to a set of equations designed to reproduce the behaviour of real tissue in each of the layers. It starts by calculating the deformation or movement of elements directly affected when an incision is made. From this it works out the effect on successive elements right across the face.

By using graphics cards designed for computer games to generate the facial images, the team has been able to convert the results of these calculations much faster than with previous versions of the program. The extra speed allows surgeons to view the effect of their incisions in real time as the model head rotates. Pieper plans to present his work at the Electronic Imaging Conference, to be held later this month in San Jose, California.

Court Cutting, a plastic surgeon at New York University who specialises in repairing cleft palates, is impressed by the team’s attempt, but says he will only use the model when it incorporates bone and muscle.

Pieper himself points out that his model does not yet take account of the differing elasticity of old and young skin, nor after-effects such as sagging skin that happen slowly. He plans to work on these features in future versions of the software.
More here.