A heart transplant recipient’s survival depends on dozens of variables, including the weight, gender, age, and blood type of both donor and recipient, and the ischemic time—or the time during a transplant when there is no blood flow to the organ.
To better understand transplant risk factors and improve patient outcomes, researchers at Lund University and Skåne University Hospital in Sweden use artificial neural networks (ANNs) to explore the complex nonlinear relationships among multiple variables. The ANN models are trained using donor and recipient data obtained from two global databases: the International Society for Heart and Lung Transplantation (ISHLT) registry and the Nordic Thoracic Transplantation Database (NTTD). The Lund researchers accelerated the training and simulation of their ANNs by using MATLAB®, Deep Learning Toolbox™, and MathWorks parallel computing products.
“Many of the techniques we use are computer-intensive and time-consuming,” says Dr. Johan Nilsson, Associate Professor in the Division of Cardiothoracic Surgery at Lund University. “We used Parallel Computing Toolbox with MATLAB Parallel Server to distribute the work on a 56-processor cluster. This enabled us to rapidly identify an optimal neural network configuration using MATLAB and Deep Learning Toolbox, train the network using data from the transplantation databases, and then run simulations to analyze risk factors and survival rates.”