Computing a Cancer Cure: 7 Ways Supercomputers Help Scientists Understand and Treat the Disease

01/04/2018 11:36 am ET Updated Jan 05, 2018

There’s a more than 40 percent chance that you will be diagnosed with cancer at some point in your lifetime, and a one in five chance that it will be terminal.

This isn’t meant to depress you, it’s just a fact of life in 21st century America. Cancer, the second-leading cause of death in the U.S. after heart disease, kills more than 500,000 people per year, including about 2,000 children.

In 1971, President Richard Nixon signed the National Cancer Act, saying: “The time has come in America when the same kind of concentrated effort that split the atom and took man to the moon should be turned toward conquering this dread disease. Let us make a total national commitment to achieve this goal."

And yet cases continued to rise.

In 2016, then Vice President Joe Biden made a similar declaration when he launched the Cancer Moonshot: “I know that we can help solidify a genuine global commitment to end cancer as we know it today  —  and inspire a new generation of scientists to pursue new discoveries and the bounds of human endeavor.”

Progress is occurring. Developments in diagnostics, medical imaging, treatments and basic knowledge are making a real impact on cancer fatalities.

“The mortality rate in cancer has been declining every year since about 2000, so we are doing something right,” said Warren Kibbe, former acting deputy director of the National Cancer Institute. “But it’s clear we need to understand more about basic biology.”

A­ host of questions remain unanswered, both about cancer’s underlying causes and about the best way to fight the disease.

To address these questions, scientists frequently turn to supercomputers — possibly the most technologically advanced, general-purpose, scientific instruments ever developed.

“Supercomputers are key to the Cancer Moonshot,” wrote then U.S. Secretary of Energy Ernest Moniz. “These exceptionally high-powered machines have the potential to greatly accelerate the development of cancer therapies by finding patterns in massive datasets too large for human analysis.”

The Texas Advanced Computing Center (TACC) at The University of Texas at Austin, which designs, builds and hosts several of the most powerful supercomputers in the world, helps the nation’s cancer researchers explore problems that they couldn’t otherwise tackle.

If scientists are the rocket in the cancer moonshot, computing power is the jet fuel.

What follows are 7 ways supercomputers help oncologists, surgeons and computer scientists improve our fundamental understanding of cancer and our methods for diagnosing and treating the disease.

1. Chemotherapy and Drug Design

Özlem Demir, University of California, San Diego
The model of full-length tumor protein 53 (p53) bound to DNA. p53 plays a crucial role in multicellular organisms, conserving the stability of DNA by preventing mutations and thereby acting as a tumor suppressor. However, in approximately 50 percent of all human cancers, p53 is mutated and rendered inactive. Therefore, reactivation of mutant p53 using small molecules has been a long-sought-after anticancer therapeutic strategy. Here, the surface of each p53 monomer is depicted with a different color.

New drugs cost billions of dollars to develop and take decades to move into the marketplace, but supercomputers can help researchers identify promising new chemotherapeutic drugs faster and with less investment.

For instance, Shuxing Zhang, associate professor of experimental therapeutics at MD Anderson Cancer Center, used TACC’s Lonestar5 supercomputer to virtually screen more than 1,400 Food and Drug Administration-approved small molecule drugs to determine which had the features needed to bind and inhibit TNIK — an enzyme that plays a key role in cell signaling related to colon cancer.

He found that mebendazole, a drug that had been approved by the FDA to fight parasites, could effectively bind to TNIK and inhibit its activity.

"Such advantages render the possibility of quickly translating the discovery into a clinical setting for cancer treatment in the near future," Zhang says.

Other teams use TACC supercomputers to model cancer-related proteins at the atomic level and study the evolutionary history of chemotherapeutic plants.

2. Immunotherapy

NIAID
Scanning electron micrograph of a human T lymphocyte (also called a T cell) from the immune system of a healthy donor. Immunotherapy fights cancer by supercharging the immune system's natural defenses (include T-cells) or contributing additional immune elements that can help the body kill cancer cells.

Immunotherapy supercharges the body’s natural defenses to fight cancer. But not every immune therapy works the same on every patient. Differences in an individual's immune system may mean one treatment is more appropriate than another. Furthermore, tweaking a patient’s system through supplemental means might heighten the efficacy of certain treatments.

Researchers from Wake Forest School of Medicine and Zhejiang University in China developed a new mathematical model to represent the interactions between prostate tumors and common immunotherapies. They then incorporated data from animal studies into their models and used the Stampede supercomputer at TACC — a National Science Foundation (NSF)-funded system that was one of the fastest in the world — to perform millions of simulations to determine tumor responses to the treatments.

The researchers found that the depletion of T-Cells and the neutralization of the signaling protein, Interleukin 2, can have a stronger effect when combined with androgen deprivation therapy and vaccines.

Says lead researcher Xiaobo Zhou: "TACC provides an important assistance for discovering clinically meaningful and actionable knowledge across highly heterogeneous biomedical big data sets."

Other researchers are using TACC supercomputers to design better clinical trials and analyze genetic data related to immune proteins.

3. Radiation and Proton Therapy

Mayo Clinic, Hitachi
Proton beam therapy expands the cancer care capabilities of hospitals and clinics. In properly selected patients — especially children and young adults and those with cancers located close to critical organs and body structures — proton beam therapy is an advance over traditional radiotherapy. The proton beam facility at Mayo Clinic in Phoenix/Scottsdale, Arizona, opened in March 2016.

X-ray radiation is the most frequently used form of high-energy treatment, but an emerging treatment uses a beam of protons to destroy cancer cells, with minimal damage to surrounding tissues.

The pinpoint accuracy required by the protein beam also means that the device must be precisely calibrated and that human error (and even the patient’s breathing) must be taken into account.

Wei Liu, a researcher at Mayo Clinic, used TACC supercomputers to develop a model for treatment planning that is more accurate and better at sparing organs than current methods.

With more than 25,000 variables, generating a plan that can deal with possible errors and still deliver the proper dose to the tumor is a big problem.

"It's very computationally expensive to generate a plan in a reasonable timeframe,” Liu says. “Without a supercomputer, we can do nothing."

4. Surgery

David Fuentes, MD Anderson Cancer Center
A volumetric finite element mesh is generated by intersecting a structured grid with the 3D surface as the boundary. The method is used to generate tumor models for more precise surgeries.

Surgery remains the most frequently used approach to treat cancer, but it is not without its complications. Removing too little of a tumor can lead to a relapse; too much — especially in a critical area like the brain — can harm the patient.

A pioneering project by researchers at The University of Texas at Austin (UT Austin) and MD Anderson Cancer Center used TACC’s advanced computing resources to perform minimally invasive laser treatment on a canine tumor without the intervention of a surgeon.

"The more data and images that can be acquired, the more confidence researchers and surgeons can have in planning surgical simulations," said David Fuentes, who leads the research.

Working in collaboration with Rice University scientists and Medtronic, Fuentes and his team are adapting the methods they developed for supercomputers into a portable system for operating rooms.

5. Genomics

Amelia Weber Hall, Iyer lab
A heat map showing differences in gene expression between primary tumors and cultured cell lines. Each row is a gene and each column is a tumor or cell sample. In the heat map, red indicates high expression and blue indicates low expression. NHA refers to normal human astrocytes, a star-shaped glial cell of the central nervous system.

The human genome consists of three billion base pairs, so identifying a single mutation by sight simply isn’t possible. Computers, on the other hand, are great at finding patterns in massive datasets and have been a boon to cancer researchers.

Researchers from the University of Texas at Austin and the National Cancer Institute are using TACC’s Stampede supercomputer to mine reams of data from The Cancer Genome Atlas to identify genetic variants and patient subtypes.

This has allowed them to identify a specific mutation in the protein forkhead box P1 (FOXP1), associated with an aggressive type of lymphoma that often correlates to poor therapeutic outcomes.

“This knowledge can be helpful in the development of more targeted therapies that seek to eliminate cancer at its origin,” says Vishy Iyer, one of the researchers involved in the research.

Genomic research using TACC’s supercomputers has also helped researchers identify cancer risk factors and classify how patients will respond to different types of treatments.

6. Patient-Specific Treatments

Lima et. al. 2017, Hormuth et. al. 2015
Model of tumor growth in a rat brain before radiation treatment (left) and after one session of radiotherapy (right). The different colors represent tumor cell concentration, with red being the highest. The treatment reduced the tumor mass substantially.

While some researchers use data mining to advance their understanding and treatment of cancer, scientists at the Center for Computational Oncology at UT Austin are mathematizing cancer to predict how cancer will progress in a specific individual.

Each factor involved in the tumor response — from how fast chemotherapeutic drugs reach the tumor to the degree to which cells signal each other to grow — is characterized by a mathematical equation that captures its essence. They then use supercomputers to combine the formulas with specific data from patients.

“If you have a model that can recapitulate how tumors grow and respond to therapy, then it becomes a classic engineering optimization problem. ‘I have this much drug and this much time. What’s the best way to give it to minimize the number of tumor cells for the longest amount of time?’” says Thomas Yankeelov, head of the Center for Computational Oncology and director of Cancer Imaging Research in the LIVESTRONG Cancer Institutes of the Dell Medical School.

At Vanderbilt University, Yankeelov’s previous institution, his group was able to predict with 87 percent accuracy whether a breast cancer patient was going to respond to therapy. The group recently began a clinical study in Austin, Texas to predict, after one treatment, how an individual’s cancer will progress. They will then use that prediction to plan the future course of treatment.

7. Cancer Diagnostics

An important factor in fighting cancer is the speed at which the disease can be diagnosed. In the future, it may be possible to diagnose cancer much earlier using more sensitive body scans, new types of DNA tests, and even nano-sensors working in the bloodstream.

Experimenting with these techniques in cancer patients (or healthy individuals) is risky. But scientists can test these techniques virtually using supercomputers to simulate the interactions of tissues and technology.

One promising diagnostic device scientists are studying is the nanopore — a tiny hole in a very thin membrane that can sequence DNA inside the body and detect signs of cancer as DNA molecules pass through.

Aleksei Aksimentiev, a professor of biological physics at the University of Illinois, Urbana-Champaign, has been using TACC’s supercomputers to engineer nanocarriers that can be used to detect and capture DNA molecules and force them through a nanopore.

“These new nano-mechanisms can guide the design of a new generation of nanopore sensors for genetic marker-based cancer diagnostics, which we believe will play an important role in precision oncology," said Li-Qun (Andrew) Gu, a collaborator on the project from the University of Missouri.

Bonus: Artificial Intelligence on the Case

Daniel Lobo, University of Maryland Baltimore County
A computational method reverse-engineered a signaling network able to recapitulate the level of conversion stochasticity of a series of pharmacological experiments.

A final, and truly radical, way that researchers are using high-performance computing for cancer research is through the application of machine and deep learning.

Daniel Lobo, an assistant professor of biology and computer science at the University of Maryland, Baltimore County, is using TACC’s systems to uncover the complex cellular communication networks that underlie cancer, and to design methods to disrupt them.

Lobo and collaborators recently uncovered the cellular control networks for pigmentation in tadpoles and reverse-engineered never-before-seen coloration. He is applying the method to cancer to find interventions that might stop metastasis in its tracks without damaging other cells.

“Traditional approaches like chemotherapy attack the cells that grow the most, but leaves cells that are signaling others to grow and that may be the most important,” Lobo says. “We’re using machine learning to find out the communication networks between these cells and hopefully to discover a treatment that can cause the tumor to collapse.”

Other researchers recently used TACC supercomputers and machine learning to distinguish between tumor types and differentiate benign from malignant tissues with nearly 90 percent accuracy, comparable to human radiologists.

“Getting a true understanding, given the complexity of the information, without some assistance from machine learning, is probably hopeless,” said Michael Levin, Lobo’s collaborator. “I think it’s inevitable that we use machine learning to enrich scientific and biomedical discovery.”

***

To learn more about the role of supercomputing in understanding cancer and advancing cancer treatments, explore the TACC Special Report.

Fighting Cancer with Supercomputers
Advanced computing accelerates cancer research in a plethora of ways.

This post is hosted on the Huffington Post's Contributor platform. Contributors control their own work and post freely to our site. If you need to flag this entry as abusive, send us an email.

CONVERSATIONS