And it is here where Willcox introduces a concept central to her own research: the digital twin. One of the first examples of a digital twin is widely considered to have been developed by NASA during the Apollo programs. In the Apollo 13 mission to the moon, when the spacecraft suffered a serious malfunction NASA mission controllers were able to feed in data from the physically crippled spacecraft to rapidly adapt and modify the simulators on the ground to match the evolving conditions on the real spacecraft, and then use those data-adapted simulators to design the strategies used to safely bring the astronauts home.
But current digital twin technology is almost unrecognizable when compared to when NASA first developed this modeling technique.
Willcox, who is a professor of aerospace engineering at UT Austin, has made digital twins a core research tenet and has even built a small unmanned aircraft and its corresponding virtual model.
“The digital twin [of the plane] is not a static representation. Instead, it is a living evolving representation. We can collect data from the sensing capabilities we have onboard the airplane. We can collect data from inspections made after each flight. And we can assimilate these data to update the digital twin, so that as my airplane ages and degrades and gets damaged and repaired over its lifetime, the digital twin is following it along.”
This is data assimilation writ large. In order to ensure the reliability and accuracy of a predictive tool for a system like an aircraft - where so much is at stake – you need a broad set of mathematical and computing skills. Enter computational science, a field that enables the combination of “predictive physics-based models with powerful machine learning methods, scalable computational methods to achieve data assimilation and decision making, and high performance computing,” said Willcox.
Now digital twin technology has moved beyond just aerospace engineering, to impact many other engineering disciplines as well as many other applications across science and society.
There are digital twins of buildings used to enable energy efficiency. Bridges and other civil infrastructure have digital twins enabling virtual health monitoring and predictive maintenance. Digital twins of wind farms are optimizing efficiency and minimizing downtime.
“Across the natural world, we are seeing interest in digital twins of oil reservoirs, farms, forests, coastal areas, ice sheets, and even talk of a digital twin for all of planet earth – all to help guide sustainable decision-making.”
Even medical research is beginning to use digital twins to advance medical assessment, diagnosis, personalized treatment, and in-silico drug testing.
Time to take a deep breath though. Because this technology is highly complex and not quite ready to roll out just yet.
Notwithstanding the unprecedented amount now being collected, data, on its own, is akin to a library containing millions of books with no titles.
Not only that but data in many areas of interest remains sparse and at times difficult to interpret. “We talk about “big data” and indeed we often have a large amount of data. But in most engineering, scientific and medical applications, the data are in fact very sparse. What’s more the data are indirect and noisy. As an engineer, I can almost never actually measure what it is I want to know. I cannot cut open the aircraft wing to measure its health. Instead, I have a handful of sensors on the wing’s surface. And from these sensors I try to infer what is going on inside the wing.”
Similarly, a medical practitioner is almost always working with sparse, noisy and indirect measurements, trying to figure out what might be going on inside an organ that we certainly can’t observe directly.
Modeling the multiscale nature of complex systems is another challenge Willcox highlighted. “On my airplane wing, changes in the microscopic properties of the material can have big effects at the full scale of the vehicle. In medical applications, we know that phenomena at the cell and molecular level interact across scales to have effects at the scale of a human being. And even with today’s supercomputers, models that would resolve all those scales are just too expensive.” This is where the research at the Oden Institute comes in, advancing these frontiers through a combination of improved modeling, advanced algorithms, scientific machine learning, and high performance computing.
Bringing the spirit of TED Talks to local communities, TEDx Talks are organized independently under a free license granted by TED. This grassroots initiative of local, self-organized events (where x = independently organized) has a mission to spread new ideas and connect communities eager to grow their impact on the world. The student-led TEDxUTAustin Conference, held March 5, 2022, has been running for five successful years at the Forty Acres. This year’s theme, “Blueprints” revolved around the concept that “while Blueprints may appear fixed or rigid, with sharp angles and precise lines running across the page, in reality these are dynamic frameworks upon which masterpieces are derived.”
A video of Karen Willcox’s talk will be available to view in full on the Oden Institute website very soon.