ECP Industry Council Chair, Michael McQuade Addresses Audience at ECP Annual Meeting

Remarks Presented at the First ECP Annual Meeting

Dr. J. Michael McQuade

ECP Industry Council Chair

The Exascale Computing Project

Knoxville, TN

February 1, 2017

 

(Transcript of Discussion)

Why Are We Here?

Thank you very much for that very gracious introduction, and thank you for the chance to be part of tonight’s dinner.

And most importantly, I want to thank everyone here for the commitment you have all made to our Nation’s Exascale Computing Project. I hope in my remarks to explain why I think this project is critical for us right now.

First let me give you a little plan for my talk, which I’ve given the working title, “Why Are We Here?”

I’ll give a brief summary of something you all know already, the reasons and rationale we give for why a project like the ECP is important. I then want to propose that there is something more fundamental at work here, and that will lead to some remarks about facts and empiricism. I hope in today’s charged atmosphere that teaser will at least keep you awake for a while.

I’ll then make a few remarks about our own experience at UTC on how we integrate the leading edge of high performance computing into the way we bring new technology to our customers. Finally, I will come back to the subject of investments we make as a society to achieve broader national objectives. And, I’ll try to leave a little time at the end for dialogue.

I ask the question, “Why Are We Here” specifically about the Exascale Computing Project, but I want to ask it more generally of what we do when we invest time and dollars in the pursuit of science. I’m not asking why you are at this meeting or at this dinner. What I’m really asking is, “Why are we creating exascale competency and capability?”

Many in this room have the legitimate answers and rationale we have used and continue to use to explain the need for the project and advocate for the funding necessary to accomplish it. We give answers such as pushing the boundaries of computation to tackle ever more complex problems in science and national security; creating the next breakthroughs in hardware, software and communications systems technology, progress that is necessary to drive exascale computing but also valuable in its own way as a catalyst for innovation; and taking the next steps in a process that has created economic growth for decades as we leverage the developments and successes of the leading edge of computing into systems that are economically delivered to the private sector. As my colleagues at the Council on Competiveness, the organization that sponsors our HPC Advisory Committee, have said consistently for several years now, “To out-compute is to out-compete.”

It’s a big investment we are making here and the many reasons are mutually supporting. We are doing so at a point in time that is both auspicious and concerning. Without the concerted effort of the hundreds of women and men in this room, and the thousands not here who will also be critical to the success of the ECP, we will lose momentum in basic science, in national security and in the application of leadership computing to our global innovation and competitiveness. As the Secretary of Energy Advisory Board noted in its 2014 report:

The historical NNSA mission (simulation for stewardship), multiple industrial applications (e.g., oil and gas exploration and production, aerospace engineering, and medicinal chemistry (pharmaceuticals, protein structure, etc.)) and basic science all have applications that demonstrate real need and real deliverables from a significant performance increase in classical high performance computing at several orders of magnitude beyond the tens of petaflop performance delivered by today’s leadership machines.

Now, let me take just a few minutes to provide some perspective from United Technologies. I would like to give you a little background on how we see, and operationalize, this computing landscape across our company.

Many of you know us well, but for those who don’t, we’re a roughly $58B company pretty much equally weighted between the aerospace and the built infrastructure industries. We provide commercial and military jet engines and a very large percentage of everything else that goes on a plane – landing gear, wheels and brakes, electrical generation and distribution, air management, etc. We also make elevators and HVAC systems to allow people to live comfortably in high rise and single family buildings, especially in hot climates. And, in a technology-related business, we are the global leader in transportation refrigeration.

If you allow me for just a couple of minutes, I would like to go off-subject to speak about this last business for just a couple of minutes. I can find lots of ways to relate this slight riff to the needs for high performance modeling and simulation and the creation of new materials, but mostly I just want to use this pulpit to speak about an issue that deserves more visibility. It’s something about which I think everyone should know.

Each year we grow more than enough food to feed the world’s population, but still one billion people go to bed hungry every night. One-third or more of food never makes it from the farm to our fork. Insufficient methods of transport and storage are the primary cause in the developing world, while in the developed world waste is most often due to consumer preference, the disposal of excess food in our refrigerators or on our plates or food we deem undesirable for some reason, such as appearance.

The consequences of food waste are direct – as I just said, nearly one-seventh of the world’s population goes to bed hungry each night. But there is also another way of looking at this problem. Wasted food consumes energy, and the use of fossil-based energy generates carbon dioxide. Recent estimates by the UN Food and Agriculture Organization put the CO2 equivalent emissions generated to produce food that goes to waste at almost 3.5 gigatonnes annually. To put that into stark perspective, if we thought of food waste as a country, it would be the third largest emitter behind the United States and China.

UTC is working to improve transport refrigeration to help address the problem in the developing world. Design of effective cold chain systems is accomplished through thermodynamic and component modeling which relies on the existence of sufficient computational resources. By addressing this important issue of food waste, we will not only reduce world hunger but also CO2 emissions associated with food waste.

OK, back to our main subject:

UTC’s businesses are global and we provide solutions that are driven by megatrends that are, and will, impact our world for decades to come – urbanization, the growth of the middle class, and global mobility driving air travel.

Around the world, more than one million people a week are moving to cities. That’s the equivalent of two cities the size of Tokyo each year. By 2030, the world’s urban population will grow to 5 billion with China & the rest of Asia leading the way.

Urbanization is also lifting people out of poverty and creating a rapidly expanding new middle class. By 2030, nearly 60 percent of the world’s population will be members of the middle class. That’s about 5 billion people.

Some economists estimate that middle-class consumption will more than double from $27 trillion today to $55 trillion by 2030.

To accommodate all these new residents, cities are investing in massive infrastructure projects: high-rise housing and office buildings, schools, hospitals, subway systems and airports.

The increase in the middle class will result in a sharp increase in air travel. Today, less than 20% of world’s population has flown somewhere. That means more than five-and-a-half billion people have never been on an airplane.

By 2030, the number of commercial passengers will more than double – from 2.9 billion to 6.7 billion – and the number of commercial aircraft in service will grow from today’s 20,000 to some number approaching 40,000.

To respond to these megatrends, we need to focus on technology development to provide the best solutions for our customers. Our products are complex, and they are becoming more complex. In many cases, unlike a new smart phone or social media app, the decision to bring a new product family to market is a decision to make a decades long, multi-billion dollar investment. As one example, we have recently launched a new jet engine for the single aisle commercial market, the market served by the Airbus A320 and the Boeing 737. While major product development started in 2008, we had been investing in the specific differentiating technology – a gear to go between the turbine and the fan – for over a decade before committing to development. We’ve already spent over $10B on this program and have another $10B to go as we expand our supply base and capitalize on the incredible growth this new engine is bringing us. Our success depends on our ability to use that investment to deliver truly breakthrough performance to our airline customers. And so far we’re doing that, with an engine that is 18% more fuel efficient, 75% quieter and reduces emissions by more than 50%.

I relay this because it speaks to exactly why UTC is so committed to the effort that brings all of you here this week. Jet engines are wickedly complex, exquisitely balancing thermodynamic, aero and chemical processes to provide performance without compromise. The engine must work every time; we don’t have the luxury of doing a beta release and then fixing things on the fly. To this task we bring computational tools inside the company, and in partnership with the National Laboratory computing infrastructure, to simulate and analyze every step of the way. Breakthroughs in understanding turbulent, lean combustion or environmental coatings or two-phased lubricant flow depend on access to the most productive supercomputers. Modeling and simulation allows us to reduce design cycles and optimize the number of physical tests we do, saving millions of dollars. Quite simply, we could not execute a modern jet engine program without high performance computing and without the Nation pushing the boundaries of HPC. Critical to our ability to expand our use of high performance computing is not just the hardware and software we can access at the Labs; it’s also the full ecosystem including the applications expertise with whom we partner.

So, we know all the reasons – innovation, technology development, engineering skills stewardship and economic competitiveness. But, my belief – and the answer to my question “Why Are We Here?” – is more basic. It is, “We are here is to discover facts.” We use our most powerful computers to discover and reveal empirical facts about the real world, because as scientists and engineers this is what we do; this is what our theories depend on and our engineers convert to useful stuff. We learn the facts – and we defend the facts – so that society as a whole can decide what to do.

One segment of society might choose to innovate based on newly revealed facts. An obvious example here is the creation of the semiconductor revolution that came as we developed a deep understanding of the facts associated with the classical and quantum behavior of electrons in materials, in the presence of electric fields.

Another segment takes the facts – what particles emerged with what momentum and with what probability – that we have ascertained from analyzing what our sensors and detectors delivered from collisions in the Large Hadron Collider, and uses those facts to fine tune our theories of the Standard Model. (Or, if you are perversely optimistic like I am, to give guidance to what lies beyond the Standard Model.) Note, by my definition, the facts are what happen in the detectors. What happens next is the human enterprise of speculation, innovation and hypothesizing.

In today’s new world order political atmosphere, some people believe, or want to believe, that they can define facts to be what they want them to be. Now, I suspect that most people in this room, fully rooted in the scientific enterprise, will agree with me that the concept of “fact” equates with our concept of empiricism, and while interpretation and extrapolation and supposition are all subject to different points of view, may have uncertainties, and may help create or stimulate debate, facts are not subject to our belief. They just are, because they can be confirmed by an appeal to evidence. David Wooten described it thus: “Facts are a linguistic device which ensures that experience trumps authority and reason.” Or, as David Hume said, “There is no reasoning…against matter of fact.”

And just to reiterate the crucial point, I am not saying that what we believe to be a fact can’t and doesn’t change. What is immutable is the fact itself. Our goal is to continue to accumulate empirical evidence to successively reveal, clarify and put boundaries on the fact.

It may surprise some of you that this is a relatively new – and very limiting – definition of fact. In his seminal book, The Invention of Science, Wooten traces the evolution of this modern definition from its Latin origin as the word for what we today would say as “action” or “deed” – hence the phrase “after the fact” – through the 16th century to our use today: a truth that can be described and verified by empirical evidence.

The great transformation that occurred in the late 16th and early 17th centuries, the development of the scientific method, connecting hypothesis to experiment, would not have been possible without this clear definition of what a fact means. Prior to this moment there were other kinds of facts, that is, things that were taken as true and about which any theory had to offer an explanation. The three obvious competitors were received wisdom; appeal to repetitive experience, or what some call the “guild;” and majority belief. The barrier to creating science – the process by which we seek to explain, conjecture, and expand our ability to project – based on these alternate definitions is that they do not operate in a context in which the connection to observed reality is a necessary condition. As Chico Marx said, “Who you gonna believe, me or your lyin’ eyes?”

I’m dwelling on this topic because I want to draw a clear distinction between the core output from a project like the ECP – detailed facts extracted from processed data about how built infrastructure impacts weather, about how miniscule asymmetries impact shot efficiency in inertial confinement, about how signals and physical flows propagate in highly heterogeneous, deep sub-surface formations and about how natural and potentially human-created proteins fold – I want to draw a distinction between the facts of the matter and the steps we take to do something with those facts.

Remember I have said that facts must be connected to repeatable, empirical evidence. I understand that the steps we take to validate and articulate a fact can be long, brought about by an accumulation of evidence and experimentation. As scientists and engineers, we speak of error bars and statistical validity. Our understanding of a fact can change; our definition of what a fact is cannot.

You see, as a society, we can and should prioritize and we can and should debate chosen pathways and consequences. Ethics and theology are human constructs and as such, when they lead to a decision it is expected, even necessary, that debate ensue. However, when we must make decisions, the success or failure of which will depend on facts, appealing to belief or “what I want reality to be” is an ultimately unsuccessful strategy.

Refusing to sail west because we had received wisdom that the world is flat instead of empirical facts to validate this assertion delayed the discovery of the Americas by 150 years, despite then sea-faring nations having sufficient technology and bravery to make the voyage.

  • Building a bridge out of materials that can’t support the required weight because we want a structure that looks more appealing and want materials to behave in a certain way will not make those materials do so.
  • Building in a coastal flood plain because we choose to believe, contrary to empirical evidence, that global temperature isn’t increasing will not cause sea levels to stop rising.

 

I’ve spent time on the subject of facts and the appropriate use of them because I believe we live in a time when some want to again believe that empiricism is only one of several methods to define facts, and, in this case, I might even use the word “truth.” To come full circle, the identification and verification of facts are at the core of the rationale for the Exascale Computing Project. And, each of us has an obligation to ensure that facts maintain their voice, maintain their unique capability to define and defend the world as it is, not as we want it to be. If we fail, as President Obama said, “…my mother used to tell me, reality has a way of catching up with you.”

You will do amazing things over the next decade, creating not just a new machine but a new ecosystem of hardware, operating systems and software applications to confront the Nation and the world’s biggest challenges. And when you are done, we will have a capability to explore new science, to extract new facts and to provide verification or refute things we already think we know. That is the scientific enterprise.

Through all of that, I ask that you remember the core reason why we are here – to provide a basis on which valid decisions can be made. I ask that you join me in ensuring that we continue to believe in a world that is knowable, a world that reveals itself to us empirically.

Our future depends on it.

Thank you.

 

Share this article
Share on FacebookShare on Google+Tweet about this on TwitterShare on LinkedIn