This article first appeared on HPCwire

I arrived in beautiful Barcelona on Saturday afternoon. It is warm, sunny, and oh so Spanish. I am greeted at my hotel with a glass of Cava to sip while I have a tour of the historic hotel. A short rest, walk around Barcelona, and a little bit of work until dinner at 8pm.

It’s Tuesday and off to PRACEdays 2017, part of the European Summit Week. Tuesday starts with a welcome by Sergi Girona, EXDCI Co-ordinator, and Serge Bogaerts, Managing Director of PRACE aisbl, outlining the week of plenaries, keynotes, breakout sessions, BoFs, and poster sessions. There will be a lot to see and learn this week in Barcelona!

 

Anwar Osseyran was next with a detailed overview of PRACE achievements and the challenges they have ahead of them. PRACE prides themselves on providing open access of the best HPC systems for European scientists. Their criterion – scientific excellence.

In the PRACE partnerships, there are seven “Tier 0” systems (top systems available for international use), including the recent addition Piz Daint which was added in 2016 being number 8 on the Top500 list. Of the 7 world-class systems they have, there are over 60 petaflops of peak performance enabling 524 scientific projects.

Anwar has put the challenges PRACE sees in how to adapt and modernize HPC Infrastructure into four quadrants:

  • European open science cloud: Enabling persistent access to data. This is a huge challenge affecting health care
  • Strong HPC infrastructures for data processing
  • Adapting HPC solutions for cloud environments to make it easy and accessible for scientists
  • How to achieve Exascale

As PRACE considers these challenges, the question of funding comes in. How will PRACE fund all their ambitions? If they can’t do it all, what technologies and applications should they focus on? As Anwar says, consider the “mundane versus heavenly science. It’s about choices.

On more than one occasion during his presentation, Anwar discussed the concept of collaboration among the communities versus the benefits of competition. Anwar suggested that competition among scientists produces better results. I would have thought that collaboration among the supercomputing centers would be more of a norm – sharing resources, results – all contributing to better science.

As Anwar said in his closing statement: “It’s about finding a balance between traditional, disruptive, and fundamental science.”

 

The first Keynote was by Minna Palmroth who presented Understanding Near-Earth Space in Six Dimensions.

The thing I really love about events like this is the opportunity to learn more about the science, the big science problems, and things I’ve never thought about. Minna hit the mark in her presentation on near-space problems.

The Earth has radiation belts. Navigation and weather satellites sail in plasma around the Earth, traversing the radiation belts. Two types of phenomena are affecting spacecraft and satellites: A single event upset (like a system failure), and the aging of the spacecraft due to the harsh radiation they experience in the radiation belts.

The radiation belt situation is already extremely important, but will be more important in the future as the number of spacecraft grows. The challenge in a nutshell: how to simulate large, and ever increasing numbers of spacecraft in the radiation belt requires a dense grid, and complex grid calculations in multiple dimensions simultaneously.

Minna is at the Department of Physics at the University of Helsinki, Finland. They are solving parallelization on 3 levels:

  • Across multiple nodes using MPI
  • Across multiple cores within a  node using OpenMP
  • Within cores with Vectorization

Their most recent development supports multiple ions, an optimised boundary conditions implementation resulting in improved scaling. This has given them the processing power and speed to do the math needed for the near-space problems they have identified.

The simulations Minna shared of solar winds and radiation belts as they hit the Earth’s atmosphere are fascinating. The solar winds create significant amounts of heat that dissipate and spread around the Earth’s atmosphere.

The system Vlasiator (http://vlasiator.fmi.fi/) is a newly developed large-scale space physics model. The goal is to model the entire near-Earth space, going far beyond the existing large-scale plasma simulations. This will take the modeling from the current solar winds and radiation belts to space weather and spacecraft instrument optimization. Vlasiator has been used to discover phenomena that no one thought existed, and with the continued modeling improvements such as adding machine learning, Vlasiator will be an important tool to understanding space phenomena and methods to protect spacecraft, technological systems, or human life in space.

 

The 2nd Keynote was by Telli van der Lei, Using Big-Data Methodologies in the Chemical Industry.

The information shared by Telli is not surprising – we have long known that modeling supply chains can produce positive results. Regardless, this is a topic that can’t be discussed enough – especially in a conference that is heavily research and academic (approximately 73% of the attendees). Talking about the business application of the science they are modeling and the improvements it enables is a good thing. It takes science and computation that was developed in one place, used and enhanced in another, and demonstrating it back to the first place.

Telli is an academic, now working for industry. She works for DSM as a Senior Scientist in the Supply Chain and Process Modeling. Doing the modeling in industry, Telli professes, can be quite hard. In her presentation, Telli talks about the industry issues DSM thinks about, the results they achieved with their Supply Chain Modeling, and the challenges she thinks about going forward.

For industry issues, health is number one. I would argue that nearly every country in the world has the issues of aging population, healthcare, and optimal food composition. After health, nutrition, how to feed the growing population, and the drive of urbanization creating clusters of humans while decreasing farming space is a growing issue. Lastly, resource constraints of available materials to feed into the supply chain is a major issue.

DSM uses computer modeling to simulate the supply chain from raw materials to manufacturing, warehouse to the client. Using the modeling and incorporating the process into their supply chain, they have realized some amazing results in correctness of orders, reduction in supply chain costs, reduction of inventory, and a more efficient, flexible, and responsive supply chain.

Out of all this, they have modeling and advanced analytics which have come from their proven successes. They still see challenges such as from a modeling perspective, how do you optimize the input, output, and runtime of existing models or incorporate the business choices in models? They can use HPC to simulate, but how are you convinced your results have value? As Telli said: “It’s not yay – here we go <with our results>, it’s how you change your business.

 

About the author: Kim McMahon is the CEO of McMahon Consulting, a full service marketing firm with over 15 years of experience in HPC, Enterprise Technical Computing, and the high-end IT space working with clients around the globe. Kim has performed sales and marketing for more years than she cares to count. She writes frequently on marketing, life, the world and how they sometimes all come together.