Virtual reality and resilience: Shelter from the storms

Combining immersive reality with meteorological data can be a powerful tool in preparing for major storms. Visualization specialist Stephen Paul sets out six ways this technology can help.

When Hurricanes Harvey, Irma and Maria slammed into the United States and its territories, they brought with them unprecedented flooding and destruction. Weather forecasters accurately predicted the paths of these storms but could only estimate the impact of the resulting flooding and devastation.

What if we could see these impacts before they hit our neighborhoods? What precautions could we take to be prepared, prevent property damage and above all, save lives?

Visualizations in the form of augmented or immersive reality are beginning to answer these questions. A quickly evolving field, visualization mixes creativity, scientific information and technology, putting people in the middle of the action and communicating essential information through visual simulations. We traditionally use visualization throughout the architecture and engineering industry to help public audiences understand the impact of new structures, like bridges or levees. But fast emerging capabilities are boosting its use in communicating flooding impacts – even before storms hit – making visualization an important tool in resilience efforts. Here are six ways in which virtual reality will impact resilience.

Visualize this: Visualization is a powerful communications tool that takes varied forms. Virtual reality, for example, uses headsets and headphones to completely immerse users into a new place where they can explore their surroundings. Augmented reality – as the word implies – adds to reality, people use headsets but are able to see their physical reality, while adding images, objects and information to it. Visualization enables people to actually see the impacts of weather and storms, bringing science to the discussion in a powerful way.

Putting people at the scene: We can, with advanced notification, show specific scenes with different proscribed levels of flood waters by working with data from scientists and water experts. We can combine existing hardware, software and imagery so that through immersive reality people can view specific areas of flooding and have the sense of being at the scene. We can also provide them with some limited visual options for flood levels that will show them how the specific area will look with various different levels of water in them. This immersive technology can provide a very vivid experience and clearly communicate the impacts of flooding on people and the environment. (See an example here: http://aecomviz.com/sc-flood/ )

There are limitations: Right now the technology we have available to us allows us to create fixed point visualizations for resilience. We’re telling the visual story by combining information from our engineers and scientists with simulation software. As an example, we recently created a demo for a federal agency using photo simulations. We selected three locations that are vulnerable to flooding in the state of Kentucky. We recreated locations through computer generated images and put water in the model as we would place a 3D object in a traditional visualization. Working with the project manager, we selected the points to build up digitally, determining the level of water above ground level for a 100 year flood and for a 500 year flood. Our immersive visualization allowed viewers to enter one of the three locations and view each with the different specific flood levels.

And challenges: Right now, the challenge is not in working around technology as it is – its more in keeping up with, connecting and integrating the new technology developments to advance our abilities. With different projects, data sets and client bases requiring different combinations of hardware, software and dynamic information the challenge is getting everything plugged in together so it’s seamless and accessible to stakeholders. The technology is being reinvented very rapidly; we’re in a changing and exciting era and part of our goals are to use the most up-to-date capabilities, integrate them with our software and use it to tell an effective story.

But the future is out there: Technology is a double edged sword, while currently limiting our capabilities it’s also advancing swiftly – and we’re depending on that advancement to further our resilience capabilities. Specifically as we are able to process data exponentially faster, we will have greater flexibility and more dynamic visualizations. We can provide fixed points and show the effects of that set point for flooding but we don’t yet have the flexibility to calculate and quickly translate a visual that will automatically simulate all levels at will. There are too many moving parts for our current capability…for now. Technology is changing everything and it’s expected that we’ll have this capability within the next 5 years – or sooner. That’s not surprising when you consider that 5 years ago the technology for much of what we do routinely now didn’t exist. We recently developed a model of Columbia, South Carolina, that shows the impacts of rising seawaters and flooding in inland areas for levels ranging between 100 to 1,000 year floods. This most current model uses gaze navigation – where the viewers using virtual reality look in a specific direction for a second to move to the next location. The visualization allows for a full 360-degree view of a location with varying degrees of flooding across different flood levels. The technology is web-based, and democratizes virtual reality by enabling these views across a variety of simple platforms. Viewers can also superimpose the image of the location without floodwaters over the different images of flood levels. This development is poised to be a new tool in understanding and protecting communities from extreme weather resulting from climate change.

And the next wave is a game changer: That idea of sharing information brings us to the next generation of visualization. Ideally this next wave of technology will see us move into an era of more deeply integrated, broader and more dynamic data as well as cohesive interactive programming and more accessible hardware options. Developments such as Apple’s ARKIT and the diverse retail versions of the MS Hololens will allow us to incorporate a more widespread use of Immersive Technology. Advanced hardware and software will be a game changer for resilience and visualizations. We’ll be able to combine floodwater data with all kinds of variables, including high water tables, city expansions and low elevations to deliver augmented, virtual reality or other visualizations. We’ll also be able to engage people and communicate the information on a human level, taking the abstractions that are databases, heat maps and flood potentials and bringing them to life. Through immersive technology we’ll be able to show people what their street can look like under varied conditions bringing the consequences of hurricane flooding to life well before potential storms make landfall.

Virtual reality will communicate the potential impacts of severe weather and the preplanning efforts needed to mitigate. This helps on two fronts. In the short term by enabling governments to see and adopt effective means of mitigating storm impacts where there’s an impending weather event. And in the long run by encouraging urban planning and design that takes development or expansion into account by providing contrasting illustrations of storm impacts with and without building over such natural barriers as wetlands.