By combining big data and VR, architects can significantly improve design

Phil-Bernstein

Big data is already transforming the way architects design buildings, but the combined forces of big data and virtual reality (VR) will advance the architectural practice by leaps and bounds.

Consider how far architects have come - before even integrating VR in architecture - using data from sensors and crowdsourcing. A few years ago, the John F. Kennedy School of Government at Harvard University hired Sasaki Associates to run a master-planning exercise, gathering feedback from students and faculty about the campus. The first question the Sasaki team tackled was, 鈥淗ow do the students get into the building?鈥

In the old days, my former firm (Pelli Clarke Pelli Architects) would just hire some college kid to sit out there with a clicker and click, click, click every time someone entered the building. More recently, the Sasaki team used its to track student movement in and out of buildings. They also asked Kennedy School students to draw diagrams of their paths through that section of Harvard鈥檚 campus, to wit: 鈥淲here do you start, and where do you end up?鈥

In the end, they discovered the main entrance for K School students was almost never through the front door. To the Sasaki team鈥檚 surprise, everybody cut through the loading dock.

It鈥檚 reminiscent of how design and construction teams used to build Army bases. They would build all the structures, then wait for three to four weeks to see where everybody was walking. Anywhere the grass was worn, that鈥檚 where they paved to make a path. While that was clever planning, it鈥檚 primitive compared to Sasaki鈥檚 current methods - and especially to what鈥檚 coming up.

Data-infused virtual students

There鈥檚 a lot of data out there for architects to absorb - so much that it can be tough to figure out how to go about it. According to , the brain is inundated with 11 million bits of data per second, yet it can only process around 50 bits at a time. But just as a photo is worth a thousand words, so is a VR experience worth thousands of data points. Infuse data into an , and designers will be able to take it in more effectively and efficiently.

What I suspect we鈥檒l see more of in the future is behavioral modeling - not through complex system simulations but using avatars (virtual people) with individual characteristics that interact accordingly. When preparing a campus master plan of the future, architects can have class schedules, diagrams of the 40 or so relevant campus buildings, and an avatar representing each student. The master-plan proposal could be tested by running simulations of what people do at any time - day or night.

The opportunities possible with behavior modeling are enormous. I remember working on a project for Wake Forest University, designing a shared building for the law and business schools. Both schools had a lot of the same needs鈥攖he number of classes, faculty offices, student activity offices, libraries, and so on. But if you took all of those individual needs for both schools and compiled them into one set of demand parameters, you鈥檇 end up with a building that鈥檚 much bigger than it needed to be because of the tremendous overlap.

Consulting with the fire marshal about exit requirements for that building, we argued that it wasn鈥檛 possible for every person to be in all spaces of the building simultaneously. A student couldn鈥檛 be in the classroom, the library, the courtyard, and the dining hall all at once - just as a faculty member couldn鈥檛 be in the cafeteria, her office, and a classroom at the same time. We eventually agreed that the overall demand for stuff like exit stairs and even toilets could be computed according to traffic estimates, which resulted in a much more efficient building.

These sorts of data-based decisions will be much easier in the near future: If an architect wants to make her case to the fire marshal, she could just run the simulation in a virtual model and show them. I imagine that she鈥檒l be able to buy a 鈥淢idwestern Liberal Arts campus student body鈥 of avatars, plop them into her simulation engine, and watch what they do all day long.

Then when the school says, 鈥淲e need a new English building,鈥 the architect could start testing against the number of English majors, determine where people would have their previous class on campus based on actual schedules, and on and on. And that is a major shift from experience and intuition to actual performative analysis based on big data.

Here at Yale, we鈥檒l likely be doing the same thing soon, as the undergraduate population is scheduled to expand by 15% next year, with no new classrooms being built on campus.

In a health care setting

Another example of pairing the power of virtual reality and big data might be found in the design of intensive care units (ICUs) for a hospital. Any good health care firm has likely designed dozens of such units, and in theory, their designs have continually improved over the years. When I was doing this work 20 years ago, the first thing we鈥檇 do is sit down with the client and ask, 鈥淗ow many beds do we need, how鈥檚 the building coming together, and what鈥檚 the floor plan look like?鈥 Then we鈥檇 sit down with the staff, look at floor plans, and iterate until we got a plan everybody thought would work. But once we built it, there鈥檇 be a million details and refinements left to resolve.

Contrast that with an ICU planning exercise that uses VR in the year 2022. Architects can gather usage data from 25 or 30 other ICUs so they know what the traffic patterns are. They鈥檒l have a digital model of how materials flow through the hospital and can build a 3D model of the ICU. Instead of the ICU staff looking at floor plans - which they can鈥檛 read anyway - they can walk through a virtualized environment (using a VR headset and something like Autodesk Live), so they can actually see and use the space.

The staff will be able to move the lights around, open doors, and turn the dials on the instruments. And we can present that with high-resolution 3D rendering as if it鈥檚 a realistic game, or paint that environment with data to show our clients, quantitatively, what鈥檚 going on. They will quickly learn about the proposed space: How noisy is it? What鈥檚 the temperature like? How much air is moving into this particular location? And most importantly, how can we deliver the best care here?

There鈥檚 going to be much better data - both in terms of what鈥檚 happened in similar spaces and how architects can simulate the representation of what鈥檚 going to happen. And as a result, they鈥檒l be able to make much better design decisions for future buildings.

Phil Bernstein is an Autodesk fellow, architect, and teacher of Professional Practice at Yale