Defining the Digital Twin for Buildings

Houston, we have a problem.

Quickly after the oxygen tanks exploded early into Apollo 13’s mission in April 1970, the astronauts were fighting for their lives. The whole world held its breath as the now-famous rescue ensued. Engineers in Houston scrambled to sort through technical issues from hundreds of thousands of miles away. 

The rescue was successful because back on earth there was an identical copy of the spacecraft—Apollo 13’s twin. The team could quickly test solutions on the ground without adding risk to the astronauts in space.  

Almost 50 years later, NASA uses the same strategy to understand and manage systems and machines across the solar system. Except today, the twins are virtual and there’s a fancy buzzword to describe them: Digital Twins. 

Read any marketing-driven article today and you’ll learn that digital twins are about to save the world from climate change, create world peace, and wipe our virtual and physical butts. According to Gartner, digital twins were at the peak of the famous Hype Cycle for Emerging Technologies in 2018—entering the Peak of Inflated Expectations and headed towards the Trough of Disillusionment. They estimated that digital twins are 5 to 10 years away from the “Plateau of Productivity” where a technology becomes mainstream and fully operational. 

Gartner defines a digital twin as:

A software design pattern that represents a physical object with the objective of understanding the asset’s state, responding to changes, improving business operations and adding value.

By that definition, virtually all organizations that manage physical assets have some sort of digital twin in operation. And it’s been that way for decades. For example, consider a single data point representing one quality of the health of one asset. Every commercial building I’ve ever been inside has that covered.

Here lies a paradox: How can a technology be simultaneously proven and in-use for decades AND be 5-10 years from maturity? That’s very confusing. 

I think it’s because we’re talking in circles. There’s little consensus on what a digital twin is, what it does, and how the latest ultra-hyped versions are any different than what we’ve been doing for years. This post is the first installment in my attempt to untangle that mess for myself, with the hope it will be useful for you as well. Since we’re mainly concerned with buildings, I think the history of digital tools for buildings is a great jumping-off point. 

The History of Digital Tools for Buildings 

I’ve worked in the buildings industry for almost 10 years now (woohoo!). Much of that time has been spent with digital representations of mechanical systems or whole building energy models. These types of tools, along with the others listed below, are mainstream in our industry. It turns out that we’ve been inching closer to digital twins this whole time—it’s the next progression. To illustrate, let’s walk through them in the order that I encountered them, starting at the beginning of my career. 

(1) Energy Models

Whole building energy models—built in tools like eQuest, OpenStudio, Trane Trace, etc—are used in design and energy management practices to calculate the energy consumed in a building for every hour of the year. 

By creating a digital version of the building, different alternatives for a target energy use (e.g. net-zero) can be digitally simulated and tested before being implemented in real life. A better envelope, a different orientation to the sun, more distributed energy resources, a different HVAC design, etc.

This makes it possible to test different design alternatives in the context of the exact user interaction and specifics of the building. Also, knowing the savings and/or ROI for different alternatives helps building owners and investors decide whether or not to pursue them. 

(2) Building Automation Systems 

We use it to operate and monitor our buildings. It provides streaming data, a user interface with graphical ways to explore the equipment, and of course control of the equipment itself. The BAS was the IoT for buildings before IoT was a thing.

(3) Building Information Models (BIM) 

Building information models are three-dimensional representations of buildings that aid in the construction process. I’m not sure I could say it better than Jenana Roper did

BIM is a small sub-set of a Digital Twin, frozen in time – typically during the design and construction phase. BIM is a finely tuned tool for more accurate design, collaboration, visualization, costing and construction sequencing phases of a building’s life. Its primary purpose is to design and construct a building, and post-construction, it serves to provide a digital record of a constructed asset. BIM is only focused on buildings – not people or processes. However, BIM is a small but very useful input into a Digital Twin, as it provides us with an accurate digital asset register and location data and is a great starting point for both a Smart Building and a Digital Twin.

(4) Energy Management Information Systems (EMIS)

EMIS, or building analytics, comprise another puzzle piece of the digital twin. As we’ve defined in the EMIS Framework, an EMIS is a system of devices, data services, and software applications that communicates with any building system or third-party data source to aggregate data and transform it into new capabilities that aid in the optimization of the building. 

An EMIS typically isn’t one product or application, although it should perform like one. The EMIS stack is all of the devices, data services, and applications that meet the needs of the user. Depending on the EMIS implementation, the stack has many different layers:

  • The integration layer is responsible for managing communication between the systems and third-party data sources and the historian layer. It could include hardware and software, including drivers for protocol translation.
  • The historian layer stores time series data and associated metadata in one or more databases, providing those data on request to applications.
  • The application layer consists of all high-level analysis tools that rely on collected building performance data. Applications can perform any sort of analytics with the data, such as fault detection and diagnostics (FDD), advanced visualization, and machine learning for prediction. 
  • The control layer supports applications that have a need to affect the operation of building devices in an automated or semi-automated manner.

(5) Smart Buildings 

A smart building builds on the BAS and in some ways the EMIS. It integrates multiple previously-siloed building systems into one platform and provides enhanced control functionality and engagement with occupants.  Some have called these Building Engagement Platforms, Building Operating Systems, or IoT platforms. There are probably many more buzzwords grouped under the Smart Buildings umbrella.

A smart building platform is the “middleware” between devices, people, and software applications. It differs from BAS in the scope of the data it connects with—ideally it connects everything digital: HVAC, lighting, plug loads, meters, access control, fire suppression, grid-interaction, IoT, etc. It differs from EMIS in its focus on connections at the edge instead of the cloud, human engagement, and control. Just like an EMIS, a smart building platform provides capabilities to users and occupants through applications. 

Defining the Modern Digital Twin

What is a modern digital twin then? What does it add to historical versions? One way to think about it is that it combines all of the above together. 

Static models are enriched by live data and the systems using live data are enriched by better models. For instance, an energy model would be more useful if it was calibrated to actual conditions in the building as it evolves over time. If that could happen automatically, even better.

Similarly, a BIM needs to be transitioned from construction-focused to O&M focused—meaning it needs static data (e.g. maintenance logs, O&M manuals, warranties) and live data from the BAS. No longer are models or simulations being operated in isolation from the physical world—there must be a connection between the physical and the digital systems. This requires two-way data exchange and the inclusion of humans in the roles of occupants or operators, so the focus on the human experience inherent to the latest smart building approaches is vital for the digital twin as well. 

With the combined functionality of all of these approaches, the digital twin is more intelligent and able to provide better analytics and control. It’s greater than the sum of it’s parts. We’ll get to that in more detail in the next installment. 

Another way to think of it: define what a modern digital twin needs to include: 

  1. A three-dimensional model—traditional models that describe the building’s physical attributes such as materials and spatial position 
  2. Static data—all of the details that add context, from O&M manuals to model numbers to human profiles 
  3. Streaming data—data from physical building systems and third-party applications like weather feeds 
  4. Computation—traditional analytics, now being supplemented with AI
  5. Human interaction—Interaction, feedback, and a user interface for occupants and operators
  6. A platform for applications—the middleware connecting everything together

Drop any one of these and, in my opinion, you no longer have a modern digital twin. In the next installment, we’ll walk through what digital twins will do for building owners, why they will change the buildings industry, and my questions about the future of digital twins.


If this article resonated with you, I invite you to subscribe to my weekly(ish) newsletter on smart buildings and analytics. It will keep you up to date on the industry and any new article I write.

Share or Save

5 thoughts on “Defining the Digital Twin for Buildings”

  1. I believe your#6 item above is the most critical. A building should have a single Digital Twin defined by an industry standard. That eliminates duplicate data collection, tagging, and archiving. All value added applications would then have exactly the same view of the building.

  2. James, I have a few thoughts/comments.

    We think its good to differentiate FDD/Analytics from Optimization. In doing data-driven or monitoring-based RCx, we find 40% of the energy savings come from finding and fixing the faults, and 60% from optimization. Most Analytics tools today are passive – they do not do optimizations or what DOE calls ASO.

    We think of trend dashboards as “human-in-loop-FDD/analytics”. Good FDD/Analytics doesn’t require an expert to interpret the information. It does the detection and the diagnostics for you.

    1. Terry, thanks! I have analytics differentiated from optimization (control) in the EMIS Stack. Are you saying it should be differentiated in the list of requirements for a digital twin as well? Just want to make sure I’m following because I think you’re right.

  3. Terry & James I think we must not confuse the key parts of a digital twin system with the how or usefulness of it as a tool. As you wrote, a modern digital Twin needs all of these items to advance us in the next evolution so to speak in Smart Buildings. Once we define and create those layers, it can be applied to uses such as Automated System optimization. I’m sure the next installment will go into more, but the computation layer includes AFDD, energy model comparisons, simulations of system changes etc. Optimization is often a result of these computations, either it be early to influence design, done after (RCx) or live (ASO) for maximizing energy performance.

Leave a Reply

Your email address will not be published. Required fields are marked *