For many people, the term ‘digital twin’ is an elusive one and carries with it a certain mystery. As consultants, we see it as our responsibility to help clients understand just what a digital twin is and how it fits into their building asset strategy. The reality for many building owners is that a digital twin is typically not considered a priority for ‘today’, but rather a goal on the horizon to work toward. And that is where we help our clients maximize their building strategy and identify the foundational pieces needed to get to either a specific use case or a computer model of their entire facility. It's about building towards an integrated future.

 
 

Making the case for the ROI

Many building owners and facility managers are still grappling with how to define—and justify—the return on investment for digital twins. Part of the challenge is that the concept is often presented as an aspirational “future state,” making it difficult to connect with current operational realities. As a result, digital twins can feel abstract or overly complex, rather than practical tools delivering measurable value today.

A more effective approach is to start with clearly defined, near-term use cases. For example, optimizing a chilled water system can demonstrate tangible energy and cost savings, sometimes in the range of 20–40%. By focusing on specific systems or combinations of systems, organizations can begin to see real returns and build confidence in the broader potential of digital twin strategies.

However, the true value of a digital twin lies in moving beyond isolated optimization toward a more integrated, holistic view of building performance. Rather than analyzing systems in silos, digital twins bring together multiple data points—such as building systems, occupant behavior, and external environmental conditions—to provide a more comprehensive understanding of how a facility operates. This integrated perspective is particularly important in complex environments like laboratories, where system interactions significantly impact both energy use and operational outcomes.

There is also some confusion in the market about what digital twins actually look like. They are often associated with highly detailed 3D models, which can create the impression that advanced visualization is a prerequisite. In reality, many organizations are not yet at the stage where 3D modeling adds value. The priority should instead be on data integration, actionable insights, and scalable use cases that support operational and decarbonization goals.

By starting small, focusing on high-impact applications, and building toward greater integration over time, organizations can unlock the practical benefits of digital twins while laying the groundwork for more advanced capabilities in the future.

 
 
 
 
 
 

Testing real-world options in a virtual model

For existing facilities, one of the most compelling advantages of a digital twin is the ability to visualize complex building data in a spatial context. By overlaying operational data—such as occupancy patterns and energy use—onto a building model, facility teams can better understand how spaces are actually being used. This can reveal insights that are difficult to capture through traditional data analysis alone, such as movement patterns, underutilized areas, or equipment that is operating more frequently than expected. In environments where performance and efficiency are closely tied to how spaces are used, this level of visibility can inform more targeted and effective interventions.

This capability is particularly valuable in laboratories, where their continuous operation limits opportunities for real-world experimentation. Because these facilities must run 24/7 to maintain safety and environmental conditions, it is often not feasible to test new strategies directly in the live environment. Digital twins offer a way to address this challenge by enabling scenario testing in a virtual setting. By creating a digital representation that reflects real-world conditions, teams can explore the impact of potential changes—such as adjusting equipment schedules, modifying ventilation strategies, or introducing new technologies—without disrupting ongoing operations. These simulations allow organizations to evaluate sustainability measures in advance, reducing risk and increasing confidence in implementation.

A helpful way to think about this is as a working copy of the building: a parallel environment where changes can be tested, refined, and validated before being applied in practice. As digital twins continue to evolve, the ability to connect these virtual models with real-time data further enhances their value—providing a feedback loop that helps confirm whether implemented strategies are delivering the intended performance and decarbonization outcomes.

 
 

Where are we in the digital twin timeline

While interest in digital twins continues to grow, industry adoption is still in the early stages. The underlying technologies largely exist, but many organizations are only beginning to explore how to apply them. Rather than viewing digital twins as a single solution to implement, it is more accurate—and more effective—to think in terms of a long-term strategy that evolves over time.

Digital twin development is often described along a maturity curve. At the foundational level, organizations may begin with elements such as 3D models or asset data—capabilities that, in many cases, already exist within design models or advanced building automation systems. The next step is less about adding new tools and more about connecting what is already in place: breaking down silos between systems, aggregating data, and enabling a more integrated view of building performance.

 

There are also practical barriers that can slow progress. In some cases, systems are not designed to communicate with one another, limiting the ability to integrate data. In others, the necessary data simply is not being captured—for example, due to gaps in metering or monitoring infrastructure. Without access to reliable, relevant data, even well-defined use cases can be difficult to implement.

Over time, these challenges are likely to diminish as technologies mature and industry practices evolve. For now, however, most organizations are focused on building the foundational capabilities that will enable more advanced digital twin applications in the future.

As maturity increases, this integration supports more advanced insights and, ultimately, more automated or predictive operations. However, reaching that level requires a strong foundation. Many organizations today are still at the early stages—developing strategies, identifying priority use cases, and piloting targeted applications.

 
 
 
 
 

Don’t forget your roadmap

For organizations starting from a clean slate, one of the most important, and most overlooked, steps is defining clear use cases from the outset. This begins at the highest level, with organizational goals. Whether the priority is operational efficiency, sustainability, or long-term asset performance, these objectives should guide every subsequent decision.

This top-down approach mirrors established BIM practices: start with what the organization is trying to achieve, then determine how assets will be managed, and from there define the data, systems, and equipment required to support those outcomes. When applied to digital twins, this ensures that technology decisions are driven by purpose—not the other way around.

A common misstep is to begin with a specific technology or vendor solution and attempt to build a strategy around it. Because while these tools may offer value, they are not always aligned with an organization’s specific needs or long-term goals. By contrast, starting with clearly defined outcomes allows organizations to identify the right systems and data requirements to support meaningful use cases.

A structured roadmap can help translate this strategy into action. Rather than approaching digital twins as a large, all-at-once investment, organizations can break the journey into manageable steps. This might involve:

Taking this phased approach makes implementation more practical and cost-effective, while still aligning with a broader vision. By focusing on achievable milestones, organizations can demonstrate value early, build internal momentum, and progressively expand their digital twin capabilities in a way that supports long-term decarbonization and operational goals.

 

Identifying early wins

When it comes to achieving early returns, some of the most impactful opportunities are also the most fundamental: measuring performance and making that data usable. Many organizations have set ambitious decarbonization and ESG targets, but without reliable data, it is difficult to track progress or make informed decisions. Establishing the ability to monitor key metrics—such as energy use, system performance, and occupancy—is often the first and most valuable step in a digital twin strategy.

Once this foundation is in place, organizations can begin to identify inefficiencies and adjust operations accordingly. Even at the level of individual systems, improved visibility can lead to meaningful energy and cost savings. However, greater value emerges as data from multiple systems is brought together.

Integrating systems enables more coordinated and responsive building operations. For example, occupancy signals from one system—such as lighting—can be used to inform another, like ventilation, allowing building systems to respond dynamically to actual usage. While this type of integration may be more complex in specialized environments like laboratories, the principle remains the same: connecting data across systems creates opportunities for smarter, more efficient control strategies.

Ultimately, the progression from measurement to integration supports a more holistic approach to building performance. By starting with accessible data and building toward more connected systems, organizations can unlock practical, high-impact improvements that advance both operational efficiency and sustainability goals.

 

Reducing risk and cost in existing labs with digital twins

For existing laboratory facilities, the path forward is not about wholesale replacement, but about making better use of what is already in place. Because these facilities are operational, they offer an immediate advantage: access to real performance data. This creates an opportunity to begin measuring how systems, equipment, and spaces are actually being used, and to use those insights to guide targeted improvements.

Digital twin strategies in this context often start with focused use cases. This might include analyzing how occupants move through a space, how frequently specific equipment is used, or how energy consumption varies across different lab environments. These insights can be applied at multiple levels—whether examining individual pieces of equipment or understanding how a lab interacts with the broader building systems.

One of the most valuable early applications in laboratories is predictive maintenance. Given the critical nature of lab operations, equipment failure is not only costly but can also compromise research and safety. By leveraging connected systems and real-time data, digital twins can help identify early warning signs of equipment issues, enabling proactive maintenance and reducing the risk of unexpected downtime.

In parallel, digital twins support ongoing tracking of energy, carbon, and resource use, providing a clearer picture of performance over time. They also create a platform for testing potential upgrades—such as new equipment or operational changes—in a virtual environment before implementing them in the real world.

Importantly, these benefits can be realized incrementally. With a clear strategy in place, organizations can align capital planning with long-term digital twin goals. For example, when equipment is due for replacement, specifying systems that are capable of capturing and sharing data ensures that each investment contributes to a more connected, intelligent facility over time.

By taking a phased, data-driven approach, existing lab facilities can steadily improve performance, reduce operating costs, and advance decarbonization—without disrupting critical operations.

 
 
 
 
 

Should you start with a pilot project

Pilot projects can play an important role as valuable entry points in moving digital twin initiatives forward. In many cases, facility teams are already aware of specific challenges or improvements they want to pursue. Targeted pilot projects—focused on a particular system or use case—allow organizations to demonstrate value quickly, without waiting for a fully developed, enterprise-wide strategy.

That said, pilots are most effective when they are guided by a broader vision. Understanding the long-term direction—what systems need to be connected, what data will be required, and how different use cases may evolve—helps ensure that early investments support future integration. Without this context, there is a risk of implementing solutions that are difficult to scale or connect later.

A thoughtful approach combines both elements: a clear roadmap informed by a comprehensive understanding of existing systems and data, alongside focused pilot projects that deliver early results. This enables organizations to build momentum while maintaining alignment with long-term goals.

Ultimately, digital twins are not a single project, but an evolving capability. By starting with what is available, prioritizing high-impact use cases, and building toward greater integration over time, organizations can create a strong foundation for more efficient, resilient, and lower-carbon laboratory environments.

 
 

Akira Jones, P.Eng., LEED AP

Division Director – Digital Services


akira.jones@hhangus.com

 
 

Krigh Bachmann, B.Des., BIM AP

Design Technology Leader


krigh.bachmann@hhangus.com

 
Vancouver
 
 

Key takeaways for Vancouver building owners and operators: 

  • Carbon is becoming a recurring operating cost.
    Starting in 2026, exceeding Vancouver’s greenhouse gas intensity limit triggers annual penalties, not one-time fines. By the early 2030s, carbon costs for many large buildings can resemble annual debt service without creating any asset value.
  • This is a fuel-system issue, not a building-quality issue.
    High-performing buildings still fail long-term targets if they rely on gas-fired heating.
  • Carbon payments can fund solutions.
    Predictable carbon costs can often be redirected to service financing for electrification and retrofit projects.
  • Early action preserves control.
    Waiting doesn’t reduce cost – it compresses risk. Owners who act early retain flexibility, access incentives, and sequence work on their terms. Delayed action concentrates capital, construction, and compliance risk into fewer years as 2040 approaches.
 
 
 

 

For many building owners, carbon regulation still feels distant or a matter to be addressed later, once requirements are clearer or costs feel more immediate. On the surface, today’s carbon-related expenses appear manageable. But that surface view is misleading. 

Carbon risk in Vancouver’s building stock behaves like an iceberg. What is visible today is small. What lies beneath the surface is large, fixed, and already embedded in future operating costs.

This is a cash flow and asset planning argument, with carbon as the cost driver. It is about financial exposure that is already sitting on balance sheets, quietly compounding year over year.

Carbon risk is moving from ESG reporting to the balance sheet and it is backloaded.

Waiting does not reduce the cost of transition. It reduces the number of options available when that cost becomes unavoidable.


Carbon performance is becoming an operating line item

Vancouver’s carbon regulations are clear and increasingly consequential. The City sets a carbon performance cap on large buildings, measured as greenhouse gas emissions per square metre each year. Starting in 2026[1], buildings must operate below that cap. If they exceed it, owners pay a fee of roughly $350 per tonne of CO₂e above the limit. ($500 base, plus $350 per tonne CO2e above limit, plus $100 per gigajoule GJ above the heat energy limit where applicable.) That fee is not a one-time fine. It is an annual operating cost, repeating every year a building remains non-compliant. And the cap does not stay flat, it tightens over time.

By 2040, the allowable emissions limit drops to 0 kg CO2e per m2 per year target for office and retail. In practical terms, that means buildings must operate without gas and district energy related carbon emissions, effectively eliminating fossil gas heating.

Carbon performance is becoming a recurring operating expense—unless the emissions are removed.

For many owners, this shift has not yet fully registered. But it is already built into the operating future of Vancouver’s commercial real estate market.


A typical downtown tower and why it matters

Consider a common Vancouver asset: a 20,000 m2 mixed-use tower with office floors above retail at grade. This is not a worst-case building. It represents a typical, professionally operated downtown property.

Example:

GFA = 20,000 m², actual GHGi = 32 kg CO2e per m² per year (tail building case), limit GHGi = 25 kg CO2e per m² per year.

tonnes over = (actual GHGi − limit GHGi) × GFA ÷ 1000

fee = 500 (Base permit fee) + 350 × 140 = $49,500

Most buildings of this type rely on centralized mechanical systems typically gas-fired boilers supplying heating throughout the building. This is standard practice across much of Vancouver’s existing office stock. In these buildings, operational emissions are dominated by gas-fired heating.

Key data point:
Buildings contribute nearly 60 percent of Vancouver’s total emissions. In large commercial buildings, boiler plants are usually the single largest source of on-site emissions.

This is not an efficiency problem. It’s a fuel problem.

That distinction matters. The financial exposure created by carbon regulation does not apply only to inefficient or poorly managed buildings. It applies directly to normal, well-operated assets whose core heating systems inherently produce carbon. This places hundreds of Vancouver buildings—across both institutional and private portfolios—on the same trajectory.


Carbon costs can escalate faster than rent

In the early years, carbon penalties tend to look small. That makes them easy to dismiss. But they repeat every year, and the regulatory caps tighten.

As allowable emissions decline, the same building generates larger overages and larger annual fees. By the early 2030s, projected carbon costs for many large buildings stop looking like nuisance fines and begin to resemble annual debt service.

By the early 2030s, carbon penalties can behave like a debt service payment without creating an asset.

The difference is stark. This “debt service” delivers no efficiency gain, no resilience, and no improvement in building value. It is a recurring liability with no upside. This is the financial inflection point where paying penalties becomes more expensive than fixing the problem.


Two paths: pay carbon or invest in the asset

At that point, owners are effectively choosing between two financial paths. The first is to pay carbon penalties annually—an operating expense that escalates, compounds, and produces no return.

The second is to invest in the building electrifying systems, reducing emissions, and aligning the asset with long-term regulatory requirements.

Carbon payments are pure leakage. Retrofit capital converts cost into value.

Retrofit investments reduce future penalties, improve energy resilience, and align assets with lender, insurer, and tenant expectations. Electrification also creates optionality access to incentives, flexibility as energy markets shift, and protection against tighter future limits.

The decision is not whether cost is coming. It is whether that cost is paid forever or invested once into the asset.


Why waiting can look rational (until it isn’t)

In the near term, deferring action can appear financially rational. Early carbon costs are relatively small, and standard financial analysis places less weight on costs that occur further in the future.

On paper, paying penalties for a few years can look cheaper than committing capital today. But that apparent advantage is fragile. It depends on carbon prices staying low, incentives being ignored, energy costs remaining stable, and financing conditions not improving.

Small changes in assumptions can completely erase the financial case for waiting.

As regulatory caps tighten, penalties escalate and retrofit scopes deepen. What once looked like a manageable operating cost begins to behave like a long-term liability. When compounding penalties, rising energy exposure, and the disruption of rushed retrofits are considered together, the “wait and see” approach often ends up costing more over the life of the asset.


The window is narrowing

In the near term, many buildings can manage compliance through optimization, controls tuning, recommissioning, and minor upgrades. This work may be enough to meet early limits like 2026 for some buildings with relatively low cost and disruption.

But later-stage reductions are much steeper. By the time Vancouver approaches 2040, there is effectively no room left for gas heating. The remaining compliance pathway is dominated by electrification and system replacement, not incremental improvement.

Waiting does not reduce the scope of work. It compresses it.

Fuel switching, electrical upgrades, tenant coordination, incentive access, and contractor availability all still have to happen. When that work is compressed into fewer years, financial risk, construction risk, and operational risk rise together.


Many buildings will miss the 2040 target

Based on City of Vancouver benchmarking data, nearly every existing office tower reports non-zero operational emissions today. Under a 2040 requirement of zero, almost the entire current stock fails not because buildings are inefficient, but because they still burn gas.

High-performing buildings still fail if they rely on combustion-based heating.

Optimization can reduce emissions, but it cannot eliminate them as long as gas remains in the plant. That is why the 2040 requirement represents a structural transition, not a tuning exercise. This is a system-wide fuel challenge, not a marginal performance issue.


Actions that buy time and flexibility

There are, however, low-risk actions owners can take now. Controls upgrades, recommissioning, HVAC optimization, sensor deployment, and operational tuning typically require modest capital and limited disruption. They reduce emissions and energy costs immediately while improving system visibility.

These are no-regrets moves that lower risk today and preserve options tomorrow.

These measures are bridges, not end-state solutions. Their value lies in buying time to allow deeper system transitions to be planned and executed deliberately.


Carbon costs can service capital

One of the most important mindset shifts for owners is reframing carbon penalties as predictable cash flow. Carbon costs are recurring, measurable, and tied directly to performance. If an owner does nothing, that cash flow goes to the City every year. But that same cash flow can often be redirected to service financing for retrofit and electrification projects.

The building keeps paying but now it’s paying down infrastructure, not penalties.

The question shifts from “How much will this cost us?” to “How do we make unavoidable costs work for the asset instead of against it?”


Incentives that change the math

BC incentives materially improve retrofit economics. Programs such as the Clean Buildings Tax Credit, a refundable 5 % tax credit on qualifying retrofit expenditures, subject to program deadlines and certification, utility performance programs, and electrification funding reduce net capital cost, improve returns, and lower early-stage risk. When layered with avoided carbon penalties, these incentives can transform recurring operating costs into funding sources for higher-performing, lower-risk assets.


What happens when owners wait

New York City’s Local Law 97 offers a clear preview. NYC sets a per tonne penalty for exceedance, which is one reason owners felt it as an operating line item. Many owners delayed action. As limits tightened, that delay translated into higher penalties, compressed timelines, and competition for limited engineering and contractor capacity.


Waiting did not avoid cost. It concentrated it.

Vancouver is earlier in the same movie. The regulatory signals are clear, but the market is not yet saturated. Owners still have the ability to sequence work and retain control.


Four Questions Every Owner Should Ask:

  1. What is my building’s carbon exposure?
  2. When does that exposure spike?
  3. What actions buy time and flexibility?
  4. How do I finance control—not penalties?

These are practical questions that can be answered today—and they increasingly define asset resilience in Vancouver’s market.

Carbon risk in Vancouver’s buildings is not speculative. The limits are defined. The timelines are published. The financial mechanisms are already in place. That also means the risk is manageable.


Waiting doesn’t reduce cost. It reduces choice.

Avoiding the iceberg is not about reacting at the last moment. It is about adjusting course early, while there is still room to maneuver. The opportunity today is to move from compliance thinking to strategic asset planning—turning an inevitable transition into a controlled one.

[1] The first reporting for the 2026 data year is due June 1, 2027, and notes operating permits start in 2027 for large office and retail.

If you’d like to explore decarbonization strategies for your building assets, contact our specialists:

 
 

Mike Hassaballa, M.A.Sc., P.Eng

Lead Consultant, Energy Infrastructure, Senior Engineer

mike.hassaballa@hhangus.com

 
 

Dayne Perry

Senior Manager, Commercial

dayne.perry@hhangus.com