Jump to content

Articles & Publications

To explore how digital twins are defined and the overarching concepts around them, the DT Hub hosted a five-part talk series (available here).  
These talks were introduced by Sam Chorlton, chair of the Digital Twin Hub, who highlighted the fact that digital twin are not a new concept but rather that the technologies are now at a point where they can have a meaningful impact. With the national digital twin (NDT) programme leveraging now matured technologies and principles, these talks were aimed at exploring how they could be utilized within the built environment.  In each case, a video from the speaker was used to spark an online discussion involving a mix of stakeholders and experts from across the value chain. 
This first series of talks included: 
Olivier Thereaux (ODI), Towards a web of digital twins;  Brian Matthews (DAFNI), Meeting the Digital Twin Challenge;  Tanguy Coenen (IMEC), Urban Digital Twins;  Neil Thompson (Akins), Twinfrastructure; and  Simon Evans (Arup), Digital Roundtable.   Towards a Web of Digital Twins 
Beginning the digital twin talk series, Olivier Thereaux from the Open Data Institute (ODI) considered the parallels between the world wide web and the need to connect digital twins to form a national digital twin.  By first citing the Gemini Principles and establishing what a digital twin is, Olivier articulated the rationale for their adoption by explaining the concept of digital twin as an approach to collect data to inform decision making within an interactive cycle. 

Olivier provided further detail about the need to both share and receive data from external datasets (e.g. weather data) and other related digital twins.  To enable this exchange, he proposed the need for data infrastructure such as standards and common taxonomies.  As these connections develop, Olivier foresees the development of a “network of twins” that regularly exchange data.  Scaling these networks, a national digital twin could be achieved. 
Responding to Olivier’s talk, DT Hub members and guests asked a wide range of questions including on the adherence of technologies to standards, with Olivier confirming the existing of suitable standards; and referring to the work done by W3C and others.  In addition, questions were posed around connecting twins that span cross-sectors and the need to ensure trust in data. 
The full Q&A discussion transcript can be found here. 
In addition, Olivier has also kindly produced an article on the topic of his talk, which can be found here. 
Meeting the Digital Twin Challenge 
Following Olivier, Dr. Brian Matthews from DAFNI presented on the DAFNI platform and the challenges related to developing an ecosystem of connected digital twins.  Citing Sir John Armitt and Data for the Public Good, Brian emphasized how data is now considered as important as concrete or steel in regard to UK national infrastructure.  Building on the digital twin definition given by Olivier, Brian proposed two types of digital twin:  
Reactive.  Dynamic model with input from live (near real time) data; and  Predictive.  Static model with input from corporate systems.  Linking to the Gemini Principles, Brian acknowledges that a single digit twin is impossible; requiring an ecosystem to achieve a national digital twin. Delving deeper, Brian looked at some of the associated technical challenges related to scaling and integration. He also talked about how the DAFNI platform can meet these challenges, by enabling connections between data and models, in support of the NDT programme.  
Responding to Brian’s talk, participants asked questions about whether “historic” could be considered an additional digital twin type with Brian confirming that historical are considered within the proposed types..  A lot of the discussion focused on data and data sets. This included the exchange data used by DAFNI with Brian confirming the use of a standardized dataset called DCAT which DAFNI are planning to publish. There were also questions to contextualize DAFNI within the NDT programme. 
The full Q&A discussion transcript can be found here. 
Urban Digital Twins 
Following Brian, Tanguy Coenen from IMEC presented on IMEC’s built environment digital twin (BuDi) as well as the idea of a city-scale digital twin.  Explaining BuDi’s role as a decision-marking tool informed by near real-time data via sensors and IoT devices, Tanguy articulated how BuDI can support several use cases.  In addition, Tanguy also considered digital twin use case types by considering: 
Yesterday: Historical  Today: Realtime  Tomorrow: Predictive  Considering current smart cities as a set of silos, Tanguy expressed a desire for interoperability and data connectivity between these disparate datasets to form a urban digital twin what can support both public and private asset collections. 
Responding to Tanguy’s talk, questions were asked about terminology and the relationship to the ISO smart cities initiatives as well as the importance of standards around open data.  Tanguy confirmed IMECs desire to support and align with these efforts.  When asked about high-value use cases, Tanguy referred to: people flow, air quality and flooding as key urban-scale use cases. 
The full Q&A discussion transcript can be found here. 
Twinfrastructure 
Continuing the digital twin talk series, Neil Thompson from Atkins introduced the Commons workstream and the Glossary, a key mechanism to enable a common language to support the NDT programme.  Neil described the Commons mission to build capability through an evidence-based approach, and drew several parallels between the commons and the creation of the internet, including utilizing open and agile methodologies. As thinking develops, Neil sees the commons as the location for discussion and consensus gathering to support formal standardization once consensus had been achieved. 
Responding to Neil’s talk, questions were asked about where a similar approach to consensus building had taken place with Neil referring to examples such as GitHub and Stackoverflow.  Questions were also asked about the glossaries relationship to existing resources, with Neil referring to its ability to record whether an entry is a “shared” term. 
The full Q&A discussion transcript can be found here. 
In addition, the Glossary that Neil referred to can be found here. 
Digital Roundtable 
Finally, to conclude the digital twin talk series, Simon Evans from Arup moderated a round table discussion between the previous speakers.  Brian, Tanguy, Neil and Simon provided their reflections and insight and answered questions from the audience. 
The round table dealt with a wide array of topics such as: 
What makes digital twins different for the built environment compared to other sectors?  With the roundtable agreeing that the aspects that constitute a digital twin have been present in the built environment, but the use of the term demonstrates an evolution of thinking, the need for data connectivity, outcome focus, and a focus on data-driven decision making. 
  How the NDT programme will address security and interoperability challenges? With the roundtable referring to the Information Management Framework Pathway and a future pathway related to security and security-mindedness.  
  How might a digital twin support social distancing?  With the roundtable providing examples of using hydrodynamic modelling and occupant monitoring via camera data to monitor and support social distance policies.  The videos of each of the talks as well as the round table discussion can be found here. 
 
And there we have it.  This series digital twin talks was developed to explore how digital twins are defined and the overarching concepts around them.  Thank you for contributing to the discussions.  Your level of engagement and willingness to share are what have made these talks a success.   
Please let me know what topics you would like future digital twin talks to address? If you have any suggestions on how to improve these talks? Or who you may want to hear a talk from in the future. 
 
Read more...
In my first article, I explored the basic concept of digital twins. Fundamentally, they are a digital replica of a physical thing - a ‘twin’. But depending on maturity, this replica can range from a simple representation of a local component, all the way to a fully integrated and highly accurate model of an entire asset, facility or even a country, with each component dynamically linked to engineering, construction, and operational data.
This broad range of what a digital twin can be has made defining and understanding them extremely difficult, with disagreement on what level of maturity or features constitute a ‘true’ digital twin. Inflated market expectations, promising more than is currently achievable, have further complicated things.
In this second article (attachment below), I put forward a maturity spectrum in an attempt to offer more clarity and understanding. Undoubtedly there will be critics, but it has been tested extensively cross-industry and seems to offer a clear framework for simply articulating what a digital twin is at each element of maturity.
I welcome feedback as industry continue working to create a common definition.
 
Read more...
As everyone who works within the built environment sector knows, the essential starting point for any successful construction project is the establishment of a solid foundation. With that in mind the Digital Twin Hub is thrilled to announce the publication of its first ever digital twin foundation guide: Digital Twins for the Built Environment. 
The Purpose 
The purpose of this guide is not to be exhaustive but to document, at a high level, knowledge and examples of Digital Twin use cases that have been shared through the development of the DT Hub and engagement with our early members. 
It is hoped that by sharing this knowledge all members of the DT Hub will benefit from a common understanding of foundational concepts and the ‘How, What and Why’ of Digital Twins and that this shared knowledge will enable more meaningful discussions within the DT Hub. 
The Structure 
To provide a relatable structure we have broken down the concepts into the different phases of the asset lifecycle. This should provide a greater sense of clarity of how Digital Twins can be applied to support real business problems against tangible examples. 
The Role of the Community 
The creation of this guide has demonstrated that there is complexity in distilling foundational concepts. For this publication we have focused on what we hope will benefit the community.  To maximise the value we must therefore develop, refine and iterate this guide in partnership with the members.  
We actively encourage the community to provide feedback, both positive and negative in nature. More importantly than this, we hope that as part of this feedback process the community will be able to suggest potential alterations or amendments to continue increasing the value offering of the document.  
DTHUb_NewbieGuide_May2020_(1).pdf
 
 
Read more...
Why This Theme?
DT Hub activities focus on a set of thematic areas (themes) that are based on shared opportunities and challenges for members. These themes are areas where collaboration can help members to gain greater understanding and make more progress towards realising the potential benefits of digital twins. 
This short introductory piece outlines the scope and approach for the third theme “Pathway to value”: 
Why is this theme important? Each of the members we spoke to raised concerns that their development of digital twins may be hindered without a clear ability to demonstrate value to others, including senior management and stakeholders.  While most member organizations have top-level support for digital twins it is still difficult for them to progress from pilots towards larger scale investments. 
Sharing examples of the value that is already being generated by digital twins, from other members and more widely, can increase support and accelerate adoption. In addition, there is a desire to consider and share thinking on steps along the roadmap towards greater value at greater scale.  
  
Scope 
This theme will facilitate discussions between members and other stakeholders on: 
Shared or common use cases, outcomes from existing digital twins and opportunities for future collaboration  Costs and blockers  Strategic approach and roadmaps for digital twins  The goal is to build on work being done through the NDT programme and generate ideas and recommendations based on real-world experience from members and from the wider market.  This may influence the development of future tools to quantify value as well as overall thinking on the roadmap towards a federated national digital twin.  
Engaging with this theme can help digital twin users and stakeholders start to address questions like: 
What use cases offer the greatest potential value?  How can I measure the value from digital twins, encompassing economic (profit), social (people) and environmental (planet) benefits?  How will my organisation benefit from the implementation of digital twins?  What are the blockers to realising value and how can we address these?  What are some of the steps on the roadmap towards greater value at greater scale?  What can I learn from other industries that are implementing digital twins at scale?   
Objectives 
The main objectives for this theme are then to: 
Map use cases within a pre-existing framework, and consider measures of value (we have started by mapping to people, planet, profit)  Identify potential blockers and possible approaches to address these  Assess strategies/roadmaps from members and the wider market  Generate insights for members and feedback learnings to the wider NDT programme including potential needs for any tools or frameworks   
Get involved 
You can already start to get involved, including by:
Commenting on the posts in the dedicated space for this theme Starting your own topic where you have ideas to share We want this theme to be driven by member’s views and priorities, so it would also be great if you would like to comment on this post including on:
Where you are seeing initiatives that could benefit articulating the digital twin value proposition Use cases and in-house examples that might help inform this work Specific value pathway activities you may be working on related to use cases or value models Any views that you have on what digital twin value pathways look like (DT Hub facilitation Team)
 
Read more...
As the National Digital Twin (NDT) programme develops its thinking around the Commons, several resources to support the implementation of digital twins within the built environment will be developed.  The first of which, the Glossary, is readily available for members to engage with.  Further resources will likely include ontologies, schema and other key data infrastructure elements required to enable the NDT. 
To ensure that these resources are fit-for-purpose, they need to align to the needs of the DT Hub members; supporting use cases.  As such, this article uses the output of the Theme 3 webinar to explore and begin to identify horizontal, high-value, use cases for prioritization.   
The outcome of this work will be a community-driven document (draft under development here) to inform the Commons on which use cases should be considered a priority when developing resources. 

During the Theme 3 webinar, a total of 28 use cases were identified by members. 
Open Sharing of Data 
Data-sharing Hub 
Health and Safety 
Social Distancing 
Customer Satisfaction 
Behavioural Change 
National Security 
Traffic Management 
Incident Management 
Efficiency Monitoring 
Condition Monitoring 
Scenario Simulations 
Rapid Prototyping 
Asset Optimization 
Investment Optimization 
Preventative Maintenance 
Carbon Management 
Service Recovery 
Decision Support 
National Efficiency 
‘Live’ in-use Information 
Logistic / Transit Tracing 
Natural Environment Registration 
Pollution Monitoring 
Air Quality Monitoring 
 
Resilience Planning 
Resource Optimization 
Service Electrification 
 
This initial schedule demonstrates the breadth of value that a digital twin can facilitate.  However this list can be refined as some of these use cases: 
Overlap and can be consolidated through the use of more careful terminology.  For example both Pollution Monitoring and Air Quality Monitoring were identified.  However it is likely that the system, sequence of actions, as well as any associated key performance indicators will be shared between these use cases.  Therefore they could be consolidated under a single use case Environmental Monitoring. 
  May be specific to some members or some sectors.  For example, Customer Satisfaction Monitoring is a vital use case for DT Hub members who directly engage with a user-base within a supplier market (for example, utility companies and universities).  However, many organizations manage assets and systems whose actors do not include a customer (for example, floor defence systems, and natural assets).  Likewise, Service Electrification is a use case that is only applicable for assets and systems which rely on fossil fuels (for roads and railways).  As such, while Customer Satisfaction Monitoring and Service Electrification are vital use cases which must remain within scope of the overall programme, they may not be appropriate for prioritization. 
  Are aspects as opposed to a stand-alone use case.  For example, ‘Live’ In-use Information may be a requirement of several use cases such as Traffic Management and National Security but does not in itself constitute a sequence of actions within a system.  By identifying the use cases that are most common to DT Hub members as well as eliminating duplicates, it is hoped that a refined schedule can be produced; limited to high-value, horizontal use cases.  Such a schedule will be valuable to: 
The NDT programme to understand what use cases the IMF Pathway will need to support;  Asset owner/operators to identify and articulate the value-case for implementing digital twins; and  Suppliers to demonstrate the validity of their software in realizing these values.  Furthermore, once a streamlined schedule has been developed, further research can be undertaken to identify the typical key performance indicators used to measure and monitor systems that support these use cases. 
 
And there we have it, useful use cases.  Of the 28 use cases identified which do you think are the most horizontal? Which do you think are high-value (priority) use cases? Which do you think could be aggregated together? 
Please feel free to add your thoughts below, or, alternatively, comment directly on the draft community-driven document which will be progressively developed as member views are shared.  Feel free to comment on the content included and suggest how to refine the schedule. 

the_pathway_towards_an_imf.pdf DTHUb_NewbieGuide_May2020_(1).pdf HUB Version_DT Standards Roadmap_November 2020 (3).pdf
Read more...

Blogs

Breaking Barriers: Skills

During our research activities within the DT Hub, several barriers relating to the use of digital twins were identified.  This blog post, which looks at digital skills, reflects on skill as a barriers and considers related issues so that we can discuss how they may be addressed.

As organizations develop a wide array of datasets and supporting tools, a key concern has been the capability of the people using these resources to make decisions and take action.  To do so, these people need to be sufficiently skilled.
Industry reports, such as the Farmer Review, have consistently identified skills shortage as a key issue within the built environment.  This figure below, produced by the Construction Products Association (CPA), shows the proportion of firms who have had difficulties in recruiting traditional trades.  For example, in the first quarter of 2017, over 60% of firms had difficulty recruiting bricklayers.
A cause of this shortage is the lack of training being provided by organizations within the built environment.  As shown in the figure below from the Farmer Review, workforces within the built environment are some of the least trained.  While an obvious solution may be simply to provide more training, the issue is confounded by the fact that we need to inject a new set of skills in to the sector; increasing the amount of training required.

In 2018, The World Economic Forum produced their Future of Jobs Report.  It considered what are the current emerging and declining skills as a result of digital transformation, automation and the fourth industrial revolution.
These, are highlighted in the table below.

Considering the results provided, the need for manual skills as well as installation and maintenance skills are declining rapidly.  As such there is a risk that any immediate training to fill our skills gap may not be suitable for future employment needs.  As initiatives such as the Construction Innovation Hub and Active Building Centre consider Design for Manufacture and Assembly (DfMA) and other more modern methods, perhaps the focus should be on which skills are needed for the future.
Digital twins, as representations of physical assets, processes or systems, will need to be interfaced with built environment professionals of the future.  The question however, is in what capacity?  Let’s consider a scenario:
Cardiff University has a digital twin of their campus.  Within this twin, they have included sensors to record the usage and occupancy of lecture halls to access space optimization.
For an estate manager to be able to use this twin, they may benefit from:
Software skills, to interface with the incoming data.  This software may not be part of their core asset management system; needing additional knowledge and skills to use. Analytical thinking, to allow them to test scenarios.  For example, to test what would happen to usage if a single building was changed to private rent from external customers; improving the universities income generation. Creative thinking, to allow them to consider new ideas.  For example, to use the timetable to place lectures that straddle lunch across-campus; increasing foot-traffic past the university lunch hall. Intuitive thinking, to allow them to question outputs.  For example, to be able to identify when a faulty sensor may have led to data discrepancies or when an analysis programme has identified importance solutions due to its correlative nature such as starting lectures at 6am to free up more rooms for private rent. Ultimately, the reason for adopting digital twins will be to provide value for an organization and its wider ecosystem.  As such, problem-solving skills, critical thinking, systems analysis and analytical thinking will likely become core competencies.  For organizations with critical long-term planning requirements, future employees need to be taught these skills now so that they are appropriately competent for the future.
 
And there we have it, breaking the barriers related to skills.  How relevant do you think the WEF top 10 growing skills will be for future consumers of digital twin content?  What skills do you consider to be core to future digital twin users?
 

the_pathway_towards_an_imf.pdf DTHUb_NewbieGuide_May2020_(1).pdf HUB Version_DT Standards Roadmap_November 2020 (3).pdf
Read more...
A lot of the early thinking on digital twins has been led by manufacturers. So, what do digital twins mean to them and what insights could this provide for the built environment?
This blog is the second in series that looks at what we can learn from the development of digital twins in other sectors. It draws on key findings from a report by the High Value Manufacturing Catapult. This includes industry perspectives on:
The definition of digital twins Key components of digital twins Types of twin and related high-level applications and value The report “Feasibility of an immersive digital twin: The definition of a digital twin and discussions around the benefit of immersion” looks partly at the potential for the use of immersive environments. But, in the main, it asks a range of questions about digital twins that should be of interest to this community. The findings in the report were based on an industry workshop and an online survey with around 150 respondents.
We’ve already seen that there are many views on what does or does not constitute a digital twin. Several options were given in the survey, and the most popular definition, resonating with 90% of respondents was:
A virtual replica of the physical asset which can be used to monitor and evaluate its performance
When it comes to key components of digital twins, the report suggests that these should include:
A model of the physical object or system, which provides context Connectivity between digital and physical assets, which transmits data in at least one direction The ability to monitor the physical system in real time. By contrast, in the built environment, digital twins may not always need to be “real-time”. However, looking at the overall document, the position appears to be more nuanced and dependent on the type of application. In which case, “real-time” could be interpreted as “right-time” or “timely”.
In addition, analytics, control and simulation are seen as optional or value-added components. Interestingly, 3D representations are seen by many as “nice to have” – though this will vary according to the type of application.
In a similar fashion to some of our discussions with DT Hub members, the report looks at several types of digital twin (it is difficult to think of all twins as being the same!). The types relate to the level of interactivity, control and prediction:
Supervisory or observational twins that have a monitoring role, receiving and analysing data but that may not have direct feedback to the physical asset or system Interactive digital twins that provide a degree of control over the physical things themselves Predictive digital twins that use simulations along with data from the physical objects or systems, as well as wider contextual data, to predict performance and optimise operations (e.g. to increase output from a wind farm by optimising the pitch of the blades). These types of twin are presented as representing increasing levels of richness or complexity: interactive twins include all the elements of supervisory twins; and predictive twins incorporate the capabilities of all three types.
Not surprisingly, the range of feasible applications relates to the type of twin. Supervisory twins can be used to monitor processes and inform non-automated decisions. Interactive twins enable control, which can be remote from the shop-floor or facility. Whereas, predictive twins support predictive maintenance approaches, and can help reduce down-time and improve productivity. More sophisticated twins – and potentially combining data across twins – can provide insight into rapid introduction (and I could imagine customisation) of products or supply chains.
Another way of looking at this is to think about which existing processes or business systems could be replaced or complemented by digital twins. This has also come up in some of our discussions with DT Hub members and other built environment stakeholders – in the sense that investments in digital twins should either improve a specific business process/system or mean that that it is no longer needed (otherwise DT investments could just mean extra costs). From the survey:
Over 80% of respondents felt that digital twins could complement or replace systems for monitoring or prediction (either simple models or discrete event simulation) Around two-thirds felt the same for aspects related to analysis and control (trend analysis, remote interaction and prescriptive maintenance) with over half seeing a similar opportunity for next generation product design While remote monitoring and quality were seen as the areas with greatest potential value. Cost reduction in operations and New Product Development (NPD) also feature as areas of value generation, as well as costs related to warranty and servicing. The latter reflects increasing servitisation in manufacturing. This could also become more important in the built environment, with growing interest in gain-share type arrangements through asset lifecycles as well as increasing use of components that have been manufactured off-site.
It would be great if you would like to share your views on any of the points raised above. For example, do you think built environment twins need the same or different components to those described above? And can digital twins for applications like remote monitoring and quality management also deliver significant value in the built environment?

the_pathway_towards_an_imf.pdf DTHUb_NewbieGuide_May2020_(1).pdf HUB Version_DT Standards Roadmap_November 2020 (3).pdf
Read more...
Digital technologies are no longer considered tools to satisfy a need for efficiency, they are active agents in value creation and new value propositions [1].
The term “digital twin” has entered the regular vocabulary across a myriad of sectors.  It’s consistently used as an example of industry revolution and is considered fundamental to transformation, but the broad scope of the concept makes a common definition difficult. Yet it’s only once we understand and demystify the idea - and can see a path to making it reality - that we will start to realise the benefits.
Heavy promotion by technology and service providers has inflated expectations, with most focusing on what a digital twin can potentially achieve when fully implemented, which is like buying a unicorn even if currently cost-prohibitive. Few refer to the milestones along the journey, or incremental value-proving developments. This is evidenced, in part, by the fact that only 5% of enterprises have started implementing digital twins, and less than 1% of assets have one [2].
Over the course of three blogs, I will attempt to demystify the concept and break through the platitudes, answering the fundamental questions: What is a digital twin? What type of new skills and capabilities are required? Will a digital twin generate value? And will it support better decision making?
 
“Digital” in context
Digital twins are symptomatic of the broader trend toward digitalisation, which is having a profound effect on businesses and society. Widely cited as the “fourth industrial revolution” [3] or Industry 4.0 (broadly following: steam power (c1760-c1840), electricity (c1870-c1914) and microchips (c1970)), it’s characterized by a fusion of technologies that blur the lines between the physical, digital, and biological spheres – such as artificial intelligence, robotics, autonomous vehicles and Internet of Things (IoT).
Though the exact dates of the earlier revolutions are disputed, their timeframes were slower than the rapid pace and scale of today’s disruption, and still they saw companies and individuals that were slow or reluctant to embrace change being left behind.
The digital revolution is unique, and derives in part from a new ability to massively improve quality and productivity by converging technologies and sources of data within a collaborative framework, which inherently challenges the business and organisational models of the past. Not only this, but the online connection of all assets together (the Internet of Things), is the key enabler to the next phase of industrial development.
The complexity of assets, and cost of developing and operating them makes any promise of efficiency gains and improved performance immensely attractive. However, the reality of digital transformation to offer these rewards has too often fallen short. The failure comes from a rush to introduce digital technologies, products, and services without understanding the work processes in which they will be used, or the associated behaviours and joined up thinking required to make them effective.
While individual products and services have their place, significant gains in efficiency and productivity will only come by weaving a constellation of technologies together and connecting them with data sources, followed by supporting management and application of that data through project, asset and organisational developments.
Is data the “new oil” or the “new asbestos”? and how can industry start tangibly benefiting from the digital twin concept?
With data apparently the “new oil”, or maybe the “new asbestos”, and against a backdrop of digital transformation being viewed by many sceptics as a fashionable buzzword, how can industry start tangibly executing and harnessing the benefits of the digital twin concept?
 
Digital twin basics
Fundamentally, a digital twin is just a digital representation (model) of a physical thing - its ‘twin’; and therein lies the complexity of this industry agnostic concept. Other commonly used terms, such as Building Information Modelling (BIM), Building Lifecycle Management (BLM) and Product Lifecycle Management (PLM) represent similar concepts with some important distinctions, that are all part of the same theme of data generation and information management.
The term “digital twin” first appeared in 2010, developing from the conceptual evolution of PLM in 2002 [4]. Since then, it’s meaning has evolved from simply defining a PLM tool into an integral digital business decision assistant and an agent for new value and service creation [5]. Over time many have attempted to define the digital twin, but often these definitions focus on just a small part of the asset lifecycle, such as operations.  
“A digital twin can range from a simple 2D or 3D model with a basic level of detail, to a fully integrated model of an entire facility with each component dynamically linked to engineering, construction, and operational data”
A digital twin can range from a simple 2D or 3D model of a local component, with a basic level of detail, all the way to a fully integrated and highly accurate model of an asset, an entire facility, or even a country [6], with each component dynamically linked to engineering, construction, and operational data.
There is no single solution or platform used to provide a digital twin, just as there isn’t one CAD package used to create a drawing or 3D model. It’s a process and methodology, not a technology; a concept of leveraging experience-based wisdom by managing and manipulating a multitude of datasets.
While a fully developed digital model of a facility remains an objective, practically speaking, we are delivering only the “low hanging fruit” pieces of this concept for most facilities now.  These fractional elements, however, all point towards a common goal:  to contribute a value-added piece that is consistent with the overall concept of the digital twin.  As technology and techniques improve, we predict the convergence of the individual parts and the emergence of much more complete digital twins for industrial scale facilities, and ultimately entire countries.
“There is no single solution or platform used to provide a digital twin, just as there isn’t one CAD package used to create a drawing or 3D model”
The ultimate aim is to create a “single version of truth” for an asset, where all data can be accessed and viewed throughout the design-build-operate lifecycle. This is distinctly different to a “single source of truth”, as a digital twin is about using a constellation, or ecosystem, of technologies that work and connect.
The digital twin promises more effective asset design, project execution, and facility operations by dynamically integrating data and information throughout the asset lifecycle to achieve short and long-term efficiency and productivity gains.
As such, there is an intrinsic link between the digital twin and all the ‘technologies’ of the fourth industrial revolution, principally IoT, artificial intelligence and machine learning. As sensors further connect our physical world together, monitoring the state and condition, the digital twin can be considered the point of convergence of the internet-era technologies, and has been made possible by their maturity. For example, the reducing costs of storage, sensors and data capture, and the abundance of processing power and connectivity.
The digital twin is a data resource that can improve design of a new facility or to understand the condition of an existing asset, to verify the as-built situation, run ‘what if’ simulations and scenarios, or provide a digital snapshot for future works. This vastly reduces the potential for errors and discontinuity present in more traditional methods of information management.
As asset owners pivot away from document silos and toward dynamic and integrated data systems, the digital twin should be become an embedded part of the enterprise.  Like the financial or HR systems that we expect to be dynamic and accurate, the digital twin should represent a living as-built representation of the operating asset, standing ready at all times to deliver value to the business.
Each digital twin fits into the organisation’s overall digital ecosystem like a jigsaw, alongside potentially many other digital twins for different assets or systems. These can be ‘federated’ or connected via securely shared data - making interoperability and data governance key. In simple terms, this overall digital ecosystem consists of all the organisational and operational systems, providing a so-called ‘digital thread’.
Author: Simon Evans. Digital Energy Leader, Arup. Delivery Team Lead, National Digital Twin Programme
[1] Herterich, M. M., Eck, A., and Uebernickel, F. (2016). Exploring how digitized products enable industrial service innovation. 24th European Conference on Information Systems; 1–17.
[2] Gartner, Hype Cycle for O&G
[3] https://www.weforum.org/agenda/2016/01/digital-disruption-has-only-just-begun/
[4] Digital Twin: Manufacturing Excellence through Virtual Factory Replication. White Paper, pages 1 – 7
[5] Service business model innovation: the digital twin technology
[6] Centre of Digital Build Britain, The Gemini Principles
 
 
 
 
Read more...
Why this theme?
DT Hub activities focus on a set of thematic areas (themes) that are based on shared opportunities and challenges for members. These themes are areas where collaboration can help members to gain greater understanding and make more progress towards realising the potential benefits of digital twins.
This short introductory piece outlines the scope and approach for the second theme: Digital twin competencies:
To help organizations understand the competencies, skills and cultural considerations that can help them to successfully adopt digital twins, as well as fostering collaboration and making recommendations on the best way forward
Why did we identify this theme as priority? Each of the members we spoke to raised concerns about digital competencies within the built environment.  To maximize the value that a digital twin can bring to an organization, the different actors who might undertake digital twin related activities must have the necessary knowledge, skill, and authority to do so.
In other words, without a sufficiently competent set of individuals to realize the benefits, digital twins will be under-utilized.  In recognition of the built environment’s challenges with training and upskilling its workforce as well as the concern raised by DT Hub members, this has become a prioritized theme.
Scope
This theme will build on work being done through the National Digital Twin (NDT) programme Enablers stream and will develop ideas and recommendations based on real-world experience from members and from the wider market.  In addition, this theme will also help to test some of the underlying principles of the NDT programme which may influence future digital competency frameworks.
Engaging with this theme can help digital twin owners start to address questions like:
Who within my organization should interact with our digital twins? How will they interact with digital twins? What knowledge and skills are needed to undertake these activities? What skills gaps do I have and what gaps are there in the built environment overall? When looking to hire new staff who will interact with our digital twins, and what core competencies should I be looking for? What cultural orientation is helpful to successful implement digital twins – what can I learn from others? Related to the first bullet above, work has been started by some members to identify the types of actors that may interface with their digital twins. Building on this work, we plan to discuss and agree a schedule of “personas”, that broadly represent a suite of roles, and then build a profile of the competencies against each persona.
Objectives
The main objectives for this theme are then to:
Identify a schedule of personas that cover relevant roles for a notional organization, ensuring sufficient flexibility and scalability. Understand (from examples) what digital twin related activities each persona is expected to undertake Using an industry recognised system such as the European Qualification Framework (EQF) to map knowledge, skill, and autonomy requirements to each of the relevant activities Feedback learnings to inform into the development of the NDT programme Enablers work Generate insight to potentially develop a digital twin competency framework NOTE: It is acknowledged that organizations such as CITB are working on digital competency frameworks, and it is hoped that engagement with such activities is done via the Enablers stream. 
We’re already starting on the first set of activities for this theme and we are creating some content for you to dive into including:
A Webinar to discuss the theme requirements, including a broad discussion around persona and competencies. Topics and posts to kick off conversation within the hub (for example a piece on competencies related anonymizing data) as well as links to interesting external sources based on this research. We are adding these to a dedicated forum for theme 2   Research into interesting examples from other industries of approaches that consider digital competencies (coming soon - we will also add interesting links to the theme 2 forum) What next?
There are lots of opportunities for you to get involved, including in activities to flesh out this theme. We want this to be driven by member’s views and priorities, so it would be great if you would comment on this post including to tell us:
Where you are seeing initiatives that could benefit skills development in the digital twin area Use cases and in-house examples that might help inform this work Specific competency activities you may be working on related to building knowledge and skills Any views that you have on what digital twin competencies look like  
 
 
 
Read more...
This blog was first produced following discussions with digital twin owners about the importance of learning more from other industries. It also relates to the first “theme” that we identified as a priority for the DT Hub, which looks at digital twin definitions and concepts. We hope you enjoy reading this piece and welcome your comments as well as your thoughts on other topics where you would like to hear more from us.
The idea of digital twins in space may seem like science fiction – or at least a long way removed from the day-to-day challenges of the built environment. But, in fact, the aerospace industry has been at the forefront of many of the technology innovations that have transformed other areas. Before Michael Grieves coined the term digital twin in 2002, NASA was using pairing technology to operate and repair remote systems in space.
Digital twins, in the aerospace sector, have since gone way beyond simulations. This is driven by a need to accurately reflect the actual condition of space craft and equipment and predict potential future issues. While the crew of Apollo 13 may have relied on a physical double as well as digital data, future space stations and trips beyond our atmosphere will be using digital twins to deliver the right kinds of insights, decision support and automation needed to achieve their missions.
Despite the great distances and the technological advancement of space technologies there are valuable parallels with industries back on earth. For example, digital twins of remote and autonomous vehicles (like the Mars Exploration Rover) could provide useful lessons for similar vehicles on earth, from robots in nuclear facilities and sub-sea environments, through to delivery vehicles in a logistics centre or drones on a building site.
More specifically, a 2012 paper co-authored by NASA provided several insights into the approach to digital twins in aerospace,  including the following definition:
A Digital Twin is an integrated multiphysics, multiscale, probabilistic simulation of an as-built vehicle or system that uses the best available physical models, sensor updates, fleet history, etc., to mirror the life of its corresponding flying twin
Digital twins could represent a significant shift away from a heuristic (i.e. past-experience based) approach to one using sophisticated modelling combined with real-life data. This shift impacts design and build, certification and ongoing operation. The drivers behind this change include a need to withstand more extreme conditions, increased loads and extended service life. (Imagine a manned trip to Mars, or one of the new commercial space ventures that call for vehicles to be used again and again).
The paper also looked at some of the needs and priority areas for digital twins, including:
more accurate prediction of potential materials failures; as well as the condition of other systems in space vehicles by connecting multiple models with data from the physical twin. If digital twins can add value in the harshest environment imaginable, what applications could this have for the built environment? One example is the interesting parallels between assessment of the risks of cracks and failures in long-life space vehicles and long-term structural monitoring of bridges and other infrastructure. The required level of fidelity (i.e. the level of detail and accuracy) as well as the extent to which real-time data is needed, may vary considerably – but many of the same principles could apply. 
More widely, the authors of the paper felt that the parallels and benefits from developing digital twins for aerospace could extend across manufacturing, infrastructure and nanotechnology.
The ideas explored in the paper also go well beyond monitoring and towards automation. For complex space missions, vehicles may not be able to get external help and will need to be self-aware, with “real-time management of complex materials, structures and systems”. As the authors put it:
“If various best-physics (i.e., the most accurate, physically realistic and robust) models can be integrated with one another and with on-board sensor suites, they will form a basis for certification of vehicles by simulation and for real-time, continuous, health management of those vehicles during their missions. They will form the foundation of a Digital Twin.”
Such a digital twin could continuously forecast the health of vehicles and systems, predict system responses and mitigate damage by activating self-healing mechanisms or recommend in-flight changes to the mission profile.
While the context may be very different, our discussions with DT Hub  members and others in the market suggest that built environment infrastructure owners and operators are aiming to achieve many of the same aspirations as NASA – from better prediction of potential issues through to actuation and self-healing.
Which space twin applications and ideas do you think we could apply to the built environment?
We would welcome your comments on this piece as well as your thoughts on other topics where you would like to hear more from us.
 
 
 

the_pathway_towards_an_imf.pdf DTHUb_NewbieGuide_May2020_(1).pdf HUB Version_DT Standards Roadmap_November 2020 (3).pdf
Read more...
Priority themes
The DT Hub is the place for early adopters to share good practice and learn from each other based on real-world experience. DT Hub activities focus on facilitating collaboration between members, to gain greater understanding and catalyse progress towards realising the potential benefits of digital twins. Discussions are grouped into a set of thematic areas (themes), the first three of which are:
Testing digital twin concepts: What are digital twins, what are main building blocks for twins and how can we develop common understanding? Digital twin competencies: What capabilities, skills and culture are needed to successfully implement twins? Pathway to value: building and sharing value cases; what is the roadmap to increased scale and greater value? The idea is to combine good thinking from members’ strategies, use cases and projects. As well as leaning from the wider market. This will generate guidance and recommendations to feed into the wider National Digital Twin (NDT) Programme.
Why start here?
The selection of the initial three themes reflects a desire to address “foundational” concepts and thinking on digital twins. That is, to take on some of the shared challenges to build common understanding and unblock or accelerate opportunities. The work on these themes can then help to tackle the:
Need for clarity on whether something can really be classed as a twin, and understand the key building blocks of a twin, to evaluate  vendors claims; and to enable greater consistency for future development of twins Desire to better understand the skills and cultural orientation needed as twins are rolled out operationally (e.g. what does a field engineer want to do with a twin, versus someone from an innovation team) Requirement to build fuller value cases (which include societal, environmental and financial benefits) as twins move from initial pilots towards wide-scale implementation The initial work on themes will also provide a foundation to address other key priorities based on the Gemini Principles (which set out proposed principles to guide the national digital twin and the information management framework that will enable it) such as data quality and security. The better we understand the scope and key building blocks of digital twins (through activities such as the discussions relating to theme 1), the easier it will be to consider what digital twin use cases will be most valuable, which will also help inform security-related and other requirements.
The three themes also relate to each other. Understanding more about the definitions, concepts and buildings blocks for digital twins (theme 1) will inform the skills, capabilities and organizational culture that may be required to successfully implement twins (theme 2). Equally, these themes will be vital to help each organization to shape clear strategies and roadmaps and to map out a pathway to value at scale (theme 3).
Focus on projects and use cases
A major focus of this community is to “learn by doing and progress through sharing”. Each theme will draw directly on the work that members are doing and planning related to digital twins. Members are invited to share insights from their work on digital twin strategies and applications. In particular, the more that you can share on your projects in the “DT (Digital Twin) Register” section of the portal the more we can all learn from each other.
The goal is also to share thinking on common use cases, and leverage these to test evolving thinking for each theme. We’re already starting to do that for theme 1, where examples of important digital twin use cases include:
Predictive maintenance, progressing to increased automation and even “self-healing” assets Efficiency and carbon reduction in logistics Planning (from short term operational needs, through to longer term strategy and resilience)  
How to get involved
There are lots of ways to get involved and contribute your ideas, or tap into the content and insights being generated for each theme including:
Telling us about your digital twins in the DT (Digital Twin) Register Joining a webinar to discuss the themes. (We are sending out invitations for a theme 2 webinar on 24th March). Review materials from previous webinars on the “Resources” page Commenting on posts in the “Themes” pages, starting with the first theme on “Testing digital twin concepts”. Think there’s something we should be talking about? Start a new topic or let us know about through “Contact Us” at the bottom of each page Joining a live conversation to dig into the themes in more detail (we’ll share more on these later)  
(From the DT Hub facilitation team).

the_pathway_towards_an_imf.pdf DTHUb_NewbieGuide_May2020_(1).pdf HUB Version_DT Standards Roadmap_November 2020 (3).pdf
Read more...
Context:
DT Hub activities focus on a set of thematic areas (themes) that are based on shared opportunities and challenges for members. These themes are areas where collaboration can help members to gain greater understanding and make progress towards realising the potential benefits of digital twins.
The first theme is called “Testing digital twin concepts”. The focus is on:
Helping DT Hub members increase common understanding of digital twin definitions and concepts, then test and refine this thinking in specific use cases and projects where there is potential to deliver significant value
 
Why start with this theme? Each of the members we spoke to felt that it is difficult to systematically plan and progress digital twins without a clear understanding of what digital twins are, when something is classified as a digital twin, and what are the key concepts or building blocks that lie behind these twins.
In other words, discussing digital twin concepts and definitions is a “foundational” activity that is needed to underpin future activities. Moreover, it is difficult to think about making digital twins more connected and interoperable, as stepping-stones towards a national digital twin, if the approach to each twin is inconsistent.
Scope:
This theme will build on work being done through the NDT Programme, including the Gemini Principles, and feed back ideas and recommendations based on real-world experience from members and from the wider market. 
There are other reasons why this work is needed. As with other major technology developments, from the internet of things to AI, a growing range of players will claim to have tech solutions for digital twins. There is a risk here of “twinwash”. If every sharp-looking 3D visualization is labelled as a “digital twin”, regardless of whether or not it bears any relation to real-world physical assets, this can create confusion and risks devaluing “real” twins.
Tackling this theme can help digital twin owners start to address questions like:
What makes my digital twin a twin? What types of areas (e.g. related to data, models, analytics, control systems etc) do I need to consider in creating digital twin strategies and planning for individual digital twin projects? What can I learn from the approaches taken by others to defining and scoping digital twins -including how this relates to specific use cases? How do I relate and connect multiple digital twins within my organization? How does my twin (and the approach I’m taking to it) relate to other third-party twins? For example, how will a water pipeline twin connect with a highways or city twin? Related to the first bullet above, at least some DT Hub members would like to see the creation of a “Turing test for twins”. In other words, to have an agreed set of criteria established as the minimum threshold for a twin to be considered a twin. At the same time, there is also a desire for flexibility - the scope of twins will vary according to the intended purpose and specific use case. For example, not all twins will involve real-time control and actuation.
Objectives:
The main objectives for this theme are then to:
Provide insights on “good” approaches to describe and classify digital twins and their constituent elements – building on the Gemini Principles Understand (from examples) how other industries have advanced their digital twin journeys Apply this thinking to specific use cases in existing or planned founding member digital twins in areas where there is the potential to deliver significant value Help DT Hub members to achieve greater consistency across their organizations and with supply chains and partners Develop an intuitive “test” for what constitutes a digital twin Feedback learnings into the evolution of the “Commons” and  the Gemini Principles  
Activities
We’ve already started on the first set of activities for this theme and created some content for you to dive into including:
A Webinar to start to relate this to the Gemini principles and to identify some initial use case priorities Research into interesting examples from other industries of approaches to defining and developing twins Creation of blog-style “conversation starters” (for example insights from aerospace and manufacturing as well as thoughts on approaches to defining twins) as well as links to interesting external sources based on this research. We are adding these to a dedicated space for theme 1  
What next?
There are still plenty of opportunities for you to get involved, including activities to flesh out this theme. This includes an online “jam” – a virtual event that we’ll host on the DT Hub, dates to be confirmed.
We want this theme to be driven by member’s views and priorities, so it would be great if you would like to comment on this post including on:
Existing initiatives that could feed into this work Use cases that we should prioritise to test emerging thinking on digital twin concepts Specific digital twin projects you are be working on Your views that on what makes a digital twin a twin  
Read more...
Our collective understanding of digital twins is rather nascent.  To ensure that we operate under the same base information there is a need to periodically reflect on the concepts and principles we have outlined.  This blog post is one in a series which reflects on previously published concepts to consider whether our collective thinking has advanced.

As we develop the thinking, tools, and resources relating to digital twins, a lot of discussion is taking place regarding their scope, scale and accuracy.  Within the Gemini Principles it stated that a digital twin is:
I want to reflect on this statement.  In particular, the use of “realistic”.
For something to be realistic, according to the Oxford English Dictionary, it must represent something in a way that is accurate and true to life.  For example, for something to be “photo-realistic” it must appear as if it was a photograph.
However, the Gemini Principles state that a digital twin must represent physical reality at the level of accuracy suited to its purpose. Interestingly, while undertaking discovery interviews with DT Hub members we saw this issue realized.
Interview Insight
"Several members commented on how people in their organizations would try to extend the use of their digital twins beyond their intended purposes."
This was seen as both a positive and a negative outcome.  The positive being that members of these organizations saw the value in these digital twins and wanted to harness their insight.  The negative being that these digital twins did not have the information or, when available, did not have level of accuracy required to be used for these extended purposes.  For these extended needs, these digital twins were not realistic.
Amongst DT Hub members there appears to be a shared view that digital twins are, fundamentally, purpose-driven.  Therefore, digital twins might not be “real” representations, but instead the “right” representation to support a purpose.
Consider an example.  An air traffic control system utilizes a “digital twin” of runways, aircraft and their flight paths along with sensor information (e.g. weather and radar) to assist with preventing collisions, organize and control the landing and departing of aircraft.  In this example while real-time information and analytics are used, none of the physical elements (planes, control towers) have realistic representations, they instead use basic representations to support the air traffic controller.  Instinctually an air traffic control system does everything we want a digital twin to do, it is a digital representation of physical assets which also includes sensor information where the physical assets provide a link back to the digital twin.  Given this, it should be fairly clear that an air traffic control system would be considered a digital twin.  However, this does not appear to be the case.

A poll was placed on twitter asking “would you consider an air traffic control system a digital twin”.  After 62 votes were cast, the result was exactly 50:50.  What does this tell us?  Perhaps public messages on what a digital twin is aren’t sufficiently defined?  Perhaps the question was poorly worded? Or perhaps, for some, the lack of a realistic representation is the reason they said no?  Unfortunately, context for each vote isn’t available.  At the very least we can be sure that our shared view may not be shared by everyone. 
In an age where many consider data to be the new oil perhaps we should consider using our data sparingly.  So long as the data provided is sufficient for its intended purpose, a realistic representation may not always be required.
 
And there we have it, realism and its place within Digital Twins.  Do you believe that a digital twin has to be realistic?  Can something be a digital twin without being a realistic representation?  Had you voted on this poll, would you have considered an air traffic control system a digital twin?
 
 
Read more...
The first theme that we are addressing in the Hub is “Testing digital twin concepts”. This was identified as a key foundational element by the DT Hub members – to help increase understanding and inform the development of strategies and projects.
It will build on the Gemini Principles and generate recommendations to feed into the National Digital Twin (NDT) “Commons” stream.
The theme is summarised as “helping DT Hub members to increase common understanding related to digital twin definitions and concepts, and then to test and refine this thinking within specific use cases and projects where there is potential to deliver significant value.”
You can find out more on the objectives, activities and selection process for the theme in the attached document. 
DT Hub Theme 1 report 19 December 2019.pdf
 
Read more...
Historically, standards have often been (falsely!) perceived as a contradiction to innovation.  In fact, standards have often played a pivotal role in the adoption of new innovations.  This is because those standards established a framework which defined aspects such as common vocabularies, essential characteristics and good practice.  Once such a framework had been established, products and services that support the framework were developed. 

What happened with Building Information Modelling (BIM) is a great example of this.  After developing the PAS 1192 series, UK competencies around BIM were catalysed; allowing the UK to (and continue to) export its leadership globally.  To facilitate the same level of adoption for digital twins, a similar framework is needed. 
With work already underway to develop standards relating to digital twins at ISO, a roadmap for digital twins within the built environment is needed to ensure that such standards are developed in a holistic manner; formalizing the right content while allowing the sector to compete within these constraints. 
To that end, BSI have worked with CDBB to produce a digital twins standards roadmap for the built environment.  This roadmap considers what specific digital twin standard are needed as well as what supporting standards need to be produced which relate to the wider use of digital within the built environment.  The roadmap was developed through the analysis of around 12,540 standards across a myriad of sectors.  The Standards roadmap is attached below. 
Comments and contributions to the roadmap, due to be updated periodically, are welcomed.  Please feel free to comment below, email DTHub@cdbb.cam.ac.uk.
HUB Version_DT Standards Roadmap_November 2020 (3).pdf
 
 
Read more...
Top
×
×
  • Create New...