Jump to content

Articles & Publications

The DT toolkit is a simple guide to the things you need to think about on your digital twin journey. It arose from the request we’ve heard from the DT Hub community “how do I make the business case for a digital twin”. In response, through the Gemini programme of the National Digital Twin programme, we’ve been able to bring together people who have been through or are going through the digital twin journey from different perspectives: consulting, technology development, legal and academic and who are willing to share their learning and expertise with the DT Hub community.
The team first met back in September 2020 to discuss how we might put together a toolkit for making the business case. We discussed how it would need to focus on purpose, relevant examples, a roadmap and a usable business case template. We debated the use cases for articulating digital twin purpose and took this to a DT Hub workshop to garner community input to what is now a use case framework which is starting to resonate in meetings and presentations. We presented and discussed case studies of digital twins that have been developed or are being developed which can be found on the DT Hub. We spent a long time talking through the steps organisations need to go through to implement a digital twin and as a result produced the roadmap which you can find in the report. We talked about digital twin sophistication levels. And members of the team worked together to really think through what a business case template might look like and what you would need to put together to get sign off for your digital twin. This template is now freely available as a resource for you to download and use.
This DT toolkit report is a true collaboration of diverse minds working in the field of digital twins who are open to challenge and debate. The result is a toolkit that you can use to set you and your team on your digital twin journey. As with all journeys, the toolkit is now at its first pit-stop and the toolkit team are going to use it with their clients and provide feedback on how to improve and fine tune it. We invite you to do the same, read the toolkit report, try it out and tell us what you think.  
We are very grateful for the passion and dedication that the Toolkit Team have shown towards putting the toolkit together. Working with limited resources we have been reliant on our volunteer’s goodwill and conviction that the work of the National Digital Twin programme is something they want to be involved in and contribute to.  Drawing from across the disciplines and different organisations, we’ve been really boosted by the support we’ve received from a  team of people going through the digital twin journey and enthusiastic about sharing their experience and ideas with the wider community.  
 If you would like to share your learning and experience with the community and take part in the next iteration of the Toolkit, reach out to us. We can all work together to make this a valuable community resource. 
 
“The toolkit captures know-how and insights from people with experience of developing and using digital twins.  Steps are given that provide the reader with valuable guidance for justifying, building and exploiting twins, increasing value and reducing the risk of change.” Dr Peter van Manen, Frazer-Nash Consultancy
“Working collaboratively with people from a variety of industries and experiences, has been not only invaluable to the construct of the Toolkit but also, fun, inspiring and wholesome to participate in.” Peter Curtis, Sopra Steria
“Working on the DT toolkit has been an excellent way to socialise my thoughts and the Catapult’s work on Digital Twins, while increasing my understanding of DTs, through discussions with the other team members.” Ron Oren – Connected Places Catapult
Read more...

Shared by the Community

Open Energy gets UK Government backing

Icebreaker One has won a major UK Research and Innovation competition for the Open Energy project, which aims to revolutionise the way data is shared across the energy sector to make sure the UK achieves its net-zero goals.
It means the project will receive £750k in UK Government funding to continue developing a standard that all organisations in the energy data ecosystem can use to search, share and access data. It’s also developing a prototype governance platform to make sure data is shared securely. 
Icebreaker One hosted a webinar on 16 February at 10am to share more information about its progress so far and plans for the future.
View launch webinar (16 February 2021)
View project summary briefing
Open Energy aims to transform the way organisations exchange the information they need to phase out fossil fuels and implement renewable energy technology. Icebreaker One is aiming to roll out the Open Energy standards, guides and recommendations across the energy sector over the next year.
Open Energy has been guided by industry advisory groups across the UK which include representatives from Ofgem, Scottish Power and SSE. It’s led by Gavin Starks, one of the key figures behind the Open Banking Standard that has revolutionised the banking sector over the past five years.
Icebreaker One worked with project partners Open Climate Fix, Raidiam and PassivSystems, to win the Modernising Energy Data Access (MEDA) competition, run by Innovate UK as part of the Industrial Strategy Prospering from the Energy Revolution programme. 
A summary of the MEDA Phase Two work is available here.
Gavin Starks, founder and CEO at Icebreaker One, said,
“We’re delighted to have this backing to continue developing the data infrastructure to help unlock access to data to deliver efficiency and innovation across the energy sector.

This will have a material impact on the UK’s ability to make the most of decentralised energy supply and consumption, help address the coming challenges of the transition to electric vehicles and catalyse the delivery of our net-zero targets.

Our work will help unlock data discovery by enabling energy data search and usage by delivering a trusted ecosystem for decentralised data sharing.”
Rob Saunders, Challenge Director, Prospering from the Energy Revolution at UKRI, said:
“The MEDA competition was designed to accelerate innovative ways for energy data to be open-sourced, organised and accessed, providing a platform for new technology, services and more agile regulation within the energy sector. 
“The Icebreaker One project showed exactly what can be achieved through collaborative thinking and will help create a framework for all stakeholders to share data further for the common benefit – and ultimately for the UK’s net-zero ambitions. We are looking forward to working with them closely as the project develops further.”
David Manning, Head of Data Management at SSE plc, said: “At SSE we recognise that becoming a data driven organisation is critical to our role in helping achieve a net zero world.”
“Readily accessible and trusted data will be essential to building the decarbonised energy system of the future; ensuring flexibility, customisation and personalisation for energy users, large and small. It’s exciting to see the progress being made in this space.”
https://energydata.org.uk/2021/02/03/open-energy-gets-uk-government-backing/
Read more...

Shared by the Community

Digital Twins: the Tuesday blogs

“There are two things in life for which we are never truly prepared: twins.”
Josh Billings
We have thought a lot about Digital Twins in recent times and heard an awful lot more. But there is always room for new thoughts on any subject, hence this short series of articles. We want to share fresh views with the experts and with the uninitiated. And we’ll include a hidden gem each week.
We’ll speak in plain English. We won't talk about taxonomies, ontologies or system of systems. Instead we will look to the wisdom of Rumsfeld, Einstein, Gandhi and others to explore the wonderful world of twinning. And we’ll keep the number of words below 400 for most of the time. That’s just one page of your valuable time. We’ll post one every week for the next few weeks, starting today, and then stop (or maybe start talking about something else when we are done).
Here are the different episodes in the series:
1.  Known unknowns. Unlocking awareness, knowledge and action.
2.  Time and space. The relativity of structure, behaviour and certainty.
3.  Trusted friends. Authority, assurance and agency.
4.  A puppy isn’t just for Christmas. Long-term value.
5.  Greeks bearing gifts. Giving context.
6.  Back to the future. History, science and maths.
7.  Wisdom of the crowds. People matter.
So, settle back and read the first in the series. It shows us how Donald Rumsfeld has helped us unlock some of the hidden secrets of Digital Twins. And why we should seriously consider using them more.
 
Peter van Manen & Mark Stevens .. Frazer-Nash Consultancy
Read more...
As we head in to a new year, it’s exciting to see both industry and government recognise and support the work that is being done around a National Digital Twin.  
An important boost came at the end of 2020 with the publication of the Government’s National Infrastructure Strategy. It unveiled plans for a ‘radical improvement’ in the quality of infrastructure and included support for the adoption of the Information Management Framework and the National Digital Twin. It’s an encouraging sign of the Government's ongoing support to the programme.  
Further support came from HS1, who are on track to revolutionise the railway industry by developing a 5G Augmented Reality Digital Twin Project. They plan to virtually replicate rail assets on the HS1 line by 2021. The technology will monitor the real-time performance of rail assets and allow for the swift detection and repair of faults. It will act as a clear example to others of the many benefits of digital twins and we’re delighted to have them team up with the NDTp, to share their insights and experience.
The power of collaboration 
In our latest case study, we showcase the success of the ‘Colouring London’ project. The platform is a model for open databases on urban building stocks, and is specifically designed to provide data to support other building related digital twins, for example 4D procedural models of cities. 
The site has already received 4.17 million edits, an incredible 200,000 of which have been made directly by individual contributors – the remainder have been made by automated processes. It is a great example of collaboration between various bodies and demonstrates how sharing knowledge and data can have such a positive impact on the sustainability of our cities. 
Colouring London pre-dates the creation of the Gemini Principles though the team have welcomed them as a valuable tool in describing their work. The platform’s clear purpose of serving the public good, it’s openness, quality and functionality offers a model to others also aiming to adhere to the Gemini principles. 
Other activities I would like to highlight are: 
DT Hub Progress. We now have 970 members and expect to reach 1000 by the end of the month. The update to the website has also been completed and we’ve had positive feedback on the improved accessibility of information. It is great to see more community generated content, in addition to other resources, such as the 49 articles and publications available. I’m particularly pleased to see 12 data sources listed next to the ‘share the data’ tab. It’s a promising start and I want to encourage others to add in their data too!  
Hub Together launch. In the first of our monthly ‘Hub Together’ town halls, we invited members to bring their lunches and voices to help shape the community over the coming year. Sam Chorlton, DT Hub Chair and Tom Hughes, Delivery Lead were there to answer questions on what the plans are for the community going forward and what influence the community can have on the information management framework. Hub Together takes place on the third Wednesday of each month and the next one is the 17th February at noon. 
Legal Roundtables Outcome Report. We have completed a series of four carefully scoped roundtables, bringing together nine leading lawyers, across practice areas such as planning, IP, data protection and ethics. The overarching outcome was that, although there are legal challenges, there are no red flags to the IMF. The roundtables were led by Sarah Rock, principal associate in the construction and engineering team at Gowling and it has been fantastic to have her input. An outcomes report has been published to provide further insights into the findings.  
Launch of Community for "Data Value and Quality". This is a place to focus discussion around how our collective approach to data governance, value and quality must evolve. It provides a central point for the storage of resources that are relevant to each topic, and a forum for the open sharing of ideas, research and case studies.  
Progress of the Gemini programme. In our weekly Gemini call, which takes place every Tuesday from 10:30-11:00, we are regularly joined by over 50 individuals from across Government, industry and academia. It is open to all DT Hub members and is an opportunity to hear updates on the various NDTp streams. It is also a chance to invite attendees to collaborate on projects, such as the ‘Digital Twin Toolkit’. Already nine organisations have volunteered their time to supporting it and are currently preparing a DT Toolkit report to go out in sync with the Tech Digital Twin Report.
Read more...
Happy new year! I’m sure that many of us are quite relieved to put much of 2020 behind us, but as I reflect back on the past year and our new goals for 2021, there is much to celebrate as well.
A year of surprises
There was of course, the huge impact of COVID on all our lives – like others across the country we had to move to home working, new ways of communicating and juggling home school with Zoom meetings. Yet throughout I was struck by how my team and the wider DT Hub community kept the momentum going and were determined to keep ‘moving things forward’. I’m really grateful for all their hard work and find it encouraging that things can progress and work so well in the virtual world.
Another surprise came earlier in the year after the launch of the DT Hub. It was clear to us that there needed to be a community of users sharing ideas and experience, but we were cautious as to what the response would be. The built environment has typically been quite siloed, without much engagement between sectors, so we knew it would be a challenge to break people out of those distinct sectors and work as a unified entity.
So it was a very welcome surprise to find our caution misplaced! We were expecting around 200 members by year end, but we have, in fact, got over 1000 members. It has really shown us that there is this huge amount of enthusiasm and momentum around digital twins, as well as an appetite for being a part of the conversation.
We’ve also been struck by the emotional investment of members towards getting this emerging field right. For example, the debate around establishing a common set of standards has been heated at times, but in a positive way. We’ve wanted to include different voices and make sure that everyone is being heard and that means differences in opinion. We believe that is a healthy environment to be in and we intend to keep driving the conversation in a constructive way.
Keeping the momentum going
This all gives us good cause to be optimistic going in to 2021. We’re starting the year by opening the Hub up further to try and accommodate as many people as possible. We are extending the invite to academia and the international community, as well as branching in to other sectors such as Formula 1 and manufacturing. 
To support this increase in numbers, we have revamped the DT Hub website. We’ve reflected on the feedback we’ve received and the refresh aims to make it more useable and accessible. As ever, we would love your opinion on what is working best for you and what you would like to see more of.
There will also be a shift in the Hub in terms of content. As expected from a new organization, we have been directing much of the content to get the conversation started. 
We’re now at the point where we will move to enabling our members to share and drive that content. 
At the heart of the DT Hub is its members. This is a place for members to share, discuss, network and learn and although we have been driving much of the content up until now, we want to hand the reigns over to you - enabling you to suggest the topics and themes you want to discuss and exploring the areas that will most benefit the community.
Launch of ‘Hub Together’ and Community Insights
Starting this year there will be a regular series of ‘Hub Togethers’, town hall style events where the reins will be very much in the hands of our members. This will be your chance to shape the conversation and grill us on any topics related to the DT Hub. We intend to make it as interactive as possible with flash up polls and the option to respond or add in comments to questions.
In conjunction with Hubs Together, we will also be starting a series called ‘Community Insights’. Each month we’ll invite a different member or group from the community and interview them on the work they are doing in the digital twin space, as well as finding out a bit about their background and interests. We now have members from all infrastructure sectors and we think it will be fascinating to get a chance to really dig deeper on what is happening in each field. 
In the first of our Community Insights, I’m excited to interview the CSIC research team who, in collaboration with Cambridge City Council, are developing a digital twin for Cambridge. I’m keen to find out all the lessons they learnt from the experience and what kind of impact they think it could have for the city.
We will also be continuing our work on standards this year. A lot of the foundational thinking has already been done, as well as some of the passionate discussions referenced earlier! This has set us on a clear path to what will be the first set of standards - a really momentous achievement, born out of a lot of collaboration. 
So there is plenty to get stuck in to for 2021 and we would love our members to continue to get involved. Do please take a look around the refreshed website  and start signing up to the various events on offer. I look forward to seeing you all this coming year.
 
Read more...
Standards make everyday life work. They may decide the size or shape of a product or system. They can specify terms so that there are no misunderstandings. They are crucial for instilling confidence and consistency for both providers and users. This is why we have made the development of a set of standards a crucial component of our journey towards building a National Digital Twin.
In conversations we’ve had in the Digital Twin (DT) Hub and the wider Centre for Digital Built Britain (CDBB) community, there have been significant concerns about the costs involved in investing in a digital twins. We believe, that to mitigate the risk and avoid the need to make changes down the line, standards are of vital importance. We need a shared foundation and framework to support the end goal of secure data exchange and interoperability.

We’ve made significant progress towards that goal and it’s exciting to be pioneers in establishing what will hopefully be a common language - guidelines that can be used, not just here in the UK, but globally.

To start with, we’ve needed to gain a thorough understanding of what the current standards landscape looks like and the CDBB commissioned the British Institute of Standards (BSI) to do the research. Their initial scoping exercise is complete and BSI and CDBB are now reviewing the results of this exercise to identify if and where standards are needed to overcome a specific challenge or fulfil a purpose. We’ve also looked to other sectors to see if existing standards can be applied or modified to work in the built environment.
We are now in the process of creating a clear roadmap that prioritises standards to be developed. The document will be accompanied by a report to include the narrative, justification and rationale behind the roadmap. It will be presented through a series of thematic areas: Digital Twins, Data, ICT, Application, and Outcomes as well as multiple sub-topic themes, to help enable users to locate key standards.
The end goal is a very practical guide. It will cover everything from a shared vocabulary, to ensure consistent definitions of assets, to recommended data formats, user case methodology, a code of practice on information exchange and so on.

A vital part of the process is involving stakeholders and we’re very grateful for all the feedback we’ve received so far. We have recently had the opportunity to share the latest review with DT Hub members as well as those within the wider digital twin community. Attendees of the recent workshop, hosted by BSI, had the opportunity to both critique and verify the findings as well as to share their views on some of the priorities for standards to support successful digital twins in the built environment.  This has been a valued opportunity to really shape the direction of these important developments as we can’t do it alone.

A great example of the impact standards can make is one I came across from the early 1900s when the BSI developed a standard for tram gauges at a time when, in the UK alone, there were 75 different widths of gauge! They succeeded in reducing it down to five recommended widths. These became the standards going forward and greatly boosted the industry’s fortunes increasing compatibility between networks and rolling stock. As the British standard was adopted abroad, the UK tram market enjoyed more opportunities to trade and business flourished.

We hope to make a similar kind of impact – we want to see all developers of digital twins flourish and benefit from the advantages that sharing data and ideas can bring. But in order to do that successfully, the whole process needs to be underpinned by standards that have been formed out of thorough research and review and have the support and involvement of as many people as possible. We look forward to seeing you around the DT Hub!
Samuel Chorlton, Chair of the Digital Twin Hub
Read more...
Strategic planning for life after Covid-19 brings an unprecedented opportunity to change the way we view and manage our infrastructure. Mark Enzer, from CDBB makes the case for putting people first.
The current pandemic has been a powerful but unforgiving teacher.  It has demonstrated the importance of data and the power of digital models to derive insights from those data, to help us model outcomes, to guide the pulling of the levers to control “R” and to help us make better more-informed decisions.  Covid’s  disruptive impact across all sectors and societies has also revealed the interconnections and interdependencies between our economic and social infrastructure, highlighting the importance of creating resilient, sustainable and secure infrastructure systems upon which essential services depend.
So why change our view of infrastructure?
 We have created an amazing, complex machine on which we wholly depend. Without it, our lives would be immeasurably worse. Society would not survive. That machine is infrastructure – our built environment. However, we don’t appreciate the relationship between infrastructure and our wellbeing. Therefore, we don’t set objectives in terms of outcomes for people and society.
And although we understand each part of the built environment, we do not manage it as a whole. Therefore, we don’t know how to address its systemic vulnerabilities or make it work better.  If we envision, plan and manage infrastructure differently, we can make it what it should truly be: A platform for human flourishing.
Putting people first
The Centre for Digital Built Britain (CDBB) and the Centre for Smart Infrastructure and Construction (CSIC) have recently published ‘Flourishing systems’, which makes the case for a people-focused systems-based vision for infrastructure.  As we consider priorities following the Covid-19 outbreak, we have an opportunity to plot a new course that recognises the fundamental role of infrastructure in the social, economic and environmental outcomes that determine the quality of people’s lives.  To do this, we must see infrastructure as a complex, interconnected system of systems that must deliver continuous service to society.  Infrastructure is so much more than just a series of construction projects.
Adopting a system-of-systems approach makes it possible to address the great systemic challenges such as achieving net-zero carbon emissions, improving resilience and preparing for a circular economy.  It also unlocks the potential of digital transformation across the built environment.
How digitalisation delivers value
With the ongoing digital transformation of the infrastructure industry, we have the opportunity to deliver huge benefit for people – for whom infrastructure ultimately exists.  Digital transformation encompasses how we function as organisations, how we deliver new assets and how we operate, maintain and use existing assets.  Bringing digital and physical assets together creates cyber-physical systems – smart infrastructure.  Effectively, this is applying the fourth industrial revolution to infrastructure. Making better use of asset and systems data is central to this vision because better analysis of better data enables better decisions, producing better outcomes, which is the essential promise of the information age.
As part of this, we must recognise digital assets, such as data, information, algorithms and digital twins, as genuine ‘assets’, which have value and must be managed effectively and securely. In time, as data and digital assets become valued, data itself will be seen as infrastructure.
We are now at a point where the vision for effective digitalisation of the whole of the built environment is within reach.
Enabling secure, resilient data sharing
Managing complex interconnected systems requires the appropriate tools. CDBB’s National Digital Twin programme sets out a structured approach for effective information management across the system as a whole.  This approach is informed by ‘The Gemini Principles’ and is driven by the NIC’s ‘data for the public good’ report. The recent paper ‘Pathway Towards an Information Management Framework’  suggests an approach for the development of an Information Management Framework  to enable secure, resilient data sharing across the built environment.  It is this that will enable data connections between digital twins, which is at the heart of the concept of the ‘National Digital Twin’ – an ecosystem of connected digital twins.
 All systems go
Taking a systems-based approach to our infrastructure will improve our ability to deliver desirable outcomes for people and society – around accessibility, inclusion, empowerment, resilience and wellbeing – not just for now but for generations to come. It will also better equip us to address the urgent global systemic challenge of climate change.  It’s time to see infrastructure differently – as a system of systems that provides a platform for human flourishing.
flourishing-systems_final_digital.pdf
 
Read more...
During our research activities within the DT Hub, several barriers relating to the use of digital twins were identified.  This blog post is one of a series which reflects on each barrier and considers related issues so that we can discuss how they may be addressed.

As our members, and indeed other organisations active in the built environment, develop data and information about their assets, the ability to ensure that this data can be used within other tools is a priority.  To do so, the data needs to be interoperable. One definition of interoperability is:
In brief, if data can be shared between systems it is considered interoperable.  Typically, this can be achieved in one of two ways:
Both systems use the same formal description (schema) to structure the data; or One system transforms its data using an intermediate formal description (schema) to structure the data The simplest solution appears to be (1), to have all systems create, use and maintain information using the same schema.  This would mean that information could be used in its default (native) format and there would be no risk of data being lost or corrupted during its transformation.  However, this isn’t practicable as, from a technical perspective, it is unlikely that the broad range of information needed to support every possible purpose could be captured against the same schema.  In addition, public procurement directives require performance-based technical specifications as opposed to naming specific software. This means that an organization may be challenged if they specify their supply chain use a particular piece of software as it would circumvent directives around competition and value for money.
As it is not possible to guarantee that the same schema will be used throughout, it is far more practicable to identify which established industry schema is most suitable to accept data within (2) depending on the purpose of using this information.  In doing so, there is an added benefit that the information you receive may be open data.
Typically misused as a synonym for interoperability, open data is important for sharing but for a specific reason.
Open data, in brief, is un-restricted data.  By using proprietary software and systems the schema used to structure that data is hidden.  As a user of that software you are effectively given permission by the vendor to use that structure to view your information.  For built environment assets this can be a problem as the physical asset can outlast the software used to design and manage it.  Meaning that in 50 years a tool that allows access to this information may not exist - or sooner given the cannibalistic nature of the software industry.  Consider SketchUp for example.  Since its release in 2000, it has been owned by three different organizations: @Last Software, Google, and Trimble.  The permission to use the SKP schema has changed hands several times.  Who will produce software to view these files in 30 years’ time?
To ensure enduring access to asset information, either bespoke schemas need to be developed and maintained internally, or an established open schema needs to be used.  However, while several open schemas are readily available (such as IFC, PDF, PNG, MQTT) they can raise concerns related to access, control and abuse of the data within. 
These concerns, thankfully, can be offset through control.  Using open data structures, it is possible to ensure that only the information you wish to exchange is delivered.  By using proprietary structures hidden information can also be exchanged which cannot be controlled; potentially causing a larger risk than their open counterparts.  Conversely, to produce a “need-to-know” dataset an open data approach is, ironically, easier to control.
When considering which methodologies to use, open data benefits typically outweigh its risks.  The use of these open data structures will not only unlock interoperability between digital twins within an organization but will be the mechanism that enables a secure national digital twin. 
Access to appropriate data about our national infrastructure is currently held behind proprietary schema.  Let’s make Britain’s data open again!
 
We hope you enjoyed this short piece on breaking the barriers related to interoperability.  What specific challenges have you faced relating to the implementation of interoperability?  Do you consider open data in this content is an opportunity or a threat? Would you prefer the National Digital Twin to be based on an open or a propriety schema?

the_pathway_towards_an_imf.pdf the_pathway_towards_an_imf.pdf DTHUb_NewbieGuide_May2020_(1).pdf HUB Version_DT Standards Roadmap_November 2020 (3).pdf DT Hub SII SUMMARY - Published.pdf
Read more...
The 2020 DT Hub extension to the Smart Infrastructure Index survey explored organisational maturity towards digital twins and the National Digital Twin programme. It was completed by 18% of eligible DT Hub members. Analysis of the response provides insight into; maturity within the DT Hub, relationships between organisational maturity indicators and digital twin maturity and where there may be risks and opportunities for advancing digital twin maturity within the Built Environment.   Watch Sam Chorlton and Tom Hughes' interview on the survey results and what it means to the DT Hub.  
 
DT Hub SII SUMMARY - Published.pdf
 
Read more...
Is it? Or is it not?
 For a few years now, parts of our sector and indeed other sectors, have been researching, defining and promoting digital twins.  If we observe anything, it’s that chatter (including within the DT Hub) has been rife with the ‘what is/isn’t a digital twin...’
I’m no expert, and don’t yet claim to offer a resolution to clarify the topic, but I do think a discussion hosted within the DT Hub would be of use.  This discussion is something that will provide greater clarity and implementation for those less involved in this definition process and yet vitally important to the delivery of whatever a digital twin of the future is destined to be.
Let’s learn from BIM implementation
I wear many hats in my career and most of them are related to the implementation and ‘normalisation’ of BIM processes. As Vice Chair of the UK BIM Alliance and Chair of the UK & Ireland Chapter of buildingSMART International, I’m afforded a view of the sector from various different levels of stakeholders and the challenges they face in an ever-changing world as they prepare to digitalise.  The silent majority are perhaps the key to unlocking the transformation to a digital sector and it’s vital that the BIM message reaches them and connects in a meaningful way to each and every one of them... BIM in the UK has been ongoing for over a decade and my feeling is that there is at least another to go before we reach ‘business as usual’.  It’s exactly the same for Digital Twins.
All vocal parties involved here in the DT Hub seem keen to navigate more smoothly through the same sectoral challenges and one of those, in a similar vain to BIM, is “is this a Digital Twin or not”?
Acknowledging that BIM in the UK has formerly been going through the same sector engagement, we can also see similar issues appearing now with the concept behind Digital Twins being taken over by technology providers rather than sector stakeholders and subsequently being marketed in that way.  It’s by no means a UK-only challenge, with many global discussions observed.
Hence, we’re rapidly on the way to Digital Twins being defined by technologies rather than their use and value to us as people.  A human-centric approach to any digital transformation will almost always achieve greater adoption and ultimately ‘success’ than one led purely by technology. Hence the CDBB National Digital Twin Programme envisages the built environment as a system of systems, comprising economic infrastructure, social infrastructure and the natural environment.  The CDBB Gemini Principles neatly position Digital Twins in a way that forces one to consider the overall business need (the ‘why’) and all the potential societal benefits.
Other DT Hub discussions have touched on the possibility of a Turing-type test.  The original Turing test was created by Alan Turing to determine whether or nota machine was discernible from a human.  Whilst the test is valuable for determining artificial intelligence, it’s also one that is evaluated by humans and hence quite challenging to ensure all evaluators are equal. Perhaps a technology-driven test that provides both a score and a ‘time taken’, introducing a level of competition between creators of Digital Twin systems might help adoption.
 
So here’s the proposition... we hold a workshop (or two) to discuss and investigate the need for a test, the type of test, ‘what’ is being tested, what the thresholds might be, and anything else that’s relevant to the topic of ascertaining whether or not someone’s proposed Digital Twin is actually a Digital Twin.
I have three questions to start the discussion here in this thread...
1. Do you feel the need for a ‘test’ to determine whether or not a Digital Twin is a Digital Twin? Can we continue without a formal ‘test’ or should we actively seek to develop something absolute to filter out what we’ve been able to do for many years and focus on true Digital Twin solutions and the search for the allusive Digital Twin unicorn?!
 
2. If we do need a test, will a simple yes/no suffice? Or does a ‘score have more longevity? If you ever saw the HBO series Silicon Valley, you may be familiar with the Weismann Score, a fictional test and score for online file compression.  It enabled the fictional tech companies to demonstrate the success of their software and algorithms by testing their performance for file compression.  Would a similar test be suitable for our purposes, with a threshold for determining if a proposed Digital Twin is a Digital Twin and would it then cater for future digital developments and emerging technologies?
 
3. Finally, are you keen and able to join a virtual workshop?  
 

the_pathway_towards_an_imf.pdf the_pathway_towards_an_imf.pdf DTHUb_NewbieGuide_May2020_(1).pdf HUB Version_DT Standards Roadmap_November 2020 (3).pdf
Read more...

Articles

Digital Twins for Resilience

In the interview with Eleanor Voss, policy advisor to the NIC, we have begun to explore the recommendations from the NIC regarding the incorporation of resilience standards and the adoption of these by regulators. Eleanor has provided a comprehensive overview of the most pertinent areas of the study to our members and has provided us with the opportunity to influence the route that these recommendations may eventually take.
For those who are not aware, The NIC is an arm’s length body of the treasury. The Commission makes recommendations to government on economic infrastructure policy. If government accept the Commission’s recommendations, they become government policy. For example, in 2017, the NIC published the report, Data for the Public Good, which made recommendations to government on the opportunities that data, machine learning, AI and digital twins present for infrastructure planning, operation and resilience. Now, of course, following acceptance of many of the NIC’s recommendations in the report, the Centre for Digital Built Britain is taking forward much of this work through its National Digital Twin Programme. The Commission continues to play a role in steering this valuable programme.
In the past, the NIC has approached resilience from a sector level, for example, water and energy, but with the current environmental, population and technological changes, resilience has become a pressing issue with a need for a cross sector approach.  This is particularly highlighted by the interdependent nature of infrastructure. In 2018, the then chancellor asked the NIC to undertake a cross sector study, and to recommend to government, policy measures needed to ensure the resilience of the energy, telecommunications, water and transport sectors.  A few weeks ago, the NIC published its study of infrastructure resilience – Anticipate, React, Recover: Resilient Infrastructure Systems. The report calls for government to set standards for the resilience of our infrastructure and create a framework to ensure that these standards will be met now and in the future.  Today, of course, the impact of Covid-19 has meant that resilience is being discussed by everyone.
Resilience?
In order for an asset to fully satisfy its function in a manner that is effective it is often said that it is necessary for it to be resilient. But what do we actually mean by that? During the interview with Eleanor, she refers to it as infrastructure systems, engineering and organisational systems being able to:
anticipate, resist, absorb, recover, adapt and transform.
That places quite a high degree of responsibility on the ownership, operation and design of an asset or system. With the publication of the NIC report recommending Government publish resilience standards and for regulators to introduce these as new obligations on infrastructure operators by 2023, it is absolutely essential that we are able to understand how this might affect us and what preparatory work we can be doing now to be ready.
Measure?
The first part of ensuring that an asset is resilient is measurement. This is where we want to focus the discussion in the Hub. It is vital that we are able to provide feedback into the Commission regarding the feasibility of their recommendations and the only way in which we can reasonably look to do this is through assessing the practicality and viability of first measuring and then later utilising these results. Within the interview, Eleanor draws out three key areas where your guidance would be beneficial.  These are:
1.   How do we identify the appropriate level of granularity for data and models such that they can support the measurement of resilience?
2.   Providing accurate simulations for complex systems such as infrastructure requires a realistic digital representation of the physical one. As this is the core aims of Digital Twins, how can we use Digital Twins in areas such as what-if scenario planning and assessing the necessary circumstances which lead to loss of service?
3.   Dependencies/interdependencies how can we use Digital Twins to understand these and manage them?
 
Within the Hub we would like to encourage members to consider these questions from the perspective of the asset owner/operator they represent and allow us to provide useful feedback to the commission. The Hub will be running this discussion until the end of August when we will segue the discussion and start looking at supporting adaptive planning.
 
 
Read more...
We have embarked on several industrial ages and long before the arrival of the digital age there were the spinning jenny, coalmines, steam engines and telegraph poles which created the momentum for the economy we have today.
This platform of industrial progress has enabled our urbanisation, the travel between urban centres and ultimately, the digital connection between them to support a global industrialised commerce system.
The Internet of Computers
The Internet is a loose arrangement of connected but autonomous networks of devices. The devices act as hosts (or connected computers) for a specific purpose (a website host for example) and communicate through a protocol that ties together the interconnected set of networks and devices.
It is not only the backdrop of our new industrial age that makes the Internet fascinating. It was the culture that emerged from its creation.
‘Request for Comments’ created by junior team members of the ARPANET project enabled a loose and counter-hierarchical method for building consensus and developing standards.
That counter-culture was to have a profound impact on the culture of collaborators in internet-engineering circles. These collaborators maintained a meritocracy which was open and inclusive. Hacker culture was born from this and ultimately, the first internet protocol, the Network Control Protocol.
The founders of this interconnected network said:
Open source and hacking were founding behaviours within the culture of early internet engineers.
But the Internet was only the first step in our journey to today’s digital economy.
You have to keep in mind that computing in the 1960s was exclusive to national governments, the military and businesses. However, the proliferation of the telephone provided a vision of the future for connected computing.
In the 1970s, to meet the demand for connecting multiple computers together, Local Area Networks (LANs) were created. The demand for connectivity did not stop there, the Transmission Control Protocol and Internet Protocol (TCP/IP), opened LANs to connect to other LANs.
In 1981 there were only 200 interconnected computers. Despite the vision of an interconnected community of people linked through purpose and interests, instead of proximity, practically it was still a long way away.
Internet of Business
What about the dot-com-silicon-valley fairytale of rags to riches?
CERN, the European Organisation for Nuclear Research, an owner of the world’s largest particle physics laboratory, had a wicked problem to solve:
How can CERN map the complex relationships between the various people, programs and systems?
The answer was Enquire, a programme that Tim Berners-Lee created to attempt to achieve that outcome at a local level. This effort eventually led to the creation of a World Wide Web of information. It was no longer about merely connecting computers to the Internet, and it became our foundation for publishing the information on those computers to the world.
Despite the creation of hypertext mark-up language (HTML) and facilitating connections through uniform resource locations (URL) (the addresses we use to visit data on the web), there was little interest in the WWW. The initiative was twice shelved and worked on without any formal approval. Eventually, through creating a browser for this WWW, its benefits were realised.
The first ‘thing’ connected was a toaster in 1990.
In 1995 the state ownership of the Internet ended (where the fair use policy restricted commercialisation), unleashing the commercial opportunity of the Internet.
From connecting millions of computers to selling millions of products on eBay, the web rapidly went from a data sharing and discovery tool to a fully functioning marketplace.
Investors marvelled at the most extensive initial public offering in history (1995) when Netscape (an internet browser), at just two years old, went public. Burners-Lee was vindicated as he was begging uninterested students to develop web software only a few years previous.
By July 1997, there were 19.6 million connected computers, and Amazon had over 1 million customers.
No brief history of the web would be complete without a mention of Google. A play on the word Googol, which denotes a massive number, Page and Brin set out to make the WWW discoverable. Yahoo! Offered to buy Google for $3bn, Google rejected the offer and eventually generated a need for the Oxford English Dictionary to add the verb Google.
The end of the ’90s saw the first dot-com bubble burst, and the NASDAQ peaked at 500% higher than what it had been when Netscape offered it's IPO 5 years earlier. The market contraction was significant, Alan Greenspan coined the phrase irrational exuberance, and it captured the economic problem well.
The Internet of Media
While the latter stages for the commercial aspect of the Internet failed, hacking and the open-source movement were still active. Wikipedia demonstrated the power of open and collaborative systems. It had 20,000 articles in 2002 and grew to 2.5 million by 2009, today it contains 28 billion words in 52 million articles in 309 languages.
Web 2.0 was to take the plastic nature of digital information and extend the Internet into a platform for connecting people with rich media. The printing press, compact disk, and the physical bank statements rom your bank were unable to match the plasticity of the Internet.
A simple example is Craigslist, a user-driven website that allows its users to buy and sell anything. It was started by Craig Newmark who circulated e-mail newsletters among friends with notices of events in San Francisco. By utilising the Internet, it became a website with 20 billion page views a month!
It did not stop there, in 1996 the song ‘Until it sleeps’ by Metallica became the first track to be illegally copied from CD and made available on the Internet as an MP3. It pathed a way for a generation to thinking music and other digitally related creative output should be digital, easy to access and nearly free.
64% of teenagers in 2007 had created content to upload to the Internet.
Solving the problem of compression to enable media to be streamed over the Internet redefined the entertainment industry and shaped today's internet culture, which is now considered pop culture.
The Internet of Things
There are 20 billion devices connected to the Internet today. In 2013 Hal Varian, Google’s Chief Economist, wrote:
We have reached a moment where the website is almost obsolete, and our interface with the WWW is purely through streamed data through services (like Netflix and video games on Steam) or specific applications (like Facebook and TikTok on mobiles).
It is clear from the rolling history of the Internet that there is still an opportunity for its extensibility. Where the early founding students in ARPANET set the tone of the culture of openness and agility, leading to connecting computers to computers, networks to networks and toasters to other things.
That might sound like an obvious thing to say. However, I honestly believe we are still in the early stages of an internet that will converge vast networks of national infrastructure to the benefit of the citizen.

We must preserve the playful and collaborative nature found in internet culture.
Today, The Internet of the Built Environment
From connecting a toaster in 1990 to connecting our built environment, the Internet has been on a rapid journey, and that journey does not stop here.
What next for the Internet? More than data and databases, more than information management, it will help us understand our built and natural environments in new and profound ways.
The vision of the Internet enabling an interconnected community of people linked through purpose and interests, instead of proximity is a reality today. The Flourishing Systems paper has developed today’s vision of the Internet.
That flourishing converging network of infrastructure systems is enabled by the National Digital Twin programme, and it draws some interesting parallels from the creation of the modern Internet.
The Commons is a place where we create the protocols needed to connect economic and social infrastructure digitally. A fundamental founding principle of the commons is setting the behaviour of collaborators. We aim to capture the essence of opensource and collaborate openly with the members of the DT Hub.
With that cultural underpinning, the Commons is also like a zipper, where we have a foundation that makes the initial connection and the slider (the commons) moves to connect the following elements together.

The foundation data model and the reference data libraries are like the TCP/IP and HTML frameworks. They form the protocols for connecting digital twins together and enables the built environment to communicate digitally.
This extension of the Internet is a platform for creativity and profound economic growth. Much like the Internet, the founders did not predict its impact on creative industries and pollical power through empowering communities.
We will not know the future impact of this technology, but it will be impactful.
Lastly, it is our only chance to adapt our built environment to operate in harmony with our natural environment.
The National Digital Twin Programme is standing at the beginning of a new wave of interconnectedness, and with open and inclusive collaboration, we will take the first step into a new future.
 
Read more...
Following input from DT Hub members into a community-driven document, we have proceeded to reduce the number of use cases identified during the Pathway to Value Workshop from 28 down to 12:
Open Sharing of Data Asset Registration Scenario Simulation Occupant/User Management Environmental Management Traffic Management Process Optimization Asset Management Carbon Management Resource Management Resilience Planning Risk Management Using these use cases, we can begin to explore how the National Digital Twin (NDT) programme can support members of the DT Hub in realizing their value.  One way of doing so is by identifying what parts of these use cases need to be developed via the Commons Stream as part of the Information Management Framework (IMF).
The reasoning being these 12 use cases are:
Horizontal. Meaning that they can be applied within several sectors and their respective industries; and High-value. Meaning that they can achieve a return on investment. Positively, these use cases have a strong synergy with a similar schedule presented by Bart Brink of Royal HaskoningDHV on a recent buildingSMART webinar on digital twins.

By identifying DT Hub member horizontal, high-value, use cases we hope that their associated tasks, key performance indicators and federation requirements can be recommended for prioritization as part of the development of the Information Management Framework (IMF).
At the beginning of June, CDBB released The Pathway Towards an Information Management Framework: A Commons for a Digital Built Britain, a report setting out the technical approach that will lead to the development of the National Digital Twin.  Within the report it focuses on three key facets that will enable secure, resilient data sharing across the built environment:
Reference Data Library.  A taxonomy describing a common set of classes to describe the built environment; Foundation Data Model.  An ontology outlining the relation between these classes or properties of these classes; and Integration Architecture.  Exchange protocols to facilitate sharing of information, using these defined classes and relations between digital twins.
As opposed to being released as a complete resource, we will likely see these facets developed organically as the NDT programme continues to follow its mantra of:
As such, the key question isn’t “what should these facets include?” but “what should be included first?”.  We hope to answer this question using these horizontal, high-value, use cases. 
EXAMPLE:
“Environmental management”.  At the beginning of 2020, news reports focused on air pollution and its link with infrastructure.  In addition, many building assets may wish to monitor air quality due to its known impact on occupant performance.  As a use case that is associated to regulatory compliance, productivity, and applicable to a breadth of assets Environmental Management may be a horizontal, high-value, use case.
To support such a use case, the:
Reference Data Library.  May need to include classes such as: Temperature, Wind speed, Humidity, CO2, and PM2.5 as well as their associated units to enable the consistent recording of this information. Foundation Data Model.  May need an ontology describing acceptable ranges and the relationship of air quality concepts to other classes such as Health and Productivity depending on the function being monitored; and Integration Architecture.  May need to facilitate the sharing of information from sources such as other digital twins, as well as datasets from the Met Office and local governments. Simply put, by identifying these horizontal, high-priority, use cases, we may be able to begin accelerating the realization of their value by having the taxonomies, ontologies and protocols needed to facilitate them available at an earlier stage of the overall IMF development.
And there we have it.  As DT Hub members begin to consider how the information management framework may support their digital twin development as well as the national digital twin, which use cases do you think are the most horizontal and high-value? How do you think these facets might support your ability to undertake these use cases?
Please feel free to add your thoughts below, or, alternatively, comment directly on the draft community-driven document which is, and will continue to be, progressively developed as member views are shared.

the_pathway_towards_an_imf.pdf the_pathway_towards_an_imf.pdf DTHUb_NewbieGuide_May2020_(1).pdf HUB Version_DT Standards Roadmap_November 2020 (3).pdf
Read more...
Top
×
×
  • Create New...