Jump to content

Articles & Publications      Blogs

    Combining asset data into a connected digital twin can give asset owners across energy, water and telecoms networks a better understanding of the risk of extreme weather events caused by climate change, allowing them to take action. 
    “We cannot plan for a more resilient future in silos” heard delegates to a showcase event this month exploring progress with CReDo – the Climate Resilience Demonstrator – whose second phase is led by Connected Places Catapult and the Digital Twin Hub. 
    Instead, a “whole system” approach is needed which considers the complex connections and interdependencies between different types of infrastructure essential to society’s functioning. 
    Over 450 people joined the hybrid event and heard Elliot Christou, CReDo Technical Lead at the Catapult, and Sarah Snelson, a Director specialising in public policy practice with Frontier Economics, explain the threats posed by a changing climate and the need to take action using sophisticated tools such as CReDo to limit disruption from future flooding. 
    CReDo is a climate change adaptation digital twin which brings together data across energy, water and telecoms networks to create a bird’s eye view of the infrastructure system. Connected Places Catapult has been working with Anglian Water, BT and Openreach and UK Power Networks who have brought their people and their data to the project to investigate how it possible to share data across sectors and how there is benefit in doing so through increased climate resilience.   
    We heard how simulations can be run and data interrogated using the CReDo digital twin to allow users to understand more fully the vulnerabilities of their infrastructure networks to flooding. With the correct information to hand, asset owners can make more informed decisions to protect their assets in advance of these extreme weather events impacting and causing failure across the system.  
    Scenarios can be created that demonstrate the impact of a range of different future flooding risks,  and show how the loss of one piece of the infrastructure jigsaw puzzle can disrupt other services. CReDo can then be used to coordinate and support decision making to allow the infrastructure system to be better protected and made more resilient. 
    Using data to create “actionable insights” could therefore allow decisions to be made that “keep the lights on at a lower cost for the benefit of network operators and society” it was said. 
    Recent months have been focused on CReDo as a decision-making tool for asset operators. Going forward, the benefits for customers and wider society are set to be explored further. A report on progress with phase two of the project is to be published shortly. 
    Flooding threats made clear 
    The showcase event began with a powerful video featuring Baroness Brown of Cambridge, (Professor Dame Julia King) Chair of the Adaptation Committee of the Climate Change Committee outlining the rising occurrences of extreme weather, the need for infrastructure to be resilient to such events and how the impact on society can be more serious if authorities are not more prepared. 
    “The climate change resilience of infrastructure networks is a challenge that is not yet well understood and is one that we need to address urgently,” she said. “Asset owners really need to know, who are they dependent on”, she added, pointing out that if one is impacted by flooding and that problem was to affect an energy substation for instance, that problem could cascade further. “Understanding risks in advance and how we can mitigate them is key.” 
     
     
    Speakers at the event included Sarah Hayes, Strategic Engagement Lead for CReDo, who explained that one vision for the digital twin is for asset owners to be able to assess the impact of future investment decisions, such as relocating or improving defences for a power substation. 
    While phase one of the CReDo project used a centralised database, phase two explores how to develop a distributed architecture to enable scalability across sectors, regions and organisations, she explained. “We are on a journey towards connected digital twins”. 
    Jethro Akroyd, Principal Engineer at CMCL Innovations, ran through the approach to developing the distributed architecture and explained how CReDo uses a common data structure to enable interoperability between the data sets from the asset owners. He walked the audience through a technical demonstration of the CReDo visualisation showing how the assets are connected and then impacted by flooding scenarios as failure cascades throughout the system. 
    Industry panel shares its insights 
    A panel discussion involving representatives from asset owners involved in CReDo, Anglian Water, BT and UK Power Networks together with representatives across infrastructure and climate resilience  and moderated by Arup’s global digital leader Simon Evans concluded proceedings. 
    "I am hugely impressed by what I have heard,” remarked one of the panel. “What we are talking about is getting access to data". Another said: “We are increasingly seeing the impact of climate change, so energy and water networks definitely need to work more closely together." 
    One utility provider remarked that more frequent severe weather events caused by climate change were having a big impact on its fault rates. “We cannot protect everything all of the time, so the better we understand how systems are inter-related, the more we can help customers and create insights into the most sensible way to protect our network.” 
    National Infrastructure Commissioner Jim Hall commented that it was great to see the use of digital tools to help with the planning of resilient infrastructure. “This is a really exciting space,” he noted, “let’s not stop experimenting”. 
    Connected Places Catapult’s Ecosystem Director for Integrated Infrastructure, Chris Jones described CReDo as a “great example of the Catapult ethos of bringing together infrastructure sectors, generating a conversation, identifying common ground and sparking innovation”. 
    “We have got ambitious plans to scale CReDo”, he added, “and we want you all to work with us to take this project forward.” 
    Learn more about CReDo 
    Get involved in our next phase 
    Contact us: credo@cp.catapult.org.uk 
     
    Article by Mike Walter, Connected Places Catapult
     
    Read more...
    We all speak the same language, don't we?
    Jonathan Eyre, Senior Technical Fellow in Digital Twins, AMRC and Digital Twin Lead, HVMC
    And what does industrial common language mean for data interoperability?
    In conversation we use language to express thoughts and our point of view, but are the same words being used and understood by everyone in the same way? At the moment, this is most certainly not the case for a term like digital twin at the moment, especially with use cases all being so different to each other. This misalignment can be manageable for a small group discussion, but for larger world-wide collaborations the question becomes how can you trust and perhaps even validate that everyone is using the same language in the same way?
    These issues are already deeply embedded in current information systems; information is actively consumed across organisations, supply chains and even across human languages that are all trying to exchange information without any loss of quality. Common acronyms can have different meanings; SME stands for both "small and medium-sized enterprise" and "subject matter expert" which even a small misunderstanding like this can cause major downstream issues. Ensuring language is being used in the same way for every end user is difficult, but not impossible as we’ll discuss.
     
     
     
    So how do we create an industrial common language?
    We live in a complex world where manufactured goods are produced for other sectors (like the built environment), that are then transported around to let other sectors like healthcare to provide services to society. A sprawling system of systems.
    Creating consistency in this interconnectedness is not to be understated in its difficulty. As with most things, there is prior work such as “The pathway Towards an Information Management Framework” [1] where this report and other supporting outputs detailed key principles and captured common language formally as “Industry Data Models & Reference Data”. The scale of the overall challenge is overwhelming for any individual; however individually we don’t need to solve everything. The framework critically enables experts in their fields to create consistent language to support everyone in managing information quality all the way to the top.
    This is what the Apollo Protocol [2] is empowering by having a method for convening forums to solve problems, establishing a consistent language for them and justifying 'why?' with evidence along the way. Language is an ever-evolving process and creating an industrial language is no different with on-going efforts required.
     

     
    I'm convinced, but what does it really give us? 
    Chiefly, data interoperability. This often gets mentioned superficially as being crucial to enable digital twins and cyber-physical systems, however, to get there it is common language that is a key step towards having it.
    The next thing is producing reference data libraries (RDLs) which are a “particular common set of classes and the properties we will want to use to describe our digital twins.” [1]. These will define the underpinning common language structures that will enable a click of a button exchange of information between information systems without data loss.
    Data interoperability and RDLs together provides a new layer to build upon for managing quality information ensuring overall consistency. Importantly though nothing is technology (or vendor) dependent and is simply a methodology to analyse the world backed with evidence. This, itself, has a lot of advantages but overall allows a much greater agility to enable its development in distributed environments, thus avoiding silos of information that are typically controlled by a dictated single source.  Critically, this consistent, but distributed, approach enables continual open extensions to improve and innovate the structuring of the language we build up, even in different data management systems in different ecosystems.
    So, what next?
    Consistent language is critical for data interoperability and requires input by everyone. Being able to agree on language though doesn’t mean that everyone needs to unanimously agree, but by creating understanding where we can in our respective areas will enable the success of the transformations we are all making. With this approach, other areas of opportunity naturally open up such as mapping reference data libraries, ultimately enabling us to solve the wicked problems we face together worldwide.
    The Apollo Protocol and its approach is enabling convening to develop unified common language for industrial data. If you are involved in initiatives and events also trying to enable interoperability and data sharing, then perhaps consider what you can do to enable consistent language as a starting point?
    Jonathan Eyre is a member of the DT Hub Advisory Board, Senior Technical Fellow for Digital Twins for the Advanced Manufacturing Research Centre and Digital Twin Lead for High Value Manufacturing Catapult. Contact Jonathan via the DT Hub. 
     
    Links:
    [1] The pathway Towards an Information Management Framework:
    https://www.cdbb.cam.ac.uk/what-we-did/national-digital-twin-programme/pathway-towards-information-management-framework
    [2] The Apollo Protocol: https://theiet.org/apollo-protocol
    Join the Apollo Protocol network discussions:
    https://digitaltwinhub.co.uk/networks/29-the-apollo-protocol/
    Read more...
    Data sharing between digital twins – can we show this in a simple way?
    Sarah Hayes, Strategic Engagement Lead for CReDo
    It’s great to hear about our digital twin projects because they’re exciting and innovative. Our use cases are different and varied because there are so many problems that digital twins can help us solve. In a recent radio interview[1], Lord Deben, Chair of the Committee on Climate Change suggested that it should be legally required that every single government decision should be made with climate change and sustainability in mind and that each decision should be made quicker than it currently is. We, the DT Hub community, know that connected digital twins are part of the answer to this to enable quicker decisions taking account of more information so that every decision can be made with climate change and sustainability in mind, and it’s part of our duty to communicate this.
    But it’s also part of our job to explain well what digital twins are and how we develop them, not just what they can do for us. When I sat and listened to other presentations at the Utility Week conference last May in Birmingham, I started to wonder how others might become confused by the variety of ways we choose to describe our digital twin projects. Each project has a different diagram to represent how the data is brought together, what the controls over the data are and what the governance looks like. If we had a common diagram to describe these areas we could start to properly compare and contrast our approaches and better understand where bespoke approaches trump a common approach and vice versa.
    A group of data and digital twin experts have come together since the summer to talk about how our own projects tackle the thorny problems of data integration and access. How do we bring together data from different sources? Where do we put that data? And how do we ensure as much of it is as open as possible and data that needs to be secure stays so? We found that we use different names for the same things but after some discussion we can come to consensus on which names seem most appropriate. It’s not an exact science, but through discussion and working through examples together, we’re all making progress.
    We developed the data architecture wheel (with thanks to the Virtual Energy System team at National Grid ESO for sharing the original basis for this diagram) to show how data can be shared. Organisations have digital twins of their assets and may want to share some of their data and almost certainly need data from outside their organisation. We have different ways to share this data. We can share data on a point-to-point basis as below; I email you my file. But that won’t scale as multiple parties send multiple emails to each other (much like today?).

    Or we can develop a central database for open or shared data. We’ll need some access, security and quality protocols (the padlocks) to govern the database and we’ll need a way to agree that. And whilst central databases do have their place, one central database cannot become the national digital twin. And many databases will continue to silo our information, causing duplication, inefficiency and friction.

     
    So we can develop a distributed data sharing architecture with agreed common access, security and quality protocols (the padlocks). This allows organisations to retain control over their data and who accesses it and its’ quality.

    In reality we know, we’re going to get a bit of each, and can it be represented like this?

    We want your feedback, so let us know! Of course, these diagrams will best come to life when presented in the context of real projects, and that’s why we’re presenting them in the context of CReDo and the Virtual Energy System. Stay tuned to the Gemini Call and the CReDo team will be talking more about the distributed architecture being developed on 21 February.
    In order to ingest data into particular use cases or digital twin projects, it is necessary to use 1) a high level data structure or model and 2) a more bespoke data structure tailored to the use case. A foundational or top level ontology would lay the foundations for 1) and 2) as is the thinking behind the development of the Information Management Framework.
    Without an agreed top level ontology at this stage in our journey, we can still make progress by sharing our common high level data structures at the industry level and sharing our bespoke data structures at the use case and project level (which can be copied and adopted for similar use cases.) But we just need to make sure we’re talking about the same thing and that we can share our learnings as we go.
    Using the same diagrams to point out differences of approach can help. I’m talking through these diagrams at the Gemini call today and putting out a call to action to help us improve these diagrams and to join in our discussion. Can you use this diagram to represent your digital twin or data sharing project? It would be fantastic to see others using these diagrams to talk about their projects at the Gemini calls. And can we start to develop shared rules that will enable distributed architectures to work across industries? Join the Data sharing architectures network on the DT Hub and share your feedback Data sharing architectures - DT Hub Community (digitaltwinhub.co.uk). If you’d like to get more involved then please get in touch.
     
    Sarah Hayes is the Strategic Engagement Lead for CReDo, author of Data for the public good.
    sarah@digitaltwinnercouk.onmicrosoft.com
    Join the Gemini Call Tuesdays at 10:30-11:00 Gemini Call - DT Hub Community (digitaltwinhub.co.uk)
      [1] BBC Sounds - Rethink, Rethink Climate, Leadership
     
     
    Read more...
    A digital twin approach to embodied carbon calculations 
    Glen Worrall, Bentley Systems
    This article considers how to use the DT Toolkit roadmap to deliver a digital twin suitable for embodied carbon reporting. 
    The requirement for a reduction in everyone’s carbon footprint can have a wide-ranging net when you consider the many “carbon” interactions we have in any given day. 
    However, the latency of any transaction tends to affect the ability for carbon teams to have any influence on the design or materials used that can materially impact an assets carbon footprint. 
    The digital twin mindset means that we can ensure our digital model is as close to the physical model as possible, but also can be interrogated quickly and easily. 
    Going digital is a common theme, but the effectiveness of this must come from an increase in our targeted goals. The following process may be aligned with embodied carbon workflows but can just as easily be applied to any process that utilises the digital twin framework.
    Why...  and what is it for? 
    We require a digital twin to ensure the standardisation of embodied carbon reporting, which will be effective if we can reduce the embodied carbon calculation cycle from two weeks to instant. We also aim to make the report accessible by all project team members to ensure they are aware of how their decisions can make an impact upon the assets embodied carbon. 
    The digital twin must be fed from current working practices and will remove the duplication of any data. This is an interesting problem that surfaces many times and generally conflicts with ISO 19650 processes,  i.e. while we want to access SHARED / PUBLISHED information we do not want to access WIP information and we do not want to access source information using a variety of tools. 
    There are many ways we can use standards to enable access to the information and while visualising the result should be seen as part of the enhancements, the federation of external sources is not as straightforward as one would imagine. 

    Carbon Calculators enable smart material selection
    What information do we need and what data do we have? 
    A simple question such as ‘how much concrete is in my model’ unfortunately has a very complex process in obtaining a result and one which costing, construction and carbon teams must answer on a regular basis. The definition of the term concrete is not as simple as it should be with different grades, but standards such as UNICLASS assist and help us locate those elements that will materially affect our embodied carbon total. Even the standardisation of units between the teams that report environmental product declarations and the teams that build engineering models is challenging. However, there are many unit conversion libraries which allow us to utilise tonnes, meters squared or cubic yards.
     

    Standards such as Uniclass assist in developing industry processes 
    Who will do what? 
    This is interesting as most teams think that digital twins will remove the task. However, there is still a trade-off that must happen by carbon teams. Are we locally sourcing or cost restrained? This is part of the project planning that does impact the ability to be carbon zero. As long as make the task the sole focus and remove the requirement for finding data and presenting results for the project team, there should be a positive impact on the desired outcomes. We need the carbon teams to focus on effective material selection not being data wizards. 
    What does the data tell us? 
    The key here is that the results should be accurate, timely and effective. Improving the latency of the carbon reporting needs to have an impact that sees the reduction of the embodied carbon for all infrastructure. The data should show the reduction but also how complete the calculation is. An ISO 19650 workflow may release data which is not suitable, i.e. is the volume of a steel vessel really what we want to track or is it the volume of the shell of the vessel. This information must be transparent and especially when content is missing which will materially impact the global warming potential, i.e. what is the factor for the MEP systems in a building which may not be modelled for the next six months? 
     

    Information Accessibility should be a key value proposition 
    How are we doing? 
    Like all infrastructure projects, the project is an evolving twin. Further as we move into construction what processes are in place to ensure the as-built embodied carbon matches the as-designed embodied carbon? There are plenty of processes in place to ensure the as-built asset matches the engineering requirements, but where was a different material used, how did the construction process impact the actual embodied carbon for an asset. What happens in 12 years’ time or after the first maintenance window.  This is information which we need to ensure that carbon and whole life costs align with our expectations. 
    Conclusion 
    Regardless of our outcome, a framework for our process and data requirements should lead us to a positive outcome. Further, the ability to identify the preferred outcomes allows us to identify those parts of the process that cannot meet requirements or can be improved. 
    Glen Worrall is a member of the DT Hub Community Council and Director of Digital Integration at Bentley Systems. Contact Glen via the DT Hub.
     
     
    Read more...
    The Digital Twin Journeys workstream has taken world leading research and turned it into accessible and useful information to enable those who are just starting out on their digital twin journeys to get ahead. We have learnt about more than just innovative technologies and their implementation, we have learnt about the type of thinking that makes this research ground-breaking. To take this research forwards and discover what your Minimum Viable Twin is, check out the infographic, the final summary of our workstream.
    Join Desmond and Mara as they embark on a journey of their own to develop a digital twin. As you follow them, you will learn about an approach to design thinking and iterative development that paves the way for effective digital twin prototyping.  
    Read the full infographic here.
    We have taken our journey through assessing the need of users as they utilise our services. This enables the interventions that we make to be tailored to their needs, considering the ecosystem of services they rely on and the differing levels of access to these services.  
    We have learnt that care needs to be taken when selecting whether to create your own solution from scratch, buy something pre-existing or work with partners. The Deep Dish project used well established code to handle computer vision, the sensors used in the Staffordshire bridges projects were not custom made for it. In short, there is no need to reinvent the wheel.
    As digital twins were themselves first conceived by NASA as a way of managing assets in the most inaccessible place, space, so too have we learnt how we can manage inaccessible assets from space with the help of satellite telemetry. But we also discovered how important skilled data scientists are to making this technique accessible to industry. 
    We learned that digital twin prototypes can be used as a tool for their own continuous cycle of improvement, as each iteration teaches us how to better classify, refine and optimise the data we use in our decision-making. 
    The key to it all is the decisions that we make, the way that we change the world around us based upon the information that we have in front of us. We have learnt that working with decision makers is central to creating digital twins that improve outcomes for people and nature as part of a complex system of systems. We can provide these stakeholders with the information that they need to realise our collective vision for a digital built Britain.
    This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF).
    Check out the rest of the outputs on the CDBB Digital Twin Journeys page.
    Read more...
    To asset owners and managers, understanding how people move through and use the built environment is a high priority, enabling better, more user-focused decisions. However, many of the methods for getting these insights can feel invasive to users. The latest output from Digital Twin Journeys looks at how a researcher at the University of Cambridge has solved this problem by teaching a computer to see. Watch the video to learn more.

    Working from the University of Cambridge Computer Laboratory, Matthew Danish is developing an innovative, low-cost sensor that tracks the movement of people through the built environment. DeepDish is based on open-source software and low-cost hardware, including a webcam and a Raspberry Pi. Using Machine Learning, Matthew has previously taught DeepDish to recognise pedestrians and track their journeys through the space, and then began training them to distinguish pedestrians from Cambridge’s many cyclists.
    One of the key innovations in Matthew’s technique is that no images of people are actually stored or processed outside of the camera. Instead, it is programmed to count and track people without capturing any identifying information or images. This means that DeepDish can map the paths of individuals using different mobility modes through space, without violating anyone’s privacy.
    Matthew’s digital twin journey teaches us that technological solutions need not be expensive to tick multiple boxes, and a security- and privacy-minded approach to asset sensing can still deliver useful insights.
    To find out more about DeepDish, read about it here.
    This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF).
    Read more...
    We all want the built environment to be safe and to last. However, minor movements over time from forces such as subsidence can impact how well our assets perform. It can also make connecting and modifying assets harder if they have shifted from the position in which they were built. If the assets are remote or hard to access, this makes tracking these small movements even more difficult.
    The latest instalment from the Digital Twin Journeys series is a video showing the construction and built environment sectors what they need to know about remote sensing and using satellite data, featuring the Construction Innovation Hub-funded research by the Satellites group based at the Universities of Cambridge and Leeds. 
    Using satellite imaging, we may be able to detect some of the tell-tale signs of infrastructure failure before they happen, keeping services running smoothly and our built environment performing as it was designed over its whole life. 

    You can read more from the Satellites project by visiting their research profile. 
    This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF). 
    Read more...
    Interviews with DT Hub Community Co-Chairs Ali Nicholl, IOTICS, Melissa Zanocco, Infrastructure Client Group and Mark Enzer, Director, CDBB and Head of the National Digital Twin programme
    As a new phase opens up for the Digital Twin Hub (DT Hub), we have relaunched our Hub Insights 'Live' series. 
    The introduction of the Digital Twin Hub Community Council brings with it opportunities for DT Hub members to share experiences and get involved in shaping the direction of the Hub. Community Council Co-Chairs @Ali Nicholland @Melissa Zanocco outline their thoughts on a changing attitude to data sharing, the development of key projects such as the Climate Resilience Demonstrator (CReDo) which have grown out of DT Hub relationships, and how the Council will use its voice to help enable socio-technical change.
    'Learning by doing, progressing by sharing'
    In the third interview of this mini-series, @Mark Enzer, Director of CDBB and Head of the National Digital Twin programme, speaks about his passion for digital twins and connected digital twins, where it began for him, plus a look at the digital twin landscape and how co-ordination and collaboration will be key to taking the work forward.
    Mark talks about the exciting opportunities that will result from the transition of the DT Hub to an industry/Catapult partnership hosted at the Connected Places Catapult, the influence of the Centre for Digital Built Britain and the National Digital Twin programme, and the importance of a future strategy for the DT Hub focused on its membership - bringing the industry together to develop the roadmap for an ecosystem of connected digital twins.
    Watch the Hub Insights - New horizons videos here:
    Sam Chorlton interviews Ali Nicholl
    Tom Hughes interviews Melissa Zanocco
    Tom Hughes interviews Mark Enzer
     
    Read more...
    Motion sensors, CO₂ sensors and the like are considered to be benign forms of monitoring, since they don’t capture images or personal data about us as we move through the buildings we visit. Or at least, that’s what we want to believe. Guest blogger Professor Matthew Chalmers (University of Glasgow) helped develop a mobile game called About Us as part of the CDBB funded Project OAK. The game takes players through a mission using information from building sensors to help them achieve their aims — with a twist at the end. He writes about why we all need to engage with the ethics of data collection in smart built environments. 
    Mobile games are more than just entertainment. They can also teach powerful lessons by giving the player the ability to make decisions, and then showing them the consequences of those decisions. About Us features a simulated twin of a building in Cambridge, with strategically placed CO₂ sensors in public spaces (such as corridors), and raises ethical questions about the Internet of Things (IoT) in buildings. 
    The premise of the game is simple. While you complete a series of tasks around the building, you must avoid the characters who you don’t want to interact with (as they will lower your game score), and you should contact your helpers — characters who will boost your score. You can view a map of the building, and plan your avatar’s route to accomplish your tasks, based on which route you think is safest. On the map, you can watch the building’s sensors being triggered. By combining this anonymous sensor data with map details of which offices are located where, you can gather intelligence about the movements of particular characters. In this way, you can find your helpers and avoid annoying interactions. If you’ve avoided the bad characters and interacted with the good characters while completing your tasks, you win the game.  
    However, a twist comes after you have finished: the game shows you how much could be inferred about your game character, from the exact same sensors that you had been using to make inferences about other characters. Every task in the game exposes some sensitive data about the player’s avatar, and reinforces the player’s uncomfortable realisation that they have exploited apparently neutral data to find and avoid others. 
    What does this tell us about the ethics of digital twins? Our journeys through the built environment can reveal more than we intend them to, e.g. our movements, our routines, where we congregate, and where we go to avoid others. All this information could inadvertently be revealed by a building digital twin, even though the data used seems (at first glance) to be anonymous and impersonal. The game used CO₂ levels as an example of apparently impersonal data that, when combined with other information (local knowledge in this case), becomes more personal. More generally, data might be low risk when isolated within its originating context, but risk levels are higher given that data can be combined with other systems and other (possibly non-digital) forms of information.  
    The Gemini Principles set out the need for digital twins to be ethical and secure, but About Us demonstrates that this can be surprisingly difficult to ensure. Collecting data through digital twins provides aggregate insights — that’s why they’re so useful — but it also creates risks that need ongoing governance. It’s vitally important that citizens understand the double-edged problem of digital twins, so that citizens are more able to advocate for how they want the technology to be used, and not used, and for how governance should be implemented. 
    Gamification is now a well-established technique for understanding and changing user attitudes toward digital technology. About Us was designed to create a safe but challenging environment, in which players can explore an example of data that could be collected in distributed computing environments, the uses to which such data can be put, and the intelligence that can be gathered from resulting inferences. The ultimate purpose of Project OAK is to enable anyone concerned with how data is managed (e.g., data processors, data subjects, governance bodies) to build appropriate levels of trust in the data and in its processing. Only if we recognise the ethical and legal issues represented by digital twins can we start to give meaningful answers to questions about what good system design and good system governance look like in this domain. 
    Information about this project is available on their GitHub page.
    This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF).  
    To join the conversation with others who are on their own digital twin journeys, join the Digital Twin Hub.
    Read more...
    Described in the Pathway to the Information Management Framework, the Integration Architecture is one of the three key technical components of the Information Management Framework (IMF), along with the Reference Data Library and the Foundation Data Model. It consists of the technology and protocols that will enable the managed sharing of data across the National Digital Twin (NDT).
    The IMF Integration Architecture (IA) team began designing and building the IA in April 2021. This blog gives an insight on its progress to date.

     
     

    Principles
    First, it is worth covering some of the key principles being used by the team to guide the design and build of the IA:
    Open Source: It is vital that the software and technology that drives the IA are not held in proprietary systems that raise barriers to entry and prevent community engagement and growth. The IA will be open source, allowing everyone to utilise the capability and drive it forward.






      Federated: The IA does not create a single monolithic twin. When Data Owners establish their NDT Node, the IA will allow them to publish details of data they want to share to a NDT data catalogue, and then other users can browse, select and subscribe to the data they need to build a twin that is relevant to their needs. This subscription is on a node-to-node basis, not via a central twin or data hub, and Owners can specify the access, use, or time constraints that they may wish to apply to that subscriber. Once subscribed, the IA takes care of authenticating users and updating and synchronising data between nodes.
      Data-driven access control: To build trust in the IA, Data Owners must be completely comfortable that they retain full control over who can access the data they share to the NDT. The IA will use an ABAC security model to allow owners to specify in fine-grained detail who can access their data, and permissions can be added or revoked very simply and transparently. This is implemented as data labels which accompany the data, providing instructions to receiving systems on how to protect the data.


      IMF Ontology Driven:  NDT Information needs to be accessed seamlessly. The NDT needs a common language so that data can be shared consistently, and this language is being described in the IMF Ontology and Foundation Data Model being developed by another element of the IMF team. The IA team are working with them closely to create capabilities that will automate conversion of incoming data to the ontology and transact it across the architecture without requiring further “data wrangling” by users.
      Simple Integration: To minimise the risk of implementation failure or poor engagement due architectural incompatibility or high cost of implementation, the IA needs to be simple to integrate into client environments. The IA will use well understood architectural patterns and technologies (for example REST, GraphQL) to minimise local disruption when data owners create an NDT node, and ensure that once implemented the ongoing focus of owner activity is on where the value is – the data – rather than maintenance of the systems that support it.
      Cloud and On-Prem: An increasing number of organisations are moving operations to the cloud, but the IA team recognises that this may not be an option for everyone. Even when cloud strategies are adopted, the journey can be long and difficult, with hybridised options potentially being used in the medium to long term. The IA will support all these operating modes, ensuring the membership of the NDT does not negatively impact existing or emerging environment strategies. Open Standards: for similar drivers behind making the IA open-source, the IA team is committed to ensuring that data in the NDT IA are never locked-in or held in inaccessible proprietary formats.   What has the IA team been up to this year?
    The IMF chose to adopt the existing open-source Telicent CORE platform to handle the ingest, transformation and publishing of data to the IMF ontology within NDT nodes, and the focus has been on beginning to build and prove some of the additional technical elements required to make the cross-node transactional and security elements of the IA function. Key focus areas were:
    Creation of a federation capability to allow Asset Owners to publish, share and consume data across nodes
      Adding ABAC security to allow Asset Owners to specify fine-grain access to data
      Building a ‘Model Railway’ to create an end-to-end test bed for the NDT Integration Architecture, and prove-out deployment in containers


     
     
    Read more...
    Sensor technology has come a long way over the last 30 years, from the world’s first, bulky webcam at the University of Cambridge Computer Science Department to near ubiquitous networks of sleek sensors that can provide data at an unprecedented volume, velocity and quality. Today, sensors can even talk to each other to combine single points of data into useful insights about complex events. The new webcomic ‘Coffee Time’ by Dave Sheppard, part of the Digital Twin Journeys series, tells the story of this evolution and what it means for what we can learn about our built environment through smart sensors.  
    Starting with a simple problem – is there coffee in the lab’s kitchen? – researchers in the early 1990s set up the world’s first webcam to get the information they wanted. Today, people in the Computer Lab still want to know when the coffee is ready, but there are more ways to solve the problem, and new problems that can be solved, using smart sensors. Smart sensors don’t just send information from point A to point B, providing one type of data about one factor. That data needed to be collated and analysed to get insights. Now sensors can share data with each other and generate insights more instantaneously. 
    The West Cambridge Digital Twin team at the computer lab have looked at how specific sequences of sensor events can be combined into an insight that translates actions in the physical world into carefully defined digital events. When someone makes coffee, for example, they might turn on a machine to grind the coffee beans, triggering a smart sensor in the grinder. Then they’d lift the pot to fill it with water, triggering a weight sensor pad beneath to record a change in weight. Then they would switch the coffee machine on, triggering a sensor between the plug and the outlet that senses that the machine is drawing power. Those events in close succession, in that order, would tell the smart sensor network when the coffee is ready. 
    These sequences of sensor triggers are known as complex events. Using this technique, smart sensors in the built environment can detect and react to events like changes in building occupancy, fires and security threats. One advantage of this approach is that expensive, specialist sensors may not be needed to detect rarer occurrences if existing sensors can be programmed to detect them. Another is that simple, off-the-shelf sensors can detect events they were never designed to. As the comic points out, however, it is important to programme the correct sequence, timing and location of sensor triggers, or you may draw the wrong conclusion from the data that’s available. 
    Something as simple as wanting to know if the coffee is ready led to the first implementation of the webcam. Digital twin journeys can have simple beginnings, with solving a simple problem with a solution that’s accessible to you, sparking off an evolution that can scale up to solve a wide range of problems in the future. 
    You can read and download the full webcomic here.
    You can read more from the West Cambridge Digital Twin project by visiting their research profile. 
    This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF). 
    Read more...
    By 2050, an estimated 4.1 million people will be affected by sight loss in the UK, making up a portion of the 14.1 million disabled people in the UK. How might digital twins create opportunities for better accessibility and navigability of the built environment for blind and partially sighted people? A new infographic presents a conception of how this might work in the future.
    In their work with the Moorfields Eye Hospital in London, the Smart Hospitals of the Future research team have explored how user-focused services based on connected digital twins might work. Starting from a user perspective, the team have investigated ways in which digital technology can support better services, and their ideas for a more accessible, seamless experience are captured in a new infographic. 
    In the infographic, service user Suhani accesses assistive technology for blind people on her mobile phone to navigate her journey to an appointment at an eye hospital. On the way, she is aided by interoperable, live data from various digital twins that seamlessly respond to changing circumstances. The digital twins are undetectable to Suhani, but nevertheless they help her meet her goal of safely and comfortably getting to her appointment. They also help her doctors meet their goals of giving Suhani the best care possible. The doctors at the eye hospital are relying on a wider ecosystem of digital twins beyond their own building digital twin to make sure this happens, as Suhani’s successful journey to the hospital is vital to ensuring they can provide her with care. 
    Physical assets, such as buildings and transport networks, are not the only things represented in this hypothetical ecosystem of connected digital twins. A vital component pictured here are digital twins of patients based on their medical data, and the team brings up questions about the social acceptability and security of digital twins of people, particularly vulnerable people. 
    No community is a monolith, and disabled communities are no exception. The research team acknowledges that more research is needed with the user community of Moorfields to understand the variety of needs across the service pathway that digital twins could support. As such, developers need to consider the range of users with different abilities and work with those users to design a truly inclusive ecosystem of digital twins. The work by the Smart Hospitals research team raises wider questions about the role of digital technology both in creating more physical accessibility in the built environment but also potentially creating more barriers to digital accessibility. It is not enough to create assistive technologies if not everyone can – or wants to – have access to those technologies.  
    ‘The role of digital technologies in exacerbating potentially digital inequalities is something that needs to be looked at from a policy perspective, both at the hospital level, but also more generally, from a government Department of Health perspective,’ says Dr Michael Barrett, the project’s principal investigator.  
    Dr Karl Prince, co-investigator, reflects that, ‘The traditional questions when it comes to this type of technology are raised as to: do they have access to equipment, and do they have the technical ability?’ The lesson is that you can build digital twins that create a better experience for people if you design digital systems from the perspective of an ecosystems of services, with input from users of that ecosystem.  
    Through exciting case studies, the project raises vital questions about digital ethics and the potentially transformative effects of digital twins on the physical built environment.
    To read the infographic in detail, click here.
    You can read more from the Smart Hospitals project by visiting their research profile page. 
    This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF).  
    To join the conversation with others who are on their own digital twin journeys, join the Digital Twin Hub.

     
     
     
     
     
     
     
    Read more...
    Digital twins are not just a useful resource for understanding the here-and-now of built assets. If an asset changes condition or position over its lifecycle, historical data from remote sensors can make this change visible to asset managers through a digital twin. However, this means retaining and managing a potentially much larger data set in order to capture value across the whole life of an asset. In this blog post, Dr Sakthy Selvakumaran, an expert in remote sensing and monitoring, tells us about the importance of curation in the processing of high-volume built environment data.
    There are many sources of data in the built environment, in increasing volumes and with increasing accessibility. They include sensors added to existing structures – such as wireless fatigue sensors mounted on ageing steel bridges – or sensors attached to vehicles that use the assets. Sources also include sensing systems including fibre optics embedded in new structures to understand their capacity over the whole life of the asset. Even data not intended for the built environment can provide useful information; social media posts, geo-tagged photos and GPS from mobile phones can tell us about dynamic behaviours of assets in use.
    Remote sensing: a high-volume data resource
    My research group works with another data source – remote sensing – which includes satellite acquisitions, drone surveys and laser monitoring. There have been dramatic improvements in spatial, spectral, temporal and radiometric resolution of the data gathered by satellites, which is providing an increasing volume of data to study structures at a global scale. While these techniques have historically been prohibitively expensive, the cost of remote sensing is dropping. For example, we have been able to access optical, radar and other forms of satellite data to track the dynamic behaviour of assets for free through open access policy of the European Space Agency (ESA).
    The ESA Sentinel programme’s constellation of satellites fly over assets, bouncing radar off them and generating precise geospatial measurements every six days as they orbit the Earth. This growing data resource – not only of current data but of historical data – can help asset owners track changes in the position of their asset over its whole life. This process can even catch subsidence and other small positional shifts that may point to the need for maintenance, risk of structural instability, and other vital information, without the expense of embedding sensors in assets, particularly where they are difficult to access.
    Data curation
    One of the key insights I have gained in my work with the University of Cambridge’s Centre for Smart Infrastructure and Construction (CSIC) is that data curation is essential to capture the value from remote sensing and other data collection methods. High volumes of data are generated during the construction and operational management of assets. However, this data is often looked at only once before being deleted or archived, where it often becomes obsolete or inaccessible. This means that we are not getting the optimal financial return on our investment on that data, nor are we capturing its value in the broader sense.
    Combining data from different sources or compiling historical data can generate a lot of value, but the value is dependent on how it is stored and managed. Correct descriptions, security protocols and interoperability are important technical enablers. Social enablers include a culture of interdisciplinary collaboration, a common vision, and an understanding of the whole lifecycle of data. The crucial element that ensures we secure value from data is the consideration of how we store, structure and clean the data. We should be asking ourselves key questions as we develop data management processes, such as: ‘How will it stay up to date?’ ‘How will we ensure its quality?’ and ‘Who is responsible for managing it?’
    Interoperability and standardisation
    The more high-volume data sources are used to monitor the built environment, the more important it is that we curate our data to common standards – without these, we won’t even be able to compare apples with apples. For example, sometimes when I have compared data from different satellite providers, the same assets have different co-ordinates depending on the source of the data. Like ground manual surveying, remote measurements can be made relative to different points, many of which assume (rightly or wrongly) a non-moving, stationary point. Aligning our standards, especially for geospatial and time data, would enable researchers and practitioners to cross-check the accuracy of data from different sources, and give asset managers access to a broader picture of the performance of their assets.
    Automated processing
    The ever increasing quantity of data prohibits manual analysis by human operators beyond the most basic tasks. Therefore, the only way to enable data processing at this large scale is automation, fusing together remote sensing data analysis with domain-specific contextual understanding. This is especially true when monitoring dynamic urban environments, and the potential risks and hazards in these contexts. Failure to react quickly is tantamount to not reacting at all, so automated processing enables asset owners to make timely changes to improve the resilience of their assets. Much more research and development is needed to increase the availability and reliability of automated data curation in this space.
    If we fail to curate and manage data about our assets, then we fail to recognise and extract value from it. Without good data curation, we won’t be able to develop digital twins that provide the added value of insights across the whole life of assets. Data management forms the basis for connected digital twins, big data analysis, models, data mining and other activities, which then provide the opportunity for further insights and better decisions, creating value for researchers, asset owners and the public alike.
     
    You can read more from the Satellites project by visiting their research profile.
    This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF).
    For more on the Digital Twin Journeys projects, visit the project's homepage on the CDBB website.

    Read more...
    Our latest output from the Digital Twin Journeys series is a webcomic by David Sheppard. 'Now We Know' tells the story of a fictional building manager, Hank, who isn't sure how a building digital twin can help him in his work when the existing building management system tells him what he thinks he needs to know. 
    This same tension plays out around real-world digital twin development, as advocates point to the promise of perfect, right-time information to make better decisions, while others remain unconvinced of the value that digital twins can add. As the West Cambridge Digital Twin research team developed a prototype digital twin, they encountered this barrier, and found that working with the building-manager-as-expert to co-develop digital twin capability is the way to go. While they grounded iterations of the prototype in the building managers' present needs, they were also able to present the potential capability of the digital twin in ways that demonstrated its value. This is mirrored in the fictional narrative of the comic in the consultation between the Cambridge Digital Twin Team expert and the building manager, Hank.
    Involving end users, like building occupants and managers, in the design and development of digital twins will ensure that they meet real-world information needs. Both people and data bring value to the whole-life management of assets. Many uncertainties exist in the built environment, and in many cases when pure data-driven solutions get into trouble (e.g. through poor data curation or low data quality), expertise from asset managers can bolster automated and data-driven solutions. Therefore, incorporating the knowledge and expertise of the frontline managers is crucial to good decision-making in building operations. 
    The benefits of this hybrid approach work in the other direction as well. While the knowledge developed by building managers is often lost when people move on from the role, the digital twin enables the curation of data over time, making it possible to operate buildings beyond the tenure of individual staff members based on quality data.
    At present, the knowledge of experienced asset managers in combination with existing building information, is greater than the insights that early-stage digital twins can offer. But that does not mean that the promise of digital twins is a false one. It simply means that there is still a long way to go to realise the vision of right-time, predictive information portrayed in the comic. Digital twin prototypes should be developed in partnership with these experienced stakeholders.
    You can read more from the West Cambridge Digital Twin project by visiting their research profile, and find out about more Digital Twin Journeys on the project's homepage.
    This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF).
    Read more...
    When we travel by train, we expect that we will arrive at our destination safely and on time. Safety and performance of their service network is therefore a key priority for Network Rail. Our latest video in the Digital Twin Journeys series tells the story of how researchers have inherited two intensively instrumented bridges and are transforming that high volume and velocity of data into a digital twin showing the wear and pressures on the bridges, as well as other information that can help the asset owners predict when maintenance will be required and meet their key priorities.
    Remote monitoring has several benefits over using human inspectors alone. Sensors reduce the subjectivity of monitoring. Factors such as light levels, weather and variations in alertness can change the subjective assessments made by human inspectors. They may also be able to identify issues arising before visual inspection can detect them by monitoring the stresses on the bridge. A human inspector will still be sent to site to follow up on what the remote sensing has indicated, and engineers will of course still need to perform maintenance. However, remote monitoring allows the asset owners to be smarter about how these human resources are deployed. 
    One important insight for Network Rail is based on more accurate data about the loads the bridges are experiencing, and the research team have developed a combination of sensors to make a Bridge Weigh-In-Motion (B-WIM) Technology. As shown in the video, a combination of tilt sensors, bridge deformation and axle location sensors to calculate the weight of passing trains. As the accuracy of weight prediction data is impacted by changes to ambient humidity and temperature, sensors were added that detect these factors as well. Accelerometers were added to calculate rotational restraints at the boundary conditions to improve the accuracy of weight predictions and cameras were installed so that passing trains can be categorised by analysing the video footage.   
    The digital twin of the Staffordshire Bridges centres on a physics-based model for conducting structural analysis and load-carrying capacity assessments. The site-specific information, such as realistic loading conditions obtained by the sensors, will be fed into the physics-based model to simulate the real structure and provide the outputs of interest. A digital twin replica of the structure will be able to provide bridge engineers with any parameter of interest anywhere on the structure, including in non-instrumented locations.
    All of the sensors on these bridges produce a high volume of data at a high velocity. Without data curation, we could easily be overwhelmed by the volume of data they produce, but the research team is learning to narrow down to managing the right data in ways that provide the right insights at the right time. Working with Network Rail, this project will demonstrate the use of real-time data analytics integrated with digital twins to provide useful information to support engineers and asset managers to schedule proactive maintenance programmes and optimise future designs, increasing safety and reliability across their whole portfolio of assets. 
    You can read more from the Staffordshire Bridges project by visiting their research profile.
    This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF).
    To see more from the Digital Twin Journeys series, see the homepage on the CDBB website.

    Read more...
    A new infographic, enabled by the Construction Innovation Hub, is published today to bring to life a prototype digital twin of the Institute for Manufacturing (IfM) on the West Cambridge campus. Xiang Xie and Henry Fenby-Taylor discuss the infographic and lessons learned from the project.
    The research team for the West Cambridge Digital Twin project has developed a digital twin that allows various formats of building data to function interoperably, enabling better insights and optimisation for asset managers and better value per whole life Pound.  
    The graphic centres the asset manager as a decision maker as a vital part of this process, and illustrates that each iteration improves the classification and refinement of the data. It also highlights challenges and areas for future development, showing that digital twin development is an ongoing journey, not finite destination. 
    The process of drawing data from a variety of sources into a digital twin and transforming it into insights goes through an iterative cycle of:  
    Sense/Ingest - use sensor arrays to collect data, or draw on pre-existing static data, e.g. a geometric model of the building  Classify - label, aggregate, sort and describe data  Refine - select what data is useful to the decision-maker at what times and filter it into an interface designed to provide insights  Decide – use insights to weigh up options and decide on further actions  Act/Optimise - feed changes and developments to the physical and digital twins to optimise both building performance and the effectiveness of the digital twin at supporting organisational goals.  Buildings can draw data from static building models, quasi-dynamic building management systems and smart sensors, all with different data types, frequencies and formats. This means that a significant amount of time and resources are needed to manually search, query, verify and analyse building data that is scattered across different databases, and this process can lead to errors. 
    The aim of the West Cambridge Digital Twin research facility project is to integrate data from these various sources and automate the classification and refinement for easier, more timely decision-making. In their case study, the team has created a digital twin based on a common data environment (CDE) that is able to integrate data from a variety of sources. The Industry Foundation Classes (IFC) schema is used to capture the building geometry information, categorising building zones and the components they contain. Meanwhile, a domain vocabulary and taxonomy describe how the components function together as a system to provide building services. 
    The key to achieving this aim was understanding the need behind the building management processes already in place. This meant using the expertise and experience of the building manager to inform the design of a digital twin that was useful and usable within those processes. This points to digital twin development as a socio-technical project, involving culture change, collaboration and alignment with strategic aims, as well as technical problem solving.
    In the future, the team wants to develop twins that can enhance the environmental and economic performance of buildings. Further research is also needed to improve the automation at the Classify and Refine stages so they continue to get better at recognising what information is needed to achieve organisational goals. 
    You can read more from the West Cambridge Digital Twin project by visiting their research profile. 
    This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF).  
    To see more from the Digital Twin Journeys series, see the homepage on the CDBB website.
    Read more...
    Digital twins enable asset owners to use better information at the right time to make better decisions. Exploring the early stages of a digital twin journey – understanding the information need – are Staffordshire Bridges researcher Dr Farhad Huseynov and Head of Information Management Henry Fenby-Taylor.
    Network Rail manages over 28,000 bridges, with many being more than 150 years old. The primary means of evaluating the condition of the bridges is through two assessment programmes; visual examination and Strength Capability Assessment. Every conceivable form of bridge construction is represented across Network Rail’s portfolio of assets, from simple stone slabs to large estuary crossings, such as the Forth Bridge. Managing a portfolio of this diversity with frequent and extensive assessments is a considerable challenge.
    Condition monitoring
    The current process for condition monitoring involves visual examination by engineers and takes place every year, along with a more detailed examination every six years. The visual inspection provides a qualitative outcome and does not directly predict the bridge strength; it is conducted to keep a detailed record of visible changes that may indicate deterioration. The load-carrying capacity of bridges is evaluated every five years through a Strength Capability Assessment, conducted in three levels of detail:
    Level 1 is the simplest, using safety assumptions known to be conservatively over-cautious (i.e. 1-dimensional structural idealisation). Level 2 involves refined analysis and better structural idealisation (i.e. grillage model). This level may also include the use of data on material strength based on recent material tests, etc. Level 3 is the most sophisticated level of assessment, requiring bridge-specific traffic loading information based on a statistical model of the known traffic.  Understanding the information and insights that asset owners require helps shape what data is needed and how frequently it should be collected – two essential factors in creating infrastructure that is genuinely smart. During the discussions with Network Rail, the research team found that Level 3 assessment is only used in exceptional circumstances. This is because there is no active live train load monitoring system on the network; hence there is no site-specific traffic loading information available for the majority of bridges. Instead, bridges failing Level 2 assessment are typically put under weight and/or speed restrictions, reducing their ability to contribute to the network. This means that there is potentially huge value in providing Level 3 assessment at key sites with greater frequency.
    Digital twins for condition assessment
    The Stafford Area Improvement Programme was setup to remove a bottleneck in the West Coast Main Line that resulted in high-speed trains being impeded by slower local passenger and goods trains. To increase network capacity and efficiency, a major upgrade of the line was undertaken, including the construction of 10 new bridges. Working with Atkins, Laing O’Rourke, Volker Rail and Network Rail, a research team including the Centre for Smart Infrastructure and Construction (CSIC), the Centre for Digital Built Britain (CDBB) and the Laing O’Rourke (LOR) Centre for Construction Engineering and Technology at the University of Cambridge is collaborating with Network Rail to find a digital twin solution for effective condition monitoring.
    Two bridges in the scheme were built with a variety of different sensors to create a prototype that would enable the team to understand their condition, performance and utilisation. Both bridges were densely instrumented with fibre optic sensors during construction, enabling the creation of a digital twin of the bridges in use. The digital twin’s objective is to provide an effective condition monitoring tool for asset and route managers, using the sensor array to generate data and derive insights.
    Identifying challenges and solutions
    Meetings were held with key stakeholders including route managers and infrastructure engineers at Network Rail to learn the main challenges they face in maintaining their bridge stock, and to discover what information they would ideally like to obtain from an effective condition monitoring tool. The team liaised closely with the key stakeholders throughout to make sure that they were developing valuable insights.
    Through discussions with Network Rail about the team’s work on the two instrumented bridges in the Staffordshire Bridges project the following fundamental issues and expected outcomes were identified:
    A better understanding of asset risks: How can these be predicted? What precursors can be measured and detected? A better understanding of individual asset behaviour Development of sensor technology with a lifespan and maintenance requirement congruent with the assets that they are monitoring How structural capability be calculated instantly on the receipt of new data from the field Development of a holistic system for the overall health monitoring and prognosis of structures assets Realistic traffic population data in the UK railway network. (Can this be predicted with sufficient accuracy for freight control and monitoring purposes?) To address these issues, the team instrumented one of the bridges with the following additional sensors, which, combined, produce a rich dataset:
    Rangefinder sensors to obtain the axle locations. A humidity and temperature sensor to improve the accuracy of weight predictions against variations in ambient temperature. Accelerometers to calculate rotational restraints at the boundary conditions and therefore improve the accuracy of weight predictions. Cameras to categorise passing trains.  
    Data from these sensors feeds into a finite element model structural analysis digital twin that interprets the data and provides a range of insights about the performance of the bridge and the actual strain it has been put under.
    Applying insights to other bridges
    Significantly, information from the instrumented bridge sites is relevant to adjacent bridges on the same line. Having one bridge instrumented on a specific route would enable Level 3 assessment for other structures in their portfolio and those of other asset owners, including retaining walls, culverts, and other associated structures. Just as the new bridges relieved a service bottleneck, digital twins can resolve procedural and resource bottlenecks by enabling insights to be drawn about the condition of other assets that weren’t instrumented.
    This is a valuable insight for those developing their own digital twins, because given that one bridge is instrumented it follows that where trains cannot have diverted course, then any other bridges along that same stretch of track will be undergoing the same strain from the same trains. This insight will enable teams implementing sensors to be able to efficiently implement a sensor network across their own assets.
    One of the outcomes of the Staffordshire Bridges project is development towards a holistic approach for the overall health monitoring and prognosis of bridge stocks. Such changes improve workforce safety by reducing the requirement for costly site visits while maintaining a healthy bridge network.
    You can read more from the Staffordshire Bridges project by visiting their research profile.
    This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF). 
    To keep up with the Digital Twin Journeys project, check out the Digital Twin Journeys home page.
    Read more...
    Digital twins can help organisations achieve various goals. In some cases, the end goal is for buildings and infrastructure to last longer, use less energy, and be safer. In others, it is enhancing the lives of people who interact with the built environment and its services. As highlighted by the Gemini Principles, these are not mutually exclusive aims, so wherever you are on your digital twin journey, it is important to consider other perspectives on the hybrid digital and physical systems you create. How will your digital twin fit into a wider ecosystem that provides services to all kinds of people? How will your asset’s performance impact the wider built environment and those who need to navigate it? Whose lives will be better if you share data securely and purposefully.
    In the first output from the Digital Twin Journeys series, the team working on the Smart Hospital of the Future research project, enabled by the Construction Innovation Hub, shared case studies from two smart hospitals and reflect on the innovations they saw during the COVID-19 pandemic. In this two video mini-series, the research team shares insights about how existing digital maturity enabled these hospitals to respond to the pandemic in agile ways, transforming to a hybrid physical and digital model of care distributed across multiple sites. They also explored how individual asset digital twins fit into a wider landscape of ecosystem services, guiding how we approach interoperability to achieve better outcomes.


    These insights inform the way we think about the role of digital twins in the smart built environments of the future. Dr Nirit Pilosof reflects that, ‘Digital twin as a concept can promote the design of the new system, the design process of the built environment and the technologies, but also really help operate… the hybrid models looking at the physical and virtual environments together.’ If health care is enabled by connected digital twins, how could the design of hospitals – and whole cities – change? 
    In the videos, the team also discusses the limitations and ethics of services enabled by digital data and the use of digital technologies to improve staff safety, from isolated COVID wards to telemedicine. They frame service innovation as an iterative and collaborative process, informed by the needs of digital twin users, whether those are the asset owners and operators, or the people benefitting from the services they provide. 
    According to project co-lead Dr Michael Barrett, ‘The people who need to drive the change are the people who are providing the service.' After the COVID crisis, we can better recognise what we have learned from implementing digital services at scale, as more people than ever have relied on them. The team reflect that having the right people in the right roles enabled the smart hospitals in these cases to transform their services rapidly in response to the need. The same human and organisational infrastructure that is creating the smart hospital of the future is also needed to create the flexible, responsive built environments of the future.
    Digital Twin Journeys can start from the perspective of available technology, from a problem-solving perspective, or from the perspective of users experiencing a service ecosystem. The smart hospitals project demonstrates the value of the latter two approaches. Hospital staff were instrumental in shaping the digitally-enabled service innovation to keep them safe and offer better services on and offsite, but project co-lead Dr Karl Prince points out how people accessing those services have to navigate a variety of different services in the built environment to get there. As we begin to connect digital twins together, we need to consider not just our own needs but the needs of others that digital twins can address. 
    For more on this project, including links to their publications, see the team’s research profile on the CDBB website. Keep up with the Digital Twin Journeys series on the CDBB website or here on the Digital Twin Hub blog.
    Read more...
    This week marks the one-year anniversary of the National Digital Twin programme’s (NDTp) weekly Gemini Call – an online progress update from the NDTp with a feature spot for members of the Digital Twin Hub to showcase projects and share digital twin knowledge and experiences. DT Hub Chair, Sam Chorlton, tells us about the call, its beginnings and the latest move to the DT Hub.
    There’s no doubt that the Gemini Call has been a game-changing addition to the NDTp. Brought about by CDBB CReDo Lead, Sarah Hayes, we launched it in September 2020 as part of the Gemini programme to inform our friends and followers about programme developments.
    In its early days, the call also played a major part in opening the dialogue for creation and delivery of NDTp projects, notably the Digital Twin Toolkit project, which resulted in a report and template package to help organisations build a business case for a digital twin. (We’re excited that the template has since been downloaded approaching 1,000 times.)
    We could not have achieved the Toolkit project without the input of supporters across 17 DT Hub member organisations, and it was the members’ pro bono contributions and willingness to collaborate in this venture that enabled us to open up opportunities for knowledge sharing and discussions about digital twin journeys.
    By the community, for the community
    Today, the half-hour Gemini Call brings in around 60 participants each week, and over the year nearly 300 members have attended at least once. This year we have changed the agenda to allow for a feature focus by DT Hub member organisations to present digital twin projects or research, followed by a forum for Q&A. To date, there have been 16 digital twin presentations given by organisations worldwide. It is this free exchange of knowledge and open discussion between members of the community that is pushing progress on an international scale.
    Sarah Hayes gives her take on the year, “We’re thrilled with what has happened with the call and we are telling everyone to come and get involved. We have over 2,000 members from government, public and private industry sectors and academia, and there is so much we can all learn from one another. Right now, there is a ground swell of connected digital twin development, and the DT Hub community can access this first hand.”
    Gemini Call chair and Digital Energy leader at Arup, Simon Evans, said, “The call has been an excellent forum to bring industry together, whatever the experience or involvement with digital twins, and provide that regular knowledge-share and update on leading international digital twin developments.”
    The Gemini Call sits centre stage within the DT Hub community as a member-focused exchange to help organisations increase their digital twin knowhow - it is a focal point for the community as we experience and drive digital transformation. Come and join the conversation!
    Progressing by sharing challenge
    One year on, we set this challenge to our members: invite a guest from your network to the next Gemini Call so we can expand the discussion and break down the barriers to sharing data.
    Become a DT Hub member
    Sign up to join the Gemini Call
     

    Read more...
    An update from the Information Management Framework Team of the National Digital Twin programme
    The mission of the National Digital Twin programme (NDTp) is to enable the National Digital Twin (NDT), an ecosystem of connected digital twins, where high quality data is shared securely and effectively between organisations and across sectors. By connecting digital twins, we can reap the additional value that comes from shared data as opposed to isolated data: better information, leading to better decisions through a systems thinking approach, which in turn enable better outcomes for society, the economy and our environment.
    The NDTp’s approach to data sharing is ambitious: we are aiming for a step change in data integration, one where the meaning is captured sufficiently accurately that it can be shared unambiguously. Conscious that “data integration” may justifiably mean different things to different people, we would like to shed some light on our current thinking and present one of the tools we are currently developing to help us articulate the need for this step change. It is a scheme for assessing the level of digitalisation of data items based upon four classifiers: the extent of what is known, media, form, and semantics. The scheme entails the 8 levels below - which are likely to be finetuned as we continue to apply the scheme to assess real data sets:
    Levels of digitalisation: towards grounded semantics

    We trust that the first levels will resonate with your own experience of the subject:
    Extent: as it is not possible to represent what is unknown, the scheme starts by differentiating the known from the unknown. By looking into the information requirements of an organisation, “uncharted territories” may be uncovered, which will need to be mapped as part of the digitalisation journey.
      Media: information stored on paper (or only in brains) must be documented and stored in computer systems.
      Form: information held in electronic documents such as PDFs, Word documents, and most spreadsheets, needs to be made computer readable, i.e. moved to information being held as data, in databases and knowledge graphs for example.
      Semantics: the progression towards “grounded semantics” and in particular the step from the “explicit” level to the “grounded” level is where, we believe, the fundamental change of paradigm must occur. To set the context for this step, it is worth going back to some fundamental considerations about the foundational model for the Integration Architecture of the NDT. From a Point-to-Point model to a Hub and Spoke model empowered by grounded semantics

    A key challenge at present is how to share data effectively and efficiently. What tends to happen organically is that point-to-point interfaces are developed as requirements are identified between systems with different data models and perhaps reference data. The problem is that this does not scale well. As more systems need to be connected, new interfaces are developed which share the same data to different systems, using different data models and reference data. Further there are maintenance problems, because when a system is updated, then its interfaces are likely to need updating as well. This burden has been known to limit the effective sharing of data as well as imposing high costs.
    The alternative is a hub and spoke architecture. In this approach, each system has just one interface to the hub, which is defined by having a single data model and reference data, that all systems translate into and out of. It is important to note, that although this could be some central system, it does not need to be, the hub can be virtual with data being shared over a messaging system according to the hub data model and reference data. This reduces costs significantly and means that data sharing can be achieved more efficiently and effectively. Neither is this novel. The existing Industry Standard Data Models were developed to achieve exactly this model. The new piece is that the requirement now is to be able to share data across sectors, not just within a single sector, and to meet more demanding requirements.
    Thus, the National Digital Twin programme is developing a Foundation Data Model (a pan-industry, extensible data model), enabling information to be taken from any source and amendments to be made on a single node basis.
    But what would differentiate the NDT's common language - the Foundation Data Model - from existing industry data models?
    Our claim is that the missing piece in most existing industry data models which have “explicit semantics”, is an ontological foundation, i.e. ”grounded semantics”.
    Experience has shown us that although there is just one real world to model, there is more than one way to look at it, which gives way to a variety of data models representing the same “things” differently and eventually, to challenges for data integration. To tackle them, we recommend to clarify ontological commitments (see our first conclusions on the choice of a Top Level Ontology for the NDT’s Foundation Data Model) so that a clear, accurate and consistent view of “the things that exist and the rules that govern them” can be established. We believe that analysing datasets through this lens and semantically enriching them is a key step towards better data integration.
    As we begin to accompany organisations in their journey towards “grounded semantics”, we are looking forward to sharing more details on the learnings and emerging methodologies on the DT Hub. We hope this window into our current thinking, which is by no mean definitive, has given you a good sense of where the positive disruption will come from … We are happy to see our claims challenged, so please do share your thoughts and ask questions.
    Read more...
    Creating Digital Twins can be like sailing in uncharted waters, so how do you handle it when unforeseen challenges rock the boat? Can you even predict what kinds of things will disrupt your journey? We’ve noticed in various conversations on the DT Hub that no matter what sort of Digital Twin you’re trying to set up or why, there is an incredibly wide range of potential disruptions. From technical to cultural, from resources to supply chains, almost every avenue is susceptible to producing a challenge somewhere. Many examples that we’ve already seen have only become apparent once the people developing Digital Twins are up against them in real time, so that’s why the DT Hub has launched this new activity, Defining Our Digital Twin Challenges!
     
    We would like to know about the challenges you’ve encountered on your DT journey in order to make the overall roadmap easier to follow. 
     
    The information you provide will help us to ultimately define our common challenges so we can start to solve them together. This series of thematic workshops, run by the DT Hub, will progress the conversation around the Digital Twin Journey, and surface some of the challenges that organisations are still facing whilst embarking on their journey. Each Challenge will culminate in an Activity, where we will present the specific challenge areas that you have brought to us to a select group in order to provide constructive feedback. The outcome of these workshops will be to share insights from inside and outside the community for the benefit of the community as a whole.
     
    You can use this activity Bring out your Digital Twin Challenges to explore your challenges with others, and our crowd facilitator, Joao, will be interacting with you to make sure you get the best experience possible. Joao is a former market researcher, court interpreter and has been a brilliant member of our team for years as a 100%Open Associate. We look forward to your invaluable contributions, and in turn the exponential development of the DT journey.

     

    Read more...
    Since its creation in 2018, the National Digital Twin programme (NDTp) has had three objectives: 
    Enable a National Digital Twin – an ecosystem of connected digital twins to foster better outcomes from our built environment 
    Deliver an Information Management Framework – ensure secure resilient data sharing and effective information management 
    Align a Digital Framework Task Group to [provide coordination and alignment among key players. 
    In 2021 and with the Digital Framework Task Group of senior leaders from industry, academia and government overseeing progress, it is at a point where key projects are being realised and support for its work is gathering momentum. Here is a summary of the latest developments. 
    The Digital Twin Hub community is now in excess of 2,000 members and its remit to create technical foundations and to provide a co-ordinated community in which to share expertise and knowhow on digital twins is being met with enthusiasm and support from a diverse range of participants across the UK and beyond.  
    This year is proving pivotal in terms of active engagement with our members to better understand their digital maturity and needs, especially through surveys, community activities and international summits. And in parallel is the publication of key documents and resources including the Digital Twin Toolkit and upcoming Collaborative Workshop to help companies make their business cases, and the Digital Twin Standards roadmap, a culmination of work by the British Standards Institute (BSI), which enables a framework for information management and sets out our programme for the next few years.  
    Key to these activities is the willingness of members from both academic and industrial fields to share their own knowledge and experiences. The DT Hub is launching a new series titled Digital Twin Journeys to focus on academic research and lessons learned from digital twin projects focused on construction: buildings, infrastructure and industrial, and satellite applications. In parallel, we will engage with industry to run a consultation on our Flex 260 Standards as well as a second Smart Infrastructure Index (SII) Survey which tracks, in the first instance, digital and organisational maturity levels of asset owner and operator members.  
    At the end of August, we also announced the launch of three thematic workshops to address Digital Twin Roadblocks by progressing the conversation and surfacing the challenges faced by organisations while embarking on their digital twin journeys. The aim is for members to discuss experiences and to elicit the main challenges and blockers encountered in their programmes to date. These monthly workshops will commence at the end of September 2021.  
    Our work on the Information Management Framework, to allow the smooth adoption of digital twin technologies, has also gathered pace with the introduction of a methodology to divide the information management space into manageable segments. The 7 circles approach provides the building blocks for informed decision making and will deliver better information management and information sharing on a national scale. 
    The NDTp’s CReDo project will be running a webinar on 2 November 2021 to coincide with COP26 to give insight into our plans to develop a digital twin across water, energy and telecoms to improve resilience across the infrastructure system. CReDo – Climate Resilience Demonstrator – is applying an Information Management Framework approach to share data across water, energy and telecoms service providers, combined with hydrology and climate data from the Met Office, to help plan for and adapt to the cascading effects of increased flooding due to climate change. Registration for the webinar will be opening soon. 

    Read more...
    Matthew West, Technical Lead, National Digital Twin Programme, introduces a video on the 7 circles of Information Management and Process Model Information Requirements.
    Join Matthew and Al Cook, a member of the technical team of the NDTp and an expert in data integration activities and information security, as they take you through key elements of the Information Management Framework and detail a new approach to effective information management. 
    A video is available to view below, with a live Q&A session from 10:00 to 10:30 on Thursday 15 July 2021.
    Access to quality and well-managed information in organisations is key to support decision making and optimise outcomes at all levels. Decisions based on poor quality information, or no information at all, can significantly increase the risk of mistakes or even disasters.
    Systematically implementing information management ensures the ability to deliver the right information to the right decision-makers, at the right time. It is a critical success factor for the National Digital Twin (NDT), an ecosystem of connected Digital Twins where high-quality data is shared securely, on a massive scale to improve decision making across the UK.
    The “7 circles of Information Management”: developing the Information Management Framework
    The Information Management Framework (IMF), a collection of open, technical and non-technical standards, guidance and common resources, is intended to enable better information management and information sharing at a national scale and provide the building blocks to be (re)used by those wishing to be part of the NDT.

     
    The scope of the IMF is broad and the “7 circles diagram” that I introduce in the video below is a pragmatic way to divide the Information Management space into areas of concern that can be addressed separately as well as supporting each other. It is intended to help identify areas and associated NDTp deliverables that are of particular relevance to you.
    The technical aspects of the IMF may come first to your mind. On top of “information transport” mechanisms, together with authorisation and security protocols, to ensure that information can be accessed seamlessly, the NDT needs a language, an inter-lingua, so that data can be shared consistently and used to support decisions without requiring any further “data wrangling”. To develop this common language (the NDT’s ontology) the team is pursuing a principled approach, deeply grounded in mathematics and science to ensure that it is as extensible and all-encompassing as possible. This is what the deepest circles of the 7 circles diagram are about.
    There is, however, much more to the Information Management Framework than the purely technical aspects, and as part of the highest circles of the 7 circles diagram, we are developing guidance on how to systematically improve information management so that producing data that meets the quality standards required to be part of the NDT becomes part of “business as usual”.
     
    A first step towards better information management: defining your information requirements
    This means that while the NDT’s ontology is being developed, steps can be taken to work towards better information management. Organisations need to reach a point of recognition that there is a need to address data quality in a way that enables improved decisions within their own business and with those they have data-based relationships with. And defining Information Requirements (the second circle in the stack) is a key starting point.
    Process Model based Information Requirements
    Too often, information requirements are incomplete or even absent in organisations, with the implication that if requirements are not identified and agreed there is no reason that they would be met. As part of the second circle of the “7 circles diagram”, the team has released a paper outlining the proposed approach to developing information requirements, based on the analysis of process models. This is a novel approach, ensuring the systematic identification of information needed (no more, no less) to support decisions and to identify where it is captured.
    I encourage you to watch Al Cook’s presentation in the second part of the video to find out more about this approach.
    The team and I hope to share more detailed guidance on information management in the near future, helping you to assess your organisation’s current information management maturity, prioritise areas for obvious improvements in decision-making and start addressing them, so that mistakes can be avoided and better outcomes achieved.
    And as we continue to further develop the Information Management Framework, we look forward to accompanying you through the discovery of other circles among the 7 circles of Information Management.
    This video contains an introduction to the 7 circles of Information Management presented by Matthew West followed by a presentation by Al Cook on a suggested approach to define information requirements. Al and Matthew look forward to answering your questions and talking about next steps in a live Q&A session on the DT Hub, on the 15/07 from 10:00 to 10:30.
     
     
    Read more...
    The vision of a National Digital Twin as an ecosystem of connected digital twins enabling better social and economic outcomes across the built environment continues to gain wide support. But to make it a reality, we need people with the right skills to put it into play.  
    “Collaborate on the rules and compete on the game” is a phrase we use to describe how we want connected digital twins to evolve. The sporting analogy carries over well into skills. We want the best teams to deliver on the National Digital Twin, not just a team of strikers or goalkeepers but diverse teams with a range of skillsets and capabilities. Diversity has to be at the heart of a skills strategy ensuring that the future workforce is more effective. 
    The skills & competency framework sets out the skills that are needed to manage information and work with data in the future. These aren’t just what we might see as hardcore technical skills such as data modelling and analytics which are described as digital skills but also business skills like transformational leadership which recognises the benefits of getting information management right. 
    The capability enhancement programme sets out pathways for individuals and organisations to get the right skills in place depending upon aspirations both at the personal level and the organisational level. Have a go at the self-assessment questionnaire to assess what training might be helpful to you and take a look at the training register to find a suitable course. 
    The National Digital Twin is a long term journey and there is time to get the right skills in place to reach our destination. 
    Read more...
    The DT toolkit is a simple guide to the things you need to think about on your digital twin journey. It arose from the request we’ve heard from the DT Hub community “how do I make the business case for a digital twin”. In response, through the Gemini programme of the National Digital Twin programme, we’ve been able to bring together people who have been through or are going through the digital twin journey from different perspectives: consulting, technology development, legal and academic and who are willing to share their learning and expertise with the DT Hub community.
    The team first met back in September 2020 to discuss how we might put together a toolkit for making the business case. We discussed how it would need to focus on purpose, relevant examples, a roadmap and a usable business case template. We debated the use cases for articulating digital twin purpose and took this to a DT Hub workshop to garner community input to what is now a use case framework which is starting to resonate in meetings and presentations. We presented and discussed case studies of digital twins that have been developed or are being developed which can be found on the DT Hub. We spent a long time talking through the steps organisations need to go through to implement a digital twin and as a result produced the roadmap which you can find in the report. We talked about digital twin sophistication levels. And members of the team worked together to really think through what a business case template might look like and what you would need to put together to get sign off for your digital twin. This template is now freely available as a resource for you to download and use.
    This DT toolkit report is a true collaboration of diverse minds working in the field of digital twins who are open to challenge and debate. The result is a toolkit that you can use to set you and your team on your digital twin journey. As with all journeys, the toolkit is now at its first pit-stop and the toolkit team are going to use it with their clients and provide feedback on how to improve and fine tune it. We invite you to do the same, read the toolkit report, try it out and tell us what you think.  
    We are very grateful for the passion and dedication that the Toolkit Team have shown towards putting the toolkit together. Working with limited resources we have been reliant on our volunteer’s goodwill and conviction that the work of the National Digital Twin programme is something they want to be involved in and contribute to.  Drawing from across the disciplines and different organisations, we’ve been really boosted by the support we’ve received from a  team of people going through the digital twin journey and enthusiastic about sharing their experience and ideas with the wider community.  
     If you would like to share your learning and experience with the community and take part in the next iteration of the Toolkit, reach out to us. We can all work together to make this a valuable community resource. 
     
    “The toolkit captures know-how and insights from people with experience of developing and using digital twins.  Steps are given that provide the reader with valuable guidance for justifying, building and exploiting twins, increasing value and reducing the risk of change.” Dr Peter van Manen, Frazer-Nash Consultancy
    “Working collaboratively with people from a variety of industries and experiences, has been not only invaluable to the construct of the Toolkit but also, fun, inspiring and wholesome to participate in.” Peter Curtis, Sopra Steria
    “Working on the DT toolkit has been an excellent way to socialise my thoughts and the Catapult’s work on Digital Twins, while increasing my understanding of DTs, through discussions with the other team members.” Ron Oren – Connected Places Catapult
    Read more...
Top
×
×
  • Create New...