Jump to content

Articles & Publications

Publications

CReDo report 3: Assessing Asset Failure

Climate change is increasing the frequency with which the UK infrastructure is threatened by extreme weather events. To explore the potential impact of future climate conditions, the CReDo project is working to develop a digital twin of key infrastructure networks. This digital twin can be used to help make decisions to better protect the networks in advance of extreme weather events, and ultimately to help inform a real-time response to extreme weather events. The novel feature of this tool is that it will provide the collaborating asset owners- and also crisis management teams- with not only assessments concerning the impact of a weather-induced flooding incident in a future climate on the infrastructure and networks monitored by the individual asset owners, but also the operability of assets owned by other companies- where the failure of these assets impinges on the functionality of their own. The highly interdependent nature of these infrastructure networks, such as telephone lines relying on power supplies being operational, mean that reliably modelling the impact of an extreme weather event requires accounting for such connections. It is planned that the shared appreciation of the mutual threats described by the digital twin across the different actors will encourage further coordination between the companies in their strategic plans to mitigate these increasing threats.
This report outlines just one component of this development. We demonstrate how it is possible to elicit from asset owners the probabilities that each of their assets might fail, in a particular future flood scenario that makes consideration of the impact of climate changes on extreme weather patterns. Taking these unfolding events, and through working with teams of domain experts drawn from asset owners associated with the local power, water and telecommunication companies, our team demonstrate how it is possible to elicit probability distributions of the failure of each asset and their connections within the network. This information would then be fed to operational researchers who can calculate the knock-on effect on the whole network of each simulated future incident. From a decision-analytic perspective, the digital twin would thus consist of connected digital twins representing hydrology, the failure modes of assets, and the system in which the assets sit, with a decision support layer sitting above this.
Visit the technical report page
 
Read more...

Publications

CReDo report 4: Modelling System Impact

This paper describes the work done on the understanding of infrastructure interdependencies and impact on the overall system. The work on the model described in this report started in September 2021. Access to the data was given at the end of October 2021 and the technical work ran until mid-January 2022.
The work was led by Lars Schewe and primarily carried out by Mariel Reyes Salazar. The integration of the multiple different networks was carried out by Maksims Abalenkovs. We achieved to demonstrate that we can integrate the data from a digital twin into component networks models and could connect these with an overarching coordinating algorithm. This allows us to propagate failures in the networks and then analyse the impacts on the different networks. The observed runtimes for the test networks indicate that the implemented methods will work on realistic networks and that implementing more complex models is feasible in follow-up projects.
The technical work planned in the work package was to model each of the component networks, build models that allow to propagate failures through each of them, and propose methods to propagate the failures between them.
To structure the work, the team proposed three levels of detail for the network models and two levels for the integration. In addition, the objective functions for the underlying optimization problems were to be developed. Due to unavailability of data and the short timescale, it was decided to focus on the first levels for all networks and the integration. As no data was available that could guide the definition of an objective function, this work was not undertaken.
The basic models were implemented in Python and tested on a small-scale model of part of a UK town. This allowed to demonstrate that the overall methodology is sound and that data from a digital twin can be transferred to more detail network models and the results can be played back to the digital twin.
Visit the technical report page
 
Read more...
Described in the Pathway to the Information Management Framework, the Integration Architecture is one of the three key technical components of the Information Management Framework (IMF), along with the Reference Data Library and the Foundation Data Model. It consists of the technology and protocols that will enable the managed sharing of data across the National Digital Twin (NDT).
The IMF Integration Architecture (IA) team began designing and building the IA in April 2021. This blog gives an insight on its progress to date.

 
 

Principles
First, it is worth covering some of the key principles being used by the team to guide the design and build of the IA:
Open Source: It is vital that the software and technology that drives the IA are not held in proprietary systems that raise barriers to entry and prevent community engagement and growth. The IA will be open source, allowing everyone to utilise the capability and drive it forward.






  Federated: The IA does not create a single monolithic twin. When Data Owners establish their NDT Node, the IA will allow them to publish details of data they want to share to a NDT data catalogue, and then other users can browse, select and subscribe to the data they need to build a twin that is relevant to their needs. This subscription is on a node-to-node basis, not via a central twin or data hub, and Owners can specify the access, use, or time constraints that they may wish to apply to that subscriber. Once subscribed, the IA takes care of authenticating users and updating and synchronising data between nodes.
  Data-driven access control: To build trust in the IA, Data Owners must be completely comfortable that they retain full control over who can access the data they share to the NDT. The IA will use an ABAC security model to allow owners to specify in fine-grained detail who can access their data, and permissions can be added or revoked very simply and transparently. This is implemented as data labels which accompany the data, providing instructions to receiving systems on how to protect the data.


  IMF Ontology Driven:  NDT Information needs to be accessed seamlessly. The NDT needs a common language so that data can be shared consistently, and this language is being described in the IMF Ontology and Foundation Data Model being developed by another element of the IMF team. The IA team are working with them closely to create capabilities that will automate conversion of incoming data to the ontology and transact it across the architecture without requiring further “data wrangling” by users.
  Simple Integration: To minimise the risk of implementation failure or poor engagement due architectural incompatibility or high cost of implementation, the IA needs to be simple to integrate into client environments. The IA will use well understood architectural patterns and technologies (for example REST, GraphQL) to minimise local disruption when data owners create an NDT node, and ensure that once implemented the ongoing focus of owner activity is on where the value is – the data – rather than maintenance of the systems that support it.
  Cloud and On-Prem: An increasing number of organisations are moving operations to the cloud, but the IA team recognises that this may not be an option for everyone. Even when cloud strategies are adopted, the journey can be long and difficult, with hybridised options potentially being used in the medium to long term. The IA will support all these operating modes, ensuring the membership of the NDT does not negatively impact existing or emerging environment strategies. Open Standards: for similar drivers behind making the IA open-source, the IA team is committed to ensuring that data in the NDT IA are never locked-in or held in inaccessible proprietary formats.   What has the IA team been up to this year?
The IMF chose to adopt the existing open-source Telicent CORE platform to handle the ingest, transformation and publishing of data to the IMF ontology within NDT nodes, and the focus has been on beginning to build and prove some of the additional technical elements required to make the cross-node transactional and security elements of the IA function. Key focus areas were:
Creation of a federation capability to allow Asset Owners to publish, share and consume data across nodes
  Adding ABAC security to allow Asset Owners to specify fine-grain access to data
  Building a ‘Model Railway’ to create an end-to-end test bed for the NDT Integration Architecture, and prove-out deployment in containers


 
 
Read more...
We may be drawing to the close of the current iteration of the NDTp, but there is still plenty of exciting and important work happening. It is encouraging to see just how hard everyone is working to ‘finish well’ and ensure that all the learnings and progress we have made is packaged into a useful blueprint for others to use. 
Some highlights to share: 
The DT Hub Community Council 
One of the goals we set for the DT Hub was for the community to lead the development and strategy going forward.  An important milestone on that journey has been the start of a Community Council, supported by a network of Community Champions. What struck me about the process of establishing the Council was the amount of feedback and applications we received from interested Hub members. It showed a level of enthusiasm and commitment that is very encouraging for the next stage of the DT Hub. 
We now have 12 motivated community representatives across different types of organisations, sizes and locations. Our first Council meeting was held at the end of January, joined by members from Australia and Sweden, as well as the UK. It was a great first meeting, and clear that there is a real desire to keep the momentum of the DT Hub going and to continue the ethos of sharing and collaboration. And I’m delighted to say that we will have @Melissa Zanocco @Ali Nicholl as Co-chairs.
Progress on CReDo  
When CReDo was launched, alongside the CReDo film and demonstrator app, there were a wide range of publications that wrote about the story. What has been interesting to me is that long after the event, the project is still very much being referenced by journalists and key organisations. People are continuing to follow the progress and keep abreast of all the latest insights from the project.  
It has moved the conversation forward on collaboration through connected digital twins, by delivering a tangible example that demonstrates the benefit to our everyday lives. It is being discussed and recognised as something important for critical infrastructure and also for governments. One of the key partners on the project, BT, highlighted the benefits of this work and digitalisation for BT.  
Over the coming weeks the team are putting together both technical and non-technical reports on CReDo to capture the lessons learned, what we could have done better, what we will do better going forward and recommendations for others.  
The team is planning a webinar on 2nd March 2022 to show how the climate resilience model has been realised using synthetic data sets. The event will be a talk-through of project methodologies and findings, insights and next steps. We’ve already had over 500 sign-ups and it’s great to see so much interest. Please sign up to take part.  
Smart Infrastructure Index results 
Developed specifically for the built environment and infrastructure industry, the Index provides a holistic view of digital maturity: from customer insights to digital twins; modern methods of construction to whole-life asset management. 
There were 57 responses to the 2021 Index, up from 21 in 2020. Whereas in 2020 these responses came exclusively from asset owners / operators, in 2021 the survey was sent to the wider DT Hub community. While this increased the reach of the survey, it also influenced the scoring.  
The overall digital maturity score for the DT Hub community was 37.3 in 2020 and it decreased to 33.6 in 2021. When looking at scores for asset owners / operators only, this decrease in digital maturity score was still evident, however, it was far less significant, with average score of 37.1 in 2021. For a further breakdown of the results please go to the report: results of the 2021 Index. 
Read more...
Article written by :- Ilnaz Ashayeri - University of Wolverhampton | Jack Goulding - University of Wolverhampton
STELLAR provides new tools and business models to deliver affordable homes across the UK at the point of housing need. This concept centralises and optimises complex design, frame manufacturing and certification within a central 'hub'; where 'spoke' factories engage their expertise through the SME-driven supply chain. This approach originated from the airline industry in the 1950’s, where the rationale of this optimises process and logistic distribution. STELLAR takes this one step further by creating a bespoke offsite ‘hub and spoke’ model which is purposefully designed to deliver maximum efficiency savings and end-product value. This arrangement is demonstrated through three central tenets: 1) 3D 'digital twin' factory planning tool; 2) Parametric modelling tool (to optimise house design); and 3) OSM Should-Cost model (designed specifically for the offsite market). 
 
STELLAR Digital Twin hub article.pdf
Read more...

Shared by the Community

Introducing the Virtual Energy System

The energy industry has made impressive strides along the path to net-zero, while undergoing the transition to digitisation. Our next, shared step can be to capitalise on the potential of a more dynamic, joined-up and intelligent view of our entire energy system.
Great Britain’s energy system is experiencing two fundamental transitions in parallel.
First, the shift to net zero – something we’ve already made significant strides in. The decarbonisation of our sector is well underway, as is the planning for the changing demands on the sector as other industries also undergo this change in their own efforts to reach net zero. 
And second, digitisation. New technology and the prevalence of real-time data have already transformed many aspects of the energy industry, and there are a multitude of commercial projects that bring to life the concept of digital twins of specific IT systems.
An opportunity to come together
We now have an opportunity ahead of us; to bring these parallel transitions together to create something incalculably more powerful, that has the potential to help us take even greater strides towards net zero.
This is why we’re launching an industry-wide programme to develop the Virtual Energy System – a digital twin of Great Britain’s entire energy system.
We recognise it’s an ambitious goal.
But we also recognise that it could be a central tool, bringing together every individual element of our system to create a collective view which will give us more dynamic intelligence around all aspects of the energy industry.
The Virtual Energy System will also provide us all with a virtual environment to test, model and make more accurate forecasts – supporting commercial decision-making, while enabling us to understand the consumer impact of changes before we make them.
This ambition is not out of reach - many elements of the energy industry are already using individual digital twins. The next step on this journey is to work together to find a way to take these digital twins forward, in unison. A way in which we can connect these assets and encourage future development across the entire energy spectrum.
A tool created by our industry, for everyone
The key to the Virtual Energy System will be collaboration - this won’t be the ESO’s tool, but a tool available to our entire industry - a tool that we will all be able to tap into and derive learnings from, that will support future innovation and problem solving.
But we need to start somewhere. We are sharing the concept and setting down the gauntlet. It will only become a reality if it is collaboratively designed and developed by the whole energy industry.
The ESO has set out its initial thinking on what a roadmap could look like, but we need our best and brightest minds to feed into this to shape its future. We know we won’t always reach a perfect consensus every time, but only through engagement and open collaboration will the full benefits be unlocked.
What’s next?
In December we brought the energy industry together with Ofgem and BEIS for a one-day conference. It was an opportunity to explore the proposed programme, and kickstart our feedback and engagement period. From this, we plan to form working groups to begin a deeper dive into the key areas of development that will underpin the entire development journey. To watch back the conference, contribute to our initial stakeholder feedback and view a brief outline on the suggested structure visit our website.
Get Involved and Hear More
Join us on Thursday 10th February 1pm-2pm for a brief introduction to our Common Framework Benchmarking Report ahead of its public release, followed by a workshop around the key socio-technical factors which could make up the common framework of the Virtual Energy System. There will be lots of opportunity to discuss and ask questions during the session, it will be an informal session where we can collaborate around the latest ideas.
Register to attend
 
You can also join us on the Gemini Call on 8th February for a short introduction before the full session. 
Read more...
“The point of digital twins is to enable us to make better decisions that lead to better outcomes for people and nature. It is therefore important that the Digital Twin Hub is community driven, with input from people who will be using and benefitting from them. The setting up of the Digital Twin Hub Community Council is an important step for capturing that voice. I am looking forward to working with my Co-Chair, the Council and the Community to ensure that the Hub continues to support our needs as we work together towards a National Digital Twin.”  
Melissa Zanocco, Infrastructure Client Group  
 
Our primary aim at the DT (Digital Twin) Hub is to ensure the community is a vibrant, member-owned, resource-rich idea space, and in 2021 we took a number of steps to boost community engagement and conversation both inside and outside the web platform. One of these steps was to start a community led council. 
In November, we issued a call for members to form the new DT Hub Community Council. The call attracted a fantastic 70 responses, and we were truly encouraged by the application statements and words of support we received. Such was the enthusiasm that by December we had finalised our first 12 Community Council members and formed a large Community Champions network. 
The council will take an important advisory role in the future direction of the DT Hub. With their diverse skillsets and knowledge across industries and nations, both the council member and champions’ groups will be the eyes and ears of the community – giving it a strong voice as we develop our shared vision of an ecosystem of connected digital twins.  
The first Community Council meeting took place on 25 January 2022, sparking many discussions on ways to come together to further engagement opportunities for the wider membership.  
We are also delighted to announce that the council has appointed its co-chairs: Ali Nicholl from Iotics and Melissa Zanocco from the Infrastructure Client Group. 
Ali Nicholl said:
“I am excited to part of the DT Hub’s Community Council and look forward to working with the council and the diverse community it represents. The DT Hub isn’t in the business of seeking consensus on a single point of view, a single application or a single standard. Instead, we see a community developing approaches where cooperation between twins, individuals, organisations and sectors can deliver the platforms for human flourishing that societies globally so desperately need. It’s in that spirt of cooperation that we have co-chairs and I can’t wait to work together to support the community in achieving our shared vision.” 
Ali and Melissa’s fellow council members, with digital twin/digital transformation experience spanning strategy, systems, standards to people development, are Peter Burnett, Network Rail; John Erkoyuncu, Cranfield University; Polly Hudson, UCL/Alan Turing Institute; Paul May, John Lewis Partnership; Laura Mills, KPMG; Dan Rossiter, BSI (British Standards Institution) Group; Timothy Ståhle, Akademiska Hus (Sweden); Glenn Worrall, Bentley Systems UK; Amanda Wyzenbeek, Mott MacDonald (Australia); and Jamie Young, Wates.  
We welcome the new DT Hub Community Council and our Community Champions and look forward to updating members on their progress in the coming months. 
You can find out more about the Council Members and contact the team via the Community Council page.
Why not join the Community Champions network to share your ideas for the DT Hub community?
Read more...
We are facing a growing challenge in the UK in managing the assets we construct. New structures will need future maintenance and much of our existing infrastructure is ageing and performing beyond its design life and intended capacity. In order to get more out of our existing assets with minimum use of limited resource we need to better understand how they are performing. Climate crisis and extreme weather events bring additional strain to the condition and structural health of assets making assessing their condition increasingly important. There are logistical challenges too – visually inspecting remote and hard to access assets can be expensive and hazardous.
Many people don’t consider that the Earth’s surface is being continuously scanned. By different satellite sensors, in different directions, day and night. While the proliferation of sensors and satellite technology has fuelled a revolution in the way we can monitor assets, the ideal solution is to use different tools in the engineer’s toolbelt in order to find the right solutions for the right cases.
We’re used to the ideal of Google Earth, and many people in our sector are learning about the usefulness of maps and geographical information systems (GIS), with many open datasets provided by organisations like Ordnance Survey in the UK. What you see as satellite images on Google Earth are forms of optical data: like taking pictures over time of the Earth’s surface and using our eyes to see the changes (or maybe automating change detection through machine learning…and that’s another point). What many people working in the built environment do not realise is that there is a whole spectrum of other sensors that can show us beyond what our eyes can see.
Did you know that radar satellites continuously scan the earth, emitting and receiving back radar waves? These satellites do not rely on daylight to image and so we can collect radar measurements day and night, and even through clouds. Using different processing techniques, this data can be used to create 3D digital elevation models, map floods and measure millimetres of movement at the Earth’s surface – all from hundreds of kilometres up in space. And did you know there is free data available to track pollutants, monitor ground changes and track vegetation?
There is. In huge volumes. Petabytes of data are held in archives which allow us to look backwards in time as well as forwards. With all this opportunity, it can seem a bit daunting on where to get started.
I have worked in the design, construction and maintenance sectors for over a decade, and I came back to academia to learn about the opportunities of satellite data from the German Aerospace Center and the Satellite Applications Catapult. I spent a PhD’s worth of time retaining in data analysis so that I could combine the latest in data analysis with a civil engineer’s lens to better understand how we can unlock value from this data. I’ll save you the time and give a quick overview of what we can do in industry now, and share some learnings from talented researchers working on a Centre for Digital Built Britain (CDBB) project on satellite monitoring to support digital twin journeys.
Hope to see you next Tuesday 1st February at the Gemini call for introduction to the topic and some signposting on where you can go to find out more to make the most of such data for your own assets.
Read more...
Sensor technology has come a long way over the last 30 years, from the world’s first, bulky webcam at the University of Cambridge Computer Science Department to near ubiquitous networks of sleek sensors that can provide data at an unprecedented volume, velocity and quality. Today, sensors can even talk to each other to combine single points of data into useful insights about complex events. The new webcomic ‘Coffee Time’ by Dave Sheppard, part of the Digital Twin Journeys series, tells the story of this evolution and what it means for what we can learn about our built environment through smart sensors.  
Starting with a simple problem – is there coffee in the lab’s kitchen? – researchers in the early 1990s set up the world’s first webcam to get the information they wanted. Today, people in the Computer Lab still want to know when the coffee is ready, but there are more ways to solve the problem, and new problems that can be solved, using smart sensors. Smart sensors don’t just send information from point A to point B, providing one type of data about one factor. That data needed to be collated and analysed to get insights. Now sensors can share data with each other and generate insights more instantaneously. 
The West Cambridge Digital Twin team at the computer lab have looked at how specific sequences of sensor events can be combined into an insight that translates actions in the physical world into carefully defined digital events. When someone makes coffee, for example, they might turn on a machine to grind the coffee beans, triggering a smart sensor in the grinder. Then they’d lift the pot to fill it with water, triggering a weight sensor pad beneath to record a change in weight. Then they would switch the coffee machine on, triggering a sensor between the plug and the outlet that senses that the machine is drawing power. Those events in close succession, in that order, would tell the smart sensor network when the coffee is ready. 
These sequences of sensor triggers are known as complex events. Using this technique, smart sensors in the built environment can detect and react to events like changes in building occupancy, fires and security threats. One advantage of this approach is that expensive, specialist sensors may not be needed to detect rarer occurrences if existing sensors can be programmed to detect them. Another is that simple, off-the-shelf sensors can detect events they were never designed to. As the comic points out, however, it is important to programme the correct sequence, timing and location of sensor triggers, or you may draw the wrong conclusion from the data that’s available. 
Something as simple as wanting to know if the coffee is ready led to the first implementation of the webcam. Digital twin journeys can have simple beginnings, with solving a simple problem with a solution that’s accessible to you, sparking off an evolution that can scale up to solve a wide range of problems in the future. 
You can read and download the full webcomic here.
You can read more from the West Cambridge Digital Twin project by visiting their research profile. 
This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF). 
Read more...
AEC Information Containers based on the ISO21597, the so-called ICDD, are a great way to store data and relations. Widely discussed as a possible structure for any CDE, this standard was made to hand over project data or exchange files of a heterogeneous nature in an open and stable container format and therefore will become the starting point of many digital twins.

The standard says: A container is a file that shall have an extension ".icdd" and shall comply with ISO/IEC 21320–1, also known as ZIP64.
Information deliveries are often a combination of drawings, information models, text documents, spreadsheets, photos, videos, audio files, etc. In this case, many scans and point clouds came on top. And while we have all metadata datasets in our system, it is pretty hard to hand this over to the client, that might have another way of handling it. So we have now put all those specific relationships between information elements in those separate documents using links because we believe it will contribute significantly to the value of information delivery. 
We successfully handed over a retroBIM project from a nuclear facility in Germany. It was 661469018KB. And it was a ZIP; before zipping, it was around 8TB! It has a whole archive back to the 60' it has all models, all point clouds the model was made of, and it has all documents produced from the models too. So, all in all, we have 2338 documents.
We created an ICDD container that, when represented as an RDF graph (index & links), is composed of 29762 unique entities, 37897 literal values and 147795 triples.
All these information is now transferred independent form any software application and a great way to start a digital twin from. All sensor and live date can be added the same way we had connected documents with BIM elements. Only difference is that you do not store it in a zip file but rather run it in a graph data base. This way you will not only have the most powerful and fastest twin, but also most future-proof and extendible one you can possible get.
 
Read more...
BSI Flex 260 Built environment - Digital twins overview and general principles 
This work began with the Standards Roadmap developed by the British Standards Institute to explore the existing standards landscape and define a route charting the subsequent standards opportunities. It will evolve with the development of standards within the BSI’s recommended framework for digital twins in the built environment. 
We have chosen to test the BSI Flex approach to explore its applicability in the context of connected digital twins. It allows for iterative modification of the standard as common knowledge around digital twins develop, lessons are learned, and practical experience is gained across domains and geographies. 
The consultation period for this Flex standard runs for six weeks until Monday 7 March 2022. Please see:
 
BSI Flex standard landing page
BSI Flex standard commenting page
Read more...
Next week’s Gemini Call will include a presentation by Jack Ostrofsky, Head of Quality and Design at Southern Housing Group and Chair of BIM for Housing Associations.
BIM for Housing Associations (BIM4HAs) is a client led and client funded initiative set up in 2018 to accelerate the uptake of consistent and open standards-based BIM processes across the Housing Association sector. 
An urgent priority for this group is building and fire safety, particularly in the context of the development of a Golden Thread of Building Safety Information which is part of the Building Safety Bill which is expected to receive Royal Assent in 2022.
Understanding of BIM and Digital Twins in the residential housing sector is poor, yet as long-term owner-operators of built assets, housing associations are ideally placed to benefit from the efficiencies of BIM and Digital Twins.
In June 2021 BIM4HAs published a Toolkit of resources for housing associations aimed at assisting them in the process of adopting ‘Better Information Management’. The toolkit, which is free to use, translates the requirements of the National BIM Framework into accessible language and practical tools for housing associations.
Jack will describe an example of the challenge to housing associations to use structured data to manage their assets; the transfer of spatial information about buildings which designers and contractors label as ‘plots’, development managers and asset managers in housing associations have their own naming conventions which have evolved in a traditional and disjointed manner. As a result, the metadata links are severed at handover and a great deal of valuable, useable information is lost to the client.
Jack’s employer Southern Housing Group has developed a spatial hierarchy and property reference numbering system which was published in the BIM4HAs Toolkit in June. 
The spatial hierarchy and naming system links to commonly understood asset management language and informs Asset Information Requirements that housing associations can use to instruct development and refurbishment projects. This process enables contractors to provide useable metadata to housing associations and will form an essential part of the implementation of a Golden Thread of Building Safety Information. 
In a further development Southern Housing Group, working with members of the BIM4HAs community, have developed and are implementing an Asset Information Model based on the Gemini Principles and aligned with the other BIM4HAs work. This Model will be published for free, for anyone to use, by BIM4HAs as part of an update to the BIM4HAs Toolkit in February. 
Please join us on the Gemini Call on 25th January at 10.30 to hear about the spatial hierarchy work and put your questions to Jack.
Download the Spatial Hierarchy Document and ‘The Business Case for BIM’ Document from the links below. Both are part of the Toolkit.
The whole Toolkit can be downloaded for free from the National Housing Federation website here: housing.org.uk/BIM4HAs
 
BIM for Housing Associations Pt1 The Business Case for BIM.pdf SHG Spatial Hierarchy UPRN Procedures.pdf
Read more...
We are pleased to announce the publication of the (Smart Infrastructure Index) Digital Maturity Benchmarking report. 
Summary of responses 
This year, we received 57 responses from the DT Hub community as a whole, this compares with 21 responses in 2020 from asset owners/operators. While increasing the reach of the survey, it also influenced the scoring.  
The overall digital maturity score for the DT Hub community was 37.3 in 2020, decreasing to 33.6 in 2021. When looking at scores for asset owners / operators only, the decrease was evident, however far less significant, with the average score of 37.1 in 2021.  
The overarching observation of this year’s Smart Infrastructure Index results is that on average, the digital maturity score of the DT Hub community has decreased. However, the overall digital maturity of the DT Hub community’s member organisations has not necessarily dropped. There are two key factors which lead to this conclusion: first, that the demographic of respondents has changed, with the survey being sent to vendors and academia as well as asset owners / operators; and second, that the DT Hub community last year was much smaller than it is now, with far fewer organisations, who likely fall into the category of ‘early adopters’ of digital twins and digital more generally.
Analysis and recommendations to improve digital maturity
This report compares results from the 2021 Digital Twin question set with those from 2020, arranging observations and insights into subcategories then continuing with an analysis of the core Smart Infrastructure Index questions. It concludes with specific recommendations to improve digital maturity scores across both these categories. 
About the Smart Infrastructure Index 
The Smart Infrastructure Index allows organisations to: 
Better understand their maturity in relation to both digital transformation and digital twins   Compare and contrast DT Hub members with broader Index metrics   Draw comparisons with the wider community   Understand progress in the last year   Identify future areas of focus.   The DT Hub version of the Smart Infrastructure Index includes core questions that assess digital maturity across the asset lifecycle and an extension focused on digital twins in the context of the National Digital Twin programme (NDTp).
Download the report
 
 
Read more...
By 2050, an estimated 4.1 million people will be affected by sight loss in the UK, making up a portion of the 14.1 million disabled people in the UK. How might digital twins create opportunities for better accessibility and navigability of the built environment for blind and partially sighted people? A new infographic presents a conception of how this might work in the future.
In their work with the Moorfields Eye Hospital in London, the Smart Hospitals of the Future research team have explored how user-focused services based on connected digital twins might work. Starting from a user perspective, the team have investigated ways in which digital technology can support better services, and their ideas for a more accessible, seamless experience are captured in a new infographic. 
In the infographic, service user Suhani accesses assistive technology for blind people on her mobile phone to navigate her journey to an appointment at an eye hospital. On the way, she is aided by interoperable, live data from various digital twins that seamlessly respond to changing circumstances. The digital twins are undetectable to Suhani, but nevertheless they help her meet her goal of safely and comfortably getting to her appointment. They also help her doctors meet their goals of giving Suhani the best care possible. The doctors at the eye hospital are relying on a wider ecosystem of digital twins beyond their own building digital twin to make sure this happens, as Suhani’s successful journey to the hospital is vital to ensuring they can provide her with care. 
Physical assets, such as buildings and transport networks, are not the only things represented in this hypothetical ecosystem of connected digital twins. A vital component pictured here are digital twins of patients based on their medical data, and the team brings up questions about the social acceptability and security of digital twins of people, particularly vulnerable people. 
No community is a monolith, and disabled communities are no exception. The research team acknowledges that more research is needed with the user community of Moorfields to understand the variety of needs across the service pathway that digital twins could support. As such, developers need to consider the range of users with different abilities and work with those users to design a truly inclusive ecosystem of digital twins. The work by the Smart Hospitals research team raises wider questions about the role of digital technology both in creating more physical accessibility in the built environment but also potentially creating more barriers to digital accessibility. It is not enough to create assistive technologies if not everyone can – or wants to – have access to those technologies.  
‘The role of digital technologies in exacerbating potentially digital inequalities is something that needs to be looked at from a policy perspective, both at the hospital level, but also more generally, from a government Department of Health perspective,’ says Dr Michael Barrett, the project’s principal investigator.  
Dr Karl Prince, co-investigator, reflects that, ‘The traditional questions when it comes to this type of technology are raised as to: do they have access to equipment, and do they have the technical ability?’ The lesson is that you can build digital twins that create a better experience for people if you design digital systems from the perspective of an ecosystems of services, with input from users of that ecosystem.  
Through exciting case studies, the project raises vital questions about digital ethics and the potentially transformative effects of digital twins on the physical built environment.
To read the infographic in detail, click here.
You can read more from the Smart Hospitals project by visiting their research profile page. 
This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF).  
To join the conversation with others who are on their own digital twin journeys, join the Digital Twin Hub.

 
 
 
 
 
 
 
Read more...
Top
×
×
  • Create New...