Jump to content

Recommended Posts

Interesting Question: 

What is one difficulty that you’ve encountered while trying to create a Digital Twin? 

 

Context: 

We’ve heard that creating a Digital Twin can be a bumpy road. Various challenges can get in the way no matter what sort of Digital Twin you’re trying to set up or why. We’ve noticed in various conversations on the DT Hub that there is a wide range of these challenges, from technical or cultural to those related to resources or supply chains, and so many more. We’d like to hear about your experiences, so please share them with us here. Just a few guidelines before you start: 

 

One example at a time please - no lists! However, multiple posts are welcomed 

Please cite the industry you’re talking about

 

Please: Your posts need to be pithy:

 

·  Give each post a title that sums up your blocker

·  Limit each post to 100 words or so, or supply a short summary at the top if you can’t.

·  Please include an image, it helps your post stand out

 

We encourage you to like, or vote, on each other’s posts if you agree with them, your facilitator Joao and the DT Hub/ 100%Open are looking forward to reading your input.

 

Thank you.


 

Oblong OnlineJam2 DT Hub.png

  • Like 3
Link to comment
Share on other sites

Specifying the key decisions / interventions that your digital twin will make and the information needed to support them

A challenge that, as the IMF team, we would like to bring to the forefront:

The starting point of a digital twin (DT) project is an original issue or purpose, a use case or use cases the project needs to address.

Once the original purpose for the DT project is outlined, we have found that a challenging step for organisations is to define the information requirements that will ensure that the DT (or DTs) resulting from the project collects the right information and information of the right quality to support the decisions/interventions it must take/make to be fit for purpose.

We believe that the following process-model based methodology provides an efficient route to specifying these information requirements:

  • identify the core process(es), lifecycle processes (for instance periodic lifecycles like budgeting, asset lifecycles …), common processes (procurement, recruitment …) involved in the use case. You will likely need a DT (or DTs) of these core and lifecycle processes (or phases of them) and/or of the assets involved.
  • develop the models of these processes to at least the level where you can identify the key decisions/interventions
    • specify the decisions/interventions that the DT will take/make
  • develop and document the requirements for the information needed to support these decisions/interventions
    • specify the processes that the DT will use to create/capture the required data

Processes across organisations within a same industry and even across industries bear many commonalities. We believe that organisations would greatly benefit from the provision of standard process models that could be tailored to their specific context, helping them to identify the right information requirements for their DT projects.

image.pngThe challenge: establish standard process models to the level required to determine critical decisions for relevant infrastructure sectors and the data requirements for these decisions.

 

  • Like 3
Link to comment
Share on other sites

Hi @Anne GUINARD thank you for sharing this great insight. It sounds like the process-model based methodology can work as an overall roadmap to identify and address roadblocks at different stages on the DT journey and help starting off the DT project.

You mention the definition of the information requirements as a major challenging step for organisations - is this the blocker you want us to take to the workshop? Thank you!

Edited by JoaoF
Link to comment
Share on other sites

Value doesn't materialise where the efforts are 

This is a fairly general challenge when trying to make data resources useful for purposes other they were created for. 

Data is often created for specific purposes and there is typically additional effort in changing or optimising data, worrying about IP, data protection etc. and then publishing data. If data does get published, it's likely that someone else will benefit. For data that is simple in structure, a by-product of other work and not sensitive, the barrier to making data Findable, Accessible, Interoperable and Reusable might be quite low--the more complex the datasets get, the higher the effort and therefore the barrier gets bigger.  

So, how can organisations and individuals be incentivised to publish digital twins or data that contributes to digital twins when others might reap the benefits? I think the answer is to build communities of data sharers, align them to a common goal and create a common understanding for future value where everyone will benefit at some point. 

This is possible but, drawing on experience with utility data, a really hard and lengthy process.  


320px-Granules_many_hands.jpg

Photo for the sake of including a photo: Creative Commons Attribution-Share Alike 3.0 Unported license.


 

  • Like 3
Link to comment
Share on other sites

On 20/09/2021 at 19:03, CRT said:

Value doesn't materialise where the efforts are 

This is a fairly general challenge when trying to make data resources useful for purposes other they were created for. 

Data is often created for specific purposes and there is typically additional effort in changing or optimising data, worrying about IP, data protection etc. and then publishing data. If data does get published, it's likely that someone else will benefit. For data that is simple in structure, a by-product of other work and not sensitive, the barrier to making data Findable, Accessible, Interoperable and Reusable might be quite low--the more complex the datasets get, the higher the effort and therefore the barrier gets bigger.  

So, how can organisations and individuals be incentivised to publish digital twins or data that contributes to digital twins when others might reap the benefits? I think the answer is to build communities of data sharers, align them to a common goal and create a common understanding for future value where everyone will benefit at some point. 

This is possible but, drawing on experience with utility data, a really hard and lengthy process.  


320px-Granules_many_hands.jpg

Photo for the sake of including a photo: Creative Commons Attribution-Share Alike 3.0 Unported license.


 

Hi  @CRT, thank you for your post. Are you saying that the effort involved in producing and sharing DT data, together with the perception of its value (benefiting others) is the roadblock you would like to highlight? Thanks

Link to comment
Share on other sites

Will do and thank you @Katie Walsh

I have just observed your learned contribution this morning on the Gemini Call [21.09.21] _ pls note my recent post following my in-person contribution at Housing 2021, Manchester a couple of weeks ago.

As a Chartered Surveyor, my principal focus is asset management [and legislative compliance] in a post-Grenfell world. 

https://www.diversecity-surveyors.com/single-post/housing2021-a-roaring-success-with-digital-platform-for-dcs

Principal blocker:

As with many things [to with innovation / transformation and technology led disruption] within and across the built environment & construction sector, the largest blocker is CULTURAL [as well as the need to understand - at an organisational level] their collective why?

I have recently been commissioned to lead on a 'Leadership Programme' for the social housing sector; any thoughts on overcoming 'cultural reluctance' due principally to fear of the unknown in an inherently risk averse sector [and one that continues to waste billions of £'s per annum]?

 

 

 

Link to comment
Share on other sites

On 21/09/2021 at 10:56, JoaoF said:

Hi  @CRT, thank you for your post. Are you saying that the effort involved in producing and sharing DT data, together with the perception of its value (benefiting others) is the roadblock you would like to highlight? Thanks

@JoaoFIt's more that the generation of value and the effort needed to make this happen are often disconnected. If you actually share data, there won't necessarily be an immediate payback. 

  • Like 1
Link to comment
Share on other sites

Here are some challenges that our researchers have brought up in developing digital twins, paraphrased by me, so if they are in error the fault is mine and I welcome corrections:

  • The value of digital twins in providing the right information at the right time, so a key challenge is determining the frequency and timeliness of data collection to provide useful, valuable insights to asset owners.
  • With satellites, InSAR and other earth observation technologies, a challenge is in processing the high volume of data needed to quality-check the measurements taken in a timely manner.
  • In creating a digital twin of a building, existing asset management processes have been established to take advantage of the knowledge of human asset managers and the data provided by building management systems. A key challenge is to develop digital twins that are capable of complementing these existing sources of knowledge and data by adding new value.
  • Computer vision can help identify events and behaviours in the built environment without capturing footage of people, making it more acceptable from a privacy perspective. One important challenge to address is giving machine learning algorithms a fully representative training dataset so that biases are not introduced into the resulting data.
  • Each sensor in a building or asset may only be able to detect one factor or phenomenon in isolation, but if multiple sensors become networked together in ‘smart’ ways, they may be able to detect ‘complex events’, events characterised by multiple phenomena happening in a specific order, time frame or physical orientation. Understanding how to combine sensors and human understanding into truly ‘smart’ buildings that can detect complex events and respond appropriately is a challenge.
  • One promise of connected digital twins is seamless services provided to the public through digital technologies in the built environment. When designing a comprehensive service ecosystem enabled by connected digital twins, it is difficult to break down existing siloes: from a technical data sharing and interoperability standpoint; from a regulatory and geographical standpoint; and from the standpoint of existing processes and business models.
  • When designing services based on connected digital twins, it is important to acknowledge the inequalities in access to digital technology based on socio-economic, geographic, age, education, ability and other factors. Exclusion from services or inequality of service provision based on these factors is a major issue to consider in the governance and development of connected digital twins for the public good.
  • Like 2
Link to comment
Share on other sites

On 21/09/2021 at 14:48, Kirsten Lamb said:

@Kirsten Lamb Many thanks for all the below. We are encouraging one thought on each post - but your list is fantastically clear and will help us a lot to kick start the first Jam, so thank you for posting. 

 

 

Here are some challenges that our researchers have brought up in developing digital twins, paraphrased by me, so if they are in error the fault is mine and I welcome corrections:

  • The value of digital twins in providing the right information at the right time, so a key challenge is determining the frequency and timeliness of data collection to provide useful, valuable insights to asset owners.
  • With satellites, InSAR and other earth observation technologies, a challenge is in processing the high volume of data needed to quality-check the measurements taken in a timely manner.
  • In creating a digital twin of a building, existing asset management processes have been established to take advantage of the knowledge of human asset managers and the data provided by building management systems. A key challenge is to develop digital twins that are capable of complementing these existing sources of knowledge and data by adding new value.
  • Computer vision can help identify events and behaviours in the built environment without capturing footage of people, making it more acceptable from a privacy perspective. One important challenge to address is giving machine learning algorithms a fully representative training dataset so that biases are not introduced into the resulting data.
  • Each sensor in a building or asset may only be able to detect one factor or phenomenon in isolation, but if multiple sensors become networked together in ‘smart’ ways, they may be able to detect ‘complex events’, events characterised by multiple phenomena happening in a specific order, time frame or physical orientation. Understanding how to combine sensors and human understanding into truly ‘smart’ buildings that can detect complex events and respond appropriately is a challenge.
  • One promise of connected digital twins is seamless services provided to the public through digital technologies in the built environment. When designing a comprehensive service ecosystem enabled by connected digital twins, it is difficult to break down existing siloes: from a technical data sharing and interoperability standpoint; from a regulatory and geographical standpoint; and from the standpoint of existing processes and business models.
  • When designing services based on connected digital twins, it is important to acknowledge the inequalities in access to digital technology based on socio-economic, geographic, age, education, ability and other factors. Exclusion from services or inequality of service provision based on these factors is a major issue to consider in the governance and development of connected digital twins for the public good.

 

Link to comment
Share on other sites

On 21/09/2021 at 12:36, CRT said:

@JoaoFIt's more that the generation of value and the effort needed to make this happen are often disconnected. If you actually share data, there won't necessarily be an immediate payback. 

Thank you for clarifying @CRT

Link to comment
Share on other sites

We need Integrated modelling of Resources and Infrastructure with Coupled Iteration.  Especially for Energy.   Coupled Digital Twins.
The IEA Smart Grid Network (ISGAN) has done some coupled modelling and I have experience with Iterative Generation-Fuel optimisation models (albeit late 1970s)

Future and Fast Actions and Strategy papers attached.  These and associated documents are linked at    www.eleceffic.com 

Steve Browning Future and Fast Action.pdf Steve Browning Energy Strategy Mk XI.pdf

Link to comment
Share on other sites

On 22/09/2021 at 06:06, Stephen Browning said:

We need Integrated modelling of Resources and Infrastructure with Coupled Iteration.  Especially for Energy.   Coupled Digital Twins.
The IEA Smart Grid Network (ISGAN) has done some coupled modelling and I have experience with Iterative Generation-Fuel optimisation models (albeit late 1970s)

Future and Fast Actions and Strategy papers attached.  These and associated documents are linked at    www.eleceffic.com 

Steve Browning Future and Fast Action.pdf 147.58 kB · 1 download Steve Browning Energy Strategy Mk XI.pdf 242.83 kB · 1 download

Thank you @Stephen Browningfor reposting here!  To clarify, are you saying that having integrated modelling of resources and infrastructure would be the solution? We are encouraging succinct posts and simply put, what would you say the blocker preventing this from happening is? Maybe you would like to give a title that sums up your blocker? Much appreciated, thanks.

Link to comment
Share on other sites

Fractured/Lack of communication between the Digital and Physical

Industry: Defence

No sure how much of a Roadblock this is in the commercial world but it is definitely a Roadblock within a defence operational environment.

Assets that are provided to the MOD do not always have the capability to transfer data in real time; the lack of logistic communication has always been an issue when dealing with the A2 Echelons (Frontline) and further back down the Forward and Reverse Supply Chain, until good communication is established.

As part of the Design & Manufacturing Phase of a project, these requirements are often traded out because of cost and the known lack of ability to transfer this data. So even when operating where communications are good, there is no ability on the asset to automatically connect and transfer the data to the Digital.

HUMS data is a prime example of this information transfer Roadblock. The platforms do have the ability to capture this data but getting it off the platform and dealing with the different Security Classifications and aggregation of the data is another problem.

A lot of work has already been completed as part of the Logistic Coherence Information Architecture (LCIA) (Subject to Change) with regard to the data and what is required where.

Just a bit on the CADMID life cycle for Integrated Logistic Support (ILS) attached.

Hope this hits the mark and is an interesting discussion point 🙂

Regards

Rich

CADMID.PNG

  • Like 2
Link to comment
Share on other sites

John Lewis Partnership - Retail (and other areas)

For me the bigest blocker is piority.  There is very little money in retail and we are working with very lean teams to deliver just the day to day work.  If I go to a manager to ask for permission to set up a Digital Twin I will be told no, we don't have the time (FTE, money) or the money.  I'll also get the same response from out internal IT team, there isn't the time (FTE money) to spend on projects like this, we need to keep the wheels on the bus.

 

Plus I'm often told stop talking about Star Wars stuff, we don't need this.

Link to comment
Share on other sites

On 22/09/2021 at 11:12, Rich said:

Fractured/Lack of communication between the Digital and Physical

Industry: Defence

No sure how much of a Roadblock this is in the commercial world but it is definitely a Roadblock within a defence operational environment.

Assets that are provided to the MOD do not always have the capability to transfer data in real time; the lack of logistic communication has always been an issue when dealing with the A2 Echelons (Frontline) and further back down the Forward and Reverse Supply Chain, until good communication is established.

As part of the Design & Manufacturing Phase of a project, these requirements are often traded out because of cost and the known lack of ability to transfer this data. So even when operating where communications are good, there is no ability on the asset to automatically connect and transfer the data to the Digital.

HUMS data is a prime example of this information transfer Roadblock. The platforms do have the ability to capture this data but getting it off the platform and dealing with the different Security Classifications and aggregation of the data is another problem.

A lot of work has already been completed as part of the Logistic Coherence Information Architecture (LCIA) (Subject to Change) with regard to the data and what is required where.

Just a bit on the CADMID life cycle for Integrated Logistic Support (ILS) attached.

Hope this hits the mark and is an interesting discussion point 🙂

Regards

Rich

CADMID.PNG

A roadblock to take to the workshop for sure @Rich, thank you very much for sharing! 

Link to comment
Share on other sites

On 22/09/2021 at 15:33, Paul said:

John Lewis Partnership - Retail (and other areas)

For me the bigest blocker is piority.  There is very little money in retail and we are working with very lean teams to deliver just the day to day work.  If I go to a manager to ask for permission to set up a Digital Twin I will be told no, we don't have the time (FTE, money) or the money.  I'll also get the same response from out internal IT team, there isn't the time (FTE money) to spend on projects like this, we need to keep the wheels on the bus.

 

Plus I'm often told stop talking about Star Wars stuff, we don't need this.

Hi @Paul, thank you for sharing your insight on your experience in the retail sector.

Link to comment
Share on other sites

Biggest blocker = quick fix

Let's do it the 'old' way to solve one of the immediate problems because it's quick and easy, despite the fact that it addresses none of the overarching or longer-term ambitions of a project and certainly does not allow any further growth in benefits.

This is our greatest challenge and can be a result of tech teams being poorly briefed or not bought in to the overall vision of a DT project, and simply see it as an integration problem which is solved by a crowbar.

  • Like 2
Link to comment
Share on other sites

On 23/09/2021 at 19:07, sophie.peachey@iotics.com said:

Biggest blocker = quick fix

Let's do it the 'old' way to solve one of the immediate problems because it's quick and easy, despite the fact that it addresses none of the overarching or longer-term ambitions of a project and certainly does not allow any further growth in benefits.

This is our greatest challenge and can be a result of tech teams being poorly briefed or not bought in to the overall vision of a DT project, and simply see it as an integration problem which is solved by a crowbar.

Great, thank you for sharing your blocker @sophie.peachey@iotics.com - does it apply across the sectors? Thanks

Link to comment
Share on other sites

On 20/09/2021 at 12:15, JoaoF said:

Hi @Anne GUINARD thank you for sharing this great insight. It sounds like the process-model based methodology can work as an overall roadmap to identify and address roadblocks at different stages on the DT journey and help starting off the DT project.

You mention the definition of the information requirements as a major challenging step for organisations - is this the blocker you want us to take to the workshop? Thank you!

Indeed @JoaoF in the context of the first workshop, we would like to raise the definition of information requirements as a key challenge. We believe that applying a process-model based methodology can help organisations to overcome this challenge, by offering a systematic approach to identifying the information requirements and when information is most cost-effectively created. 

Link to comment
Share on other sites

Roadblock #1

1. the value proposition. We still cannot (collectively) articulate the cold hard cash value proposition to business leaders, in their language. if we had, the take up would be universal. Too much academia and not enough business talk.

Link to comment
Share on other sites

Roadblock #2

The information itself. that contained in product data templates is only valuable to the manufacturers that populate it, COBie has little or no value to the actual maintainer (detach yourself from the mantra and actually ask a spanner wielder rather than a manager or academic) - Information costs to gather, manage and disseminate, so to make this worthwhile each piece needs to be valuable to someone. This definitive data dictionary that defines what is valuable to the end users throughout the lifecycle does not exist. This needs to be rectified!! (otherwise everything else is pointless!)

  • Like 1
Link to comment
Share on other sites

On 05/10/2021 at 16:51, Katie Walsh said:

Many thanks @iain miskimmin, we will add these in. And you are on the invite list for Jam 2. 

Jam2?

 

Link to comment
Share on other sites

On 05/10/2021 at 17:26, iain miskimmin said:

Jam2?

 

@iain miskimminthere is a follow up workshop to focus on prioritising challenges. This will be held later this month. Katie will get in touch soon.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

Top
×
×
  • Create New...