Jump to content

You are currently viewing this content as a member of the public.

To get the most out of DT Hub Networks and the full range of resources on the DT Hub, we invite you to become a member. Join the Digital Twin Hub community to continue the conversation and access hundreds of resources.

Become a DT Hub member and join this network

About This Network

A place to focus discussion around how our collective approach to data governance, value and quality must evolve. It provides a central point for the storage of resources that are relevant to each topic, and a forum for the open sharing of ideas, research and case studies. Please note that the content of this network (posts, shared files ...) is visible to guests of the DT Hub website (non-registered members).
  1. What's new in this network
  2. Another day and another university explaining their Digital Twin strategy https://bits-chips.nl/artikel/getting-smarter-with-digital-twins-together/ However there is quite an interesting statement made in the article Hamilton explains that the coupling is possible with the right software code. “When we understand what the data are and where they’re coming from, you can create programs and data pipelines to connect them properly with the right component in, for instance, Unity. But actually, you want to get rid of that programming step. What you need is enriched data that contains the necessary information to automatically find its way through the model and make the right connections without any programming. That shift is what’s needed and what we’re working on at TUE.” What we need is validated enriched information that can automatically link through to the model with the right connections ...
  3. https://envirotecmagazine.com/2022/10/16/comment-dreaming-the-possible-data-and-the-future-of-water/ If what frameworks are being used ? is certainly true but the ability to validate the data and deliver it to consumers still seems to be a silo of data validation. Can we plug and play our data into analytical tools and ensure a smooth process to consumers rather than creators ? We find the challenge is not only the schema / class alignment, but the alignment of content, ie why is C30/PAV1 OK and C35/PAV1 not and what do they mean ?
  4. Some interesting topics covered in last nights Rail Industry Association meeting The first presentation on carbon, then digital trains, but perhaps the Rail Data Marketplace may be of interest to members in this network
  5. Glen Worrall

    Cost of Bad Data

    Hi @DRossiter87, there is also a gov article linking back to the framework (ps your link seems to have gone stale). The magnitude of datasets seems to make writing the rules as hard as creating the data and the evolving nature of the Employers Information Requirements creates a dynamic validation scenario. The requirement to validate data is not only at the schema level, but the downstream consumers. I highlight some of the challenges with reviewing data https://medium.com/@allowing_pullman_wasp_147/validation-of-property-instances-with-itwin-c04982134f52 and assets such as panels for example prevent the rapid deployment of engineering content into operations. I have various threads running on how to validate, but they all seem to be asset owner / project specific. The only government standard I seem to have come across is the Singapore Corenet system which is very specific. Interested in what others are doing to standardise the data validation requirements.
  6. Peter El Hajj

    Cost of Bad Data

    Hey @DRossiter87 - thanks for sharing. I thought this is a good report. Since I saw your post here, I've been on a research exercise to find more references to "cost of bad data". Have you come across other studies that assess or discuss that cost? I think it is helpful knowledge to help assess the value you can get from data; both value of static data and and value of data flow between entities (like within an ecosystem of connected digital twins). The other reference for bad data cost I came across recently is: https://hbr.org/2016/09/bad-data-costs-the-u-s-3-trillion-per-year
  7. DRossiter87

    Cost of Bad Data

    Not sure if everyone is aware but Autodesk have done an interesting report highlighting the cost of bad data and how to harness good data. https://construction.autodesk.com/resources/fmi-construction-data-report/?utm_medium=press-release&utm_source=blog&utm_campaign=fmi2021&utm_region=global It includes some interesting figures about the global cost, and its effect on things like rework
  8. Hi All - in today's Gemini Call presentation I shared our new recommended practice (RP) for the assurance and qualification of digital twins. Within the session I referenced a derivative RP on data quality and took an action to share a link. Here's a link to our work - if you scroll to the bottom of the page you'll find a download link to our data quality RP - DNV-GL-RP-0497 - Data Quality Assessment Framework. Kind regards Graham
  9. This forum is quiet so I thought maybe some provocation. Everyone talks about data quality but who is making progress measuring and managing it? DAMA six dimension model has been around for sometime but talking to data managers I have yet to find universal acceptance or understanding about data quality criteria, their semantics and their application. If you can define them how do you apply them? To the entity, to the attribute, to the relationship? How do you connect your specification with the myriads of data stores and different physical implementations found in systems often spanning decades? How do you manage situations where different teams require different data quality criteria about the same data in different processes? How do you drive motivation of staff to maintain data of high quality when there is no value to their immediate job? Different organisations have different needs and even within the same organisation different disciplines require emphasis on different qualities. The operational team in a safety-critical business will have much more stringent requirements than their colleagues in marketing. One lost lead is not the same as one accident.
  10. I was reccently introduced to the work on Digital Twins that the City of Wellington is involved in. I share some links with the DT Hub community. Unlocking the Value of Data: Managing New Zealand’s Interconnected Infrastructure Plus, check out these links too.. which where shared with me by Sean Audain from Wellington City Council who is leading the Digital Twin activity in the city. "We have been on this trip for a while - here is an article on our approach https://www.linkedin.com/pulse/towards-city-digital-twins-sean-audain/ - the main developments since it was written was a split between the city twin and the organisational twin - something that will be formalised in the forthcoming digital strategy. To give you an idea of progress in the visualisation layer this is what the original looked like https://www.youtube.com/watch?v=IGRBB-9jjik&feature=youtu.beback in 2017 - the new engines we are testing now look like this https://vimeo.com/427237377 - there are a bunch of improvements in the open data and in the shared data systems." I asked Sean about the impact on the DT to city leaders decision making. This is his response... "In our system we are open unless otherwise stated. We have used it as a VR experience with about 7000 wellingtonians in creating the City Resilience Strategy and Te Atakura- the Climate CHange Response and Adaptation plan. There are more descrete uses such as the proposals for the Alcohol Bylaw - https://www.arcgis.com/apps/Cascade/index.html?appid=2c4280ab60fe4ec5aae49150a46315af - this was completed a couple fo years ago and used part of the data sharing arrangements to make liquor crime data available to make decisions. I have the advantage of being a part of local government in getting civic buy in. Every time our councillors are presented with this kind of information they want more." Alcohol Control Bylaw – New
  11. @David Willans of Anmut recently sent me this invitation and I thought I should share it here (with permission). On 24th February, 11am GMT, Anmut are running a webinar about data valuation. When we mention the term, people tend to think it’s about setting a price for monetisation. That is one benefit of doing valuation, but it’s a third order benefit at best. The first and second order benefits are much more valuable and best described with two words, translation and focus. Translation Businesses are, in a simplified way, about choosing which assets and activities to allocate limited capital and resources to, to get the desired results. Data is just one of those assets, a powerful one because it enhances all the others by making decisions better, and can identify unseen problems and new opportunities. These allocation decisions are made using the money as a measure, a language if you will – invest £XXX in product / advertising / a new team / training, to get £XXXX in return. Data doesn’t fit with how a business allocates capital, which makes realising the value of it much harder. When you value it, ‘it’ being the different data assets in a business, data can be compared to other assets. It fits the ways the business runs naturally. The second order impact of this is culture change. Suddenly the business understands it has a sizeable portfolio of data assets (in our experience this is approx 20 - 30% of the total business value) and, because businesses manages through money, the business starts to naturally manage data. One caveat though, for the translation effect to happen, the way data's valued matters. If it’s just a simple cost-based method, or linear, internal estimates of use case value, the resulting valuation won’t be accurate and people won't believe it, because the figures will be based factors heavily influenced by internal politics and issues. Focus Capital allocation is a game of constrained choices, of where to focus. When a business’ portfolio of data assets is valued, it becomes very clear where to focus investment in data to move the needle – on the most valuable data assets. Again, this puts more pressure on the valuation method, because it has to be based on the ultimate source of value truth – the stakeholders for whom the organisation creates value. If you need to translate the value of data so the rest of the business gets it, or need clearer focus on how to create more measurable value from your data, this webinar will help. Find out more here or sign up
  12. Worth a look at Digital twins, data quality and digital skills (pbctoday.co.uk)
  13. Hi @Gary Todd (Famiio), there isn't one planned as yet, but it is possible that one may be set up once there is more information flowing here.
  14. Gary Todd (Famiio)

    Welcome to the Data Value and Quality network

    Will there be a scheduled call regarding Data Quality at some point, like there has been on the issue of Interoperability?
  15. We live in a world abundant in data and technology. There are numerous ways to fake data of all kinds (think deep fake). Envisioning a future where data outputs become as common as a PDF report how do we enable the skills around critical thinking that will allow data professionals to know when something doesn't look right even though it may have already gone through data quality and data audit checks. Just a thought at this point but I would be interested in others thoughts.
  16. “Data that is loved, tends to survive.” – Kurt Bollacker In our quest to transition ourselves from a nation that simply creates data, to one where we understand and exploit its value to the betterment of society, we still have much to learn about what constitutes ‘quality’ in data. The National Digital Twin programme wants to explore how quality can be defined, and how we can begin to build the tenets and processes for high-quality data into the way we operate in our daily lives, our corporate environments, and our national institutions. This network has been created as a place to focus discussion around how our collective approach to data governance, value and quality must evolve. It provides a central point for the storage of resources that are relevant to each topic, and a forum for the open sharing of ideas, research and case studies. We will explore case studies, debate how we have learned (or not) from the mistakes of the past, and try to bring together consensus over what constitutes best in class practices for governance, quality and ultimately, value. To help in guiding and shaping the work being done, the voices of broader stakeholder groups, expert communities and organisations is invaluable. To this end, the NDTp is establishing this new network, through the DT Hub. Who should join? This is an open group accessible to any member of the DTHub. This is an actively developing area and broad participation is widely encouraged from individuals from all backgrounds. Admin & Security This Community will be supported by CDBB and the National Digital Twin programme by a network manager (James Harris) and supported by the core NDTp team. Please note that due to the open nature of the DT Hub, the community is not suitable for the discussion of sensitive or commercial information.
  17.  

  • Newsletter

    Sign up to our newsletter

    Sign Up
Top
×
×
  • Create New...