We are often overwhelmed with new technology buzzwords, but this one has been with us for some time and is regularly misunderstood. Nevertheless, I want to explore more about the ‘Digital Twin,’ and how it’s allowing us to achieve perfection.
A Dynamic and Malleable Topology
I recall developing a Microsoft Windows device driver for a Windows 2000 PC back in 2001 for a company based in Scotland. So, the PC would host a Windows CE Operating System (OS) and, for all intents and purposes, any hardware that was inserted into the PC host would be recognised by the CE device as something inserted into its own physical slot. Now, I fear I’m showing my age a little here. Anyway, a PCMCIA network card was inserted into the Windows host where the CE device would then be connected to the network – this essentially was my remit. However, what I learned was that this had never been done before and, what compounded my challenge was, at the time, any generic Windows driver development proved to be mostly a black art, since there wasn’t much information available.
Nonetheless, in this instance, the physical asset, namely the Windows CE device, was represented in a digital environment whilst the host presented to it its external world through the hardware that was connected to it. If you like, the Windows CE device was fooled into believing that it had physical assets directly connected to it. This is indeed perhaps an early virtualisation of one thing believing it’s something or somewhere else. Nowadays, we witness similar technologies, such as Network Functions Virtualisation (NFV) where typically networking physical assets, for example, are represented within software which, in turn, provide a dynamic and malleable topology, allowing its users to shape and mould ad hoc. Moreover, this adaptability affords enterprises reduced costs when upgrading or updating their ecosystems with new software or hardware.
Data Driven
In short, the Digital Twin is the ability to replicate and execute a physical asset in software. This replication of an asset is more than a ‘shadow,’ of something that might be but, rather, it is a fully-fledged and functional replication of what would be the manifestation of that physical asset in the real-world. More often, a digital twin of your product, device or process can be tested routinely before such a system is introduced into general consumption. For example, a car manufacturer, can use a digital twin representation of its new electric vehicle prior to building it. The car manufacturer would be capable of perfecting all aspects of its engineering, mechanical design, usability and even its aesthetics using a real-time database of what it has previously learned.
Similarly, civil engineers can utilise a digital twin of their proposed bridge design prior to construction. Such engineers would be able to undertake numerous modelling schematics allowing them to understand how their design behaves under certain loads and, likewise, how the bridge, for example, performs in terrible weather – to name just a few of the many requirements of bridge design.
With data, if we know the past and understand the present, can we predict the future?
With both examples, data is essential, I dare say, critical in the modelling and understanding, as to how best to test your product. I’ll focus on the bridge model for a moment, as I did see a recent article explaining how, in Norway, a series of the Internet of Things (IoT) sensors were used to help engineers understand how the bridge behaved under numerous conditions. As such, the data that was harvested afforded the engineers the ability to predict and best repair and maintain the bridge’s longevity and, ultimately ensure it withheld the volumes and dynamics of unpredictable traffic and weather.
Until next time…
Your digital twin remains a vacuous digital space with no tangible purpose, if it doesn’t have the data needed to challenge it. Your data would comprise the past, that is, what has been previously understood and learned; it, of course, needs to understand the present, that is, how it currently behaves, along with current understanding of current conditions that nowadays beset such products. But more importantly, with the combination of both the past and present (data sets vis-à-vis machine and deep learning) we can predict the future.
We can leverage our experiences and current understanding to build better quality products. What drives this ability to holistically and accurately represent physical assets in software is good, quality data – data must drive our digital twin ideology and, as such, it will undoubtedly lead our digital representations of what we want to build to perfection. This effort will reduce waste and provide a long-term sustainable and efficient future.
So, this is where a “thinking so far ahead,” Dr G, signs off.
For more about the Digital Twin, listen to our podcast with Keysight Technologies.
Comments