Tampered IoT Data – The “fake news” of the Industrial IoT

Tampered IoT Data – The “fake news” of the Industrial IoT

The rising amount of collected IoT data, the trend towards data-based decision-making and new data services increase the risk of being vulnerable to manipulation. Therefore, companies should implement countermeasures to prevent IoT data form being tampered and build their data services and IoT platform on a trustable foundation. Tributech offers a data notary service that eliminates the risk of tampered IoT data and can be easily integrated in your data service or platform.

The term “fake news” has been around quite a while and describes the conscious misinformation to influence decision-making and beliefs of people.

In both our professional and personal lives, we make decisions all the time and of course we make mistakes. History shows us that even the best leaders sometimes make the wrong decisions. With the rise of fake news, however, there is a risk that the number of wrong decisions being made will reach unprecedented levels. This is because fake news goes far further than being a way to manipulate elections or sometimes expose unwitting consumers to hidden advertising messages.

As the global amount of information increases at an exponential scale, the access to enormous amounts of information makes it hard for people to verify sources and critically interact with the provided content. This is the reason why digital platforms are strongly criticized due to their global outreach, lack of validation and reference checks of posted content.

Due to the rise of technological opportunities within the Industrial IoT, the amount of available information is also increasing exponentially, and more and more companies are moving towards data-based decision-making to manage their business.

Data-Driven Business at Risk

Therefore, companies should be aware of the trend of conscious misinformation and the possible impact it can have when it begins to target IoT data. Tampered IoT data can cause tremendous harm to businesses as it is used for interpretations, decisions or actions triggered within a company – it may even put the safety of people at risk when security mechanisms are triggered by data inputs.

So, every company who already has established a data service or an IoT platform or is planning to do so, should be aware of these risks. We offer an out-of-the-box solution for ensuring origin and integrity of IoT data, even across company’s boundaries. This creates trust form sensor to consumer and provides an additional layer of IoT security.

Tamper-Proof IoT Data enabling Trust from Sensor to Consumer

In order to keep conscious misinformation out of your IoT-platform or data service you need an independent party at the data source that verifies origin and integrity of the data. The principle is equal to a notary in the physical world: A notary is an independent instance that verifies documents for your company.
As part of our DataSpace Kit, the DataSpace Agent operates as a data notary service that is located directly at the data source and verifies the origin and integrity. Hereby the DataSpace Agent creates cryptographic proofs of the data which are securely stored in a blockchain-based trust layer. This approach enables the verification of origin and integrity across systems and companies at any time – even of high-frequency data streams.

Why is this relevant for your IoT-platform, data service or new business model? Let’s say you have two connected machines – one is located at your supplier’s site and the other one in your factory. One machine is welding two pieces of metal together and the other one is supplying the metal sheets. The assembling machine used relevant input about the quality from the welding machine, for specific adjustments during the assembly process. If the welding machine provides wrong information about the quality of the welded parts, it might have an impact on the final product and its strength class. The production stop and the increase in rejects or claims would have tremendous consequences for the business. Now imagine the impact in a fully automated factory. Therefore, it is necessary to implement a data notary service into your system that verifies the origin and integrity of data in order to profit from the benefits of data services.


Working on a data service, new business model or IoT platform? Contact us for a first discussion as it might be at risk.

Tampered IoT Data – The “fake news” of the Industrial IoT

The rising amount of collected IoT data, the trend towards data-based decision-making and new data services increase the risk of being vulnerable to manipulation. Therefore, companies should implement countermeasures to prevent IoT data form being tampered and build their data services and IoT platform on a trustable foundation.

Covid-19: Behind the mask

Nothing new came forth but it would have enhanced prior knowledge if considered before.
A few weeks after the outbreak of the covid-19 pandemic, it makes sense to draw a first conclusion of what we’ve seen and heard from industry players and we tried to summarize some of our core findings.

Alexander Sztatecsny – new COO of Tributech

Alexander Sztatecsny, who has been working in different areas of Digitalization across the value chain for most of his business life, has recently joined the team of Tributech as COO and Managing Partner and will work with the team to elevate the business to the next level.

Covid-19: Behind the mask

Covid-19: Behind the mask

Nothing new came forth but it would have enhanced prior knowledge if considered before.

A few weeks after the outbreak of the covid-19 pandemic, it makes sense to draw a first conclusion of what we’ve seen and heard from industry players and we tried to summarize some of our core findings. Basically, nothing new popped up and everything was already part of previously published research and findings. However, the last few weeks and months demonstrated the difference in launching (digital) initiatives without clear purpose or because you really want to drive change in order to manage operations, build resilience to survive or even accelerate business.

Cross-company perspective

The past few months have shown us the benefits of digitization on the shop floor and how relevant it is, when trying to keep everything up and running in challenging times. For example, a high level of  automation on the shop floor helped to keep the production up and running with minimum or even without employees necessary on site. Accessing assembly lines in remote locations from your headquarter or from home did provide additional benefits for enterprises and supported resiliency

However, the crisis also showed us the limitations of today’s approaches – that the strength of a value or supply chain is not only dependent on the grade of digitization of one member but on the whole chain. Therefore, in future it will be necessary to think across the pond when digitally transforming businesses or value chains and include suppliers, partners and customers in your thinking and projects.

Establish & accelerate your data-driven business

Offering value-adding services to your products or even establishing new business models helps to diversify your business. For example, platform or subscription-based data services can decrease the volatility of a company’s revenue.
One especially interesting example in these challenging times is the Pay-per-Use approach. It can help a manufacturer to establish an additional sales channel, which does not impose a high investment barrier from a customer. On the other side the customer gets maximum flexibility, when scaling production up and down according to the volatile demand.

According to a Forbes study, 89% of the interviewed CIOs expect to have revenue-generating responsibilities and 2/3 of the CIOs in the manufacturing sector expect to head a profit center by 2025 and create new products/services contributing to the bottom line.

However, establishing and accelerating your data-driven business in a scalable and global way comes with several challenges that we, at Tributech, aim to address. I would like to highlight two of them which will increase in importance. The first is the so called “willingness to share” which is necessary when scaling your data-driven service or business model across your customer base. The other one is the trust in shared data from sensor to consumer for being able to rely on shared data which will be one of the key requirements for data-driven services and business models – no matter if the data is used for decision-making, documentation, triggering payments or referenced in legal agreements.

Managing data like corporate assets

As data is the necessary foundation for all of these mentioned learnings, companies need to derive a “data asset strategy” from their general company’s strategy in order to align management and handling of data with the future value creation process. This derived strategy will also help companies to better define data security & sharing policies, allocate data-related budgets and investments based on the strategy and it also assists in  demonstrating a true transformation to a data-driven future for all involved stakeholders.

At Tributech, we believe that a company’s strategy depends more and more on deriving value from data. The management, the board and also the investors should (and will) keep an eye on a new asset class – the data portfolio of the company.


Curious? Talk to us!

Tampered IoT Data – The “fake news” of the Industrial IoT

The rising amount of collected IoT data, the trend towards data-based decision-making and new data services increase the risk of being vulnerable to manipulation. Therefore, companies should implement countermeasures to prevent IoT data form being tampered and build their data services and IoT platform on a trustable foundation.

Covid-19: Behind the mask

Nothing new came forth but it would have enhanced prior knowledge if considered before.
A few weeks after the outbreak of the covid-19 pandemic, it makes sense to draw a first conclusion of what we’ve seen and heard from industry players and we tried to summarize some of our core findings.

Alexander Sztatecsny – new COO of Tributech

Alexander Sztatecsny, who has been working in different areas of Digitalization across the value chain for most of his business life, has recently joined the team of Tributech as COO and Managing Partner and will work with the team to elevate the business to the next level.

Alexander Sztatecsny – new COO of Tributech

Alexander Sztatecsny – new COO of Tributech

“A new chapter? It’s more like an entire new book!”, a statement Alexander is making, when talking about his recent career change.

Alexander Sztatecsny, who has been working in different areas of Digitalization across the value chain for most of his business life, has recently joined the team of Tributech as COO and Managing Partner and will work with the team to elevate the business to the next level.
We have interviewed Alexander to give you a short overview of his motives and expectations of Tributech and its business field.

1.) What excites you about Tributech?

There are many things which excite me about Tributech but let’s talk about the ones which strike me most.

First of all it is the core idea/value which fuels our platform: Our solution is technologically highly sophisticated, combining a couple of established technologies like Hardware-based security, Blockchain and Peer to Peer Communication to an highly standardized and scalable solution while serving a very simple but widely neglected need from enterprises around the globe: A selective, secure and auditable way to share data across organizations or processes!

It is the “Why hasn’t this been invented before?”-moment, when you realize the potential of which kind of gamechanger this could be. The second aspect which excites me in particular, is the team which combines an unbelievable level of enthusiasm, thought leadership and technical expertise. So far, it has been great fun and already very rewarding especially if you consider the currently enforced way of working predominantly remotely, I am really looking forward to build on this spirit!

 

2.) What is your long-term perspective about cross-company data exchange and transforming data to business assets?

I do see a very bright future for both topics…I guess I have to say that. 😉
Let’s be serious: There has been extensive research about the value and potential of breaking up (data)barriers along the value chain, I can only recommend e.g. the WEF study: https://www.weforum.org/whitepapers/share-to-gain-unlocking-data-value-in-manufacturing/. So in theory we should be aware that something needs to be done in order to address the corresponding issues but in reality I still miss substantial progress in real life scenarios beside successful demonstrators in lab environment. From my perspective it will need:

  • more technology/platform companies to build consortia like e.g. Adobe, SAP and Microsoft with their “Open Data Initiative” in order to address the topic from an application/platform perspective
  • large corporate front runners within the different verticals which guide their company eco-system to make sharing of data more natural
  • technology which makes sure that interests like IP Protection, Transparency and Security of all stakeholders are addressed.

When it comes to transforming data into business assets there is for sure still a long way to go to cover all aspects of importance, especially from rating, standardization and regulation perspectives. But I guess we all will agree that one topic will be of absolute essence: Being capable of proofing source and integrity of data!
If you want to put this to an extreme: Let’s ask ourselves, what is the value of data which you are not able to proof, that has not been tampered or even artificially generated? Companies which (plan to) extend monetization of data assets should make sure to address this and build the right governance models and technological foundation to serve both inter- and intra- company data exchange aspects.

   

3.) What impact do you see from COVID-19 on Tributech and the future of cross-company data sharing?

I have already been asked this quite often since I came on board at Tributech… I guess it is related to the fact that humans are typically tending to thrive for a safe harbor in times of crisis vs. making big disruptive changes to their personal and business life. So from my personal perspective I do not see any risk bigger than normal. The time has come to pick up opportunities which can sustainably shape the future in a positive way.  
From a company perspective it is as well rather straightforward: We are focusing on specific aspects of the data-economy which, at current stage, is still in its infancy compared to the overall potential. So you could say we are “ahead of the curve” and are working only with highly innovative companies who are leaders in their field. Leaders by nature do not tend to buckle under pressure, they have a clear vision on which they execute and can build the foundation for the time after any crisis. This head start will have an huge impact on success or failure of future endeavors. So, as we continue to work with these leaders, stay focused and keep our competitive advantage, we are building the foundation for what might be commodity in the future!

I am absolutely convinced that cross-company data sharing will get a massive boost and more attention from this crisis. Enterprises have already realized that being mature from an internal (OT) digitalization perspective by running e.g. almost fully automated/autonomous factories, (could have) made the difference between shutting down a factory or keep the production up and running almost as usual. Connected assets and factories enabling remote operations and services are the one side but I guess what many enterprises have learned the hard way is the dependency on the direct peers from the value chain. So, what is the point if I can keep my factories running when I do have only limited insights into my supply chain on the one side and demand from the end customer on the other side? Scalable and trusted cross-company data sharing will be key to create more transparency along the value chain and increase planning fidelity.

 


Curious? Talk to us!

Tampered IoT Data – The “fake news” of the Industrial IoT

The rising amount of collected IoT data, the trend towards data-based decision-making and new data services increase the risk of being vulnerable to manipulation. Therefore, companies should implement countermeasures to prevent IoT data form being tampered and build their data services and IoT platform on a trustable foundation.

Covid-19: Behind the mask

Nothing new came forth but it would have enhanced prior knowledge if considered before.
A few weeks after the outbreak of the covid-19 pandemic, it makes sense to draw a first conclusion of what we’ve seen and heard from industry players and we tried to summarize some of our core findings.

Alexander Sztatecsny – new COO of Tributech

Alexander Sztatecsny, who has been working in different areas of Digitalization across the value chain for most of his business life, has recently joined the team of Tributech as COO and Managing Partner and will work with the team to elevate the business to the next level.

Digital Twins: The 4 types and their characteristics

Digital Twins: The 4 types and their characteristics

In one of our last blog articles, we were focusing on what a digital twin is and what it can be used for. Today we would like to go into more detail – and highlight the four different types and applications of digital twins.

In a nutshell, there are the following 4 types: 

  • Component Twins / Parts Twins
  • Asset Twins
  • System or Unit Twins
  • Process Twins

In essence, all these types of digital twins are the same – they represent an object or process virtually and help to predict key factors like the running time or foreseeable damage. What the different types differ in, however, is the area of application. Let us now go through all types to get a more precise picture of the differences.

Component Twins / Parts Twins

As the name suggests, this is the twin of a single component in the entire system. How now? Is every screw in a car virtually reproduced in order to be able to make predictions about its service life? No, of course not, these are real key components that have a direct impact on performance and functionality. A second application is consisting of components that are not quite as important but are subject to constant high or jerky influences.

Asset Twins

Asset Twins are the next higher level of digital twins. They describe how individual components work together as an entire asset – for a better understanding: A good example of this is an engine or a pump. Asset Twins can receive information from Component Twins or be a collection of Component Twins themselves. While component twins are more concerned with the stability and durability of individual parts, Asset Twins allow you to explore an entire system. You can check how and how well individual parts work together and discover potential for improvement without having to screw around with real engines or machine gearboxes. So you can virtually – and consequently real – reduce mean time between failures and mean time to repair as well as fuel consumption while increasing factors like performance.

System or Unit Twins

The system twins, also known as unit twins, work on a higher level. They combine individual Asset Twins and give you the opportunity to check how individual assets work together – comparable to Asset Twins that combine individual Component Twins. Let’s stick to our car example: The System Twin combines all assets that are necessary for propulsion and all assets that are necessary for electricity and all assets that are needed for the bodywork, etc. The System Twin is a system that can be used for all the different types of applications.

For the sake of understanding, perhaps the example of the car factory is a bit simpler: here a System Twin brings together all the units necessary for the production of a component of the finished car. System Twins are also all about improving the collaboration between individual assets – so that the end result is maximum performance with minimum wear and tear or time consumption.

Process Twins

When a System Twin represents the production units for a single part of a car, the Process Twin represents an entire production facility and provides insight into the collaboration of all units. And then factors such as timing become important. In the closed-loop of an entire process, individual units can also produce too quickly, leading to an excess of certain individual parts and thus to high storage costs or other logistical challenges. It is only at this level that the entire complexity of monitoring via digital twins becomes really clear. Because a process only becomes functional and effective when all units, assets, and components fulfill their purpose.

The different Types of Digital Twins – A Conclusion

At the end of this article it should be clear: We are talking about higher and lower levels of digital twins here, but in reality, each level is equally important for a functioning process. To uncover optimization potential and sources of error in your company, you need to switch between the individual levels – zoom in and out, so to speak. At the end of the day, a single component can be as important as the interaction of all units. As described in the pictures, it is just as important to monitor the individual screws, which can bring the machine to a standstill, as it is to monitor the entire process, and at the end there is a finished car.


Curious? Talk to us!

Tampered IoT Data – The “fake news” of the Industrial IoT

The rising amount of collected IoT data, the trend towards data-based decision-making and new data services increase the risk of being vulnerable to manipulation. Therefore, companies should implement countermeasures to prevent IoT data form being tampered and build their data services and IoT platform on a trustable foundation.

Covid-19: Behind the mask

Nothing new came forth but it would have enhanced prior knowledge if considered before.
A few weeks after the outbreak of the covid-19 pandemic, it makes sense to draw a first conclusion of what we’ve seen and heard from industry players and we tried to summarize some of our core findings.

Alexander Sztatecsny – new COO of Tributech

Alexander Sztatecsny, who has been working in different areas of Digitalization across the value chain for most of his business life, has recently joined the team of Tributech as COO and Managing Partner and will work with the team to elevate the business to the next level.

The digital twin – an introduction

The digital twin – an introduction

Digital twins are one of the most important inventions of recent years – and yet many people still lack the knowledge about what a digital twin is, what it can do and what opportunities it opens up for a broad variety of industries. For today’s blog article, we have set ourselves the goal of changing exactly that problem.

What is a digital twin?

Let’s come to the first and probably most important question: What is a digital twin anyway? Well, the digital twin is a virtual model of a process, a product or a service. For simplicity’s sake, we will refer to this illustrated object as an “original object” in the following text. The digital twin serves as a link between the real and the virtual world. Whereby this is not 100% correct, because a digital twin can also be an image of a digital product. Furthermore, for the creation of a digital twin, it does not matter whether the product depicted already exists now or will exist only in the future.

The digital twin can use real data from sensors to virtually simulate realistic working conditions or machine positions. In this way, work processes can be analyzed in advance and sources of error avoided or wear reduced. This results, for example, in less downtime and slower wear of the machines. Such digital twins, which can map the life cycle of products, processes or services, are becoming increasingly necessary in more complex industries.

An easy example:

Let’s say you have a battery pack that you can use to charge your smartphone and other devices via a USB cable. If this battery pack has a digital twin, you can simulate in advance how often you can charge your devices with one battery charge and how long your battery pack will work with current use. The digital twin may also be able to detect errors in your use of the device that would significantly reduce its lifespan sooner or later. This should be an interesting benefit in the future, especially for insurance companies.

How does a digital twin work?

In addition to features such as a clearly assignable ID, a digital twin requires three different elements: the original object, the digital twin as a virtual object and the information that links these two. The original object has sensors that measure the most important data for optimization. This data is forwarded to a system where it is processed and evaluated. On the basis of this data, the digital twin can simulate future values and emerging problems, among other things, so that the processes and services depicted can be improved. There are two ways to feed the digital twin with data: with realistic real-time data, but also manually to bring human expertise into the calculations.

What are the benefits of a digital twin?

The digital twin can help the company in every phase of the life cycle of an object. In the first phase, which is mainly about research and the “design” of an object, a digital twin can be used to illustrate the diverse effects of different decisions. This is interesting, for example, for Formula 1 or aircraft construction, where even small changes to the outer shell can have a huge impact on aerodynamics and thus on the speed and fuel consumption of a device. After research comes production – also in this part digital twins can help to work more efficiently, with higher quality standards and higher yields. In the usage phase, availability can be optimized.
In the fourth and final phase, a topic that every company currently has to deal with in terms of corporate image and external impact comes to the fore – the recyclability of the products. Digital twins can help you to identify and implement re- and upcycling potentials. In addition, the digital twin can reveal individual weaknesses in your products, which can be eliminated and your object can be used for a longer period of time.

A brief outlook

In the coming weeks, we would like to continue and deepen our discussion about the digital twin in this blog. You will get to know the best practices as well as realistic application possibilities for your company. We will also show you how we, the team of Tributech, can enhance your digital twins to further improve your processes. Stay tuned.


Curious? Talk to us!

Tampered IoT Data – The “fake news” of the Industrial IoT

The rising amount of collected IoT data, the trend towards data-based decision-making and new data services increase the risk of being vulnerable to manipulation. Therefore, companies should implement countermeasures to prevent IoT data form being tampered and build their data services and IoT platform on a trustable foundation.

Covid-19: Behind the mask

Nothing new came forth but it would have enhanced prior knowledge if considered before.
A few weeks after the outbreak of the covid-19 pandemic, it makes sense to draw a first conclusion of what we’ve seen and heard from industry players and we tried to summarize some of our core findings.

Alexander Sztatecsny – new COO of Tributech

Alexander Sztatecsny, who has been working in different areas of Digitalization across the value chain for most of his business life, has recently joined the team of Tributech as COO and Managing Partner and will work with the team to elevate the business to the next level.