Informative & Scope Innovation

IoB: Merging Man and Machine

by Su Aziz

The 11th WIEF Global Discourse had three experts discussing Internet of Bodies: Merging Man and Machine for an hour on 25 May 2021. Here is the summary of the virtual session.

Professor Mark Findlay, Professor of Law, Singapore Management University, Singapore
Dr Ariffin Kawaja, co-founder, StretchSkin Technologies, Singapore
Leigh Howard, deputy commissioner to Southeast Asia, Victoria Government, Australia

Bobby Varanasi
, founding chairman and CEO, Matryzel Consulting Inc, United States

Bobby Varanasi

Internet of Bodies (IoB) was narrowed down to four key areas during the 11th WIEF Global Discourse on IoB, starting with what it essentially means. ‘[Most] of you have a smartwatch and are monitoring your heart, pulse rate and more. However, that’s just the basis. What we’re able to accomplish in terms of machines interfacing with humans is more than just pacemakers and pancreas implants,’ explained the session’s moderator, Bobby Varanasi.

Secondly, Bobby said, ‘is to understand what are the applications currently in progress and new. There’s a lot of ambiguity than we’d like to think.’ Thirdly, to address the risks around data. ‘All the data that’s being generated by the human body and the marketplace, how are we able to integrate them, manage them and how regulations play a role. Lastly, it’ll be about governance of regulations,’ he added before passing the ball to the three experts.

Defining IoB
Professor Mark Findlay: IoB is really a process where technology is attached, implanted, connected or assigned in some way to a physical human and from that data is collected. There are two areas of interest for all of us today, one is the technological side which is the implant that often seems to dominate a discussion – the idea of having humans that are half machines which we’ve seen a lot in movies. For me, what’s more interesting is the data produced, how that would be used and whether the body from which it’s taken from fully understands the nature of that data and the end source. The central issue we’re having is: the data produced through bodies goes somewhere and is used by someone.

Dr Ariffin Kawaja: IoB is essentially an extension of Internet of Things (IoT). My startup currently is working on rehab data. There are different phases of IoB which we’re seeing right now. The first generation will be your Fitbit and smart watches to detect your vital signs, wellness and rehab data. The second generation will be implants, digital pills approved by Food and Drug Association (FDA) of America in the past three or four years, artificial pancreas for diabetic patients, prosthetics and the ability to detect our cognitive capabilities.

Leigh Howard: What may surprise people is the amount of investment the Victoria government do globally that relates to medical research and clinical trials. It’s a significant piece of business and particularly as we see the cutting edge of that work going into Melbourne where it’s very much a life sciences hub. Some of the advances are quite startling. We’re seeing a lot of work around diagnostics and that’s the application. Whether it’s a permanent fixture to a person or something that’s ingested and passes through the system that sends data through, what’s really amazing is the brain computer interface technologies that are emerging. It’s because it can help people who might have disabilities or disadvantages by putting on the lens around physical or augmentation diagnostics.

Data: Convenience vs Privacy
Professor Mark: Due to the potential for mass data sharing, the tech we use to produce that data is less and less segmented, and more likely to intersect. From that comes some great positives. For instance, the wonderful capacity to mix medical diagnostics with other aspects of welfare and wellbeing. Lifestyle educational development even things like poverty and socioeconomic retardants can be built into the bigger picture of why humans act the way they do and what their potentials are. The downside is that the more data we have and the more we share it, the more responsible we’ve got to be. So, this is the really challenging balance between the excitement that access gives us in terms of what we can do and the very pressing responsibility in terms of care and treating that data with respect.

Dr Ariffin: Within a minute, you can have so much data but how do we actually manage this data? What kind of carbon footprint is actually needed to manage it? That in itself is a social issue.

Leigh Howard

Leigh: I look forward to the day when we’re all doing diagnostics in our home and sending the information to a medical expert who can give us a read on that in real time rather than the physical requirements of going to the person who’s doing the interpretation. That’s the basic utility for consumers but with that data out there, no data is private and that’s the concern.

Professor Mark: There are two issues that have come out of the discussion. The idea of focusing on diagnostics telling us what’s wrong with us, what we could do to make ourselves more well or the introduction of technology to change us fundamentally. There’s a moral philosophical question associated with this, which isn’t just about spiritualism or about various ways of seeing what’s right and wrong.

The question is, why do you want to own data? The usual debate in court is because I want to sell it or I want to make money out of it. There’s another side to that argument and that is, I want to protect the integrity or I don’t want every person using it and distorting it. But there’s a big debate going on now in data science about what data is. You’d think data scientists should have worked this out before but they haven’t. What is data and what extent can it be monetised? Hence, the discussion about it in the courts.

Next, what is data to be used for? There are clearly two divisions: One that says data is for public good and the other says it’s for commercial purposes. My observation is that often in this discussion about ownership, the data subject is completely ignored. An interesting point is about end user consent and many of the big data platforms believe that the data subject’s consent is sufficient to allow them to do whatever they like with the data. Any lawyer will tell you that if consent is based on the ability to access the platform, then it’s not considered. So, the reason that I raised this is that if we’re talking about data, we’ve to talk about, first of all, in terms of what it is, how it can be contained, who can it be used for and who can access it.

IoB Concerns
As as a public servant, one has to observe the highest standards of data privacy and pretty much every event activity or work stream that we undertake, it’s a big part of our existence that we’re observing everything that we need to do. A really important part of our existence and our life is that we would see the application of IoB.

What we’ve seen in the last few decades, though, is the rise of consumerism and entertainment, a significant force in modern society. So, the other application of IoB, I daresay, it’ll be applications around entertainment. I don’t think mobile phones, as an external device will last forever. We’ve this really interesting trade off because people are willing to put aside the enforceability of the consent that they provide. They’re happy to do it and do it willingly. They see it as a trade-off for the entertainment and convenience. So, I daresay that’s where we’re going to see mass market adoption of IoB.

Dr Ariffin Kawaja

Dr Ariffin: Once the data is out of the body, it’s pretty much the same the moment it’s on a cloud or on some server, it goes through the typical security process of any data. The transmission of this data can happen if somebody has sensors, they can biohack into the system and get the data out of the person. They’re able to read off the data from the person. Whenever you have this data, you’re sending out transmission within five metres. Some work is being done to limit this within 0.3 to 0.5 metre so that it’s not easily captured by unauthorised individuals. There’s a lot more work to be done in terms of security for IoB and investors are now looking at this upcoming area.

I spent about 1.5 years collecting data in a hospital in Singapore and people don’t ask questions. Education is really important. They should know what the data is about and how it’s being used. It should be articulated to them simply, so they know their consent is given with understanding.

Professor Mark: I agree. Many patients don’t ask questions but partly because they don’t think they can. Most of us have gone into operations and we’ve been asked to sign consent forms but the last thing in the world you want to think about is the conditional clauses. However, if you were to be confronted with the fact that your personalised data may, in fact, go beyond the doctor or the hospital, and go into quite a variety of different places, then you’ll be concerned. Look at the debate about AI assistant surveillance, people are genuinely worried. They’re worried what data is coming out of the track and trace device they’re wearing, and issues such as who is receiving it, who is using it, what’s it going to show and how long is it going to be around.

If we have a worry about hacking or the illicit accessing of data, the first question is, will the law do anything to help us? Basically, the relationship between law and information has been extremely vague. When it’s identified it’s usually misuse of data by governments in relation to the information they already have. The second point, which is very important, is that law has lagged behind when it comes to the concept of what we do with the information. We don’t work on health that much but certainly in areas where you look at mass data sharing beyond IoB there are areas of really significant concern.

A Hollistic Approach
Professor Mark: A big problem in countries like America, where there’s basically very little effective public health, and if you’ve got in a public health system, you’re essentially relying on the private sector to supply, we’ve to ask the question of what are the private sector’s motives. Now, if their motives are genuine in terms of refining their risk data, then that’s a great step forward because actuarial risk and insurance is one of the clumsiest areas of commercial enterprise.

The next question is, just because we have the tech, should we always use it for every purpose? There are arguments to say, particularly if tech was designed to make people well that it shouldn’t be then used to make insurance companies richer or to give them more information. I would say, there’s obviously a call for it, but the question is, it could only be advanced as a general policy if the insurance industry put up their own self-regulatory framework which people accepted. So, the regulation should come first, not last.

Dr Ariffin: There’s a startup in Singapore that was trying to advocate with the local insurance companies. It was trying to get the data and present it to the health insurance, giving them some kind of purpose, not only in terms of the premiums, but also in terms of vouchers as a motivation for the patients in order to improve themselves. Unfortunately, the insurers haven’t taken it up. They’re looking at each other, wondering who is going to start this initiative. I recall, a particular startup in London, known as ‘strip coins’ that basically encourages individuals to sign up and you can exchange it for certain stocks or certain stock exchange. This initiative will involve the insurance to get people to be motivated to, for instance, exercise. It’s a whole ecosystem and you need buy-in from stakeholders which aren’t only the insurance, but the patients, healthcare providers, government and family support. It has to be looked at as a holistic initiative, rather than just insurance companies.

Leigh: A big area to look at is considerations, we’re not talking about data going into some sort of nefarious dark web usage we’re talking about front-facing corporate and landscapes, and the day to day application. Many of us now have apps from insurance companies that reward us for doing good things but also keeping tabs on where we’re going and what we’re doing. So, there are other by-products of sending information to an insurance company.

Neural Implants: Synchronising the Brain and AI
The question is, would it enable an organisation to control human beings in the future?

Professor Mark: Climbing on the back of the research we’ve done about surveillance in the workplace, there’s nothing new in this. A Grab or delivery driver doesn’t have an implant but he has an app on his phone and it can trace as well as make him do whatever they want him to do, or he doesn’t get to drive.  Now, the interesting point is the residual fear of somebody out there using the implant that’ll turn us all into robots. Although the question’s incredibly important, it’s background noise if we were to do two things: Clearly rehabilitate the narrative and people to understand the plus points as well as the things they can engage in. Also, we must be much more egalitarian about this and ensure that AI doesn’t just sit in America, Europe, Singapore and rich parts of Malaysia. But it gets out there and people with genuine access as well as needs can have something to do with it.

This takes us back to my point of the concept of data being used for the greater good. Never forget, this is one of the great criticisms of the big pharmaceutical companies that they’re in it for a different purpose.  What we should take away from living through this COVID-19 pandemic is that if the tech is out there let’s make it available to everybody.

Leigh: There’s more good than bad in this. One example of implants, from a university in Melbourne, allows people who suffer from nervous system diseases or paralysis to externally control devices. They can zoom and click, with about 90 per cent accuracy in terms of controlling prosthetic devices. It’s internet-enabled because it allows them to send signals from their brain to the devices they control. This betters their quality of life.

Professor Mark Findlay

Last Words
Professor Mark:
There are two things we’ve got to learn. One, we’ve got to use technology like this to better predict vulnerability, whether it’s against global health risk or environmental risk. Tech is capable of doing this, not just in terms of one population of humans knowing something about temperature, stress, this sort of stuff is extremely important. Two, second point is that we often talk about the top intake, and yet, there’s a lot of bottom intake that would make a massive difference to diabetes populations in poor countries and people who need basic heart prediction such as cholesterol monitoring. The tech is there. It’s been there for 10 years, maybe, and yet it’s never actually been circulated among populations that actually need it.

If we learn nothing out of this pandemic, one thing’s certain, we’re in this together and we can be as hard-nosed about this as we like but if we don’t open up the possibilities of technology to those who can’t afford it, then the world is going to be a much lesser place.

Main photo by Comfreak from Pixabay.

2 Jun 2021
Last modified: 2 Jun 2021
share this article