Is Your Personal Medical Data Being Sold?- Interview With Jay Glasgow
In 1996 the U.S. Government passed what was called the Health Insurance Portability and Accountability Act which intended to serve 3 purposes however its main purpose was to protect sensitive health information from being disclosed without the patients consent or knowledge. Fast forward to 2023 in a post pandemic world that from the time you wake up until the time you go to bed personal data about your health, finances, and everything about your personal life is being monitored as well as bought, sold and exchanged by government, businesses and big tech 24 hours a day 7 days a week? James Egidio host of the Medical Truth Podcast interviews Jay Glasgow, CEO of Privacy.coop and discusses about how you can stop these entities in their tracks from compromising your personal data and assist with compensating you. for previous and current podcast episodes and show transcripts go to www.MedicalTruthPodcast.com
Meet The Host
Episode Transcript
James Egidio:
Hi, I’m James Egidio, your host of the Medical Truth Podcast. The podcast that tells the truth, the whole truth, and nothing but the truth about the American healthcare system. In 1996, the US government passed the HIPAA laws, which stands for Health Insurance Portability and Accountability Act. These laws intended to serve three purposes. However, its main purpose was to protect sensitive health information from being disclosed without the patient’s consent or knowledge. Today in this brave new world, you cannot get into your vehicle, go to a hospital or doctor’s office without having your private health information collected in exchange for all to see, especially two and a half years later in post pandemic world. My guest is in the business of protecting the public’s personal health information and privacy from being breached in exchange for commercial purposes with his company Privacy Co-op. It’s an honor and pleasure to have as a guest on The Medical Truth Podcast, Mr. Jay Glasgow.
Jay Glasgow:
Thank you very much. It’s a pleasure being here.
James Egidio:
So Jay, just tell the listeners and viewers of the Medical Truth Podcast a little bit about your company and what are some of the ways, medically speaking Joe Public’s medical information is being breached or even shared for commercial purposes?
Jay Glasgow:
Sure. First of all, we’re a privacy co-op. We’re a cooperative association. We’re owned by our members. Your listeners can become members. We have a a one-time fee that’s a $25 and that’s a lifetime membership. So what they’re actually doing is buying a share in a co-op. You’re listeners and viewers may have be very familiar with co-ops. There are farmers, co-ops and electrical co-ops and maybe even their members of. We’re the first authorized agency that is a co-op, so we’re owned by our members, and our strength is in a couple of pieces of paper. It’s really what’s critical to our entire conversation today. First of all, we have a piece of paper that is a membership agreement where people agree that we would represent their information rights. You own your information rights, and we’re gonna go into that in some detail here. show. But that basically gives you control over what people can and can’t do with your data. And that’s all data right across the board. Whenever a company has a privacy policy, they’re admitting that you own the rights to your information. So it’s a contract that they have that with their terms of use and it’s legally binding. It’s stating that you own the rights to your information. So that’s medical and otherwise. The other piece of paper that we have is our affiliates agreement where businesses agree to honor our terms and conditions for a change. So we turn the tables around, it’s called a reciprocal agreement in legal parlance, and basically says, Hey, we want you to agree to our terms and conditions if you’re going to use this data for secondary purposes. Secondary purposes is a big catchall phrase that means some people parse it as meaning data sales. but we think of it as much broader than that. And it’s any data use in a privacy policy that says we’re going to use this data beyond its primary intended purpose. So you ask a great question here and that is what’s going on in the medical industry with this and specific to COVID tracing and I don’t know if you have any follow up questions on that but the big thing is that the HIPAA that you commented at the beginning, you view it as primarily data privacy regulation. I don’t know that everybody in the industry views it that way. They’re seeing this as more of a health insurance portability act. Making sure that you can move from health insurance to health insurance or that your health insurance can move from hospital to hospital. And part of that is the data that moves along with it. But not all data that’s collected medically speaking is covered under HIPAA. It’s a very narrow strip of data that they cover within the regulations. So there’s a lot of metadata. That’s data. That’s about who, what, when, where, and why the healthcare is being offered for. that’s not covered by hipaa, right? And some of your audience might be surprised by that, right? So you have attorneys that are working for people in the medical industry, and they’re very carefully parsing this out and saying, this specific data must be HIPAA compliant. Which again, if you’re not enforcing those regulations I don’t know if that’s an amazing regulatory protection. but let’s assume that they are. But then this other vast amount of data that they’re collecting is not does not need to be HIPAA compliant. And so that data is largely being monetized. And we can dive in deeper into any aspect of that you’d like.
James Egidio:
I go into a doctor’s office or I go to the hospital, let’s say, and. I’m in there for, let’s say, back pain or I’m there for whatever. So what kind of data gets collected and disseminated or shared in that situation?
Jay Glasgow:
Right now the doctor’s gonna have contracts in place to protect them. Insurance is gonna require that anybody that they deal with medically speaking with data, they’re gonna have contracts in place so if you read their privacy policy, it’s likely going to say, we’re not going to share your data with anyone without your permission expressed, written permission other than, and there’s gonna be several carve outs, right? So you might see words like our associates or services and on. Where it gets very muddy is the fact that these layered contracts, you have no visibility into that you, you don’t get to see the contract that your healthcare provider sign with 37 other companies, right? And it, without being able to know who those companies are without, written requests, there’s no way for you to go and see what their privacy policies are. So once they become custodian of data related to your case, you don’t know how they’re going to use it, right? So the doctor’s saying, Hey, I’m not gonna do these things except these carve outs, but you have no visibility into those carve outs. There’s a really interesting legal case happened a long time ago. It’s called Rose the second of Abalone, and it was where one man sold a cow to another man. They put a contract together and they said, here’s the terms of the contract. I’m selling him my cow. This is for how much money it is now. Neither person in the arrangement knew that the cow was. And that changed the value of the entire transaction for a lot of different reasons. The reason the buy the buyer was buying the cow, the reason the seller was selling the cow. So there was a lawsuit over this. It went before a judge believe it was in Illinois. Now everything bad comes outta Illinois when it comes. I’m just joking when it comes to judgments. No, I’m just joking about that. Rose, the second of abalone, basically the judge. The terms of this contract was unconscionable. That’s the word that’s used because both parties could absolutely not understand the ramifications of the agreement. So he judged this contract to be void, ab nio, which means void from the very start and threw it out of court. So there was no contract, right? So what medical providers are facing. is a really difficult task if they’re trying to do the right thing, they’re told by their lawyers, you have to have a privacy policy. They craft it in a way to protect themselves. They’re likely not interested in protecting you all that much. And so they’re trying to push off risk and liability onto these other healthcare data collectors and they’re saying, Hey, other than our partners, we’re not gonna share this da data with anybody else. And you have no visibility into that. So it’s a bit like the pregnant co cow that nobody knows is pregnant. You actually don’t have any clue how that data’s gonna be used, and neither does your healthcare provider, right? Because their contracts that they’re signing have other sub contracts right? So it’s impossible for anybody that’s doing this today to know all the ways that this data is going to be used. So there’s an argument to be made that all of these privacy policies could be ruled void, aio, right? But within the context of what they think they’re doing, they, they feel that they’re ethically and legally sound. In doing so, they’re passing this data on. Now you’re asking what kind of data? There’s an interesting way of looking at this for your viewers If I’m a reporter and I’m out at a cafe that’s like an outdoor cafe, and across the street I see an IRS office, a field office, right? And somebody famous comes along. I’m not gonna mention any names, I don’t wanna pick on anybody. Somebody famous comes along, they go into the IRS office and I see. and then I see them come out. They look very nervous, very scared. They go next door to an ATM, they take out a big pile of cash, they go back into the IRS and then they come back out crying, right? And they go the other way. Now, I’ve got a lot there that I can report on without risk of being guilty of liable I can say that this person went into this office, that they look very nervous, that they came out very upset. They went to their, ATM, they took out a pile of cash, went back into the building, came out without it, and they were crying. And I could have my readers pull all sorts of inference from that without any risk of slander or of libel. Excuse me. So in this case, you start to see what metadata is, right? Metadata is all the facts around the thing. I don’t actually. specifically why they went into the IRS, but I know all about all the other stuff When it comes to medical data, there’s a lot of data about all the other stuff and it may not be your exact like hemoglobin count, right? It may not be your red blood cell count. You may not see that data go, but what you’re getting is a ton of other information that feeds what are known as insights and hospitals and. very large organizations that, that facilitate medical activities have discovered that this is incredibly valuable information to sell to companies that are creating artificial intelligence. They need to feed the data into them so that they can improve their number crunching, right? How they come up with their insights and when they take money for. and the way they pass the information. It may not be just straight data going from one guy to another person. It might be that the other person is saying, I wanna understand more about the people whose data you own to not paying to buy the data they’re paying for insights into the data. It’s a little word play that a lot of people in Silicone Valley do so that they can very quickly say, we do not sell data, but they all do right? Yeah, that’s exactly what they’re all doing is Sure. Bottom line, they’re selling data.
James Egidio:
Yeah. And what would be the reason for this data collection and this exchange of this information? Let’s say I’m, I’m a patient. I walk into a say a hospital or doctor’s office. What’s the reason why they would want to collect this data and what would they use it for?
Jay Glasgow:
Sure. And this might surprise some of your viewers. So what I want to do is answer that in a couple of different wild, crazy trajectories where you might go what factor is he coming from? But when you put them all together, you start to understand it better. A lot of facilities have heat seeking technology where they can track by heat signature who’s walking down a hallway. It’s a security device. and they can tell individuals they don’t know who you are by your heat pattern, but they can tell who this person is as they move through a building, right? So as you’re moving through a department store, as you’re moving through a hospital, as you’re moving through a stadium, they can track individuals. They don’t know who that person is, but they can say the same person is the same person that’s just over here. They’ve moved over here, they’ve stopped in front of this particular advertisement. They paid attention to it. They read a poster. They spent time looking at something, something caught their attention. Now, if you’re a pharmaceutical company and you have little pictures up in doctor’s offices or in hallways, would you be interested in where particular people pause? Where? What catches their attention? If you spent a million dollars on an advertising campaign, you want to know how effective it is, right? So you’re going to have certain colors, certain fonts, certain wording. you want to see more and more people pause in front of that messaging. Right Now. You couple that with telcos, telecom companies that are selling big data, and they may go, Hey, this security company wants to combine data with a telco through a process called Big Data. And do an better identify who that person is by the cell phone that they’re carrying. Now, once they connect it to the person, they can then go look at the other data that they might collect from the metadata of you’re visiting an office, there’s an appointment calendar. You told your doctor to send you this stuff through Gmail. If you really want to, you can take the data from Gmail, combine it with the telco information, combine it with the heat seeking information, and now you start to understand a little bit better picture of this person has this prognosis, this kind of treatment, they’ve got this kind of disease, this kind of problem. They entered the hospital and they walk past this particular advertisement and they stopped and looked at it for five minutes. That might be incredibly important for that pharmaceutical company and they might, all of a sudden you might just end up with a coupon coming through your text messaging saying, Hey, here’s a coupon for that medication you need. And a lot of people think, wow, they’re listening in on my device, cuz I just said the name of that medicine here. And a second later I started seeing banner ads. That does happen, right? Oh yeah. But so does all of these other metadata collections that are going on through our hospitals and it’s a multi-billion dollar industry.
James Egidio:
we had a conversation awhile back and you mentioned about the exchange of information where when you simply get into your vehicle and that data gets collected, I’m sure a lot of people out there that are listening and watching. don’t know anything about that at all, but explain a little bit about that as well, because that is semi-related to medical data that’s being collected, which is like you had mentioned body weight, but go ahead, I’ll let you.
Jay Glasgow:
No, sure. I was it’s an interesting story because I was actually at a convention. There was pharmaceuticals were represented and we were doing our typical privacy co-op thing. So I was talking to people that were passing by saying, Hey, you need to pay attention to how your data’s being used. And this guy stopped and he started talking to me. He said what, how’s my data being used? And I said what kind of car do you drive? And he said, A Toyota. And I said, that’s interesting cuz Toyota collects your GPS information. where you drive and how fast you drive and all that kind of jazz. And they sell it. And he said Jay, I’m an architect for Toyota. And I actually designed the dashboard computer system, and I’ll tell you right now, we’re doing a lot more with the data than just your GPS data. And I said, oh, really? What are you doing? And he said every time you get in your car, you have to be weighed to determine whether or not to activate the airbags right. And I nodded. And he goes, yeah we’re selling that data. And I said, to whom? And he goes your employer pays your health insurance, right? And I said, that’s right. And he goes we sell your weight trend analysis to your healthcare insurance company. And they collect that across all the employees. And this is something that Toyota started doing years and years ago. All the other automobile manufacturers figured out that there’s a lot of money to be made in it. and they went, holy cow. This, and last year or two years ago, rather, made 16.8 billion in selling that trend analysis to buyers, right? Like healthcare, insurance providers. So your insurance for your team at work might be going up and up if everybody’s gaining weight, right? But the health insurance company doesn’t explain that to your employer. They just say, Hey, you rates are going. your rates are going up and you have no idea why. But that’s why, and if you go and try to get a competitive rate, it’s not really competitive cuz all these insurance companies are buying the same data, right? So you just go, wow, I guess that is the rate, right? Meanwhile, there’s a business right next door who might have a better weight trend analysis of the employees and they’re getting a better deal than you are for the same amount of coverage and for the same amount of people, right? So this is a way that your viewers are getting screwed, right? But yeah. But the manufacturer of the car that they are driving is making money in the process now. Now your viewers aren’t at a loss. There’s something they can do, right? the car company tells you in their privacy policy, we’re going to sell the data that we collect about you. And unless you tell us not to, that’s where you go to an authorized agent, like the privacy co-op. There, there are many others out there. There. It’s a new industry that’s starting to grow. Most of us are non-profits. And the idea is that we’re go, our lawyers are just going to send Toyota or whoever your car company is, a nice letter and say, Hey, stop selling their. Until you agree to our terms and conditions. So this is a win-win for the for the people to understand this. But James, there’s a broader kind of, foundational legal idea here at play, and we talked about it at the top of the conversation. When you go through life, you can’t help but create data. You do it every second, you. some philosophers say that the entire universe is made up of ever expanding information and depending on your religious view, that can be an intriguing perception, right? That this is just information that grows and grows, has no end massive amounts of information, right? But pragmatically, you go through a grocery store line, you’re generating data. You go through a turns style in a football stadium, you’re generating data, whether there’s even a computer system hooked up or not. Most of that data is meaningless to computers and humans, but when a human being looks at this data and perceives value in it, it instantly becomes information, right? So data plus meaning equals information, legally speaking, and you can’t have information without concomitant, creating information rights, what the courts call rights, equivalent to ownership. So the way to think about rights equivalent to ownership, a good way for your viewers to get their brains wrapped around this is to make it very simple. You can’t own water, but you can own the rights to water, right? If I go into a court of law with a cup of water and try to argue that I own the water and the cup, I’m gonna lose that argument. But if I go to the courts and say, Hey, I have a, I’ve got a lake house and I have the rights to 15 hectors of water. Past my property line, that’s an argument that will stand and it’s been around for hundreds of years. Yeah. Exact same thing is true for your viewers for data rights, for information rights. You own the rights to the information contained with within all the data you’re creating, medical or otherwise. So we really have to get down to is just people understanding. Number one, you own your rights to all of this data not all of it is covered by HIPAA compliance. So stop thinking that you’re safe just because it’s a medical procedure, and then understand that there are businesses that are gonna monetize this and sometimes to your detriment. So it’s a really good idea to start getting plugged in and understanding how these businesses are using your data. And it’s not a complicated thing at all to tell businesses to stop. I just want you to stop using my data for all of these secondary purposes. Listen.
James Egidio:
Yeah. Where is most of the data being shared or intercepted in the medical industry?
Jay Glasgow:
I wish that we had a single point of attack. Wouldn’t it be grand if we could say, if we could just hit this data broker with a cease and desist order, we would stop it all but as I’ve laid out for your viewers, it’s everywhere. Wow. So let’s just take this as a process for a patient that might have felt a lump in their breast, for example, right? They say, oh gosh I have a lump in my breast. So they, now they’re calling, they’re probably checking with their insurance company. Who do I go to see about this the whole time Their phone could be listening in. we know that’s true. If your viewers wanna learn more about that, you can go to settings on your phone and depending on if you’ve got an iPhone or an Android, you go to privacy. There’s a setting called privacy right at the highest level within iPhone, within Android, I think you have to go to settings and then underneath there’s a sub setting for privacy. Go into privacy and look for microphone. Okay? When you find microphone on all phones today, go into. and it will tell you how many apps are listening to you right now. And you might be stunned whenever I talk about this in a public forum. Most people have an average of 30 to 35 apps on their phone that are listening twenty four seven, right? You can go in and shut them all off. But there’s an interesting medical story that came out a year and a half ago that artificial intelligence that was developed in hospitals for people that have tracheotomies. to read their lips when they can’t speak. 85% accurate when they can’t even hear a sound coming out of your mouth. So if you say something like, I want water and you can’t voice it, they will be able to, the computer will be able to say, Hey, they want some water. It is near a hundred percent accurate when it comes off of one of these cameras on the back of your phone. So even if you shut off all of the microphone settings, there’s still cameras in the room that are reading your looks. Okay so it’s coming from there. It’s coming from the clinic that you’ve talked to. It’s coming from the mammogram that you get. It comes from the people that are doing the lab results. It then comes from the fact that you now start looking for somebody that does sonar detection to go even further into understanding what you have going on. It comes from the fact that you’re now talking to an oncologist to possibly a surgeon. It comes from the fact that you’ve plugged into a particular medical group. Your insurance company is facilitating this and they’re all using, I don’t know, outlook. They’re using Calendly, whatever the calendar system is for all of the different players that are involved with this. And they’re saying, how do we get in touch with you? And you say, use my Gmail account. Gmail is now involved. You’re driving back and forth in your car by the way, your car does sell your GPS data. The number one buyers for that is automobile manufacturers that are building autonomous vehicles. They’re paying for GPS data, for training. They’re artificial intelligence for driving cars, right? So you’re generating data at a thousand points a day when it comes to a medical issue. And we haven’t even gotten halfway through your process. And so let’s say you want to start talking to your family and friends. You go to CaringBridge, you set up a Caring Bridge account so you can start letting ’em know how the chemo is going, right? CaringBridge has another privacy policy and it just goes on and on.
James Egidio:
Geez. It’s like a multi-layer. monster.
Jay Glasgow:
It is. It is. And the data brokers, last time we mapped it out, there’s over 450 major data brokers. These data brokers are taking data from all of these different places and they’re trading it, like playing cards. It’s you know when you were kids and you had baseball cards and you would start saying, got it. Got it. Need it, got it. Need it, got it. Need it. They’ve built algorithms that go through all of this data for the data brokers until they basically come up with a homogenous set, and Google is right in the middle of it. All right? so is Apple. So is. We like the acronym FANG for for remembering the, probably the companies you wanna opt out first, right? Is Facebook, Apple, Amazon, Netflix, and Google. And let’s throw Twitter in on the mix, right? That’s FANG, FANG plus a T, right? You wanna opt out of those companies because that is as close to an intersection point as you can get to Frustrate. this ecosystem. But some of your viewers are probably saying, who cares? I’m not doing anything wrong. Why do I worry about this? And what I tell people when they say that is simply this. That’s true. Let me put this in your words. Let’s say you had a second bedroom and you never used it for anything bad. Who cares if somebody sees your second bedroom? You have nothing to be ashamed of if somebody sees your second bedroom But if six fr six months from now you find out that Airbnb has been renting out your second bedroom in your house without your permission, would that bother you? Of course, they’ve been making money and somebody’s been sleeping in your second bedroom now for six months, and you didn’t know it. You didn’t have anything to be ashamed of, but would that bother you in any way that’s the way we need to view this data use that’s going on in the medical industry. Businesses are using, and sometimes for some very good reasons, right? A lot of medical analysis and not a lot of all of that trend analysis when it comes to breast cancer has led to some amazing breakthroughs, right? So you might say I want to do this, but I do have a problem with somebody sleeping in my second bedroom and not paying me. So there’s a way to fix this, and that’s just simply saying, opt out of those companies that you’re engaging until they start cutting you in. That’s called licensing your information rights. And you can do that through authorized agents like privacy co-op, and as I’ve said there, there are several out there.
James Egidio:
How long has this been going? When did this really get ramped up?
Jay Glasgow:
It’s that’s a really good question. There’s been fear of this since before hipaa you cited at 1996, that was the year of the Communications Decency Act. The Communications Decency Act that sets up a lot of things, including some of your viewers might know about section 230 which is, all the social media websites hide behind that and say, you can’t sue us because we’re doing this service for society and we’re protected. But it goes all the way back to FERPA in 1974 when data was pieces of paper in Manila folders, right? So this has always been a concern, is that people could use data from different places even before the internet and monetize it right. And do it without your ability to understand and your ability to take action. So it’s been around for a long time, but if you ask me when it really kicked in, I would say this. Do you remember when there was a little company called Yahoo? Of course. And everybody had a webpage on Yahoo. Sure. And if you remember, it was very cluttered, right? You’d go to Yahoo and they. here’s your weather, here’s your sports here’s some shopping ideas. Here’s the news headlines, here’s your horoscope, here’s your, It just went on and on. It was a remember jumbled, crowded mess. And what Yahoo was trying to do is they were trying to take the person that came to Netscape when the internet was brand new, 1995. It had been around for, quite a few years at that point. But everybody feels that was the genesis. you went to Netscape and you just look at a little box and you didn’t know what to type in, so here comes Yahoo. They’re just showing you everything and they’re, go look at all this data. I don’t know if you remember this, but Yahoo used to say right underneath the search box of Yahoo, it said, powered by Google. And everybody,
James Egidio:
I remember that.
Jay Glasgow:
I wonder I wonder who that Google company is. That’s a funny. you just typed in a search and it would magically bring up a lot of websites for you to look at. What stunned everybody was that about five years later, Yahoo was, one of the largest companies in the world and this little tiny search company, Google came out and said, Hey we’re gonna offer a webpage and it’s just a search box. There’s not all this jumbled mess on the screen. We’re just gonna give you a clean search box. They went back to Netscape. They said, you figured out the internet and that jumbled mess is bugging you. We’re just gonna give you a little search box. And from that day forward, Google gave you free email. They gave you a free phone number if you wanted it. Google Voice. They would give you free apps. And people were like, wow, all this stuff is free. This is great. I can have my own calendar, I can have my own. This, that. And it’s all free. It’s all free. This is Phenominal that’s where it really started. When a company like Google says, we can spend billions on research and development and software apps, engineers, graphic artists, buildings around the world, and we don’t have to charge you a penny, there’s a phrase we like to say. If you’re not paying for the product, you are the product.
James Egidio:
Wow. Wow. That’s, they did in such a, I guess you could say a sneaky way is what they did. It’s and it became, like you said, it became a big data collection center between Facebook, Google, and all the rest of them. Let me ask you Getting back a little bit into the medical thing, we, I think spoke about this post pandemic change as well with data collection. It’s been even re really more ramped up. You had mentioned post pandemic. Explain a little bit to the audience about that.
Jay Glasgow:
Yeah. I don’t know if people, personally, I think that there’s a lot of memory suppression going right now. I’m believer that first of all, it’s nothing to believe in. COVID was very traumatic, right? With a capital T Right. Talk to experts in military training that talk about the amount of intense training that soldiers go through to anticipate the levels of adrenaline that get shocked into their system during warfare. We weren’t prepared. I didn’t go through bootcamp. You didn’t go through boot. There wasn’t a bootcamp for Covid. Nobody prepared us. If anything, we’d been preparing our children. That disaster’s going to happen any day now. And they’ve been hearing that since they were in kindergarten. For one reason or another. There’s a lot of doom and the future looks bleak. So if you’re prepping a group of people to say the future’s going to be very bad and there’s very little precious little you can do about it, and then suddenly one day they come home from school like the third week in January of 2020, and they start hearing newscasters that they trust. Say the end is here, the world is ending. There’s a hit of adrenaline, and they got hit over and over again with this adrenaline is a very big trauma that everybody went through and when that hit you, you’re looking for any relief there. You have to start looking for the lights at the end of the tunnel and hope that it’s not a train heading your way. And so what I started hearing at that time, and this was late February. So we were just starting to hear people say, we’re gonna, we’re gonna shut everything down. You’re gonna go home and you’re gonna stay indoors for 15 days to flatten the curve. I don’t know if you remember that phrase. Yes, I remember 15 days to flatten the curve. We came through the 15 days and it turned into 21 days and then, kept going because I think here in Texas the total time of. complete government mandated shutdown was nine full weeks. It was very short. We’re one of the shortest in the country. So you had people that were nine months. Or longer. But for Texas, we are in nine weeks. Okay. If you’ve been go, if you go into this thinking 15 days and all of a sudden it’s nine weeks. start getting really well that the adrenaline’s really hitting and you’re paying a lot of attention, and you’re calling up your Congress people. I don’t know if your viewers remember doing that, but I do. Yeah. And saying, what’s going on, at a certain point I started hearing this thing I, we call it the three Ts. It started surfacing in the middle of that, and it was, listen, we’re gonna try to get back to normal, but to do we need to test track. and treat and when they say it just that quick, wouldn’t they? We need to test, track and treat as soon as we get an ability to test, track and treat, test, track, and treat. Now, I started hearing this over and over again and at first I was comforted by it. I was like, yes, we need to test, track and treat. If we can do that’s wonderful. And then I guess it was the fact that I’m a CEO of the Privacy Co-op. Eventually I said, Hey, hang on. Wait a minute. What’s that middle thing again? I get the test part that’s important. Let’s look at the other side of that. Treat. Treat sounds good. I think we should treat people. That’s a sandwich, right? So we got tests and treat, that’s two pieces of bread. Love that bread. Remember this? What was the thing in the middle? What was the tra track did? Did you say track was Was that the word that you just used was track? What do you mean? Track? What? What are you talking about? Privacy co-ops started tracking the trackers and we started running data for people to understand. 35 states jumped out there and immediately hired outside of their own health system within the state contractors to start doing covid tracking or covid tracing as it, he finally evolved into. I apparently they caught on the track was a bad word, there was a there was a newspaper article in the Houston Chronicle that actually says they’re going to track you down. And they meant it in a positive way. And I was like, that just doesn’t sound good, I don’t wanna, if I want anybody tracking me down, right? But what they basically meant, James, was they said, if you test positive. First of all, if you test for Covid, do you remember driving into a line somewhere to test to see if you had covid? You’d have people with the swabs and they shove it up your nose until your eyeball popped out. That, that thing they would make you sign a form and people were scared. James. People were genuinely fearful for their life and they would say, sign this. I remember taking the form and turning it over and it was like four sided, If you could do that with a piece of paper. Yeah. Four sided, no, it was a double piece of paper printed front and back, both sides in. I don’t know what’s easy to read, 11 point font. I think this was like 0.05. It was I had to have a microscope to read this stuff. Okay. What I agreed to in Texas when I said yes, you had to agree to it. To get the test was not, saying that you were going to hold the medical provider blameless if they screwed up your nose and popped your eyeball out. It was not that you would not hold the test kit company liable if their test got it wrong. Yeah. It was all the ways they were gonna use your data. It was massively jam-packed with this and it was basically saying in your state your health organization, your healthcare organization as an effort to test, track and treat, was going to turn this data over to collection agencies that were going to put this stuff together and find out all the people you’ve come in contact with in the past two weeks. Okay, that may sound good. But when we really started doing some hard sniffing on this and digging in, what we found out in the majority of states is that they hired a bunch of people at minimum wage to open up an Excel app on their own laptop. Cuz they didn’t have the machines to go around and use Gmail to send the data back and forth. And they would get your name and your phone number, and some of us got these calls that would say, Hey I’m with the Texas, test, track and treat group, or whatever they called themselves. Can you remember who you’ve come in contact with in the last two weeks because you just tested positive for covid and people were answering. I, I went to the PTA meeting last week. I talked to a Sally, somebody. you wanna talk about garbage data, this data garbage. But of course, and if everybody’s collecting this, if it’s a bunch of people being paid minimum wage, and they’re doing it in different ways, they’re just writing Sally, they went to pta, they’re putting in a spreadsheet. They sent it through email. Sure. So it was a lot of stuff, a lot of Good words. That sounded good. A lot of crap data work. One of my friends calls it data janitor work, where you have to hire a data janitor to come and try to figure out how to parse all of this stuff. But the real thing that’s going on here is that you were giving permission for your medical records to be shared with this, whoever this organization was. And they sell your data for profit. They, so all of the spreadsheet stuff, you talk about data leakage all over the place, it’s frightful. But and it was terrible and it wasn’t well run. And all of the statistics we saw where states outsourced this to other companies their numbers went up higher than the states that tried to keep it within their own organization. Alabama had maybe one of the best approaches. they relied on their own healthcare system. And when they needed more bodies, they went to UAB, University of Alabama and Birmingham, which is a medical university. And they hired people that were like doctors in training, medical students. And like they understood how to take care of data and the science of it all. So Alabama had a really good go of this. Texas, oh my gosh. Signed a contract for 300 million with an outfit called MTX Group, who largely exists to sell data. And I don’t know why they won over competitive bids from companies like Cisco and AT&T but there was a lot of hollow baloo made over it. Lot of stir. But I think at the end of the story, we’re not at the end of the story yet. I think the three year contract is still going. But my understanding is they mitigated this James, they said they came out with a puff piece about eight months ago at where the CEO of MTX group says we didn’t get near 300 million. We only made 65 million so far, and I was like, only 65 million to take our data off our hands, And then sell it, right? Yeah. It’s interesting. It’s just remarkable.
James Egidio:
It is. Because I hear all this stuff about a lot of nefarious motives for using this information. Like you hear a lot of the sentiment about coming out of China, for instance, and they’re using this data and they’re collecting it through, let’s say, TikTok, and they’re collecting it through the phones, I can’t remember the name, Dawe or Dawe or whatever. And even off TikTok, like I said, and now you’re getting into what they call social credit scores. They’re using it for, and they’re using it for tracing like you said, with covid and tracking. And, if you’re not compliant with getting, let’s say, the vaccines that, it’s gonna be countered against you. In your professional opinion, where do you think this is all gonna lead to it in the end? Where, where there’s gotta be like an end point for all this. There’s gotta be some kind of okay what’s the real end game of this? It’s not just collecting data, I’m sure, right?
Jay Glasgow:
Yeah. I, I, at the risk of being called a nut which by the way wouldn’t be the first time I’m called a nut, we’re already. I just wanna be very clear the what FANG has called 360 profiles have been in place, and I did a TED Talk on this five years ago. So and there’s been a number of documentaries. There’s a famous one on Netflix that you can go out, your viewers can go out and view. And that was, that came out three years ago. Three or four years ago. So the notion that this is coming. No, it’s already here. Let’s be very clear, right? They already have a 360 profile on you, maybe multiple views into that profile. So with predictive analytics, they can come down fairly closely to know whether or not you’re gonna turn left or right at the next intersection you’re driving through. Sure. So they have a pretty good idea who you are, what ailments you have, what motivates you, and what decisions you’re going to make
James Egidio:
that’s, I’m a pretty boring guy. So they’re They’re going to turn around and say, you know what? This guy’s too boring for me, because I’m not much of a,
Jay Glasgow:
there’s a lot of money in boring James. There’s a lot. See, this is what’s remarkable. I, there was a great book that came out several years ago called Streaming, stealing and Sharing. I recommend it to anybody that wants to read, and even now it’s outdated, but it told the story of. Netflix, when they first developed, decided to develop the TV show called House of Cards and It’s a fascinating chapter to read in that book that you had a producer who was a, an award-winning producer with an award-winning writer and Kevin Spacey, award-winning actor, that’s three that had come together and said, yes, I will do this project. They take it to network by network. Typically, what a network does is they bid on a pilot, and give so much of a contingency. So if the pilot does well, we’ll fund you for 10 million or whatever the amount it is to do the TV show, right? Then they went to Netflix and the only reason they went to Netflix was for the aftermarket. So they could say, after it shows on CBS, NBC or ABC or Fox, where can we take it to make some additional money? So they went to Netflix and they did the same pitch what was really interesting was at Netflix and instead of being in a room full of executives that would sit there and go that sounds like a good show to me. It was one person, the CEO of Netflix and the head CTO of their big data organization with the Netflix. That was the only people that were in the room. And the whole time, the guy was talking, the big data guy was going on the keyboard entering date. He came back and he said, yeah, we’ll fund it. We’ll do it for, it was. 200% higher than any network was offering to take the show. And they were blown away. They were like, what are you doing? He said it’s because it’s in the data. We know our viewers and if you and the producer, the writer of the show in this book actually says, he called Netflix back and said, we had an idea where we want the main character to kill a dog in the first 10 seconds of the show. And they were expecting Netflix to go, yeah, you can’t do that. Netflix goes, go. and he goes, really? And he goes, yeah, kill the dog. Really? He goes, yeah. He said, aren’t you afraid people will turn it off? And he said, we will learn more from the people that turn it off than we do from the people that continue watching it.
James Egidio:
Wow. Wow.
Jay Glasgow:
There’s money and boring. There’s money and boring.
James Egidio:
I guess there is. I all right. Have at it.
Jay Glasgow:
Had it but I think this to kind of bring this back to your viewers now what right? So now what? The, yeah. You already have this avatar in place, right? And businesses are monetizing it. What can I do? Let me put it to you this way. You’re everybody watching. This is an actor that’s engaging systems and because you’re the actor engaging systems, you’re the one generating the data. And that goes back decades. and all of that data. Some of it is information and there’s information rights that were born concomitantly and you own those rights. Still do. Yeah. This is like being a singer songwriter, going into a studio two or three years ago, cutting a song and then finding out that the studio is bootlegging and is selling it, and they’re making the. You come to an organization like ours, we’re authorized agency, right? Our attorneys send a cease and desist notice just like BMI or ASCAP would do for a musician, right? We’re basically saying stop playing that tune until you start paying licensing fees to use that data. This is so critical cuz you said, what’s next, Jay? What? What do you see? Where do you see this going? Yeah, I see this going to a world where cryptocurrency. is either going to be based upon proof ofs, and if your viewers know Proof of stake. Proof of value, proof of, yeah. All the proof ofs. They’re basically saying that the value is derived from machine time. But if all of us stand up and start opting out of our data being used for secondary purposes, we can start saying that a cryptocurrency, even cryptocurrency, is based upon agreements between two people because before that Bitcoin was purchased or even minted, two people agreed on it being minted, right? That’s human rights. That’s human rights. So if we start even making our cryptocurrency based upon human rights rather than proof ofs, right? Instead of machine rights, we want it based upon human rights, right? If we don’t exercise our human rights, we’re seeding them and the cryptocurrency will continue to be based on machine rights or machine time, right? Processor time, hard drive, space, energy utilization, right? All the things that go into how blockchain works, that’s what the perceived value is. But without changing any of the technology, if we all start opting out of our data being used for secondary purposes, we send a shockwave across and we say no. Even cryptocurrency is based upon our rights, our information rights. right, It’s a fundamental shift in the way people think, but will bring about stability in those markets. Then they’re hungry for that, right? Yeah. So the crypto markets will embrace this, and that’s a good thing for all of your viewers, and that’s where we’re going with this. We’re basically going to a world where you will have an avatar. It’s 360, meaning you will have a profile in cyberspace that can predict how you’re going to behave and where you’re gonna behave, what you’re going to do next, and. that can be a terrible thing or a good thing, but it’s a terrible thing if you don’t have, if you’re not exercising your rights. But if you do now, even the cryptocurrency that’s tied to that and all the business deals that go on, and all the medical research and all the data and everything we’ve talked about today will start to be under your own control and autonomy. It’s an extension of what people call self-sovereign identity. In, in the industry. And that’s where we need to get too. Is the idea that you’re, you are exercising your rights and you have a say in these things.
James Egidio:
Yeah, I know. Cause we see so much complacency with people and they’re so caught up in their day-to-day activity with their heads down into their phones and they’re not paying attention. And this collective complacency is creating a lot. Companies and governments to take advantage of people in the end is what it’s still absolutely control. And it starts in, it’s all about control. It starts in
Jay Glasgow:
No, you’re absolutely right. And it starts in the schools. The best illustration I can give you are listeners and viewers right now is this you go to work every day. You check into a cubicle, whether you realize it or not, and you are creating product all day long that’s making money for somebody else They’re not paying you a salary, they’re not paying you any benefits. They’re not paying your insurance. At the end of the day, you clock out of your cyber cube and you go home, but guess what? You’re still working for them, and when you get home, you’re still working for them. In fact, when you’re asleep, you’re working for them, you’re still generating that data, so sure you get some freebies out of it, but is that really worth the amount of effort that you’re putting in and how much money they’re making off of you? This is starting in schools now. They’re collecting student data and they’re selling it. That’s happening actively right now, and it’s to the detriment of our kids. Yeah. So the best way for us to get in there and start exercising our rights is to opt out, to start looking for ways you can say, Nope, I want to keep, I want to keep using your product. I want to keep buying from you. I just don’t want to use Facebook or whatever. But I don’t want them using my data for secondary purposes and we’ll help you do that at privacy co-op.
James Egidio:
Yeah it’s a proactive approach. Proactive approach, for sure. My guest, Mr. Jay Glasgow. Thank you so much, Jay, for coming on to the Medical Truth Podcast. And you can find Jay’s company at privacy dot co-op. That’s privacy.coop. Thank you so much again, sir. Appreciate it. Thank you. Thank you. All right, thanks. Appreciate it.