This compelling conversation traverses through multiple layers of the privacy and cybersecurity sphere. Marco and Kemp address the game of "whack a mole" played by lawmakers trying to regulate a rapidly evolving industry. They discuss the path-breaking GDPR law in Europe and question if the United States is lagging or catching up with international norms.
Guest: Tom Kemp, Author
On LinkedIn | https://www.linkedin.com/in/tomkemp/
On Twitter | https://twitter.com/TomKemp00
_____________________________
Host: Marco Ciappelli, Co-Founder at ITSPmagazine [@ITSPmagazine] and Host of Redefining Society Podcast
On ITSPmagazine | https://www.itspmagazine.com/itspmagazine-podcast-radio-hosts/marco-ciappelli
_____________________________
This Episode’s Sponsors
BlackCloak 👉 https://itspm.ag/itspbcweb
Bugcrowd 👉 https://itspm.ag/itspbgcweb
Devo 👉 https://itspm.ag/itspdvweb
Episode Introduction
Welcome to another episode of the Redefining Society Podcast, where your host Marco Ciappelli plunges into the depths of the complex, dynamic relationship between technology, privacy, and cybersecurity. This episode takes you on a thrilling journey, narrating the story of Tom Kemp, an active participant in the tech industry, with investments across privacy, cybersecurity, and consumer and enterprise-oriented companies.
Kemp's involvement isn't limited to just business; he's been heavily engaged in the realm of public policy, using his expertise to shape laws like California's Proposition 24, the California Privacy Rights Act, and the California Delete Act. He's even penned a book, Containing Big Tech, which examines the implications of the unchecked growth of tech giants on our privacy and security.
This compelling conversation traverses through multiple layers of the privacy and cybersecurity sphere. Marco and Kemp address the game of "whack a mole" played by lawmakers trying to regulate a rapidly evolving industry. They discuss the path-breaking GDPR law in Europe and question if the United States is lagging or catching up with international norms.
Kemp also delves into a sobering exploration of how our personal data is weaponized against us, leading to identity theft, cyber attacks, and even manipulation of our behaviors. And perhaps most disturbingly, how the accumulation of our data can potentially violate our Fourth Amendment rights and could be used to discriminate against us.
Marco and Kemp challenge you, the listener, to think deeply about how data is collected and used. They reveal the intricate webs woven by data brokers and tech companies that use "dark patterns" to lure consumers into sharing their information. The conversation scrutinizes the deceptive nature of agreements and asks the daunting question: can we really take back our privacy?
These thought-provoking insights are a wake-up call for consumers and businesses alike, inviting us to reflect on the inevitable risks that come alongside the undeniable benefits of technology in our society, our daily lives, and businesses. The Redefining Society Podcast dares to explore the boundaries of privacy and cybersecurity, challenging the status quo, and forging a path to a more secure future.
So tune in, soak in the insights, and join the conversation. Share this episode with your network and don't forget to subscribe to the Redefining Society Podcast. Let's redefine society together, ensuring privacy and cybersecurity for all.
About the Book
Technology is a gift and a curse. The five Big Tech companies―Meta, Apple, Amazon, Microsoft, and Google―have built innovative products that improve many aspects of our lives. But their intrusiveness and our dependence on them have created pressing threats to our civil rights, economy, and democracy.
Coming from an extensive background building Silicon Valley-based tech startups, Tom Kemp eloquently and precisely weaves together the threats posed by Big Tech:
- the overcollection and weaponization of our most sensitive data
- the problematic ways Big Tech uses AI to process and act upon our data
- the stifling of competition and entrepreneurship due to Big Tech's dominant market position
This richly detailed book exposes the consequences of Big Tech's digital surveillance, exploitative use of AI, and monopolistic and anticompetitive practices. It offers actionable solutions to these problems and a clear path forward for individuals and policymakers to advocate for change. By containing the excesses of Big Tech, we will ensure our civil rights are respected and preserved, our economy is competitive, and our democracy is protected.
_____________________________
Resources
Containing Big Tech: How to Protect Our Civil Rights, Economy, and Democracy (Book): https://www.tomkemp.ai/containing-big-tech
____________________________
To see and hear more Redefining Society stories on ITSPmagazine, visit:
https://www.itspmagazine.com/redefining-society-podcast
Watch the webcast version on-demand on YouTube: https://www.youtube.com/playlist?list=PLnYu0psdcllTUoWMGGQHlGVZA575VtGr9
Are you interested in sponsoring an ITSPmagazine Channel?
👉 https://www.itspmagazine.com/advertise-on-itspmagazine-podcast
Please note that this transcript was created using AI technology and may contain inaccuracies or deviations from the original audio file. The transcript is provided for informational purposes only and should not be relied upon as a substitute for the original recording as errors may exist. At this time we provide it “as it is” and we hope it can be useful for our audience.
_________________________________________
voiceover00:15
Welcome to the intersection of technology, cybersecurity, and society. Welcome to ITSPmagazine. Let's face it, the future is now we're living in a connected cyber society, and we need to stop ignoring it or pretending that it's not affecting us. Join us as we explore how humanity arrived at this current state of digital reality, and what it means to live amongst so much technology and data. Knowledge is power. Now, more than ever.
sponsor message00:47
Black Cloak provides concierge cybersecurity protection to corporate executives, and high net worth individuals to protect against hacking, reputational loss, financial loss, and the impact of a corporate data breach. Learn more at Black cloak.io. big crowds award winning platform combines actionable contextual intelligence with the skill and experience of the world's most elite hackers to help leading organizations identify and fix vulnerabilities, protect customers and make the digitally connected world a safer place. Learn more@bugcrowd.com Devo unlocks the full value of machine data for the world's most instrumented enterprises. The Devo data analytics platform addresses the explosion in volume of machine data and the crushing demands of algorithms and automation, learn more@devo.com.
Marco Ciappelli01:52
And we are recording I think this is number three today. So busy day here in ITSPmagazine, in particular, on redefining society podcast, which as a guest when I started this show, we're not going to run out of topics when you put society and technology together, and then you sprinkle a little bit of cybersecurity. On top of that. You're going to talk about pretty much everything. So I'm glad for all the people listening. And mostly I'm glad for all the amazing guests that I can bring on the show. And as usual, if you're listening to the audio version, you don't know who is on yet, but maybe you read the title. But if you're watching the video, you see that Tom camp is with me. And I'm very honored. I'm very honored to have you on Tom and to have this conversation about the book that it's about to go public. So let's talk about that. But let's talk about you first. Who is Stone Camp?
Tom kemp03:03
Well, first of all, Marco, thanks for having me on. It's an honor. It's a pleasure to be here. So, ya know, I'm a Silicon Valley based entrepreneur and I founded co founded with a great teammates two companies, one that went public and a cybersecurity company called Centrify that was acquired a few years ago. And then since then, I've been doing angel investing. So I actually have 15 investments in various tech companies, some of which are in privacy, some of which are in cybersecurity, some focused on consumers, some on the enterprise, but also over the last few years have been heavily getting into public policy, I feel I have some great expertise and in real world matters and see what enterprises and consumers are dealing with as it relates back to privacy and cybersecurity. So I've been very active from a political perspective trying to get laws passed. And so I worked on Proposition 24, in California, the California Privacy Rights Act. And just recently, I co authored a bill called the California delete act that would give consumers a single click to be able to say, hey, data brokers delete my data as well. So kind of all the above has eventually led me to write this book called containing big tech.
Marco Ciappelli04:20
Which is a big story in a short title already, right? Like, I don't even know where to start. So I like that you said you were involved in this Privacy Act. Of course, me even if I've been living in the US for such a long time. I'm from Florence, Italy, European and when I go over there for reason or another is always interesting to see how the European community has been a little bit more aggressive, definitely done than here in the US. GDPR of course, has been kind of leading the way So it's nice to see that he catch up. But I always feel like are we just keep putting quarks in the bottom of the boat that he's actually bringing water. So we do an act here and act there just to try to contain privacy. But also I don't want to sound negative because I'm not, but I don't worry too late. And I think this is the become the big question for for you. Because it sounds to me that you probably think you were not from the title?
Tom kemp05:36
Well, I think first of all, you're absolutely right. Europe has led the way as it relates to privacy, and they're still leading the way. For example, right now in Europe, there's this thing called the AI act. And so they're much further ahead of the US. And so it's people call it the Brussels effect. And probably were, the consumer protection as it relates to privacy. And cybersecurity is is initially taking a hold is in California. And so then people call it the California effect as well. But yeah, I mean, it's, it's oftentimes it's a game of Whack a Mole, right? Where you didn't really think that this would become a problem, I think the best example, is just overall privacy. In terms of, you know, when, I mean, obviously, Google and Facebook now called meta, the vast majority of the revenue is advertising their advertising companies. And so we made an agreement, as consumers with those companies that we would give up our personal information and get all these great free services. And in the past, it was about, you know, Hey, okay, we get some of these ads. And maybe some of these ads are annoying, some, sometimes the ads are actually helpful, et cetera. But what's really happened over the last 510 years is that a lot of this data that has been collected about us can now be weaponized against us as well. And so we now need to start addressing that and what's been happening in the US, it's been being addressed at the state level, kind of in a patchwork manner. But unfortunately, the United States does not have a federal equivalent of GDPR, which is kind of scary, given the fact that, you know, Google went public, you know, 20 plus years ago, same thing with meta, etcetera. And some of our even existing privacy laws, such as HIPAA, have not been updated since the 90s. And, of course, there is an explosion of healthcare related data, that's a very sensitive nature that is not covered by HIPAA, because it's on mobile apps as well. So we need to do something, it's happening at the state level, as of last year, there were five states that had privacy laws, now, there's a couple more, but it's a huge patchwork. And at the end of the day, we do need to do something to keep up with the changes in technology, just much like from, you know, protecting consumers that you know, with automobile safety, eventually there was there wasn't any, but then they added, you know, seatbelts wipers, etc. And, you know, no one complains anymore about, you know, automobile safety rules and whatnot. And, you know, we're still early stage when it comes to consumer privacy in the US.
Marco Ciappelli08:29
That's a, that's a really good point. Because sometimes, especially if you're in the cybersecurity industry, you feel like, Oh, we've been talking about these things forever. But then you think about it. And it's like, it's not really forever. I mean, it's a relatively new industry still, nowadays, like our online life, or digital life and looking at society. I mean, I just had a conversation a few days ago, about how to prepare the next generation and how educate them into using the devices that we have, and if it's okay to use apps to follow them around. And so maybe the parents are a little bit over protecting and the question when you look at from a cybersecurity perspective is apart from the relationship in the family? where's that going about? Are those data's possibly gonna go end up in bad hands and have many consequences? So weaponizing the data what what do you? What do you mean by that?
Tom kemp09:31
Yeah. So in the past, the data collection was about, you know, to facilitate advertising. But I think what people realize that the data could be used for other reasons, right. And one REIT one way especially with data collected by entities known as data brokers, that we don't have a direct relationship with, that that data could actually be used to facilitate identity theft, right? because, you know, basically all your answers, have your personal questions that you have to identify who you are, are known to data brokers, what high school you went to what's your first car? You know, what are your last five addresses, et cetera. And so that's one aspect where you know that the hackers are getting really smart of leveraging personal data to be able to either answer the questions and get in and impersonate you, or to fish you by going in and potentially attacking executives by going in and figuring out what their personal accounts are going into their Gmail, and then kind of using that as a hook into the, to the enterprise. Other examples of weaponization of information, include by getting to know you so well, that they can actually start serving up ads that kind of influence and change your behavior, et cetera. And so it can actually, you know, get some people fall into rabbit holes, where they're served up certain information, or they may be prone, and they don't even realize it to be manipulated, as well, because they're leveraging that data. And then there's also for certain people, that you can buy data through data brokers to get around subpoenas. Like you can basically get around the Fourth Amendment by, hey, I want to know what this person where they visited, where they went, et cetera, before you would have to get a subpoena to figure that information by subpoenaing the phone companies, etc. But now you can go to a Data Broker. And it's pretty easy to figure out what people's mobile advertising ID on their phone is, and then correlate that with their location information. So you can completely track and basically go around the Fourth Amendment. And then finally, we are in an interesting time, we're basically in a post abortion rights America right now, where people can subpoena or get information from Google or from data brokers, that may give you insight in terms of the reproductive choices that you may have made. And that's now criminalized where it wasn't in the past. So data, certain aspects of data wasn't criminal ability to be used criminal, but because the law with Roe v Wade, going away with the dobs decision, so there's multiple aspects of how personal data can be weaponized, it first starts, of course, with identity theft, right. But then it goes around subverting the Fourth Amendment, it goes to potentially leveraging AI. And that data feed to change or manipulate your behavior, and maybe even subtle manners, and then eventually gets to the point where this type of data could be subpoenaed and used against you based on specific choices. And one last thing is, the data could also be used to discriminate against you, for example, if you're maybe a awful employer, and you do not want to hire someone that's either pregnant, or potentially has cancer, you can actually go through a list of data broker data information, and they collect that type, or they infer that information. And you can make your hiring decisions based on doing a third party data purchase, and seeing what type of data is associated with an individual before you're hiring them, and try to kind of work around some of the discrimination laws that we have in the US. So that's very possible as well.
Marco Ciappelli13:24
That's a handful right there. It reminds me a quote of Cardinal Richelieu back in the 1700s, where it says, give me a letter of the most honorable man, and I'll find in there something to incriminate it. So now to the amount of these letters that we have, you know, air quote letters. And I mean, we were in trouble plus our artificial intelligence, just to make it even more interesting. And who knows what,
Tom kemp13:55
yeah, I think but also, just from a human perspective, that we go through in life, we go through a process of trying to discover who we are, what we like, what friends we are, we may not be sure about our sexuality, or religious beliefs, etc, right. And so we do a lot of exploration. And in the past, maybe it was going to the library and getting books or talking with friends. But no one had a record of that, right. And now, as you go through that self discovery process, that everything's being recorded and captured, right, and it's the equivalent of always having, you know, it's like that the TV show Big Brother when you're constantly being recorded, and that does change you if you're aware of that, that, that you become more self conscious, and maybe you're not as interested in doing something and look, I understand that, that a lot of people say, Hey, I don't have anything. Okay. You know, I don't have anything to hide. Right. But yeah, but but, but some people could actually use that for nefarious purposes, like what we talked about before, like identity theft, right to, to fish you or to use that to figure out your password or your security questions, etc. So it does have a negative impact on you as well. But we need to move to a regime where the users can, can take control of who they give their data, and provide, you know, kind of governance beyond that, and that's the definition online privacy. And then of course, security is about if someone steals that information, right. But there's a difference it privacy does not equal cybersecurity, they're they're kind of they overlap in some areas, right? Because it's, it's privacy is about governance. Security is about the security, integrity and protecting the information. But but there is overlap, and we need to kind of factor both in when we look at data associated with ourselves.
Marco Ciappelli15:56
Yeah, definitely a lot to think about. And when you when you start it, and you mention the advertising model into, you know, the digital marketing, you say, the word agreement, meaning the consumer agrees to exchange the service for something, which has given away the information. My, I think that nobody really only few actually made that agreement intentionally, right. It's been more of a lawyer it in, and then you become the product, you're not the users anymore. And so I don't know how you can get it back, even with education and everything. I mean, do you think that if we announced this act, if we, if we bring the regulation, I always It's this feeling that sure your tastes may change who you are today, maybe it's not who you were, but they can still say, this was you when you were students? And you maybe you were drinking, you were having fun? You were doing something and still you? So can we actually really take it back? Because I think that's the big question.
Tom kemp17:11
Actually, I think you can I mean, but you're absolutely right. Look, I mean, you know, when they, they they give you these huge End User License Agreement, who the heck reads the APA and apple and User License Agreement, it's like six pages, et cetera, you just go click right, it's, or they put dark patterns up to kind of fool you or make it so difficult as well. And in fact, we even have situations today that increasingly, they're becoming more AI oriented, that it may take two steps up to sign up for a subscription. But it may take 10 or 20 steps to unsubscribe. And we need to basically say we need legislation or regulation that says if it takes three steps to sign up, sign up, you should be able to be given the same number of steps to to unsubscribe from a service be at Amazon Prime or or, or Dropbox or whatever. I mean, so that's there are some dark patterns. But there's actually Well, first of all, to take control. And to ensure this doesn't happen. First, you do need a federal privacy law that gives you the right to delete the right to say no, to just the right to know what data is being collected about you. But then the tech industry actually has started like saying, oh, okay, well, we'll actually start suggesting state laws, because we know it's going to be so darn difficult for consumers to exercise those rights. But there's actually a solution. There. The first solution is something called Global privacy control. It's basically a signal that you can put on your browser, it eventually needs to be put in place on mobile, that basically says, As I visit websites, etc, I'm going to have a flag, set up a signal that says do not sell or share my information, or limit the use of any sensitive information that you collect. So it's called GPC. It's not California is trying to mandate that industry is pushing back. But if we all of a sudden have a signal on our phone, or on our browser that says don't collect, don't sell, that solves a lot of the problems. So that's but those are with people that we interface directly with that's with, you know, Google or Walmart or whatever websites that we visit or mobile apps. But then there's a third party data, which is collected by data brokers, data brokers are entities that we don't have a business relationship or a direct relationship with, but they collect behind the scenes, all this information or credit card transactions, what websites, they have SDKs tied into mobile apps that know what we do inside the app, etc. And so what we need for third party data is the equivalent of the FTC Do Not Call Registry where you can just go to a website and say, delete my data. and do not track me moving forward for those third parties as well. That's actually what I proposed in California with this thing called the California delete act that would give consumers in California, a single portal portal where they can just put their email address or mailing address, hit click, et cetera. So ironically, we could do a lot. I mean, at first, you need a federal privacy law, if you don't have, you know, but maybe you're fortunate to live in a state like California that has one, you need the enforcement of global privacy control that has that signal or flag that says, do not sell my data. That's just like the baseline whenever you and then the third thing is, is we need the the equivalent of FTC Do Not Call Registry, but in this case, for data brokers, etc. But if we get those building blocks in place, wow, we could have significant control over how our data is used, and basically telling people I just don't want my data to be sold or shared. And so it's possible, it's there. And, and I think we're in going back to GDPR. There's a thing called cookie fatigue. Right? Right, where you constantly every website, except the cookies. And then if you say, if you select something, well, what about performance cookies? What about analytics cookies? What about marketing? Like, oh, god, okay, just fine, track me or whatever. But if we could just have, if we could just have a signal at our browser that says, boom, right, you know, I don't want to be tracked, period. And so you get out of the whole cookie business, the cookie fatigue as well. And, and, frankly, you know, tech companies who are based on advertising, they like that fatigue, because people break down and we just need to make it easier for people as well. And, and that's why I just kind of just based on my years of looking at cybersecurity and privacy. That's the first couple chapters in my book containing big tech, because I just wanted to say actually, there turns out there could be some pretty easy solutions here. We did something similar with the the telemarketers with do not call. It's not perfect, but at least it's something it's 250 million Americans have signed up for it. There are federal privacy laws, but you just got to fix them. So you don't get the cookie fatigue. So there are things you could do. And and so that's why I wanted to evangelize based on you know, my 20 plus years of real world cybersecurity and privacy expertise.
Marco Ciappelli22:26
No, no, I agree with you, and it will be a really good thing, especially because then you really have an agreement, where you easily can say yes, say no, and maybe who knows, you get something back, which is if they know who I am, and what my taste is. And I go on Amazon, per se, and they suggest me some product, that probably right, they know enough. But so convenience versus privacy, that's, that's the big thing. You bring the AI in this conversation, I think it's right after you talk about weaponizing the data, everybody now is talking, I just got back from RSA Conference, like if you have a drink every time somebody say artificial intelligence, especially generative one, you're gonna, you're gonna know last long. So how did you see this coming into? I mean, obviously, there is a million topics around the AI, from copyrights, to harvesting of information to losing jobs, and so forth. A lot of good things to, you know, AI, it's a powerful tool. How does it plays in, in your vision in your book in the big tech contain? Yeah,
Tom kemp23:44
well look big, I mean, AI, is can be incredibly positive. And by the way, you know, just, it's incredibly positive to have a, you know, some global social media networks, etc. It just, I think people just need more, say in how their data is being used. But in the case of AI, there's, there's a lot of good things in terms of, I mean, if you drive a Tesla, you know, to be able to identify, you know, tell you, you know, how close you are to a bumper or a sign or something walking across the street as a person or a bike or something like that, I think we and then of course, like from a medical perspective, the ability to do a better job of analyzing test results, etc, and increasing the chances. So I want to be very clear that AI is a very, can be a very positive thing. And the reality is the majority of AI implementation is going to have to do with automated decision making, just try to provide better decision making better analysis, etc. A lot of the hype has to do with the generative AI, right? But the majority of AI usage is going to be just to help facilitate, you know, decision making, et cetera. Specific to generative AI. Actually, I have a couple I just like I said before, I think there's some simple solutions. As it relates to the the privacy thing. I think as it relates to generative AI. One idea that I have is for the large providers of generative AI, that from a regulation perspective, it shouldn't be allowed to be able to take text or picture and be able to go to their website and and ask them, Did you generate this, etcetera. So I think we really need more transparency for people to verify and validate if they actually had created the images, the text, et cetera, right. And so I think that another example, that of a potential solution is that when you're dealing with your on a phone call or a chat, you should have actually the right to say, hey, is this AI? Or is this a human because oftentimes, we just don't know, right? And it's kind of a 411, like in the middle of a chat, you can hit star 411. And it's, it has to fess up. If this is actually you're dealing with AI, and then from there, you should be able to have the ability to request to talk to a human as well. And in fact, GDPR has this, that you do have the right to be able for certain automated decisions, you do have the right to appeal to a human, I think we also need to add that to overall AI. So I think there needs to be more transparency, as it relates to AI, especially generative AI to be able to ask if, you know Bard, or open AI, chat GPT has actually generated this. And so you could just create a simple law that says if you have more than 10 million monthly users, and you generate stuff that that your service needs a slash verify where people can upload something, and that would resolve a lot of the issues with kids and homework, right? I have a funny story that a family friend, their daughter was a senior. And she was asked to write an essay in her high school about what's the best way, you know, one should get into college. And she wrote a very humorous essay that says, I Oh, I need to move to Montana, I need to learn the bassoon, you know, just kind of all these edge cases. So she would have a higher chance of getting into a great college. And her high school teacher accused her of using chat GPT to write this because it was too funny. The essay was too funny, too humorous, etc. Well, why not be able to take that essay? The teacher could take that essay and just go to catchy T slash verify, upload it and say, Did you generate it? And at what time, I think that would resolve a lot of the concerns and issues that getting these people to fess up whether or not they generate as well. So again, it turns out, there's some pretty simple ways to go about doing this, like, you know, are we We're Oh, we're so worried about is this AI generated? Well, Ash, we should be able to ask did you generate this or not? Yes or no? Seems pretty simple.
Marco Ciappelli28:36
Yeah, and he's also comes through the same roof, which is knowledge and being able to decide do I want to deal with an AI or not? Because in some cases, maybe you do want to? Yeah, absolutely.
Tom kemp28:49
Like, like, yeah, who wants to spend like half hour waiting to talk to someone and sometimes, based on the large data models that would fed to the AI, that like, Hey, I have a error with my refrigerator, I go to the online chat bot, and I put in the error messages they asked me for 10 different things and they've got the entirety of all the issues that have come in and in in theory and as opposed to like waiting a half hour for some technician that may not know your model of refrigerator, etc. But
Marco Ciappelli29:23
or that tells you to unplug the computer and plug it back again
Tom kemp29:29
and lose your session. And so but but the funny thing is but as you're going through that process, you sometimes people they fool you they don't tell you they actually give you the like hi I'm Alice I'm here to help you. And then you wait a minute like is this really Alice? Is this really a human like I should be able to type star 411 you know give me the info and then like Alice's and that should return Alice is a AI generated software program if you want to speak to a real human put star 911, right? And then you put that in the Chatbot. I mean, so there's how
Marco Ciappelli30:04
about just say, Hello, this is Ali's I am an AI. I can even guess
Tom kemp30:10
like that you should be able to or here's another example that for maybe websites that have over 10 million monthly average users, that there should be like in the upper right hand corner, a little nutrition label that tells you what percent of this page was generated by AI, you know, 82% of this was generated by AI. Right. Okay, well, that's good. Because, I mean, we have food nutrition labels, right? And I know, like Apple has requires mobile app application developers to put privacy nutrition labels, like what do you collect and all that stuff. And frankly, we need that across the board, every major app vendor, even if they're on Apple or not, you know, should have a privacy nutrition label in terms of what information they collect how they use it, and should be in a simple, you know, form, as opposed to this huge 10 Page legalese. But we should also have an AI nutrition label, which, which simply tells us what percent of this was actually generated by humans versus AI. And that would be very good to know that, you know, the credibility, and then you could actually potentially weigh the, the in search engines, you could use that as a factor in terms of your ranking of the actual page itself, if you elect to do that from a search engine perspective. Or that could potentially help you know, students or academia to know that, hey, I'm quoting this article, I want to know if I'm actually quoting a human, or if I'm quoting, you know, AI, etc. So I think we need more transparency when it comes to AI. And that actually resolves a lot of the issues as well. So, again, you know, to me, it's like, everyone's waives their hands and all that stuff. And I'm like, dudes, you know, just, if you're worried about something being AI generated, require barred and open chat GPT for open AI, to tell you if they generated it or not, then that would address like the whole college and high school, you know, issue of, you know, is this really this kid's essay? You know, who was being very cute and funny about saying, I'm going to move to Montana and learn the bassoon, and that's how I'm going to get into Harvard? Or was it actually, you know, Chad GPT, that generated a humorous essay on how to best get into college.
Marco Ciappelli32:39
No, and you're, you can have transparency there, as well as say, Hey, I did my research, and I get some input from Ai, but then I develop it on my own. So here it is. 20% ai, and 80%. My brain, right. Yeah. As we start closing, I know that you have another relevant topic, kind of like the third part of your book talks about big tech. And how you see this to be something that is making competition No, you're also an investor. So in tech, having your own company before investing in other? Can you explain why you felt the need to on top of data, weaponization, brokers and AI to add this chapter where you talk about the presence of this big shadow all over the tech world?
Tom kemp33:37
Well, look, the reality is, is that the five largest tech players are monopolies or in certain markets, digital markets, or the duopoly list? I mean, they may say, Oh, well, you know, if you add up the whole, if you add up libraries, and card catalogs and libraries that, you know, we only have a 20% market share and search. Well, no. But in internet search, Google is the dominant player, right? In mobile, app stores and operating system. It's a duopoly. It's Google and Apple, right. And the problem is, is that there's approximately about 10, digital markets. And these are huge markets, where it's basically kill zones for any startups to try to enter. And the problem is, is that these large enterprises as part of these digital markets, have built marketplaces where they're the market, they provide the marketplace, but they're also participating in the markets as well. Right? Which means like, Take, for example, Amazon, they obviously provide an incredible you know.com e commerce site, et cetera, but they're able to figure out Like, what are the hot products from third parties. And in getting that data, they can actually say, Oh, we're we're gonna come up with a knockoff on this as well. And so they're able to through the by owning the market, they're able to actually come out with AmazonBasics and other knock offs, etc, and kind of kill off popular products as well. So they participate in the market, they own the marketplace. Another example that is in the app stores, right? That the App Store providers, they know exactly what are the popular apps, and what they can do, or in the case of Google with the search engine, they can see what are the popular vertical searches, they can come up with their own solution, they could put it at the top, and they can either put their apps at the top there, you know, or they can put the their search results specific to their properties at the top as well. And so it actually kills off companies because they own the marketplace. And then from there, they actually charge 30%. In the case of the the mobile app stores, they take 30% of your your revenue just to be in the marketplace and sell in the marketplace. And so if you're Spotify trying to get into the audiobooks market, you have to pay that 30% tax to Apple, or Google with Android to offer an audiobook. But Apple or Google doesn't have to charge that extra 30%. So it's almost insane, to create a business that's going to try to sell mobile apps, and that's your primary way of making money. Because immediately 30% of your top line goes to the marketplace vendor as well. And if you look at Apple, I mean, they only they make billions of dollars in their app store, but they only spend about, you know, a couple 100 million dollars, you know, to validate these apps and all that stuff. And, and that's just unfair, it actually hurts the ecosystem, and it hurts innovation. So Spotify cannot even advertise that they have that you can buy audio books, you know, on the on their, their iOS app, outside of the Apple app, and they have to charge more same thing with Epic Games right there as well. So the problem the so to kind of put a bow on this, the fact that they have they they create the marketplace and they participate in the marketplace is they're both the umpire and they're the pitcher and batter inside this. It's It's patently unfair. Furthermore, it exasperates the issues of privacy, because there's they they've locked in the marketplace, that they don't have to compete for features and functions that that actually add more privacy. I mean, Google has had so many hack, I've been on Google, I'm sorry, scratch that meta, in 2019 had some major hacks going on. But the problem is, is that people are not migrating off it, even after all the hacks that happen right there because they're locked you're locked in. So there's also the fact that they're not open ecosystems, and third parties can't interoperate with them as well. So that's another aspect of as well, so people are locked in. And they have to suffer through weaker privacy because of that. So the monopoly issues, not only stop competition, and, and stifle innovation, and take off a lot of money off the top, like in the Google ad system, because they, they sell the ads, and they place the ads, that people statistically said that, hey, this is much more expensive to do, you know, online advertising because of that, but exasperates the situation of cybersecurity, it exasperates the situation as it relates to privacy, because there's no alternatives that that force these people to do a better job as it relates to privacy, because we're all locked into this as well. So, so yeah, you can't ignore the fact that these are they, these are large monopolies. And if you really want to address some of the core concerns you have about data collection in AI, you also have to make the marketplace more competitive than it currently is.
Marco Ciappelli39:25
Wow, a lot to think definitely a lot to think and of course, we could talk in go deeper in certain topics and I'm sure you do that on the books. So I don't want to give away everything. But I think it will be easy to say that this is a kind of podcast episode that I like to record because if there is something that is going to happen when people listen to this, which happened to me, it gives you a lot of thoughts, a lot of question and make you think really heavy about all our everyday things. I mean, I So think about the font that I use the apps that I use, and you know, the privacy that you have made me think about GDPR. And, and literally, I talked to my dad sometimes, and he is in his 80s. And he's like, I can't go anywhere anymore, because I always does cookies. And I already told them that I don't want them. But they asked me again, I'm like that I'm sorry. Unfortunately, you know, that's what it is. But, again, I think there's a relevant topics, again, for rights or humanity, the way we interact together, and definitely, so it's about the economy. And, yeah, and democracy. I mean, you touch that there's, there's way to weaponize as a I'm stealing something, I'm doing something bad. And then really, in a subtitle way you manipulate social engineer people pretty much to to do things that maybe they wouldn't think they would do. And nobody really sign up for that. I think I did it.
Tom kemp40:59
Yeah, I think it's like, at the end of the day, we have such a concentration of power in a few companies, right, and they've created amazing products, etc. But I don't think it's healthy to have such concentration of power, where, you know, they have 4 billion users, that's half the world's population, and there's no alternative viable alternatives to them as well. So it's just, it's not healthy for democracy. And in the past, and in the US, you know, there was the the railroad trust, and we we broke them up. And then we put in regulation. And then the same thing with Standard Oil and the oil and we made a more competitive economy. And the problem is, is that the privacy we don't have a federal privacy law, but the privacy laws that we have HIPAA, Gramm, Leach, Bliley, et cetera, they were all done in the 90s. Before, iPhone before Google before Mehta, and now most healthcare data, as I said before, is is kind of floating around in mobile apps, it's not, you know, with your doctor, et cetera. And so we need to take a HIPAA for sensitive healthcare information, and we need to update it for the modern digital world, etc. So we need to do things like that as well. And so I tried to put in this book that, you know, explain what the problems are tried to connect the dots, dots between over collection of data, how AI uses it, and how the concentration of power makes these problems worse, but I try to provide some simple solutions, like I talked about before, like the global privacy control, the equivalent delete Act, which is a Do Not Call Registry, I talked about AI transparency, I'm like, dudes, there's some actually pretty simple things we can do to increase transparency, to facilitate the ability for us to control data, our data, and how it's being used. And so we could actually start, you know, taking back control, and I'm not trying to, you know, hurt big tack or kill it or anything like that, but I think we can, you know, you know, better regulate, it better contain the negative and, and, you know, and just have more of the positive things, you know, stand out as well. And so, and that's based on me starting tech companies and, and doing this for, you know, cyber and privacy for 20 years. And a lot of your listeners, they know, they know that there is a cybersecurity problem. There's too much data, they know what needs to be better protected. So why can't the you know the policies and regulations reflect what your listeners know, which is we need to do more.
Marco Ciappelli43:53
And we need to be more active. And again, the only way to do it is understand what what we're doing every day. So I am sure this is a book for everyone that is interested in understanding this and may be an inspiration for enterpreneurs, as well as everyday user, as well as maybe regulators may understand that there are easy solution, because in the end, you just need to want to do things, you know, yeah. Maybe if we want to please. If we all wanted peace, we will have peace, but we all want TV. And we all have TV. That was 16. So yeah, I think I think what you would have said now, yeah, go for it.
Tom kemp44:36
I look yeah, I think the key thing is what I'm trying to do with the book is first raise awareness. Right. And that's and connect the dots right there. And then yeah, if I can influence people right there, you know, that's, that's great. So that's, that's what we're trying to do here.
Marco Ciappelli44:55
Well, I found this conversation to be very enlightening and And I'm sure that the audience will think about it. There'll be notes in the podcast notes and in the YouTube video if that's what you're watching on how to get in touch with Tom, in there is a website is website. I know, Tom, we didn't touch on that. But I know you write a ton of blogs, a lot of content that I'm sure it's still part of your thinking process that then culminated into writing this book. And so I invite everybody to check it out. There is the opportunity to get the book right there. And as far as I am concerned, Tom, thank you so much. And everybody stay tuned on redefining society podcast. And yeah, subscribe. They'll still be more more stories. And I got Tom, thank you so much.
Tom kemp45:46
Thank you very much as well, Mark.
sponsor message45:53
Devo unlocks the full value of machine data for the world's most instrumented enterprises. The Devo data analytics platform addresses the explosion in volume of machine data, and the crushing demands of algorithms and automation. Learn more@devo.com big crowds award winning platform combines actionable contextual intelligence with the skill and experience of the world's most elite hackers to help leading organizations identify and fix vulnerabilities, protect customers, and make the digitally connected world a safer place. Learn more at bug crowd.com. Black Cloak provides concierge cybersecurity protection to corporate executives and high net worth individuals to protect against hacking, reputational loss, financial loss, and the impact of a corporate data breach. Learn more at Black cloak.io
voiceover46:56
We hope you enjoyed this episode. If you learned something new and this podcast made you think then share itspmagazine.com with your friends, family and colleagues. If you represent a company in wish to associate your brand with our conversations sponsor one or more of our podcast channels, we hope you will come back for more stories and follow us on our journey. You can always find us at the intersection of technology, cybersecurity, and society