New iPhone Setting Stops Apps & Ads from Stalking You (App Tracking Transparency)

Apple App Tracking Transparency is Finally Here!

With the release of iOS 14.5, Apple has given us the most powerful privacy tool for users in many years – it’s called App Tracking Transparency (ATT). The update also includes a lot of features that have Apple product users very excited, like new Siri voices and being able to open your iPhone with Face ID even when wearing a mask—IF AND ONLY IF you have an Apple watch.

But as a privacy advocate, the element that matters the most to me is the App Tracking Transparency (ATT) feature. This means that apps like Facebook, Instagram and Google will no longer be able to track or gather your surfing habits on other apps or websites without getting your permission. For example, if you worked out on the Peloton app this morning, Facebook can buy that information and advertise exercise clothing to you based on your exercise type, size, weight, etc.

This is a serious blow to Facebook and other “free” services that depend on gathering your intimate personal and behavioral data to sell to their advertising clients. Of course, these services have never actually been free, as we have always been paying by giving them our information.

Specifically, the update changes the Identifier for Advertisers (IDFA), which is a unique random number assigned to each iPhone and allows advertisers and developers to track user behavior on that device. This includes not only app usage but also web browsing behavior that is often used to target advertisements to your psychographic profile. Apple says this change will provide transparency and give users an easier way to choose if their data is tracked.

Needless to say, Facebook, Google, and other big tech firms are not happy with the change. Facebook was so upset they placed a full-page ad in The New York Times in December claiming that the change would negatively affect small businesses who will see a drop of over 60% in sales. Facebook was unable to substantiate that claim, but their claim that it will force developers to enable in-app purchases or force subscriptions to make up for lost revenue is most likely true.

What will this look like for you as a consumer?

Basically, whenever you open any app that wants to access the IDFA, you will see a pop-up notification that asks for permission to track you across apps and websites by other companies and you’ll be able to opt in to allow tracking or not by choosing between “Allow Tracking” or “Ask App Not To Track.” Opting into data collection rather than having to opt out finally catches up with data privacy regulations such as the EU’s GDPR. It will be required by all software makers within a few months of the release.

So it comes down to a question of are you willing to pay for the extras provided by apps in order to have a little bit more privacy?

John Sileo is a privacy keynote speaker, award-winning author and media personality as seen all over TV. He keynotes conferences virtually and in person around the world. John is the CEO of The Sileo Group, a business think tank based in Colorado.

Is WhatsApp Privacy a Big Fat Facebook Lie? What You Need to Know.

WhatsApp privacy policy

WhatsApp Privacy: Facebook’s New “Data Use” Policy

I have been getting a ton of questions on the privacy of your personal data that is sent through WhatsApp. Is Facebook, who owns WhatsApp, sharing everything you write, including all of your contacts, messages and behaviors? It’s not quite that simple, but neither is Facebook.

Facebook announced a new WhatsApp privacy policy recently which created A LOT of confusion and user backlash. The changes caused such an uproar that they ultimately have decided to delay release of the new WhatsApp privacy agreement from Feb. 8 to May 15 while they sort themselves out. So let me give you a head start!

Behind all of this, WhatsApp is trying to break into the world of messaging for businesses (to compete with Slack and other programs). That way, when you communicate with a business, Facebook will see what you’re saying and use that information for advertising purposes.

Your Data That Can Be Accessed By Facebook

Facebook contends that your private messages will remain encrypted end-to-end, including to them, but Facebook & WhatsApp will have access to everything they’ve had access to since 2014:

  • Phone numbers being used
  • How often the app is opened
  • The operating system and resolution of the device screen
  • An estimation of your location at time of usage based on your internet connection

Purportedly, Facebook won’t keep records on whom people are contacting in WhatsApp, and WhatsApp contacts aren’t shared with Facebook. Given Facebook’s miserable history with our personal privacy, I don’t actually believe that they will limit information sharing to the degree that they promise. I think that this is one of those cases where they will secretly violate our privacy until it is discovered and then ask forgiveness and lean on the fact that we have no legislation protecting us as consumers. But please be aware that if you utilize Facebook, you are already sharing a massive amount of information about yourself and your contacts. WhatsApp may just add another piece of data into your profile.Watch The Social Dilemma on Netflix if you’d like to learn more about how you are being used to power their profits.

Highly Private Messaging Alternatives to WhatsApp

So, while it is mostly a “cosmetic change” to the WhatsApp privacy policy, if you are uncomfortable using it, you may want to consider the following:

    • There are alternative messaging apps, including Signal and Telegram, both of which have seen huge new user sign-ups since the announcement. I personally use Apple Messages (daily communications) and Signal (highly confidential communications).
    • WhatsApp says it clearly labels conversations with businesses that use Facebook’s hosting services. Be on the lookout for those.
    • The feature that allows your shopping activity to be used to display related ads on Facebook and Instagram is optional and when you use it, WhatsApp “will tell you in the app how your data is being shared with Facebook.” Monitor it and opt out.
    • If you don’t want Facebook to target you with more ads based on your WhatsApp communication with businesses, just don’t use that feature.
    • Trust the WhatsApp messaging app as much as you trust Facebook, because ultimately, they are the same company.

John Sileo is a cybersecurity expert, privacy advocate, award-winning author and media personality as seen on 60 Minutes, Anderson Cooper and Fox & Friends. He keynotes conferences virtually and in person around the world. John is the CEO of The Sileo Group, a business think tank based in Colorado

Telemedicine: Are Virtual Doctor Visits a Cyber & Privacy Risk?

The Trump administration has relaxed privacy requirements for telemedicine, or virtual doctor visits: medical staff treating patients over the phone and using video apps such as FaceTime, Zoom, Skype and Google Hangouts. The move raises the chances that hackers will be able to access patient’s highly sensitive medical data, using it, for example, to blackmail the patient into paying a ransom to keep the personal health information (PHI) private.

This relaxation in privacy regulations about telemedicine is necessary, as treating coronavirus patients in quick, safe, virtual ways is a more critical short-term priority than protecting the data. That may sound contradictory coming out of the keyboard of a cybersecurity expert, and that exposes a misconception about how security works.

Security is not about eliminating all risk, because there is no such thing. Security is about prioritizing risk and controlling the most important operations first. Diagnosing and treating patients affected by Covid-19 is a higher priority than keeping every last transmission private.

Put simply, the life of a patient is more important than the patient’s data. With that in mind, protecting the data during transmission and when recordings are stored on the medical practice’s servers is still important.

  • Doctors should utilize audio/video services that provide full encryption between the patient and the medical office during all telemedicine visits
  • If the doctor’s office keeps a copy of the recording, it should be stored and backed up only on encrypted servers
  • Not all employees of the doctor’s office should have the same level of access to telemedicine recordings; all patient data should be protected with user-level access
  • Employees of the doctor’s office should be trained to repel social engineering attacks (mostly by phone and phishing email) to gain access to telemedicine recordings

Telemedicine and virtual doctor visits is just one way that the government is willing to accept increased risks during the pandemic. Many federal employees are also now working remotely, accessing sensitive data, often on personal computers that haven’t been properly protected by cybersecurity experts. This poses an even greater problem than putting patient data at risk, because nearly every government (and corporate) employee is working remotely for the foreseeable future. I will address those concerns in an upcoming post.

In the meantime, stay safe in all ways possible.


About Cybersecurity Keynote Speaker John Sileo

John Sileo is the founder and CEO of The Sileo Group, a privacy and cybersecurity think tank, in Lakewood, Colorado, and an award-winning author, keynote speaker, and expert on technology, surveillance economy, cybersecurity and tech/life balance.

Private Eyes Are Watching You: What it Means to Live (and Be Watched) in the Surveillance Economy

surveillance economy john sileo

What it is the Surveillance Economy

How do you feel about the fact that Facebook knows your weight, your height, your blood pressure, the dates of your menstrual cycle, when you have sex and maybe even whether you got pregnant? Even when you’re not on Facebook, the company is still tracking you as you move across the internet. It knows what shape you’re in from the exercise patterns on your fitness device, when you open your Ring doorbell app and which articles you check out on YouTube — or more salacious sites. 

Welcome to the surveillance economy — where our personal data and online activity are not only tracked but sold and used to manipulate us. As Shoshana Zuboff, who coined the term surveillance capitalism, recently wrote, “Surveillance capitalism begins by unilaterally staking a claim to private human experience as free raw material for translation into behavioral data. Our lives are rendered as data flows.” In other words, in the vast world of internet commerce, we are the producers and our digital exhaust is the product. 

It didn’t have to be this way. Back when the internet was in its infancy, the government could have regulated the tech companies but instead trusted them to regulate themselves. Over two decades later, we’re just learning about the massive amounts of personal data these tech giants have amassed, but it’s too late to put the genie back in the bottle. 

The game is rigged. We can’t live and compete and communicate without the technology, yet we forfeit all our rights to privacy if we take part. It’s a false choice. In fact, it’s no choice at all. You may delete Facebook and shop at the local mall instead of Amazon, but your TV, fridge, car and even your bed may still be sharing your private data. 

As for self-regulation, companies may pay lip service to a public that is increasingly fed up with the intrusiveness, but big tech and corporate America continue to quietly mine our data. And they have no incentive to reveal how much they’re learning about us. In fact, the more they share the knowledge, the lower their profits go. 

This is one of those distasteful situations where legislation and regulation are the only effective ways to balance the power. Because as individuals, we can’t compete with the knowledge and wallet of Google, Facebook and Amazon. David versus Goliath situations like this were the genesis of government in the first place. But in 2020, can we rely on the government to protect us? 

Unlikely. At least for now. For starters, federal government agencies and local law enforcement use the same technology (including facial recognition software) for collecting data and to track our every move. And unfortunately, those who make up the government are generally among the new knowledge class whose 401Ks directly benefit by keeping quiet while the tech giants grow. Plus, there are some real benefits to ethical uses of the technology (think tracking terrorists), making regulation a difficult beast to tackle. But it’s well worth tackling anyway, just as we’ve done with nuclear submarines and airline safety.

In a recent Pew study, 62% of Americans said it was impossible to go through daily life without companies collecting data about them, and 81% said the risks of companies collecting data outweigh the benefits. The same number said they have little or no control over the data companies collect. 

At some stage, consumers will get fed up and want to take back control from the surveillance economy, and the pendulum will swing, as it already has in Europe, where citizens have a toolbox full of privacy tools to prevent internet tracking, including the right to be forgotten by businesses. Europe’s General Data Protection Rule (GDPR) is a clear reminder that consumers do retain the power, but only if they choose to. It’s not inevitable that our every move and personal data are sold to the highest bidder. We’ve happily signed on, logged in and digitized our way to this point. 

When consumers (that means you) are outraged enough, the government will be forced to step in. Unfortunately, at that point, the regulation is likely to be overly restrictive, and both sides will wish we’d come to some compromise before we wrecked the system. 

In the meantime, you have three basic choices: 

  1. Decrease your digital exhaust by eliminating or limiting the number of social media sites, devices and apps you use. (I know, I know. Not likely.)
  2. Change your privacy and security defaults on each device, app and website that collects your personal information. (More likely. But it takes a time investment and doesn’t fully solve privacy leakage.)
  3. Give in. Some people are willing to bet that a loss of privacy will never come back to haunt them. That’s exactly the level of complacency big tech companies have instilled in us using neuroscience for the past decade.  

Loss of privacy is a slippery slope, and it’s important to take the issue seriously before things get worse. Left unchecked, the private eyes watching your every move could go from tracking your exercise habits and sex life (as if that’s not creepy enough) to meddling with your ability to get health insurance or a mortgage. And suddenly it won’t seem so harmless anymore.


About Cybersecurity Keynote Speaker John Sileo

John Sileo is the founder and CEO of The Sileo Group, a privacy and cybersecurity think tank, in Lakewood, Colorado, and an award-winning author, keynote speaker, and expert on technology, surveillance economy, cybersecurity and tech/life balance.

 

Google Isn’t Just Buying Fitbit, They’re Tracking Your Donut Habit

John Sileo: Google Fitbit to Track Your Health Data

Spinning Wildly on the Hampster Wheel of the Surveillance Economy

You’re heading to the gym for a workout when you decide to surprise your coworkers with a treat. You search for the nearest bagel shop on your Google Maps app. The app directs you to their closest advertiser, Donut Feel Good?, which is actually a donut shop just short of the bagel place. Your heart pounds from the joy of anticipation — your team will LOVE you (and the sugar rush). 

Just as you’re leaving the donut place, your phone alerts you to a coupon at your favorite coffee shop. “Why not?” you think, as Google nudges your behavior just a bit more. As you bite into your first donut and bask in coworker glory, Google is busy sharing your lack of exercise and poor eating habits with your health insurance company, which also has an app on your phone.  

Welcome to the surveillance economy, where the product is your data.

Acquiring Fitbit Moves Google Out of Your Pocket and Into Your Body 

Thanks to Google’s purchase of Fitbit, Google doesn’t just know your location, your destination and your purchases, it now knows your resting heart rate and increased beats per minute as you anticipate that first donut bite. Google is at the forefront of the surveillance economy — making money by harvesting the digital exhaust we all emit just living our lives. 

Google already has reams of data on our internet searches (Google.com), location data (maps and Android phones), emails and contacts (Gmail), home conversations and digital assistant searches (Google Home), video habits (YouTube), smarthome video footage and thermostat settings (Nest) and document contents (Docs, Sheets, etc.). The sheer volume of our digital exhaust that they’re coalescing, analyzing and selling is phenomenal.

Combine that psychographic and behavioral data with the health data of 28 million Fitbit users, and Google can probably predict when you’ll need to use the toilet. 

Fitbit tracks what users eat, how much they weigh and exercise, the duration and quality of their sleep and their heart rate. With advanced devices, women can log menstrual cycles. Fitbit scales keep track of body mass index and what percentage of a user’s weight is fat. And the app (no device required) tracks all of that, plus blood sugar.  

It’s not a stretch of the imagination to think Fitbit and other health-tracking devices also know your sexual activity and heart irregularities by location (e.g., your heart rate goes up when you pass the Tesla dealership, a car you’ve always wanted). Google wants to get its hands on all that information, and if past behavior is any indicator, they want to sell access to it. 

As Reuters noted, much of Fitbit’s value “may now lie in its health data.”

Can We Trust How Google Uses Our Health Data? 

Regarding the sale, Fitbit said, “Consumer trust is paramount to Fitbit. Strong privacy and security guidelines have been part of Fitbit’s DNA since day one, and this will not change.” 

But can we trust that promise? This is a common tactic of data user policy scope creep: Once we stop paying attention and want to start using our Fitbit again, the company will change its policies and start sharing customer data. They’ll notify us in a multipage email that links to a hundred-page policy that we’ll never read. Even if we do take the time to read it, are we going to be able to give up our Fitbit? We’ve seen this tactic play out again and again with Google, Facebook and a host of other companies.

Google put out its own statement, assuring customers the company would never sell personal information and that Fitbit health and wellness data would not be used in its advertising. The statement said Fitbit customers had the power to review, move or delete their data, but California is the only U.S. state that can require the company to do so by law — under the California Consumer Protection Act, set to go into effect next year. 

Tellingly, Google stopped short of saying the data won’t be used for purposes other than advertising. Nor did they say they won’t categorize you into a genericized buyer’s profile (Overweight, Underfit & Obsessed with Donuts) that can be sold to their partners.

And advertisements are just the tip of the iceberg. Google can use the data for research and to develop health care products, which means it will have an enormous influence on the types of products that are developed, including pharmaceuticals. If that isn’t troubling to you, remember that Google (and big pharma) are in business to make money, not serve the public good. 

Google Has Demonstrated Repeatedly That It Can’t Be Trusted with Our Data

Just this week, we learned that Google has been quietly working with St. Louis-based Ascension, the second-largest health system in the U.S., collecting and aggregating the detailed health information of millions of Americans in 21 states. 

Code-named Project Nightingale, the secret collaboration began last year and, as the Wall Street Journal reported, “The data involved in the initiative encompasses lab results, doctor diagnoses and hospitalization records, among other categories, and amounts to a complete health history, including patient names and dates of birth.”

The Journal also reported that neither the doctors nor patients involved have been notified, and at least 150 Google employees have access to the personal health data of tens of millions of patients. Remarkably, this is all legal under a 1996 law that allows hospitals to share data with business partners without patients’ consent. Google is reportedly using the data to develop software (that uses AI and machine learning) “that zeroes in on individual patients to suggest changes to their care.” It was originally reported that the arrangement is all legal under a 1996 law that allows hospitals to share data with business partners without patients’ consent.

However, the day after the story broke, a federal inquiry was launched into Project Nightingale. The Office for Civil Rights in the Department of Health and Human Services is looking into whether HIPAA protections were fully implemented in accordance with the 1996 law.

Your Health Insurance Could Be at Stake

Likewise, Fitbit has been selling devices to employees through their corporate wellness programs for years and has teamed up with health insurers, including United Healthcare, Humana and Blue Cross Blue Shield

Even if individual data from Fitbit users isn’t shared, Google can use it to deduce all sorts of health trends. It’s also possible that “anonymous” information can be re-identified, meaning data can be matched with individual users. This sets up a scenario where we can be denied health care coverage or charged higher premiums based on data gathered on our eating or exercise habits. 

Now couple that with data on what foods we buy, where we go on vacation and our most recent Google searches, and companies will not only be able to track our behavior, they’ll be able to predict it. This kind of digital profile makes a credit report look quaint by comparison.

Get Off the Hamster Wheel

For the time being, you control many of the inputs that fuel the surveillance economy. You can choose to take off your Fitbit. You can change the default privacy settings on your phone. You can delete apps that track your fitness and health, buy scales that don’t connect to the internet and opt-out of information sharing for the apps and devices you must use. Your greatest tool in the fight for privacy is your intentional use of technology.

In other words, you do have a measure of control over your data. Donut Feel Good?


About Cybersecurity Keynote Speaker John Sileo

John Sileo is the founder and CEO of The Sileo Group, a privacy and cybersecurity think tank, in Lakewood, Colorado, and an award-winning author, keynote speaker, and expert on technology, cybersecurity and tech/life balance.