In my post on why mass surveillance is not normal, I referenced how the Wikipedia page for the Nothing to hide argument labels the argument as a “logical fallacy.” On October 19th, user Gratecznik edited the Wikipedia page to remove the “logical fallacy” text. I am here to prove that the “Nothing to hide” argument is indeed a logical fallacy and go through some arguments against it.

The “Nothing to hide” argument is an intuitive but misleading argument, stating that if a person has done nothing unethical, unlawful, immoral, etc., then there is no reason to hide any of their actions or information. However, this argument has been well covered already and debunked many times (here is one example).

Besides the cost of what it takes for someone to never hide anything, there are many reasons why a person may not want to share information about themselves, even if no misconduct has taken place. The “Nothing to hide” argument intuitively (but not explicitly) assumes that those whom you share your information with will handle it with care and not falsely use it against you. Unfortunately, that is not how it currently works in the real world.

You don’t get to make the rules on what is and is not deemed unlawful. Something you do may be ethical or moral, but unlawful and could cost you if you aren’t able to hide those actions. For example, whistleblowers try to expose government misconduct. That is an ethical and moral goal, but it does not align with government interests. Therefor, if the whistleblower is not able to hide their actions, they will have reason to fear the government or other parties. The whistleblower has something to hide, even though it is not unethical or immoral.

You are likely not a whistleblower, so you have nothing to hide, right? As stated before, you don’t get to make the rules on what is and is not deemed unlawful. Anything you say or do could be used against you. Having a certain religion or viewpoint may be legal now, but if one day those become outlawed, you will have wished you hid it.

Just because you have nothing to hide doesn’t mean it is justified to share everything. Privacy is a basic human right (at least until someone edits Wikipedia to say otherwise), so you shouldn’t be forced to trust whoever just because you have nothing to hide.

For completeness, here is a proof that the “Nothing to hide” argument is a logical fallacy by using propositional calculus:

Let p be the proposition “I have nothing to hide”

Let q be the proposition “I should not be concerned about surveillance”

You can represent the “Nothing to hide” argument as follows:

pq

I will be providing a proof by counterexample. Suppose p is true, but q is false (i.e. “I have nothing to hide” and “I am concerned about surveillance”):

p ∧ ¬q

Someone may have nothing to hide, but still be concerned about the state of surveillance. Since that is a viable scenario, we can conclude that the “Nothing to hide” argument is invalid (a logical fallacy).

I know someone is going to try to rip that proof apart. If anyone is an editor on Wikipedia, please revert the edit that removed the “logical fallacy” text, as it provides a very easy and direct way for people to cite that the “Nothing to hide” argument is false.

Thanks for reading!

- The 8232 Project

  • The 8232 Project@lemmy.mlOP
    link
    fedilink
    arrow-up
    5
    ·
    18 days ago

    The issue that arises from this approach, as I’ve found, is that people have something to hide from you, but not the government/large corporations. When they feel as if they are in a pool, they feel less important compared to being singled out by you.

    You could instead do something similar: “Why does the FBI need to know what color of underwear you wear?” etc. to help them realize that surveillance goes much deeper than they realize, and not everything is relevant information.

    • shield_87@lemmy.eco.br
      link
      fedilink
      English
      arrow-up
      1
      ·
      17 days ago

      yep. I use that argument sometimes , but it really depends on the person whether the “then give me your email/chat history/etc.” argumeng will work.

      and just like you said, people don’t you reading it. They wouldn’t want to see you looking through their phone. But in the context of technology, it’s very abstract. Like, when Instagram’s chats weren’t encrypted, telling someone “Instagram can read them” may sound vague. They don’t imagine how like, Meta employees maybe could have access to it, or the softwares behind it that could analyze their chats. that could all be happening, but “out of sight, out of mind!” really helps people tolerate those possibilities. Thhe frontend, the chat interface, looks OK, so yeah.

      I don’t think I could explain it very well, so my bad (not a native speaker), but yeah, I feel like because we are not encoruaged to think about how the software works behind the curtains, it’s easy to assume that “well, I’m not a target, so why should I worry about doing this [privacy thing]?”