post text
Picture this:
- You type on Google “laptop won’t turn on”
- Google now knows you have a broken laptop and can estimate how desperate you are to fix it.
- Because it knows how desperate you are, it can increase shop prices proportionally.
You are going to pay the maximum they get you to pay.
That’s algorithmic pricing.
The more companies know about you, the more they can predict and sell how desperate you are to other stores out there.
An internet-connected car knows much more about you than you realize. A smart TV also knows what you like. Your Alexa knows if there is a problem in the home.
Privacy is much more than just sensitive data.
It’s about not giving leverage away.
Because algorithms will use it against you.
Be safe out there.
In a past life I wrote the software that did this.
It’s not just about charging more when you’re desperate. It’s also things like charging you less to keep you addicted, or getting you hooked. Exploiting your emotions and behaviour to make it effective. A small loss on you now could be a long time gain for them.
Some more scenarios:
The data available back then was pretty minimal, effectively only the data we generated. But it was still enough to prey on your lizard brain. With data brokerage I’ve got no idea what level of evils we could have done.
Thanks for ‘coming out’ about it. Without doxing yourself too heavily, would you mind to share more about the industry in particular or measurement of these practises? Dip you know if it was common (and when was this?)
I know for sure that we can’t trust companies to act in our best interests (if anything, its a hostile relationship), but I guess I’m curious about your inside perspective. Has that jaded you much at all?