Big Tech lacks Big Trust
I’m a science fiction fan. This is relevant because, as many other fans of the genre, I love technology. I see it as the answer to so many of our world’s problems or even perceived problems.
Or, it can just make life a little nicer.
I remember when VHS made life easier because you could watch movies whenever you wanted, not when you lucked up and found it on TV somewhere, only to see those replaced with slimmer DVDs that eased storage of those movies, then Blu-Ray did the same but at a much higher quality, to streaming where you don’t even have to have the movie in your possession.
But technology companies, known by some as “Big Tech,” have a problem that they could solve, only they don’t want to.
They’re untrustworthy.
Let’s start with something easy, namely search engines.
Google was a gamechanger when it first hit the internet. For the first time, i could actually find content I was interested in versus stuff with some vague connection but was actually pretty different. The company went from a simple search engine to a Silicon Valley giant.
There’s just a problem. They stack the deck on their search algorithm.
I’ve seen it before, but not quite as categorically as RedState notes in a recent piece.
On a sun-drenched February morning, Taylor Lorenz fixed her gaze on her target. Lorenz’s doll-like eyes stared at Libs of TikTok creator Chaya Raichik. Lorenz’s eyes were the only facial feature visible. Her mouth and nose were completely hidden under her ubiquitous and entirely useless black mask. Raichik was wearing a T-shirt with a 10 x 12-inch photo of Lorenz’s crying face emblazoned on it. Before Lorenz could open her muffled mouth, Raichik was trolling Lorenz with her shirt.
The interview was conducted at an open-air café in LA. Three years removed from COVID-mania, Lorenz was still wearing a cloth mask. Who got the “better” of the interview? Objectively, it was Raichik. Nonetheless, a “Google” search using: “Taylor Lorenz and Raichik and Interview” will yield almost entirely leftist takes, asserting that Lorenz nuked Raichik. The first 39 links are all leftist websites. My search term caused Apple to jump into my browser search with a popup. It suggested that I read the gay mag “The Advocate.”
First on Google’s list was Lorenz’s own YouTube channel. Google’s next five suggestions are “Poynter” (leftist publisher and owner of Politifact), The Daily Beast, Mediate, The Forward, and again, The Advocate.
One must scroll down a considerable amount before reaching the first site that suggested Raichik held her own. “The Post Millennial” was the first, in 40th place. Townhall and “RedState” were nowhere to be seen.
That isn’t a bug; it’s a feature.
It is.
It should be noted how rarely anything I write for Bearing Arms shows up when I search for gun-related topics (read: never).
Now, I’m not a programmer, but it seems to me that while it would be impossible to write an algorithm that can automatically favor one version of a particular story over another based on the content, it’s trivial to boost certain sites over another. Particularly if you want to advance a particular narrative, such as the one favored by people who would think Taylor Lorenz could lay a beatdown on anyone.
But this is only one example out of many.
For example, some readers might remember me writing about why there’s a problem with smart houses. Long story short: Amazon driver said he heard a racist comment, so Amazon shuts down the dude’s smart home which was run on Alexa.
Now, I get that a service like Alexa or Google being the foundation for a smart home makes financial sense. No one has to wire literally everything and code up a custom version of Jarvis from the MCU.
But Amazon simply accepted what this driver thought he heard and shut the customer’s entire service down. They didn’t investigate, ask any questions, or even recognize that it wasn’t their place to dictate what people think. They could have told him he couldn’t have delivery direct from Amazon again if they were concerned about driver safety, but that’s not what they did. They cut him off entirely based on hearsay.
Amazon was trusted to run this dude’s house, and they proved they were unworthy of that trust.
See, the problem with Big Tech is that because they sell us services instead of products, they feel they can use those services to leverage us into doing as they bid. While Google’s search algorithm might help you find what you want if you search just right, if you’re not careful, it’ll push you toward the leftist narrative and try to tell you what to think. If you don’t follow Amazon’s version of morality, they’ll cut you off from using their service to run all the actual products you’ve bought.
These are, of course, just a handful of examples, but it doesn’t take much to find still more.
I’m not an early adopter of technology, but I love the idea of having all of these wonderful new things we keep seeing. The problem is that no one should trust these companies because they have proven they can’t be trusted.
Buy products, not services where you can, but we also need alternative services where that’s our only viable alternative. We cannot let these corporate interests tell us what we’re to think or how we’re to act in our own homes, even if we want to be vile, filthy individuals.
Our rights exist. While they aren’t required to bow down to our whims, we shouldn’t be pressured to bow down to their’s, either.
We need companies we can trust. It would be a nice change of pace.
Tilting at Windmills is 100% reader-supported. If you enjoyed this article, please consider upgrading to a paid subscription for 15% off the first year or making a one-time donation here. Your support is greatly appreciated.