Pluralistic: 06 Apr 2021

Originally published at: https://pluralistic.net/2021/04/06/digital-phrenology/


Today's links



Podcasting How to Destroy Surveillance Capitalism (permalink)

This week on my podcast, the first part of a five (?) part serialized reading of my 2020 One Zero book HOW TO DESTROY SURVEILLANCE CAPITALISM, a book arguing that monopoly – not AI-based brainwashing – is the real way that tech controls our behavior.

https://onezero.medium.com/how-to-destroy-surveillance-capitalism-8135e6744d59

The book is available in paperback:

https://bookshop.org/books/how-to-destroy-surveillance-capitalism/9781736205907

and DRM-free ebook :

https://sowl.co/bm2F7c

and my local bookseller, Dark Delicacies, has signed stock that I'll drop by and personalize for you!

https://www.darkdel.com/store/p2024/Available_Now%3A__How_to_Destroy_Surveillance_Capitalism.html

Here's the podcast episode:

https://craphound.com/nonficbooks/destroy/2021/04/05/how-to-destroy-surveillance-capitalism-part-01/

and a direct link to the MP3 (hosting courtesy of the Internet Archive; they'll host your stuff for free, forever):

https://archive.org/download/Cory_Doctorow_Podcast_383/Cory_Doctorow_Podcast_383_-_How_To_Destroy_Surveillance_Capitalism_01.mp3

and here's the RSS feed for my podcast:

https://feeds.feedburner.com/doctorow_podcast



The real cancel culture (permalink)

"Cancel culture" – the prospect of permanent exclusion from your chosen profession due to some flaw – has been a fixture in blue-collar labor since the 1930s, as Nathan Newman writes in The American Prospect.

https://prospect.org/labor/how-workers-really-get-canceled-on-the-job/

In the 1930s, employers who wanted to keep labor "agitators" out of their shops adapted the WWI recruitment screening tools to identify "disgruntled" applicants who might organize their co-workers and form a union.

Over the years, this developed into an phrenological-industrial complex, with a huge industry of personality test companies that help employers – especially large employers of low-waged workers – exclude those they judged likely to demand better working conditions.

What began with large firms like Walmart and Marriott grew to consume much of the economy, with 80% of the Fortune 500 relying on tests from the $3+b/year phrenology industry, which is now all digital, incorporating machine learning for an all-algorithmic cancel culture.

The results of these tests get warehoused by giant "HR" companies like Kenexa (bought by IBM for $1.3b, holding 20m test results) and UKG (owned by private equity, with hundreds of millions of worker records).

The latest wrinkle includes junk-science "microexpression" analysis, with applicants being assessed by an algorithm that purports to be able to read their minds by examining minute cues from their faces – a discredited idea with no basis in science.

Indeed, the whole business of personality tests, and the more general field of psychographics, with its touchstones like the "Big Five Personality Types" are more marketing hype than science; Nature calls it a "scant science."

https://www.nature.com/articles/d41586-018-03880-4

Which probably explains why job satisfaction – the thing that all this phrenology is supposed to improve – has remained static since 2000, despite vast spending on career-destroying, life-destroying digital palmistry.

https://www.inc.com/sonia-thompson/68-percent-of-employees-are-disengaged-but-there-i.html

So why do employers do it? Well, as is often the case with algorithmic decision-support tools, the most tangible benefit is empiricism-washing. Algorithms provide cover in the form of empirical facewash for illegal employment discrimination.

An employer's personality test can facilitate illegal discrimination against people with depression, for example, by asking whether "your moods are steady from day to day," and video-based screening can exclude people on the autism spectrum.

Personality assessment also provides cover for the ongoing use of disciplinary technology, such as the bossware that spies on your keystrokes and other online activity, which exploded during lockdown as "work from home" was transformed into "live at work."

Employers can claim the ongoing surveillance is there to help measure and improve job satisfaction, while the phrenology-industrial complex sales reps quietly promise that they'll catch and expel "disgruntled" workers – those apt to organize a union.

Workers won legal battles to ban workplace use of polygraphs, medical exams, genetic screening, credit reports, criminal background checks, and disclosure of social media passwords – but personality screening filled the void, allowing discrimination through the back-door.

Newman thinks the National Labor Relations Board has the authority to step in here and prohibit this kind of personality screening, both prior to hiring and on the job.

"If we are going to have a national debate about free speech in the workplace, stopping the use of personality tests to cancel 'disgruntled' workers should be front and center."

(Image: Wellcome Trust; Cryteria, CC BY, modified)



Ad-tech's algorithmic cruelty (permalink)

The wife of one of my elementary school teachers once delivered a full-term, stillborn baby. It was a great tragedy, but far worse came in the months and years that followed, as direct-marketers bombarded them with pitches that tracked the progress of their dead child.

College-savings plan ads, ads for baby food, annual birthday notices – the whole thing running on autopilot as marketers pursued the Procter & Gamble "lifecycle marketing" playbook that targets the turning points in customers' lives, like parenthood.

This got automated. In 2014, Eric Meyer coined the term "inadvertent algorithmic cruelty" to describe his experience of Facebook's "memories" feature, which bombarded him with pictures of his young daughter on the anniversary of her death.

http://meyerweb.com/eric/thoughts/2014/12/24/inadvertent-algorithmic-cruelty/

Meyer called it "inadvertent," but there's a strong argument to drop that and simply call it "algorithmic cruelty." Facebook should have known that promoting "high-engagement" posts would end up retraumatizing people on the anniversaries of the worse moments in their lives.

And if the company didn't realize it in 2014, they certainly knew about it after, and did not stop. In 2018, Patrick Gerard wrote about how Facebook commemorated his mother's death with a video of animated characters literally dancing on her grave.

https://twitter.com/PatrickGerard01/status/1031920228098355200

Algorithmic cruelty spread to other platforms: for example, Google's smart address book began adding women's stalkers to their speed-dials, sensing a high degree of mutual interactivity:

https://www.wired.com/story/the-problem-with-your-chatty-apps/

The problems of algorithmic cruelty – the predictable ghastliness of a fire-and-forget system of idiotic, automatic cheer – have long been a feature of science fiction.

Think of Bradbury's classic "There Will Come Soft Rains," where an empty house cheerfully greets its dead owners with their daily routine after a nuclear war has killed nearly every living thing.

https://www.btboces.org/Downloads/7_There%20Will%20Come%20Soft%20Rains%20by%20Ray%20Bradbury.pdf

Or David Marusek's pioneering, haunting story "The Wedding Album," about the AI avatars of a couple, created to commemorate their wedding day, outliving the couple and haunting virtual spaces for thousands of years:

https://gumroad.com/davidmarusek

Or Sarah Gailey's instant classic 2018 short story STET, which recounts a particularly horrific sort of algorithmic cruelty in the editorial notes on a scholarly paper about a self-driving car wreck:

https://firesidefiction.com/stet

None of these warnings were heeded. Indeed, algorithmic cruelty – incubated in primitive direct marketing, supercharged by social media – made the jump back to ad-tech, in a form that is thousands of times more virulent than its prehistoric paper-based ancestor.

Writing in Wired, Lauren Goode describes the ad-tech algorithmic cruelty trap she found herself in: eight years ago, she called off her wedding. Today, she is still bombarded with messages that track the progress of a marriage that never happened.

https://www.wired.com/story/weddings-social-media-apps-photos-memories-miscarriage-problem/

These are the product of the "memory monetization machine," which surfaces your old social-media breadcrumbs as inventory for spot-market advertising auctions: "This user got married eight years ago, who will pay me top dollar to show them an ad?"

Naturally, this has all the failure modes of social memory monetization – the dead children and parents, and commemorations of other traumas – but with ad-tech's nonconsensual, eternal torture: you can quit Facebook, but you can't control these background processes.

Goode quotes Kate Eichhorn, whose book THE END OF FORGETTING describes how this nonconsensual external memory system disrupts the "memory editing" that is key to overcoming trauma for the most marginalized among us:

https://www.hup.harvard.edu/catalog.php?isbn=9780674976696

Reading that, I was struck by the distance between the algorithmic cruelty of nonconsensual memory-surfacing, and my own powerful, hugely beneficial practice of combing through my own digital history, which is in a database under my control – my 20-year blog archive.

For a decade, I've started each day by looking at my posts from this day in the past – at first, it was #1yrago and #5yrsago – now I look back at #15yrsago and #20yrsago, and republish the elements that seem significant today. Here's yesterday's:

https://pluralistic.net/2021/04/05/zucks-oily-rags/#retro

I can't overstate beneficial this is: tracking my own predictions, concerns and aspirations over time is an incredible tonic for anxiety, a tool to refine and improve my goals, an empirical, external check on my memories and feelings about where I am and where I've been.

It's like a subspecies of Cognitive Behavioral Therapy, the process of writing down your worries and aspirations, then revisiting them after the fact to refine your understanding of when your intuition leads you true…or astray.

The difference between what I do and algorithmic cruelty isn't technology – it's control. I'm in charge not an unaccountable, nonconsensual algorithm.

As is often the case with tech issues, the important thing isn't what the tech does, it's who it does it to and for.

Indeed, thinking this through this morning made me realize how much I'd like to revisit my photos every day; I've got 20 years worth of them stashed on Flickr, where I was literally one of the first users:

https://memex.craphound.com/2018/04/21/family-owned-smugmug-acquires-flickr-rescuing-it-from-the-sinking-post-yahoo-ship/

I tried it this morning, but Flickr's tools remain incredibly primitive thanks to years of neglect under Yahoo's ownership. Its new owners, Smugmug, have been making great strides, but they have a LOT of technology debt to pay off.

But having manually pulled up photos from this day 5, 10, 15 and 20 years ago, I was absolutely delighted. I would welcome a Flickr change to made it simple to see pics from a given date – maybe by editing the URL itself (currently a mess!):

https://www.flickr.com/search/?text=&min_taken_date=1586156400&max_taken_date=1586242799

The point I'm trying to make here is that we shouldn't mistake the ability to revisit your past experiences and thoughts for algorithmic cruelty – the answer to this cruelty isn't to destroy our digital time-machines; it's to seize the means of computation.

(Image: Cryteria, CC BY, modified)



Folio Society publishes Philip K Dick's short fiction (permalink)

I love books. I have many, many thousands of books. I was a bookseller and a library worker. I write books. I am typing these words in my backyard hammock as the sun rises, and scattered around me on the ground are eleven books that I'm in the midst of reading.

I love books as objects for delivering type to my eyeballs; long-form reading is so much easier with print, despite my worsening visual disability. Reading on a screen is haunted by the omnipresent fact that one tap away is a Tiktok video of a guy shoving a lemon up his nose.

But I also love books as artifacts: old pulps redolent of the moisture they've absorbed, bus transfers and pawn tickets hidden in their pages; beautiful first editions, unwieldy art-books with heavy, clay-coated stock. I just love books.

I have an especial soft spot for really fancy books, limited-edition hardcovers. Beehive Books has done astounding work producing super-deluxe slipcased, hardcover editions of public domain classics:

https://pluralistic.net/2021/02/17/reverse-centaur/#beehive

But long before Beehive arrived on the scene, The Folio Society was turning out these stupendous, mouth-watering editions of new and old literary classics. They're pricey, but if I ever spot one at a used story (Burbank's Iliad Bookshop is a frequent source), I snap it up.

The latest from The Folio Society is a $745 (!), four volume (!) slipcased edition of the complete short fiction (!) of Philip K Dick (!), introduced by Jonathan Lethem (!), limited to 750 hand-numbered copies (!).

https://www.foliosociety.com/usa/the-complete-short-stories.html

The books include 24 original illustrations by 24 artists, including Dave McKean, Georgia Hill, Hilary Clarcq and many, many others.

This set is too rich for my blood, to be honest, but just watching the videos describing the production was profoundly satisfying to me.

https://www.youtube.com/watch?v=YfLlxcY4F6w

https://www.youtube.com/watch?v=4wD6TeIkBFg



This day in history (permalink)

#10yrsago Chicken Little: what do you sell to an immortal, vat-bound quadrillionaire? https://www.tor.com/2011/04/06/chicken-little/?layout=print

#10yrsago Anya’s Ghost: sweet and scary ghost story about identity https://memex.craphound.com/2011/04/06/anyas-ghost-sweet-and-scary-ghost-story-about-identity/

#5yrsago MIT panel on the W3C’s decision to make DRM part of the Web’s “open” standards https://www.youtube.com/watch?v=e3kfXtXRgk0



Colophon (permalink)

Today's top sources: Tor.com (https://www.tor.com/).

Currently writing:

  • A cyberpunk noir thriller novel, "Red Team Blues." Yesterday's progress: 1001 words (51909 total).

Currently reading: Analogia by George Dyson.

Latest podcast: Past Performance is Not Indicative of Future Results https://craphound.com/news/2021/03/28/past-performance-is-not-indicative-of-future-results/
Upcoming appearances:

Recent appearances:

Latest book:

Upcoming books:

  • The Shakedown, with Rebecca Giblin, nonfiction/business/politics, Beacon Press 2022

This work licensed under a Creative Commons Attribution 4.0 license. That means you can use it any way you like, including commercially, provided that you attribute it to me, Cory Doctorow, and include a link to pluralistic.net.

https://creativecommons.org/licenses/by/4.0/

Quotations and images are not included in this license; they are included either under a limitation or exception to copyright, or on the basis of a separate license. Please exercise caution.


How to get Pluralistic:

Blog (no ads, tracking, or data-collection):

Pluralistic.net

Newsletter (no ads, tracking, or data-collection):

https://pluralistic.net/plura-list

Mastodon (no ads, tracking, or data-collection):

https://mamot.fr/web/accounts/303320

Twitter (mass-scale, unrestricted, third-party surveillance and advertising):

https://twitter.com/doctorow

Tumblr (mass-scale, unrestricted, third-party surveillance and advertising):

https://mostlysignssomeportents.tumblr.com/tagged/pluralistic

"When life gives you SARS, you make sarsaparilla" -Joey "Accordion Guy" DeVilla

First off, that is horrible, and I am so sorry.

I have no idea how school kids get through school today with their sanity intact, when they are carrying around a box all the time that sells to them, tells them they aren’t enough, and invites the world to constantly comment on their lives. I would have found it impossible.

Today’s theme for me was how we are letting the algorithms tell us to live, how they are demanding we change for them, and now we are here to live our lives based around what algorithms demand, and not the other way around.

Or, basically what you wrote here, but replace machine with algorithm: https://pluralistic.net/2021/02/17/reverse-centaur/#reverse-centaur

Where is a person supposed to go these days to live a life that doesn’t demand a container?

And looking at places like LinkedIn, are we simply asking for our new software overlords to shape us as they see fit?

The first three pieces have much in common (“intentional algorithmic…” should replace “inadvertent”).
They all help describe a perpetual mania to control, sell, profit, acquire, accumulate and then repeat before the first cycle knows it’s started. It is not so much that life should not be like this but that life is not like this but we have made it so.