Originally published at: Pluralistic: Big Tech’s “attention rents” (03 Nov 2023) – Pluralistic: Daily links from Cory Doctorow
Today's links
- Big Tech's "attention rents": Enshittification comes out of the barrel of an algorithm.
- Hey look at this: Delights to delectate.
- This day in history: 2003, 2008, 2013, 2018
- Colophon: Recent publications, upcoming/recent appearances, current writing projects, current reading
Big Tech's "attention rents" (permalink)
The thing is, any feed or search result is "algorithmic." "Just show me the things posted by people I follow in reverse-chronological order" is an algorithm. "Just show me products that have this SKU" is an algorithm. "Alphabetical sort" is an algorithm. "Random sort" is an algorithm.
Any process that involves more information than you can take in at a glance or digest in a moment needs some kind of sense-making. It needs to be put in some kind of order. There's always gonna be an algorithm.
But that's not what we mean by "the algorithm" (TM). When we talk about "the algorithm," we mean a system for ordering information that uses complex criteria that are not precisely known to us, and than can't be easily divined through an examination of the ordering.
There's an idea that a "good" algorithm is one that does not seek to deceive or harm us. When you search for a specific part number, you want exact matches for that search at the top of the results. It's fine if those results include third-party parts that are compatible with the part you're searching for, so long as they're clearly labeled. There's room for argument about how to order those results – do highly rated third-party parts go above the OEM part? How should the algorithm trade off price and quality?
It's hard to come up with an objective standard to resolve these fine-grained differences, but search technologists have tried. Think of Google: they have a patent on "long clicks." A "long click" is when you search for something and then don't search for it again for quite some time, the implication being that you've found what you were looking for. Google Search ads operate a "pay per click" model, and there's an argument that this aligns Google's ad division's interests with search quality: if the ad division only gets paid when you click a link, they will militate for placing ads that users want to click on.
Platforms are inextricably bound up in this algorithmic information sorting business. Platforms have emerged as the endemic form of internet-based business, which is ironic, because a platform is just an intermediary – a company that connects different groups to each other. The internet's great promise was "disintermediation" – getting rid of intermediaries. We did that, and then we got a whole bunch of new intermediaries.
Usually, those groups can be sorted into two buckets: "business customers" (drivers, merchants, advertisers, publishers, creative workers, etc) and "end users" (riders, shoppers, consumers, audiences, etc). Platforms also sometimes connect end users to each other: think of dating sites, or interest-based forums on Reddit. Either way, a platform's job is to make these connections, and that means platforms are always in the algorithm business.
Whether that's matching a driver and a rider, or an advertiser and a consumer, or a reader and a mix of content from social feeds they're subscribed to and other sources of information on the service, the platform has to make a call as to what you're going to see or do.
These choices are enormously consequential. In the theory of Surveillance Capitalism, these choices take on an almost supernatural quality, where "Big Data" can be used to guess your response to all the different ways of pitching an idea or product to you, in order to select the optimal pitch that bypasses your critical faculties and actually controls your actions, robbing you of "the right to a future tense."
I don't think much of this hypothesis. Every claim to mind control – from Rasputin to MK Ultra to neurolinguistic programming to pick-up artists – has turned out to be bullshit. Besides, you don't need to believe in mind control to explain the ways that algorithms shape our beliefs and actions. When a single company dominates the information landscape – say, when Google controls 90% of your searches – then Google's sorting can deprive you of access to information without you knowing it.
If every "locksmith" listed on Google Maps is a fake referral business, you might conclude that there are no more reputable storefront locksmiths in existence. What's more, this belief is a form of self-fulfilling prophecy: if Google Maps never shows anyone a real locksmith, all the real locksmiths will eventually go bust.
If you never see a social media update from a news source you follow, you might forget that the source exists, or assume they've gone under. If you see a flood of viral videos of smash-and-grab shoplifter gangs and never see a news story about wage theft, you might assume that the former is common and the latter is rare (in reality, shoplifting hasn't risen appreciably, while wage-theft is off the charts).
In the theory of Surveillance Capitalism, the algorithm was invented to make advertisers richer, and then went on to pervert the news (by incentivizing "clickbait") and finally destroyed our politics when its persuasive powers were hijacked by Steve Bannon, Cambridge Analytica, and QAnon grifters to turn millions of vulnerable people into swivel-eyed loons, racists and conspiratorialists.
As I've written, I think this theory gives the ad-tech sector both too much and too little credit, and draws an artificial line between ad-tech and other platform businesses that obscures the connection between all forms of platform decay, from Uber to HBO to Google Search to Twitter to Apple and beyond:
https://pluralistic.net/HowToDestroySurveillanceCapitalism
As a counter to Surveillance Capitalism, I've proposed a theory of platform decay called enshittification, which identifies how the market power of monopoly platforms, combined with the flexibility of digital tools, combined with regulatory capture, allows platforms to abuse both business-customers and end-users, by depriving them of alternatives, then "twiddling" the knobs that determine the rules of the platform without fearing sanction under privacy, labor or consumer protection law, and finally, blocking digital self-help measures like ad-blockers, alternative clients, scrapers, reverse engineering, jailbreaking, and other tech guerrilla warfare tactics:
https://pluralistic.net/2023/01/21/potemkin-ai/#hey-guys
One important distinction between Surveillance Capitalism and enshittification is that enshittification posits that the platform is bad for everyone. Surveillance Capitalism starts from the assumption that surveillance advertising is devastatingly effective (which explains how your racist Facebook uncles got turned into Jan 6 QAnons), and concludes that advertisers must be well-served by the surveillance system.
But advertisers – and other business customers – are very poorly served by platforms. Procter and Gamble reduced its annual surveillance advertising budget from $100m//year to $0/year and saw a 0% reduction in sales. The supposed laser-focused targeting and superhuman message refinement just don't work very well – first, because the tech companies are run by bullshitters whose marketing copy is nonsense, and second because these companies are monopolies who can abuse their customers without losing money.
The point of enshittification is to lock end-users to the platform, then use those locked-in users as bait for business customers, who will also become locked to the platform. Once everyone is holding everyone else hostage, the platform uses the flexibility of digital services to play a variety of algorithmic games to shift value from everyone to the business's shareholders. This flexibility is supercharged by the failure of regulators to enforce privacy, labor and consumer protection standards against the companies, and by these companies' ability to insist that regulators punish end-users, competitors, tinkerers and other third parties to mod, reverse, hack or jailbreak their products and services to block their abuse.
Enshittification needs The Algorithm. When Uber wants to steal from its drivers, it can just do an old-fashioned wage theft, but eventually it will face the music for that kind of scam:
https://apnews.com/article/uber-lyft-new-york-city-wage-theft-9ae3f629cf32d3f2fb6c39b8ffcc6cc6
The best way to steal from drivers is with algorithmic wage discrimination. That's when Uber offers occassional, selective drivers higher rates than it gives to drivers who are fully locked to its platform and take every ride the app offers. The less selective a driver becomes, the lower the premium the app offers goes, but if a driver starts refusing rides, the wage offer climbs again. This isn't the mind-control of Surveillance Capitalism, it's just fraud, shaving fractional pennies off your paycheck in the hopes that you won't notice. The goal is to get drivers to abandon the other side-hustles that allow them to be so choosy about when they drive Uber, and then, once the driver is fully committed, to crank the wage-dial down to the lowest possible setting:
https://pluralistic.net/2023/04/12/algorithmic-wage-discrimination/#fishers-of-men
This is the same game that Facebook played with publishers on the way to its enshittification: when Facebook began aggressively courting publishers, any short snippet republished from the publisher's website to a Facebook feed was likely to be recommended to large numbers of readers. Facebook offered publishers a vast traffic funnel that drove millions of readers to their sites.
But as publishers became more dependent on that traffic, Facebook's algorithm started downranking short excerpts in favor of medium-length ones, building slowly to fulltext Facebook posts that were fully substitutive for the publisher's own web offerings. Like Uber's wage algorithm, Facebook's recommendation engine played its targets like fish on a line.
When publishers responded to declining reach for short excerpts by stepping back from Facebook, Facebook goosed the traffic for their existing posts, sending fresh floods of readers to the publisher's site. When the publisher returned to Facebook, the algorithm once again set to coaxing the publishers into posting ever-larger fractions of their work to Facebook, until, finally, the publisher was totally locked into Facebook. Facebook then started charging publishers for "boosting" – not just to be included in algorithmic recommendations, but to reach their own subscribers.
Enshittification is modern, high-tech enabled, monopolistic form of rent seeking. Rent-seeking is a subtle and important idea from economics, one that is increasingly relevant to our modern economy. For economists, a "rent" is income you get from owning a "factor of production" – something that someone else needs to make or do something.
Rents are not "profits." Profit is income you get from making or doing something. Rent is income you get from owning something needed to make a profit. People who earn their income from rents are called rentiers. If you make your income from profits, you're a "capitalist."
Capitalists and rentiers are in irreconcilable combat with each other. A capitalist wants access to their factors of production at the lowest possible price, whereas rentiers want those prices to be as high as possible. A phone manufacturer wants to be able to make phones as cheaply as possible, while a patent-troll wants to own a patent that the phone manufacturer needs to license in order to make phones. The manufacturer is a capitalism, the troll is a rentier.
The troll might even decide that the best strategy for maximizing their rents is to exclusively license their patents to a single manufacturer and try to eliminate all other phones from the market. This will allow the chosen manufacturer to charge more and also allow the troll to get higher rents. Every capitalist except the chosen manufacturer loses. So do people who want to buy phones. Eventually, even the chosen manufacturer will lose, because the rentier can demand an ever-greater share of their profits in rent.
Digital technology enables all kinds of rent extraction. The more digitized an industry is, the more rent-seeking it becomes. Think of cars, which harvest your data, block third-party repair and parts, and force you to buy everything from acceleration to seat-heaters as a monthly subscription:
https://pluralistic.net/2023/07/24/rent-to-pwn/#kitt-is-a-demon
The cloud is especially prone to rent-seeking, as Yanis Varoufakis writes in his new book, Technofeudalism, where he explains how "cloudalists" have found ways to lock all kinds of productive enterprise into using cloud-based resources from which ever-increasing rents can be extracted:
https://pluralistic.net/2023/09/28/cloudalists/#cloud-capital
The endless malleability of digitization makes for endless variety in rent-seeking, and cataloging all the different forms of digital rent-extraction is a major project in this Age of Enshittification. "Algorithmic Attention Rents: A theory of digital platform market power," a new UCL Institute for Innovation and Public Purpose paper by Tim O'Reilly, Ilan Strauss and Mariana Mazzucato, pins down one of these forms:
The "attention rents" referenced in the paper's title are bait-and-switch scams in which a platform deliberately enshittifies its recommendations, search results or feeds to show you things that are not the thing you asked to see, expect to see, or want to see. They don't do this out of sadism! The point is to extract rent – from you (wasted time, suboptimal outcomes) and from business customers (extracting rents for "boosting," jumbling good results in among scammy or low-quality results).
The authors cite several examples of these attention rents. Much of the paper is given over to Amazon's so-called "advertising" product, a $31b/year program that charges sellers to have their products placed above the items that Amazon's own search engine predicts you will want to buy:
https://pluralistic.net/2022/11/28/enshittification/#relentless-payola
This is a form of gladiatorial combat that pits sellers against each other, forcing them to surrender an ever-larger share of their profits in rent to Amazon for pride of place. Amazon uses a variety of deceptive labels ("Highly Rated – Sponsored") to get you to click on these products, but most of all, they rely two factors. First, Amazon has a long history of surfacing good results in response to queries, which makes buying whatever's at the top of a list a good bet. Second, there's just so many possible results that it takes a lot of work to sift through the probably-adequate stuff at the top of the listings and get to the actually-good stuff down below.
Amazon spent decades subsidizing its sellers' goods – an illegal practice known as "predatory pricing" that enforcers have increasingly turned a blind eye to since the Reagan administration. This has left it with few competitors:
https://pluralistic.net/2023/05/19/fake-it-till-you-make-it/#millennial-lifestyle-subsidy
The lack of competing retail outlets lets Amazon impose other rent-seeking conditions on its sellers. For example, Amazon has a "most favored nation" requirement that forces companies that raise their prices on Amazon to raise their prices everywhere else, which makes everything you buy more expensive, whether that's a Walmart, Target, a mom-and-pop store, or direct from the manufacturer:
https://pluralistic.net/2023/04/25/greedflation/#commissar-bezos
But everyone loses in this "two-sided market." Amazon used "junk ads" to juice its ad-revenue: these are ads that are objectively bad matches for your search, like showing you a Seattle Seahawks jersey in response to a search for LA Lakers merch:
The more of these junk ads Amazon showed, the more revenue it got from sellers – and the more the person selling a Lakers jersey had to pay to show up at the top of your search, and the more they had to charge you to cover those ad expenses, and the more they had to charge fot it everywhere else, too.
The authors describe this process as a transformation between "attention rents" (misdirecting your attention) to "pecuniary rents" (making money). That's important: despite decades of rhetoric about the "attention economy," attention isn't money. As I wrote in my enshittification essay:
You can't use attention as a medium of exchange. You can't use it as a store of value. You can't use it as a unit of account. Attention is like cryptocurrency: a worthless token that is only valuable to the extent that you can trick or coerce someone into parting with "fiat" currency in exchange for it. You have to "monetize" it – that is, you have to exchange the fake money for real money.
The authors come up with some clever techniques for quantifying the ways that this scam harms users. For example, they count the number of places that an advertised product rises in search results, relative to where it would show up in an "organic" search. These quantifications are instructive, but they're also a kind of subtweet at the judiciary.
In 2018, SCOTUS's ruling in American Express v Ohio changed antitrust law for two-sided markets by insisting that so long as one side of a two-sided market was better off as the result of anticompetitive actions, there was no antitrust violation:
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3346776
For platforms, that means that it's OK to screw over sellers, advertisers, performers and other business customers, so long as the end-users are better off: "Go ahead, cheat the Uber drivers, so long as you split the booty with Uber riders."
But in the absence of competition, regulation or self-help measures, platforms cheat everyone – that's the point of enshittification. The attention rents that Amazon's payola scheme extract from shoppers translate into higher prices, worse goods, and lower profits for platform sellers. In other words, Amazon's conduct is so sleazy that it even threads the infinitesimal needle that the Supremes created in American Express.
Here's another algorithmic pecuniary rent: Amazon figured out which of its major rivals used an automated price-matching algorithm, and then cataloged which products they had in common with those sellers. Then, under a program called Project Nessie, Amazon jacked up the prices of those products, knowing that as soon as they raised the prices on Amazon, the prices would go up everywhere else, so Amazon wouldn't lose customers to cheaper alternatives. That scam made Amazon at least a billion dollars:
https://gizmodo.com/ftc-alleges-amazon-used-price-gouging-algorithm-1850986303
This is a great example of how enshittification – rent-seeking on digital platforms – is different from analog rent-seeking. The speed and flexibility with which Amazon and its rivals altered their prices requires digitization. Digitization also let Amazon crank the price-gouging dial to zero whenever they worried that regulators were investigating the program.
So what do we do about it? After years of being made to look like fumblers and clowns by Big Tech, regulators and enforcers – and even lawmakers – have decided to get serious.
The neoliberal narrative of government helplessness and incompetence would have you believe that this will go nowhere. Governments aren't as powerful as giant corporations, and regulators aren't as smart as the supergeniuses of Big Tech. They don't stand a chance.
But that's a counsel of despair and a cheap trick. Weaker US governments have taken on stronger oligarchies and won – think of the defeat of JD Rockefeller and the breakup of Standard Oil in 1911. The people who pulled that off weren't wizards. They were just determined public servants, with political will behind them. There is a growing, forceful public will to end the rein of Big Tech, and there are some determined public servants surfing that will.
In this paper, the authors try to give those enforcers ammo to bring to court and to the public. For example, Amazon claims that its algorithm surfaces the products that make the public happy, without the need for competitive pressure to keep it sharp. But as the paper points out, the only successful new rival ecommerce platform – Tiktok – has found an audience for an entirely new category of goods: dupes, "lower-cost products that have the same or better features than higher cost branded products."
The authors also identify "dark patterns" that platforms use to trick users into consuming feeds that have a higher volume of things that the company profits from, and a lower volume of things that users want to see. For example, platforms routinely switch users from a "following" feed – consisting of things posted by people the user asked to hear from – with an algorithmic "For You" feed, filled with the things the company's shareholders wish the users had asked to see.
Calling this a "dark pattern" reveals just how hollow and self-aggrandizing that term is. "Dark pattern" usually means "fraud." If I ask to see posts from people I like, and you show me posts from people who'll pay you for my attention instead, that's not a sophisticated sleight of hand – it's just a scam. It's the social media equivalent of the eBay seller who sends you an iPhone box with a bunch of gravel inside it instead of an iPhone. Tech bros came up with "dark pattern" as a way of flattering themselves by draping themselves in the mantle of dopamine-hacking wizards, rather than unimaginative con-artists who use a computer to rip people off.
These For You algorithmic feeds aren't just a way to increase the load of sponsored posts in a feed – they're also part of the multi-sided ripoff of enshittified platforms. A For You feed allows platforms to trick publishers and performers into thinking that they are "good at the platform," which both convinces to optimize their production for that platform, and also turns them into Judas Goats who conspicuously brag about how great the platform is for people like them, which brings their peers in, too.
In Veena Dubal's essential paper on algorithmic wage discrimination, she describes how Uber drivers whom the algorithm has favored with (temporary) high per-ride rates brag on driver forums about their skill with the app, bringing in other drivers who blame their lower wages on their failure to "use the app right":
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4331080
As I wrote in my enshittification essay:
If you go down to the midway at your county fair, you'll spot some poor sucker walking around all day with a giant teddy bear that they won by throwing three balls in a peach basket.
The peach-basket is a rigged game. The carny can use a hidden switch to force the balls to bounce out of the basket. No one wins a giant teddy bear unless the carny wants them to win it. Why did the carny let the sucker win the giant teddy bear? So that he'd carry it around all day, convincing other suckers to put down five bucks for their chance to win one:
https://boingboing.net/2006/08/27/rigged-carny-game.html
The carny allocated a giant teddy bear to that poor sucker the way that platforms allocate surpluses to key performers – as a convincer in a "Big Store" con, a way to rope in other suckers who'll make content for the platform, anchoring themselves and their audiences to it.
Platform can't run the giant teddy-bear con unless there's a For You feed. Some platforms – like Tiktok – tempt users into a For You feed by making it as useful as possible, then salting it with doses of enshittification:
Other platforms use the (ugh) "dark pattern" of simply flipping your preference from a "following" feed to a "For You" feed. Either way, the platform can't let anyone keep the giant teddy-bear. Once you've tempted, say, sports bros into piling into the platform with the promise of millions of free eyeballs, you need to withdraw the algorithm's favor for their content so you can give it to, say, astrologers. Of course, the more locked-in the users are, the more shit you can pile into that feed without worrying about them going elsewhere, and the more giant teddy-bears you can give away to more business users so you can lock them in and start extracting rent.
For regulators, the possibility of a "good" algorithmic feed presents a serious challenge: when a feed is bad, how can a regulator tell if its low quality is due to the platform's incompetence at blocking spammers or guessing what users want, or whether it's because the platform is extracting rents?
The paper includes a suite of recommendations, including one that I really liked:
Regulators, working with cooperative industry players, would define reportable metrics based on those that are actually used by the platforms themselves to manage search, social media, e-commerce, and other algorithmic relevancy and recommendation engines.
In other words: find out how the companies themselves measure their performance. Find out what KPIs executives have to hit in order to earn their annual bonuses and use those to figure out what the company's performance is – ad load, ratio of organic clicks to ad clicks, average click-through on the first organic result, etc.
They also recommend some hard rules, like reserving a portion of the top of the screen for "organic" search results, and requiring exact matches to show up as the top result.
I've proposed something similar, applicable across multiple kinds of digital businesses: an end-to-end principle for online services. The end-to-end principle is as old as the internet, and it decrees that the role of an intermediary should be to deliver data from willing senders to willing receivers as quickly and reliably as possible. When we apply this principle to your ISP, we call it Net Neutrality. For services, E2E would mean that if I subscribed to your feed, the service would have a duty to deliver it to me. If I hoisted your email out of my spam folder, none of your future emails should land there. If I search for your product and there's an exact match, that should be the top result:
https://www.eff.org/deeplinks/2023/04/platforms-decay-lets-put-users-first
One interesting wrinkle to framing platform degradation as a failure to connect willing senders and receivers is that it places a whole host of conduct within the regulatory remit of the FTC. Section 5 of the FTC Act contains a broad prohibition against "unfair and deceptive" practices:
https://pluralistic.net/2023/01/10/the-courage-to-govern/#whos-in-charge
That means that the FTC doesn't need any further authorization from Congress to enforce an end to end rule: they can simply propose and pass that rule, on the grounds that telling someone that you'll show them the feeds that they ask for and then not doing so is "unfair and deceptive."
Some of the other proposals in the paper also fit neatly into Section 5 powers, like a "sticky" feed preference. If I tell a service to show me a feed of the people I follow and they switch it to a For You feed, that's plainly unfair and deceptive.
All of this raises the question of what a post-Big-Tech feed would look like. In "How To Break Up Amazon" for The Sling, Peter Carstensen and Darren Bush sketch out some visions for this:
https://www.thesling.org/how-to-break-up-amazon/
They imagine a "condo" model for Amazon, where the sellers collectively own the Amazon storefront, a model similar to capacity rights on natural gas pipelines, or to patent pools. They see two different ways that search-result order could be determined in such a system:
"specific premium placement could go to those vendors that value the placement the most [with revenue] shared among the owners of the condo"
or
"leave it to owners themselves to create joint ventures to promote products"
Note that both of these proposals are compatible with an end-to-end rule and the other regulatory proposals in the paper. Indeed, all these policies are easier to enforce against weaker companies that can't afford to maintain the pretense that they are headquartered in some distant regulatory haven, or pay massive salaries to ex-regulators to work the refs on their behalf:
https://www.thesling.org/in-public-discourse-and-congress-revolvers-defend-amazons-monopoly/
The re-emergence of intermediaries on the internet after its initial rush of disintermediation tells us something important about how we relate to one another. Some authors might be up for directly selling books to their audiences, and some drivers might be up for creating their own taxi service, and some merchants might want to run their own storefronts, but there's plenty of people with something they want to offer us who don't have the will or skill to do it all. Not everyone wants to be a sysadmin, a security auditor, a payment processor, a software engineer, a CFO, a tax-preparer and everything else that goes into running a business. Some people just want to sell you a book. Or find a date. Or teach an online class.
Intermediation isn't intrinsically wicked. Intermediaries fall into pits of enshitffication and other forms of rent-seeking when they aren't disciplined by competitors, by regulators, or by their own users' ability to block their bad conduct (with ad-blockers, say, or other self-help measures). We need intermediaries, and intermediaries don't have to turn into rent-seeking feudal warlords. That only happens if we let it happen.
(Image: Cryteria, CC BY 3.0, modified)
Hey look at this (permalink)
- Antitrust suit stalls JetBlue deal in bid to save low-cost Spirit https://www.csmonitor.com/Business/2023/1031/Antitrust-suit-stalls-JetBlue-deal-in-bid-to-save-low-cost-Spirit
-
Bandcamp Union Files Unfair Labor Practice Charge Against Songtradr https://www.404media.co/bandcamp-union-files-unfair-labor-practice-charge-against-songtradr/
-
Beholder Jack-o'-Lantern https://twitter.com/rangleme/status/1718822323439247735 (h/t Neatorama)
This day in history (permalink)
#20yrsago Mieville on Tolkien https://web.archive.org/web/20031025054631/https://www.panmacmillan.com/features/china/debate.htm
#15yrsago The Essential Groucho https://memex.craphound.com/2008/11/03/the-essential-groucho/
#10yrsago Toronto Mayor Rob Ford’s polls go up after he’s caught lying about crack-smoking video https://www.thestar.com/news/gta/city-hall/mayor-rob-fords-approval-rating-ticks-upward-with-news-of-crack-video/article_a70fbaa2-eb4c-53ac-b732-d1ed3ab14002.html
#10yrsago Inspired by Snowden, more NSA insiders are blowing the whistle https://abcnews.go.com/blogs/headlines/2013/10/more-nsa-leakers-followed-snowdens-footsteps-whistleblower-lawyer-says/
#10yrsago RIAA, BPI websites infringe copyright https://torrentfreak.com/riaa-and-bpi-use-pirated-code-on-their-websites-131102
#10yrsago UK legal aid proposal: bonuses for lawyers whose clients plead guilty https://www.theguardian.com/law/2013/nov/01/lawyers-higher-legal-aid-fees-early-guilty-plea
#5yrsago Swedish ISP punishes Elsevier for forcing it to block Sci-Hub by also blocking Elsevier <a "="" href="https://torrentfreak.com/swedish-isp-protest-site-blocking-by-blocking-rightsholders-website-and-more-181102/ <a href=">https://torrentfreak.com/swedish-isp-protest-site-blocking-by-blocking-rightsholders-website-and-more-181102/
#5yrsago Disneyland’s laundry used “gamification” as an “electronic whip,” leading to worker stress and injuries https://aeon.co/essays/how-employers-have-gamified-work-for-maximum-profit
#5yrsago Facebook blames malicious browser plugins for leak of 81,000 users’ private messages and offer of account data for 120,000,000 users https://www.wired.com/story/hackers-posted-private-facebook-messages/
#5yrsago A human being at Facebook manually approved the idea of targeting ads to users interested in “white genocide” https://theintercept.com/2018/11/02/facebook-ads-white-supremacy-pittsburgh-shooting/
#5yrsago How to tax big tech https://www.taxresearch.org.uk/Blog/2018/10/30/how-to-tax-digital-companies/
#5yrsago America’s most notorious patent troll, now bankrupt, values its bullshit patents at $1 https://www.eff.org/deeplinks/2018/10/stupid-patent-month-how-34-patents-worth-1-led-hundreds-lawsuits
#5yrsago Australia’s 2015 copyright censorship system has failed, so they’re adding (lots) more censorship https://www.eff.org/deeplinks/2018/11/sopaau-australia-testbed-worlds-most-extreme-copyright-blocks
#5yrsago Senator Wyden proposes 20 prison sentences for CEOs who lie about data collection and protection https://www.vice.com/en/article/8xjwjz/sen-ron-wyden-introduces-bill-that-would-send-ceos-to-jail-for-violating-consumer-privacy
Colophon (permalink)
Today's top sources:
Currently writing:
- A Little Brother short story about DIY insulin PLANNING
-
Picks and Shovels, a Martin Hench noir thriller about the heroic era of the PC. FORTHCOMING TOR BOOKS JAN 2025
-
The Bezzle, a Martin Hench noir thriller novel about the prison-tech industry. FORTHCOMING TOR BOOKS FEB 2024
-
Vigilant, Little Brother short story about remote invigilation. FORTHCOMING ON TOR.COM
-
Moral Hazard, a short story for MIT Tech Review's 12 Tomorrows. FIRST DRAFT COMPLETE, ACCEPTED FOR PUBLICATION
-
Spill, a Little Brother short story about pipeline protests. FORTHCOMING ON TOR.COM
Latest podcast: The Canadian Miracle, Part 1 (https://craphound.com/news/2023/11/01/the-canadian-miracle-part-1/
Upcoming appearances:
- Hackaday Supercon, Nov 4 (Pasadena)
https://hackaday.io/superconference/ -
Second Life Book Club, with Rebecca Giblin, Nov 8/17hPT
https://www.youtube.com/watch?v=L07T82M_xzk -
The New Luddites Seizing the Means of Computation, with Brian Merchant (Hallway Track), Nov 9
https://www.verylittlegravitas.com/hallwaytrack -
CBC IDEAS, Nov 16 (Stratford, ON)
https://www.eventbrite.ca/e/cbc-ideas-visionaries-in-conversation-tickets-729692809837 -
Inspiring the Next Generation, Nov 16 (Stratford, ON)
https://www.provocation.ca/upcoming-2023-events-stratford -
Gibson's Bookstore, Nov 18 (Concord, NH)
https://www.gibsonsbookstore.com/event/doctorow-lost-cause -
Lost Cause at Simsbury Public Library, Nov 20 (Simsbury, CT)
https://simsbury.librarycalendar.com/event/author-visit-cory-doctorow-29257 -
Generation of Lost Causes, Nov 22 (Toronto)
https://www.torontopubliclibrary.ca/detail.jsp?Entt=RDMEVT495758&R=EVT495758 -
Who Is Watching Big Tech? Nov 27 (Toronto)`
https://www.torontopubliclibrary.ca/detail.jsp?Entt=RDMEVT496408&R=EVT496408 -
The Lost Cause at The Strand (NYC), Nov 29
https://www.eventbrite.com/e/cory-doctorow-the-lost-cause-tickets-734958008187 -
The Lost Cause at Flyleaf Books (Chapel Hill), Dec 7
https://www.flyleafbooks.com/doctorow-2023
Recent appearances:
- The Material Power That Rules Computation (This Machine Kills)
https://soundcloud.com/thismachinekillspod/294-the-material-power-that-rules-computation-ft-cory-doctorow -
The Science in the Fiction
https://www.buzzsprout.com/2201157/13896228-ep-14-cory-doctorow-on-the-lost-cause-red-team-blues-and-chokepoint-capitalism -
The Internet Con at the Internet Archive
https://archive.org/details/the-internet-con
Latest books:
- "The Internet Con": A nonfiction book about interoperability and Big Tech (Verso) September 2023 (http://seizethemeansofcomputation.org). Signed copies at Book Soup (https://www.booksoup.com/book/9781804291245).
-
"Red Team Blues": "A grabby, compulsive thriller that will leave you knowing more about how the world works than you did before." Tor Books http://redteamblues.com. Signed copies at Dark Delicacies (US): and Forbidden Planet (UK): https://forbiddenplanet.com/385004-red-team-blues-signed-edition-hardcover/.
-
"Chokepoint Capitalism: How to Beat Big Tech, Tame Big Content, and Get Artists Paid, with Rebecca Giblin", on how to unrig the markets for creative labor, Beacon Press/Scribe 2022 https://chokepointcapitalism.com
-
"Attack Surface": The third Little Brother novel, a standalone technothriller for adults. The Washington Post called it "a political cyberthriller, vigorous, bold and savvy about the limits of revolution and resistance." Order signed, personalized copies from Dark Delicacies https://www.darkdel.com/store/p1840/Available_Now%3A_Attack_Surface.html
-
"How to Destroy Surveillance Capitalism": an anti-monopoly pamphlet analyzing the true harms of surveillance capitalism and proposing a solution. https://onezero.medium.com/how-to-destroy-surveillance-capitalism-8135e6744d59?sk=f6cd10e54e20a07d4c6d0f3ac011af6b) (signed copies: https://www.darkdel.com/store/p2024/Available_Now%3A__How_to_Destroy_Surveillance_Capitalism.html)
-
"Little Brother/Homeland": A reissue omnibus edition with a new introduction by Edward Snowden: https://us.macmillan.com/books/9781250774583; personalized/signed copies here: https://www.darkdel.com/store/p1750/July%3A__Little_Brother_%26_Homeland.html
-
"Poesy the Monster Slayer" a picture book about monsters, bedtime, gender, and kicking ass. Order here: https://us.macmillan.com/books/9781626723627. Get a personalized, signed copy here: https://www.darkdel.com/store/p2682/Corey_Doctorow%3A_Poesy_the_Monster_Slayer_HB.html#/.
Upcoming books:
- The Lost Cause: a post-Green New Deal eco-topian novel about truth and reconciliation with white nationalist militias, Tor Books, November 2023
-
The Bezzle: a sequel to "Red Team Blues," about prison-tech and other grifts, Tor Books, February 2024
-
Picks and Shovels: a sequel to "Red Team Blues," about the heroic era of the PC, Tor Books, February 2025
-
Unauthorized Bread: a graphic novel adapted from my novella about refugees, toasters and DRM, FirstSecond, 2025
This work – excluding any serialized fiction – is licensed under a Creative Commons Attribution 4.0 license. That means you can use it any way you like, including commercially, provided that you attribute it to me, Cory Doctorow, and include a link to pluralistic.net.
https://creativecommons.org/licenses/by/4.0/
Quotations and images are not included in this license; they are included either under a limitation or exception to copyright, or on the basis of a separate license. Please exercise caution.
How to get Pluralistic:
Blog (no ads, tracking, or data-collection):
Newsletter (no ads, tracking, or data-collection):
https://pluralistic.net/plura-list
Mastodon (no ads, tracking, or data-collection):
Medium (no ads, paywalled):
Twitter (mass-scale, unrestricted, third-party surveillance and advertising):
Tumblr (mass-scale, unrestricted, third-party surveillance and advertising):
https://mostlysignssomeportents.tumblr.com/tagged/pluralistic
"When life gives you SARS, you make sarsaparilla" -Joey "Accordion Guy" DeVilla