I find it deeply disturbing that the tech industry represents 9% of the U.S. GDP and only five Big Tech companies account for 25% of the S&P 500. Prior to Covid, most of the growth in stock market came from Big Tech (not the Trump Administration…). Now, as the U.S. economy is all sorts of wacky, Big Tech is what is keeping the stock market’s chin above water. In the process, Big Tech is accounting for more and more of the stock market. ::gulp::
If capitalism and stock markets aren’t your thing, it’s easy to shrug your shoulders at this. But the stock market is infrastructural in profound (and disturbing) ways to American life. Professors: university endowments depend on the stock market staying strong. So do the few remaining pension plans (hiiii government workers!). The S&P 500 is also important for nearly all retirement plans and, much to our collective chagrin, the stability of the banking world itself. Economists tend to scare the heck out of me whenever they talk about how many things are connected to the “overall health of the economy” which is increasingly dependent on a small number of Big Tech companies. And oh boy do they feel the heat to keep the economy chugging along.
Inside the tech industry, there’s another strange calculation. High status employee compensation in tech is also tethered to the stock market. Because of talent wars, tech companies panic when their stocks fall because their employees have no incentive to stay since that’s so much of their compensation. The talent wars have all sorts of other perverse incentives. For example, companies have little incentive to invest in training people for fear that they will go elsewhere. And it shouldn’t be surprising that tech companies conspired to wage-fix in an effort to cap the salaries of certain classes of workers so as to not be in a perennial talent war with each other.
Given how intense the talent wars have been in recent years, I can’t help but be fascinated by the mass layoffs happening now in tech. Over the last few months, companies are coming forward with their tails between their legs saying that they over-hired which is why they needed to lay people off. But did they all really do the exact same thing? Or is there more going on here?
In a classic text in sociology, Paul DiMaggio and Woody Powell mapped out an idea called “institutional isomorphism” where they highlighted how corporations and other large institutional arrangements move in alignment with one another. They describe practices of coordination, mimicry, and normative pressures. In other words, there are structural reasons why companies in entire sectors tend to do the same darn thing.
This might explain the collective over-hiring, but I can’t help but wonder if we’re also watching an inversion of the wage-fixing dynamic. By collectively moving at once, the tech companies are also putting a big pause on the talent wars (outside of the tiny number of very specific roles). Right now, there is widespread fear of job loss across the tech industry. In response, tech workers are staying put. They’re not going anywhere. Unless they’ve been forced out. (Random aside: will we see a massive influx in startups in a few years due to layoffs?)
I wonder if tech leaders think that this hovering threat of more and more layoffs will prompt workers into working harder, faster, more in-line with the company’s goals. Fear is a motivator. Do tech leaders believe that’s effective? Moreover, is it? Are remaining workers helping build the value of these companies at faster rates that benefits the economy? Or is fear creating all sorts of externalities within these companies? I honestly don’t know. I’m waiting for the b-school research!
Amidst the chaos inside the tech industry, we have AI. AI is often described as the cause of the chaos, but I can’t help but wonder if it’s just the hook. AI offers all sorts of imaginaries. And imaginaries are necessary to keeping stock markets going up up up. People want to imagine that this new technology will transform society. They want to imagine that this new technology will strengthen the economy as a whole (even if a few companies have to die).
Many social scientists and historians are critics of AI for reasons that make total sense to me. Technologies have historically reified existing structural inequities, for example. However, the fear-mongering that intrigues me is that coming from within the technical AI community itself. The existential threat conversation is a topic of a different rant, but one aspect of it is relevant here.
Many in the AI tech community believe that self-coding AIs will code humans out of existence and make humans subordinate to AIs. This is fascinating on soooo many levels. The a-historic failure to recognize how humans have repeatedly made other humans subordinate is obviously my first groan. Yet, more specifically to this situation is the failure of extraordinarily high status, high net-worth individuals to reckon with how the tech industry has made people subordinate in a capitalistic context already.
Poke around a bit and these folks will talk about how programmers are doomed. And I can’t help but be fascinated by their angst. At the center of this existential threat is a threat to their own status, power, and domination. They’re afraid that they will become subordinate to the machine (?or other political arrangements?). But they’re projecting this onto all of humanity without appreciating the ways in which so many people already feel subordinate to a machine, namely a particular arrangement of capital and power that is extraordinarily oppressive.
So I keep coming back to this question: How much of the computer science panic of an AI robot takeover is actually coming from an anxiety that their status, power, and wealth is under threat? (And, as such, their agency… But that’s a topic for another rant.)
I keep trying to turn over rocks and make sense of the hype-fear continuum of AI that’s unfolding and what is really standing out to me are the layers and layers of anxiety. Anxiety from tech workers about job precarity and existential risk. Anxiety from tech leaders about the competitiveness of their organizations. Anxieties from national security experts about geopolitical arrangements. Anxieties from climate scientists about the cost of the GPU fights surpassing the crypto mining. Anxieties from economists and politicians about the fundamentals of the economy.
So I keep wondering… what are going to be the outcomes of an anxiety-driven social order at the cornerstone of the economy, the social fabric, and the (geo)political arrangements? History is not comforting here. So help me out… How else should I be thinking of this arrangement? And what else in tangled up in this mess? Cuz more and more, I’m thinking that obsessing over AI is a strategic distraction more than an effective way of grappling with our sociotechnical reality.