The future of technological texture
9 October 2020
I’m going to use today’s post as a planning tool for a longer-form piece I’ve been meaning to write for a while. I’ve been thinking for some time about what the next phase of our engagement with technology will look like. I’m of the view that the next generation of platforms we engage meaningfully with will prioritize narrative building and texture over the current suite of tools that prioritize productivity and efficiency.
That’s the general premise, and the following are the specific points (in no particular order) I want to explore and unjumble as part of drawing out the thinking:
There’s a ceiling to how productive and efficient tools can make us. We are limited by certain biological constraints, and to progress past a particular level of efficiency, we will need some sort of technological augmentation.
The current suite of technologies and media we engage with are still relatively new, and many of us can remember a time before the internet and social media were ubiquitous. As we emerge from what has been a period of intense adjustment to this new paradigm, we will look for ways to exert more control over technology and work with, rather than against it.
Our content, whether public or private, will shift from projecting the moment-to-moment to instead projecting longer and more curated narrative arcs.
Virtual and augmented reality will continue to develop and allows us to bring more texture to our “digital” experiences, which is itself a term that will soon become obsolete. Our avatars and projections will become richer and more textured.
That’s a start. I’ll return to this in the coming weeks. Until then, have a great weekend!
The obvious is hard to spot
8 October 2020
I’ve been asking myself, and some people around me, what’s something that in six months’ time we’ll consider entirely obvious, but that today we either (a) don’t recognize or (b) aren’t willing to call out.
For something to be obvious then, and unclear now, a large change must occur, so the proposals below necessarily imply changes with broad implications. Because they are bold statements that imply large changes in perception, the inverse of the statement may also turn out to be true. I’ve therefore noted first what I think is the more likely shift on the issue, then the less likely position, and the implications of both.
The health of the United States: I discussed this with a friend today, and we agreed that it may be obvious in hindsight that the country had already started actually fracturing into its different states. I wrote here two weeks ago that the fracturing may have already begun, and the piece draws that thinking out further. But the inverse, that the country has been portrayed as significantly more fractured than it actually is by a divisive media, may also be true. There is a huge divergence of possible implications depending on which of these outcomes plays out. It’s also possible that the country just continues to muddle on through a quagmire of polarized but non-violent stagnation.
New York’s vibrancy: There were lots of “New York is dead” stories peddled in the middle of the year. I missed the worst of it here, and I understand that current activity levels are localized, but the city is absolutely alive at the moment. I can’t really tell if the city-is-dead narrative persists outside New York, but if it does, I think it will be obvious in hindsight that it was not true. New York, and midtown especially, is lagging other major US cities in terms of return to normal, and this is driven by the city’s reliance on mass transit, which is part of what’s made it such a powerhouse. But there’s nothing that suggests to me the city will not return to it’s pre-COVID vibrancy. The coming winter will provide another significant test through forcing people back inside, but should it emerge in the spring in decent shape, look out. In this case, there’s no middle road; the costs of living here (financial and non-financial) are extreme, and if the calculus no longer makes sense… also look out.
China’s genocide: There is nothing non-obvious about this. China is committing genocide against the Uyghur Muslims of Xianjing. The evidence appears unequivocal, showing mass incarceration, forced sterilizations and cultural obliteration. The question then, as the US Holocaust Museum put it, is how to connect “the solemn commitments of the past and a new atrocity unfolding before the world’s eyes”? How does this fit into the framework of this piece? Perhaps in six months we will say that it was obvious that China was eradicating an entire population, and the global community did nothing about it. In this case, there is no inverse, just continuing antipathy.
The question in this piece, and the issues it raises, is one that I’m going to spend more time with over the coming weeks.
Career footnotes
7 October 2020
“It’s useful to focus on adding another zero to whatever you define as your success metric—money, status, impact on the world, or whatever. I am willing to take as much time as needed between projects to find my next thing. But I always want it to be a project that, if successful, will make the rest of my career look like a footnote.”
I was digging through old Evernote threads and came across one that highlighted parts of Sam Altman’s “How to be Successful”. I now remember why I saved it, as it’s overflowing with practical and useful mental models. And the footnote-model for career planning is spectacular.
I’m fortunate to meet with very successful and senior people on a regular basis through my work. One thing that Americans, or at least New Yorkers, tend to do in meetings, is give a synopsis of their careers to-date. Listening to people regurgitate their bios as if we’re sitting in a three-dimensional LinkedIn page is often pretty boring and unnecessary. But sometimes, somebody casually rolls through a career description that makes you sit up straighter and pay more attention.
Lofty titles and senior people can indeed be impressive; the people sitting atop large institutions do genuinely yield significant power and control, and oftentimes their intellect and manner match the responsibility that comes with their position. But they aren’t the people I find objectively impressive, as their “glow” is in reference to their position, something that’s largely extrinsic to them.
The people who make you take notice are the ones who rattle off a string of experiences, where each successive role or achievement sprouts from the previous in a non-linear fashion, with exponentially increasing increasing scope or impact. Their story is impressive not for its constituent parts (which themselves are often standalone impressive), but instead for how it evolved.
The ability to move through a career in a non-linear fashion and fill roles with increasing seniority or breadth, is indicative of a few valuable traits: curiosity (to be interested in things that don’t directly “concern” you), strong communication and networking skills (to curate a group of people that care about you and to then convince them to help you do something non-standard), and strong self-belief (to convince yourself you can do something you’re “not qualified” to do).
I’ve thought about this a lot, and Sam’s piece helped me think about it in a very simple way. If you aspire to non-linear career progression, then each successive role - should you make it there - will by definition make the immediately prior role look strangely out-of-place. Ambition can therefore relegate prior roles to footnotes in the growing narrative arc of your career.
Most importantly, this mental model provides a guide for future planning. If your ambition pushes you to aspire to a career of exponentially increasing returns (financial and non-financial), then you must seek out opportunities that make your current and prior roles appear as footnotes; if it doesn’t fit the “footnote” model as described here, perhaps you aren’t dreaming big enough. Or maybe you don’t actually want to.
The curation/creation paradox
6 October 2020
I read a piece today titled “Curators are the New Creators”, which touched on a number of topics I’ve been thinking about for a while, including the following:
“With more creators, more content, and more choice than ever before, consumers are now being consumed by a state of analysis paralysis. The real scarcity isn’t content anymore. It’s attention.”
Curation is a function that we perform (1) ourselves (by being deliberate about the type of content we expose ourselves to) and (2) by outsourcing (by including content curators in our content universe in (1)). The degree to which you believe you’re proficient in method (1) affects the composition of your content universe and can make you more or less reliant on method (2).
“Curation” in the sense we use it today is important when discussing products and methods for dealing with information overload, but is actually a relatively timeless concept, that like many others, is undergoing a re-interpretation as the nature of information transfer changes. At its core, the object of any form of curation is “information”, and could be described as follows:
Curation = the transfer of information through a filter designed to create an output that supports a specific objective
The nature of the bolded inputs has changed enormously over time, but the structure of curation is relatively consistent. Each input is itself a fascinating object of consideration, and worth covering in detail in a longer-form post. But the one area worth focusing on today is the specific objective of “curation”. What are we referring to when we use this term today?
The simplest answer, at least from my perspective, is the distillation of the news and opinion cycle into a digestible and trustworthy format. In my life this takes the form of (a) a Twitter feed comprised of people I respect who opine on topics of interest to me, (b) a daily checking of a small number of websites, and (c) the daily receipt of lots of newsletters.
Each one of these channels involves an element of curation, with e.g. (a) relying more on method (1) and (c) relying more on method (2). I curate myself and curate the curators. Within this structure, the external curators are really important, but are subject to replacement once I develop the ability to curate the subject matter myself (i.e. go direct to the source) or as my interests and preferences change.
I therefore don’t agree entirely with the premise that curators are the new creators, and would go so far as saying I believe the opposite: creators have never been more important and relevant. I entirely agree that attention is now significantly more scarce than content, but disagree that better externalized curation is the solution. The volume of information is increasing at an accelerating rate, and at some point, the act of curation will become less valuable as it will be impossible to survey the entirety of the relevant content, something that is central to effective curation. If the curator isn’t across everything, their curation is no better than mine.
What’s my solution? I’m not sure yet. I think externalized curation is a critical to keeping our heads above water as concerned citizens and motivated professionals. But the acknowledgement that our attention is becoming more valuable leads me to believe that we should be increasingly more deliberate in everything we do. Striving for more curation can imply an abdication of the requirement to think for ourselves. Just like we outsource curation, we also outsource the act of thinking.
We need curation. But we also need to acknowledge that perfecting the act of getting information has a limited payoff. It’s what we do with it that counts, and unless we acknowledge that obtaining information is not in and of itself a goal, we will continue to tread water in the rising sea of information.
Rabbit holes
5 October 2020
I’ve been exhausted today. There have been a number of things I’ve wanted to write about, but I’ve found it difficult to get anything going. Sitting on the couch tonight, contemplating my exhaustion, I was reminded of an Andrew Sullivan passage from a few weeks ago:
“My average screen time this past week was close to ten hours a day. Yes, a lot of that is work-related. But the idea that I have any real conscious life outside this virtual portal is delusional. And if you live in such a madhouse all the time, you will become mad. You don’t go down a rabbit-hole; your mind increasingly is the rabbit hole — rewired that way by algorithmic practice. And you cannot get out, unless you fight the algorithms to a draw, or manage to exert superhuman discipline and end social media use altogether.”
My first observation is that this was written three weeks ago. Then, Sullivan wrote of “a gut-churning anxiety” and “a sense that the system itself was buckling”. This was before Ruth Bader Ginsburg passed, before elements of Trump’s tax returns were exposed, before the first debate, and before Trump contracted COVID-19. You could say that Sullivan had no idea what was coming. Or you could say that while the specific form of chaos to come was somewhat unknown, chaos itself was guaranteed.
Nothing about the last few weeks has been surprising in a directional sense, but I’ve still been unable to pull myself back from the hour-by-hour horror show. Like my friend Oscar, I also spent large chunks of a gorgeous New York weekend “spiritually degrading myself on Twitter by reading about Trump’s disease”. Fast forward to Monday, and I still couldn’t help myself.
I also use Twitter for more productive purposes, like following people, writers and companies relevant to my work, interests or aspirations. Today, I came across a number of people and companies working on products and concepts I’ve been thinking about for some time. I furiously followed, subscribed and signed-up, chasing some elusive outcome, or at least a step “forward”, whatever that means. I wanted to engage, build, and work, but instead was spinning wheels mentally.
And that’s where I come back to Sullivan’s comment about rabbit holes. In both cases I was tumbling down rabbit holes, while world around me remained remarkably still; it’s my mind that’s untethered. Today and this weekend, my mind itself was the rabbit hole.
If we do not exercise at least some control, we are submitting our minds to be “re-wired by algorithmic practice”. I write and meditate daily to protect and strengthen the space between my ears, but so willingly surrender it to Twitter’s endlessly updating feed. Josh Wolfe said it best on, ironically, a Twitter thread:
“Time + attention are the scarcest resource. Willingly paying it to others in an unscripted circus show––as a passive spectator, a captive audience member gasping and cheering––is a squandering abdication of your own ambition and the good you must put out. Get up, get out, get at it––and get away from the vortex.”
Get up, get out, get at it.
Photo by ASTERISK