Subscripe to be the first to know about our updates!
K. Oskar Schulz and Roman C. Ugarte
ChatGPT can do CS50 problem sets, draft section discussion posts, and even help prepare for consulting case interviews. It also passed the bar and did better on most of its AP exams than the majority of students did — and it took all of them. Each week, it feels, comes a new achievement.
It’s easy to ask yourself: Why am I doing this work, then? What’s the point of learning anything?
In this new — and highly unfamiliar — environment, the true use of your knowledge will not be for producing answers. It will be for asking the right questions.
Emerging large language models (like ChatGPT) will make accessing local knowledge, or domain-specific pieces of information, trivially simple. This isn’t an entirely new phenomenon — the internet made the answers to all sorts of specific questions a few clicks away. What’s the capital of Serbia, again?
In a world with these sophisticated models, the core expression of human knowledge will be in deciding what to use these tools for. Without any knowledge of anything, one would not be able to ask the right questions. Imagine you are trying to come up with a new research direction in physics. If you don’t know any physics, you wouldn’t know where to start. In order to be able to ask the most interesting questions, you need to have some general knowledge.
This global knowledge, we’ll call it, comes from exploring a lot of different intellectual areas. Even if you don’t remember the specifics of that introductory psychology course you took freshman year, for example, you can probably retrace your steps to revisit a concept from that class later on. And that law school lecture you sat in on perhaps didn’t leave you with very practical knowledge, but now you have the vocabulary to dive deeper when you need to.
A lot of times, global knowledge — or an interdisciplinary education, as Harvard brands it — is not very practically useful. You might forget everything a few months later. But there will be occasional flashes of inspiration that come from these cross-discipline connections that simply wouldn’t have occurred otherwise. Knowing what you don’t know — in other words, knowing that some kind of knowledge exists — enables you to find it when you need it. Tools like ChatGPT will close this last mile in the journey for knowledge — you just need to know what to ask.
However, there will soon also be types of knowledge work that artificial intelligence can simply do better than most humans.
For a historical analogue, take chess. In 1997, Garry K. Kasparov, a world chess champion, faced off against Deep Blue, a chess supercomputer built by IBM, and, in the last game of their match, conceded defeat for the first time in his career. For many, this was a crucial blow to the supremacy of the human mind over computers.
And yet, chess did not die. Far from it — the community is more active than ever and, thanks to chess engines, today’s competitors are using computers to learn even more deeply about the game and reach new heights.
In a world where a human is always inferior to a computer at chess, what value does the game serve? When real-time translation systems are widely used, why bother learning another language? Why read a book if ChatGPT already knows everything about it?
It still makes sense to learn things that aren’t useful. The strenuous mental activity involved in playing chess has been suggested to create plasticity changes in the brain that could protect against dementia. There’s a vast literature on the perspective change that comes from multilingualism. And reading a book can teach argumentation and storytelling skills, even if you’ll forget the specifics a few months later.
In many cases — but not all — the process of learning these non-practical skills is the product.
Ira J. Glass, the host and producer of NPR’s This American Life, suggests that there are two elements to a great artist: taste and skill. Most creative people, he says, have great artistic taste; it’s the skill that they’re working on developing. Emerging artists start off with poor skill: They know what great art looks like but aren’t experienced enough to pull it off. With enough work on their craft, according to Glass, they can lessen this skill gap and finally start making the great art they’ve imagined.
But the development of new generative tools has collapsed this skill gap across a variety of domains. Amateur developers can whip up an iPhone app in a day or two. Digital artists can design entirely new worlds, in high resolution, in a matter of minutes. Novelists can write books in the span of weeks, instead of years.
It prompts us to ask: What skills do I have, honestly, that aren’t at risk of replacement by artificial intelligence? Probably very few.
However, the limiting factor here isn’t the quantity of output — we’re entering into an era of content abundance — it’s how much of it is actually good. And that’s a question that, at least for now, is deeply human. It’s what comes from having a well-developed taste — the one remaining element needed for a great artist in the age of AI.
Taste can take all types of forms. For a researcher, it might be how you decide on what question to investigate in the first place, not the act of running the regression. For a journalist, it’s the intuition you have for chasing a story and collecting the right information, not necessarily turning it into words. And for a university professor, it could be the ability to excite and mentor students, not necessarily teaching the subject material.
It’s hard to know where this is all headed. One thing is clear, though: This should change both what and how we choose to learn. If, at Harvard or elsewhere, you have the opportunity to expand your global knowledge, pick up useful — but perhaps not practical — skills, and invest in your taste, you should take it.
Source: The Harvard Crimson
Subscripe to be the first to know about our updates!
Follow our latest news and services through our Twitter account