banner
conanxin

conanxin

Limn: Utopian Hacker

Compiled from: Utopian Hacks - Limn, Author: Götz Bachmann

In a lab in Oakland, a group of elite and unconventional engineers is trying to reimagine what computers can do and what they should do. It is here, in this lab located in Silicon Valley (or near Silicon Valley, depending on how you define its boundaries), that my ethnography is based. This team gathers around an engineer named Bret Victor and is part of the YC Research Human Advancement Research Community (HARC), a research lab funded by the industry dedicated to open and foundational research. "Hackers" refers to members of this group, a term that, like many others for engineers, is at best a word for experimental work (e.g., "It's just a hack") or refers to using technology for other purposes than originally intended. It can also be a pejorative term, referring to the consequences of amateurish, low-quality technological development. Therefore, when the engineers I study describe their work, "hacker" is not one of the key terms they would choose. However, I want to say that some of their work practices bear similarities to hacking, albeit in different domains. This article asks: How do engineers hack the imagined technology of what it is and what it could be?

I argue this point by analyzing these engineers, which I call "radical engineering," as there is no better term. Radical engineers fundamentally challenge existing concepts of technology (here referring to digital media): their essential characteristics, purposes, and possible futures. Their radicalism should not be confused with political radicalism, "disruptive" radicalism, or the radicalism of certain engineering outcomes. Their radicalism places them outside the more obvious, self-evident, time-tested, or desirable fields of engineering. Their stance is so heretical that they often no longer refer to themselves as "engineers." But there is no other word to replace it. They might try to use terms like "artist" or "designer in the style of Horst Rittel," but both are unstable and prone to misunderstanding. After all, these individuals are educated in disciplines such as electrical engineering, mechanical engineering, computer science, or mathematics, and their work often requires solving highly complex technical problems.

Bret Victor's team is trying to establish a new medium. To achieve this, rather than a sudden flash of inspiration, it is a process that is permanent, stubborn, and transcends what can currently be imagined. The lab takes existing technologies like projectors, cameras, lasers, whiteboards, computers, and Go pieces and reconfigures them with new or historical ideas about programming paradigms, system design, and information design, as well as a range of assumptions and visions about cognition, communication, social interaction, politics, and media. The team is building a series of operating systems for a spatial dynamic medium, each operating system based on the construction experiences of the previous one, with each construction taking about two years. The current operating system is named "Realtalk," and its predecessor was called "Hypercard in The World" (both names pay homage to historical, unconventional programming environments: Smalltalk from the 1970s and Hypercard from the 1980s). When this team develops such an operating system, it involves a process of writing and rewriting code, declarations, a lot of conversation, and even more moments of collective silence, spells of iteration and adjustment, digesting films and books, and numerous technical papers, and building dozens—actually hundreds—of hardware and software prototypes.

Prototypes are everywhere in the lab, with new ones being added every week. Within a month, visitors can point a laser at a book in the library, and the projector will project the internal contents of that book onto the wall beside her. A few weeks later, you would see people jumping around on the floor, playing a game of "laser socks": people trying to shine lasers on each other's white socks. A few months later, a table turned into a pinball machine made of light emitted from a projector, with videos of cats following around each rectangle drawn on paper. Currently, the group is experimenting with "small languages" in spatial media: specific domain programming languages based on paper, pens, scissors, Go pieces, or strings, all possessing dynamic properties, thus having the ability to directly guide computation or visualize complexity. The emphasis of all these prototypes is not on that kind of dazzling technical complexity. In fact, it is quite the opposite. The purpose of the prototypes is simplicity and simplification—empirically, you can assume that the fewer lines of code involved, the simpler those lines are, the more successful the prototype is perceived to be.

Illustration by David Hellman (draft), imagining Dynamic Land, the next iteration of dynamic spatial media in 2017

Although these prototypes are interesting, they remain "working artifacts," forming "traps" for potential possibilities with "illusions of self-movement." In Bret Victor's research group, the work of prototypes is to capture and showcase the potential attributes of a new, spatial, dynamic medium. One of its ideal attributes is simplicity, and those prototypes that demonstrate this attribute are often selected as successful. Every two years or so, the entire process produces a new operating system, which then allows for the construction of a whole new generation of prototypes, which are often (though not always) based on the capabilities of their respective current operating systems while already exploring the potential of the next generation. The overall goal is to create a fundamental breakthrough, equivalent to the technological leap of the 1960s and early 1970s, when the quadruple introduction of microprocessors, personal computers, graphical user interfaces, and the internet fundamentally transformed computation by turning computers into a medium. Turning computation into a medium in the 1960s and 1970s meant confronting technology with technology: by using new computational capabilities, a medium was created that conformed less to what people at the time thought computers "were" and more to what a dynamic version of paper might look like. In the work of Bret Victor's research group, this form of working against computation becomes radical.

Whether spiritually or in real life, the guardian of this endeavor is Alan Kay, one of the most famous radical engineers and a key contributor to those breakthroughs in the field of computing in the 1960s and 1970s that Bret Victor's team is trying to catch up with today. Let's take a look at Alan Kay. In the 1960s, he began working in the newly established computer science department at the University of Utah, writing what is arguably one of the boldest doctoral theses of all time, a crazy technical dream about a new kind of computing. The thesis begins with a quote from another radical engineer's desperate cry—"I wish these computations were done with steam" (from Charles Babbage)—and after 250 pages of reflection on "reactive engines," the climax of the thesis is a fictional "Flex Machine" manual: the first iteration of a series of ideas that would later peak in Alan Kay's vision of the "DynaBook" (1972). While researching this thesis, Kay became one of the young members of a research group funded by the Pentagon's Advanced Research Projects Agency (ARPA) Information Processing Techniques Office (IPTO), which was then taking the first steps toward establishing ARPANET. In the early 1970s, after a period of postdoctoral work with John McCarthy at Stanford University, Kay joined Bob Taylor's new Xerox PARC research lab, where legendary figures in engineering like Lampson, Thacker, Metcalfe, and many others were building the ALTO system, the first system to connect independent machines with advanced graphical capabilities.

Once the first iteration of the ALTO/Ethernet system—understanding the latter as a system rather than an independent computer is crucial—began to run, it provided Alan Kay with a powerful playground. Kay reflects on some of his work from the 1960s when he analyzed SIMULA (an obscure Norwegian programming language) and, together with Dan Ingalls and Adele Goldberg, developed a hybrid between programming languages, operating systems, and children's toys—Smalltalk. The first iteration of Smalltalk was an object-oriented experiment aimed at modeling all programming from scratch after a distributed messaging system: later versions abandoned this, and after an initial phase of success, Smalltalk ultimately lost its dominance in object-oriented programming to languages like C++ and Java. However, in the mid-1970s, the ALTO/Ethernet/Smalltalk system became a hotbed of ideas about graphical user interfaces (GUIs) and many applications that are now commonplace. Thus, Kay and his "Learning Research Group" can be seen both as a lost computing holy grail, disrupted by the computing model cast in hardware and software by capitalism, and as one of the key genealogical centers of its later emergence. It is precisely this dual significance that makes this work so unique and interesting to this day.

A whiteboard in Bret Victor's lab filled with Alan Kay's papers

Alan Kay's contributions to the history of computing are the result of a radical disruption of computing paradigms and imaginations of his time. Kay adopted unorthodox programming techniques pioneered by SIMULA, new visualization techniques developed by the Sutherland brothers, McCarthy's desire for "private computing" and Wes Clark's "lonely machines," Doug Engelbart's group's augmentation experiments, and new ideas about distributed networks, among others. Such techniques were not common in the emerging fields of software engineering and programming, but they began to circulate in the elite engineering circles where Alan Kay worked. Kay combined them with the educational, psychological, and mathematical ideas of Maria Montessori, Seymour Papert, and Jerome Bruner, and further energized them through Marshall McLuhan's trendy media theories. Kay also understood early on the implications of what Carver Mead called "Moore's Law," an exponential line of increasingly smaller, faster, and cheaper forms of computation triggered by mass-produced integrated circuits, now leading to positive feedback in technological development and the creation of new markets. Thus, Alan Kay recombined all these ideas, desires, technologies, and opportunities. The result was an important contribution to a new and emerging socio-technical imagination that, in many ways, represents the computer as a digital medium, which we have today. Therefore, Alan Kay's work can be seen as a benchmark for radical engineering, as it allows us to critique the current stalemate and potential decline in the quality of most imaginations about technology.

But is it really that easy? Is radical engineering merely a result of a bit of mixing? Clearly, it is a more complex process. One of the most compelling descriptions of this process comes from another legendary radical engineer, namely Doug Engelbart, mentioned earlier. In 1962, a few years before Alan Kay began his career, Engelbart devised a project for his own research group at the Stanford Research Institute, funded by the U.S. Air Force, aimed at redesigning "HLAM-T," or "Human using Language, Artifacts, Methodology, in which he is Trained." This HLAM-T was always a semi-mechanism, so it could participate in a continuous process of "augmenting human intellect." Engelbart believed that the latter could be achieved through a "bootstrapping" process. In Silicon Valley, this term can mean many things, from booting systems to launching startups, but in the context of Engelbart's work, "bootstrapping" is "…an interesting [recursive] assignment of developing tools and techniques to make it more effective at carrying out its assignment. Its tangible product is a developing augmentation system to provide increased capability for developing and studying augmentation systems." Like Moore's Law, this is a dream of exponential progress emerging from nonlinear, self-executing feedback. Can you be more Californian?

To ensure that Engelbart and English's description is not merely a cybernetic daydream, we need to remind ourselves that they are not just talking about technological artifacts. Simply building prototypes to create prototypes is not a wise move in radical engineering: once put into use, prototypes often collapse; thus, the prototype toolkit is not very useful for developing further prototypes. Therefore, "bootstrapping" as a process can only work if we assume it is part of a larger process in which "tools and techniques" develop over a longer period alongside social structures and local knowledge. The process is recursive, much like the "recursive publics" described by Chris Kelty in free software development communities: in both cases, developers create socio-technical infrastructures through which they can communicate and collaborate, which then spreads to other parts of life. Kelty shows that this recursive effect is not merely a magical result of self-reinforcing positive feedback. The recursive process is based on politics, resources, qualified personnel, care, and guidance. In short, they require continuous production.

Thus, bootstrapping can take different scopes and directions. Although Engelbart and English's project sounds ambitious, at least in the 1960s, they still believed that bootstrapping within a research group would yield the desired effects. Alan Kay's Learning Research Group expanded this setting in the 1970s through pedagogy and McLuhan's media theory. By introducing children, they aimed to achieve recursive effects that transcend the laboratory, with the long-term goal of involving the whole world in a bootstrapping-like process. Bret Victor and his research group's approach to bootstrapping resembles a multi-layered onion. What kind of people should be part of it, and when, leads to intense internal discussions. Once the group launches "Dynamic Land," it will enter the next phase. Meanwhile, bootstrapping has taken many forms. Prototypes involve the bootstrapping process, such as pointers, tentacles, searches, improvisational repetitions, scaffolding, operating systems, blocking, performance, imagined test cases, demonstrations, and so on. In fact, in the larger bootstrapping process, a wealth of prototyping techniques is included. In the lab, they collectively generate a feeling of sitting in the brain. The lab as a whole—its walls, tables, whiteboards, roof, machines, and the people living inside—serves as the first demonstration of an alternative medium.

A detail from the HARC lab: the top image shows Alan Kay in white jeans. The bottom image is Engelbart's 1962 paper, which Bret Victor pasted on the wall in San Francisco's Mission District

Building a series of operating systems iteratively may require a significant amount of engineering tasks in the traditional sense; for example, writing a kernel in C or a process host in Haskell. However, the overall effort is clearly not technology-driven. In future spatial media, computation should decrease. Computation will play the role of infrastructure: just as books need light but do not mimic the logic of light, the medium can utilize the computational possibilities provided by the backend operating systems when necessary, but it should not be driven by them. Instead, dynamic spatial media should be driven by the characteristics of the medium itself and should also be driven by technology. The characteristics of this medium await exploration through the bootstrapping process. In the group's words, whether the medium or the way they produce this medium is "from the future." This future is not given but depends on the medium that this group is imagining. Therefore, it depends on the properties of the medium that the group is exploring, selecting, and practicing. On one hand, technology gives rise to a new medium, which is imagined to shape the future; on the other hand, the future is imagined to shape the new medium, which in turn should drive technology.

While most of the group's work involves making devices, thinking is also part of their work. The latter enables engineers to understand what the prototype work reveals. It also directs the lab's work, inspires its endeavors, and is part of securing funding. So far, the entire process has produced a series of interconnected and constantly evolving ideas and goals: for example, one group is looking for new ways to represent and understand complex systems. A second group aims to gain more knowledge by removing the limitations of contemporary media (such as the limitations of screens, which produce forms of knowledge that are difficult to understand, like trillions of lines of code written on a screen and then staring at the screen). The third group explores new forms of expressing time, while the fourth group explores new forms of incorporating physical properties more effectively into spatial media systems. All these clusters will lead to goals and hypotheses that move more seamlessly up and down the "ladder of abstraction." Seemingly echoing Nietzsche, McLuhan, or Kittler's media theory thinking with engineering solutions, a larger goal is to make new ideas possible, which, due to the shortcomings of contemporary media, have until now remained "unimaginable." Enhanced embodied forms of cognition and better ways of collaborating to generate ideas can heal loneliness and suffering, which are often part of deep thinking. As one internal email put it, all of this together may "prevent the world from splitting apart."

One way to understand what is happening here is to frame all of this as another form of "hacking." When you "hack," you can say you are hacking apart or hacking together. Hacking apart can be seen as a practice that develops from refusing to accept previous black-box behaviors. Transferring to the realm of radical engineering, hacking apart means not accepting the black boxes of current technological paradigms, such as screen-based computers, or ready-made futures like "smart cities, smart homes," or "the Internet of Things." Instead, you would open such black boxes and dissect them: the assumptions about what is considered technological success and about future technological advancements that match certain versions of social order, often combined with unhealthy commercial opportunities. The black boxes likely also contain ideas about different types of engineers, programmers, designers, managers, and so on. If you break these apart, you might look at these elements, discard many of them, distort others, add some elements from elsewhere, and then grow some yourself. You would study different, often historical technological paradigms, as well as other ideas about what is technically possible (and when), different ideas about social order, the good life, and problems that need solving, other books that need reading, different uses of media power, and different views on what kinds of people and their professional or non-professional natures should be responsible for all this. If you're lucky, you have the conditions and capabilities to complete all of this in a long, nonlinear process, also known as bootstrapping, during which you undergo multiple iterations of hacking apart and hacking together, while creating fundamentally different ideas about what technology should do and can do, and helping shape those ideas through a series of means and practices, demonstrating to yourself and others that some utopias may not be so far away. This is what radical engineers do.

While they make considerable efforts to escape the fantasy of technological solutions, they do not abandon the engineering approach of solving problems through building things; they have developed a method that people might call "radical media solutionism," although they have a contradictory attitude toward the latter. To avoid misunderstanding: neither I nor the engineers I study believe that a true future can be pieced together solely by a group of engineers in Palo Alto or Oakland. But I do believe that radical engineers like Engelbart, Kay, or Victor's research group, in their specific, highly privileged positions, add something crucial to the complex forces that push us toward the future. My ongoing fieldwork has made me curious about what is produced here, and many visitors to the lab agree that the first products to "arrive" are indeed shocking and incredible. If we believe in the self-perception of this organization, then their technology is like hacking, merely providing temporary solutions for something greater that may come one day. Radical engineers would also be the first to propose the same temporary solutions, which, if development stops and premature concretization occurs, could be a potential source of pejorative hacking. According to their stories, the latter is precisely what happened 40 years ago when prototypes left the lab too early and entered the worlds of Apple, IBM, and Microsoft, resulting in a multitude of poor decisions that now have people staring at smartphones.

In such stories, radical engineers may adopt a retrospective "could have been" approach, mixed with traces that distinguish them from "normal" engineers. Even if they distance themselves from Silicon Valley's startup culture, their detachment from "California ideology" may not always be 100% tight. In fact, they may provide much-needed heretical solutions to Silicon Valley mainstream. However, these radical engineers are potential allies aimed at breaking the liberal, authoritarian, and powerless fantasies that Silicon Valley often offers us, whether it be "bullshit internet" or "garbage internet." From the perspective of critical theory, from the standpoint of social movements, or through the analysis of political economy, the conceptual poverty of most currently available futures in Silicon Valley can certainly become visible. However, if we compare Silicon Valley to a utopia of radical engineering, its intellectual timidity also becomes apparent, a timidity that is merely masked by the destruction it causes.

Alan Kay in a Japanese comic created by Mari Yamazaki

About the Author: Götz Bachmann is a professor of digital culture at the Institute for Digital Media Culture and Aesthetics at Leuphana University in Germany and the convener of the Bachelor's program in Digital Media. He is currently also a visiting scholar at Stanford University. He is an ethnographer who has conducted fieldwork among warehouse workers, salespeople, and cashiers in Germany, as well as among Nico Chuu in Japan. He has also written the German children's comic series KNAX.

Loading...
Ownership of this post data is guaranteed by blockchain and smart contracts to the creator alone.