Claus-Michael Schlesinger, a cultural scientist and research assistant with the Digital Humanities Department at the University of Stuttgart’s Institute of Literatures (ILW), is researching the history of information aesthetics and also deals with the question of whether computers will write novels one day.
Literature or texts that provide information on several levels are assumed to fall into the exclusive preserve of the intellectual capacities of the human brain. However, Schlesinger would only agree with this assumption to a limited extent: he knows that computers are more than capable of writing meaningful texts. Moreover, such texts are not even an innovation of our time, in which artificial intelligence is playing an increasingly dominant role in our everyday lives. Stuttgart, for instance, was not only a center of technical development in the early days of the computer, but was also at the forefront of the astonishing use of computer technology in the arts.
"NOT EVERY GLANCE IS NEAR AND NO VILLAGE IS LATE”. This sentence was output at the University of Stuttgart in 1959 by a Zuse Z22 computer system, which was one of the first serially produced vacuum tube computers in Germany. A native of Esslingen, Theo Lutz, s computer scientist and long-time professor at the Esslingen University of Applied Sciences, had written an algorithm and fed the computer with 16 subjects and predicates each from Franz Kafka's “The Castle” in addition to certain logical constants and operators. The results were the first attempts at literature with the aid of a computer in the form of several million possible pairs of sentences like the one cited above.
“Lutz used so-called combinatorial text generators,” Claus-Michael Schlesinger explains. In another experiment, the program generated random mathematical propositions, which Lutz checked against a “truth matrix”, the idea being that the more detailed the matrix, the better it would be able to judge for itself which sentence is actually true. “That was a very early proposal for what is now called artificial intelligence,” said Schlesinger. Although this idea was developed almost 60 years ago, primarily as a theoretical concept, and has not yet been put into practice, there is, nevertheless, a direct line to be drawn between Lutz's approaches and artificial intelligence as we currently know it. Schlesinger joined the Digital Humanities Department as a postdoc in 2016 to continue his work on the history of information aesthetics. “It's a great pleasure for me to work here because Stuttgart was a hotspot in the interplay between cybernetics and the humanities.”
Among other things, Schlesinger's work involves not only evaluating texts created at that time, but even, to some extent, rediscovering them because in many cases these pioneering literary efforts are only preserved as paper printouts. Schlesinger received a box with previously unpublished texts from the long-time director of the Institute for German Language in Mannheim. Among them was the program code that Stickel had written in the 1960s as a research assistant at the German Computer Centre in Darmstadt to create his “autopoems” on an IBM 7090 machine.
In the beginning was the mainframe computer
The concept of information aesthetics was coined by the philosopher Max Bense, who taught at the University of Stuttgart from 1949. Bense's approach was to use the tools of information-theory and the mainframe computers available at the time to find aesthetic forms of expression. This may seem to be an outmoded concept these days when computers are often inseparably involved in the creation of music or visual art. But, according to Schlesinger, “the humanities in the 1960s were completely different and had nothing to do with information theory and cybernetics, and was primarily driven by interpretations based on the intrinsic qualities of a given work.” The information aesthetics approach, on the other hand, was far ahead of its time, as Schlesinger confirms.
Schlesinger refutes the concern that computers might one day be able to independently write literarily challenging novels. Grammatically correct sentences are relatively easy to learn, but when it comes to semantics, which works at the content level of a text, computers are largely clueless. “You'd have to feed them extremely large amounts of text for them learn semantics.” And even then the actual literary level, the poetics of the text, would still be missing.
So much experimentation was carried out in Bense and Lutz’ day that texts of literary quality were actually created. However, it is mainly the theoretical foundations that have survived to this day. The fact that aesthetic questions began to play a role in the development of complex systems also led to today's machines that write social media commentaries as bots or reformulate sports and stock market results to produce journalistic reports. “My impression is that a lot is happening with regard to such utility texts because of the potential industrial applications,” says Schlesinger. Once again, he goes on to say, much experimentation is currently being done on “digital literature”, whereby a small community of fans of the so-called uncanny-effect, i.e., the somewhat uncanny and mysterious computer- generated literature, certainly exists.