Synthesis is appropriation

In contemporary society, we make use of the notion that things may be synthetic. Thus we may speak of synthetic biology, “synthesizers” (synthetic sound), synthetic textile etc. Such things are supposed to be artificial and not come from “nature”.

However, the Greek root of the word synthesis actually seems to refer to the conjoining of pre-existing things, rather than something being purely man-made. But what does it mean to be purely man-made?

Furniture, bricks, bottles, roads and bread are all made in some sense; they are the result of human methods, tools and craft applied to some substrate. But they do not ever lose the character of the original substrate, and usually this is the point – we would like to see the veins of wood in fine furniture, and when we eat bread, we would like to ingest the energy, minerals and other substances that are accumulated in grains of wheat.

Products like liquid nitrogen or pure chlorine, created in laboratories, are perhaps the ones most readily called “synthetic”, or the ones that most readily would form the basis for something synthetic.  This owing to their apparent lack of specific character/particularity, such as the veins of wood or the minerals in wheat. On the other hand, it is apparent that they possess such non-character only from the point of reference of atoms as the lowest level. If we take into consideration ideas from string theory or quantum mechanics, most likely the bottom level shifts and the pure chlorine no longer seems so homogenous.

Accordingly, if we follow this line of thought to the end, as long as we have not established the bottom or ground level of nature – and it is questionable if we ever shall – all manufacture, all making and synthesis, is only a rearrangement of pre-existing specificity. Our crafts leave traces in the world, such as objects with specific properties, but do not ever bring something into existence from nothing.

Synthesis is appropriation: making is taking.

Rice fields and rain

img_5604

Humans primarily live in a world of beings, each of which has meaning. Meaningful beings appear to us interconnected, referencing practices and other beings in a referential totality. Buttons suggest pushing, chairs suggest sitting, a tractor suggests farming. A (Japanese) rice paddy may suggest the heavy labour that goes into the rice harvest each year, the tools and equipment that go with it, as well as the gradual depopulation of the village, since the young ones prefer a different line of work elsewhere. It may be part of the site and locus of an entire set of concerns and an outlook on life.

The world of beings is the one that is most immediate to us, and a world of molecules, atoms, energy or recorded data, useful as it may be, is something much further away. In each case it must be derived and renewed from the use of a growing and complex apparatus of equipment, practices and body of concepts, such as the traditions of physics or mathematics. Yet nobody would dispute that these worlds – the world of beings and the calculated world – are interrelated. In some cases they are even deeply intertwined.

But how can we reconcile the calculated world with the world of beings? How exactly do they influence each other? And if the calculated world is expanding aggressively, thanks to the spread of computational machinery and its servants, is the world of beings being pushed back? Receding? Are we abandoning it, since it is no longer good enough for us? Refusing to touch it, other than with thick gloves?

The calculated world concerns itself with propositions, true facts, formal models, records. A conceptual basis is needed to codify and engage with it. A record is formed when an observation is made, and the observer writes down what was observed. Initially, it retains an intimate connection with the world (of beings). The record is interpreted in light of the world and allowed to have its interplay with other beings. The observation “it rained heavily this week” is allowed to mean something in the context of farming, in the context of a possible worry about floods, or as a comment on an underwhelming holiday. Depending on who the reader is and what their concerns are, all these meanings can be grasped. The record may thus alter the reader’s outlook in a way similar to what direct experience of the rainfall would do.

At this level, the only facts we may record are that it rained or did not rain, and whether the rain was heavy or light. But given that we have some notion of space or time, as human beings do, repetition becomes possible. Scales for measuring time and space can be constructed, The rainfall can now be 27 or 45 mm. We are now further away from the world of farming, floods and holidays – “45 mm” of rain needs to be interpreted in order to be assigned any meaning. It has been stripped of most the world where it originated. The number 45 references only calculable repetition of an act of measurement. Enabled by the notions of space and time, already it tries to soar above any specific place or time to become something composable, calculable, nonspecific. Abstraction spreads its wings and flaps them gently to see if they will hold.

So on all the way up to probability distributions, financial securities, 27 “likes” in a day on social media and particle physics. At each level of the hierarchy, even when we purport to move “downward” into the “fundamentals” of things, layers of meaning are shed and a pyramid of proverbial ivory soars to the sky.

Spatial and temporal observations depend on measurement on linear scales, such as a stopwatch or a ruler. Such scales are first constructed through repeated alignment of some object with another object. Such repeated alignment depends on counting, which in turn depends on the most basic and most impoverished judgment: whether something is true or false, does or does not accord. Thus something can have the length of 5 feet or the duration of 3 hourglasses: it accords with the basic unit a certain number of times. This accordance is the heavily filtered projection of a being through another. The side of a plot of land is measured, in the most basic case, by viewing the land through a human foot – how many steps or feet suffice to get from one side to the other? Even though the foot is actually able to reveal many particularities of the land being measured – its firmness, its dampness, its warmth – the only record that this attitude cares to make is whether or not spatial distance accords, and how many times in succession it will accord. All kinds of measurement devices, all quantitative record making, follows this basic principle. Thus, the calculable facts are obtained by a severe discarding of a wealth of impressions. This severity is obvious to those who are being trained to judge quantitatively for the first time, but soon internalised and accepted as a necessity. Today, these are precisely the facts we are accustomed to calling scientific and objective.

But the accordance of beings with distance or time is, of course, very far from the only things we can perceive about them. The being emits particular shapes, configurations, spectra that make impressions on us and on other beings. Thus it is that we may perceive any kind of similarity – for example the notion that two faces resemble each other, that a dog resembles its owner, or that a constellation of stars looks like a warrior. We delight in this particularity, which in a way is the superfluous or excess substance of beings – it is not necessary for their perception but it forms and adds to it. Thus the stranger I met is the stranger with a yellow shirt and not merely the stranger. He can also be the stranger with a yellow shirt and unkempt hair, or the stranger with a yellow shirt and unkempt hair and a confident smile, and so on – any number of details may be recorded, any number of concepts may be brought into the description. These details are not synthetic or arbitrary. But they are also not independent of the one who observes. They would depend both on a richness that is of the being under observation, and on the observer’s ability to form judgments and concepts, to see metaphorically, creatively and truthfully.

Such impressions, which carry a different and perhaps more immediate kind of truth than the truth that we derive from calculations and records, may now have become second class citizens in the calculated world that grows all around us.

Reading shelf, September 2016

Currently reading:

C.G. Jung: Nietzsche’s Zarathustra (vol. 2) (seminar notes)

J.G. Ballard: Empire of the Sun (fiction)

Eric Hobsbawm: Age of Extremes (nonfiction)

 

Just finished:

J.G. Ballard: Extreme Metaphors (interviews)

Ursula K. LeGuin: The Earthsea Quartet (fiction)

 

 

AI and the politics of perception

Elon Musk, entrepreneur of some renown, believes that the sudden eruption of a very powerful artificial intelligence is one of the greatest threats facing mankind. “Control of a super powerful AI by a small number of humans is the most proximate concern”, he tweets. He’s not alone among silicon valley personalities to have this concern. To reduce the risks, he has funded the OpenAI initiative, which aims to develop AI technologies in such a way that they can be distributed more evenly in society. Musk is very capable, but is he right in this case?

The idea is closely related to the notion of a technological singularity, as promoted by for example Kurzweil. In some forms, the idea of a singularity resembles a God complex. In C G Jung’s view, as soon the idea of God is expelled (for example by saying that God is dead), God appears as a projection somewhere. This because the archetype or idea of God is a basic feature of the (western, at least) psyche that is not so easily dispensed with. Jung directs this criticism at Nietzsche in his Zarathustra seminar. (Musk’s fear is somewhat more realistic and, yes, proximate, than Kurzweil’s idea, since what is feared is a constellation of humans and technology, something we already have.)

But if Kurzweil’s singularity is a God complex, then the idea of the imminent dominance of uncontrollable AI, about to creep up on us out of some dark corner, more closely resembles a demon myth.

Such a demon myth may not be useful in itself for understanding and solving social problems, but its existence may point to a real problem. Perhaps what it points to is the gradual embedding of algorithms deeply into our culture, down to our basic forms of perception and interaction. We have in effect already merged with machines. Google and Facebook are becoming standard tools for information finding, socialising, getting answers to questions, communicating, navigating. The super-AI is already here, and it has taken the form of human cognition filtered and modulated by algorithms.

It seems fair to be somewhat suspicious — as many are — of fiat currency, on the grounds that a small number of people control the money supply, and thus, control the value of everybody’s savings. On similar grounds, we do need to debate the hidden algorithms, controlled by a small number of people (generally not available for perusal, even on request, since they would be trade secrets), and pre-digested information that we now use to interface with the world around us almost daily. Has it ever been so easy to change so many people’s perception at once?

Here again, as often is the case, nothing is truly new. Maybe we are simply seeing a tendency that started with the printing press and the monotheistic church, taken to its ultimate conclusion. In any case I would paraphrase Musk’s worry as follows: control of collective perception by a small number of humans is the most proximate concern. How we should address this concern is not immediately obvious.

The minimal genome of Craig Venter’s Syn3.0

The J Craig Venter Institute has published a paper detailing the genome of their new Syn3.0 synthetic organism. The major accomplishment was to construct a viable cell with a synthetic, extremely small genome: only 473 genes and about 500 kbp.

Even though it is considered to be fully “synthetic”, this genome is not built from scratch. Instead, the starting point is the Mycoplasma genitalium bacterium, from which genes and regions are deleted to produce something that is much smaller, but still viable. This means that even this fully synthetic genome still contains regions and functionalities that are not fully understood. M. genitalium was also the basis for JCVI’s Syn1.0, which was produced in 2008, but the genome of Syn3.0 is the smallest so far – “smaller than that of any autonomously replicating cell found in nature”. Syn3.0 should be a very valuable starting point for developing an explicit understanding of the basic gene frameworks needed by any cell for its survival – the “operating system of the cell” in the words of the authors.

Since so many genes are still basically not understood, the authors could not rely entirely on logic and common sense when choosing what genes to remove. They used an approach that introduced random mutations into the starting organism, and then checked which mutations where viable and which were not. This allowed them to classify genes as essential, inessential or quasi-essential (!). The deletion of essential genes would cause the cell to simply die. The deletion of quasi-essential genes would not kill it, but would dramatically slow its replication rate, severely crippling it. The final Syn3.0 organism has a doubling time of about 3 hours.

Some of the points I took away from this readable and interesting paper were:

Synthetic biology methods are starting to resemble software development methods. The authors describe a design-build-test (DBT) cycle that involve several nontrivial methods, such as in silico design, oligonucleotide synthesis, yeast cloning, insertion into the bacteria, testing, and then (perhaps) sequencing to go back to computers and figure out what went wrong or what went well. Thus, a feedback loop between the cells and the in silico design space is set up.

A very small genome needs a very tightly controlled environment to survive. The medium (nutrient solution) that Syn3.0 lives in apparently contains almost all the nutrients and raw materials it could possibly need from its environment. This means that many genes that would normally be useful for overcoming adverse conditions, perhaps for synthesising nutrients that are not available from the environment, are now redundant and can be removed. So when thinking about genome design, it seems we really have to think about how everything relates to a specific environment.

The mechanics of getting a synthetic genome into a living cell are still complex. A huge amount of wet-lab (and, presumably, dry-lab) processes are still needed to get the genome from the computer into something viable in a cell culture. However, things are going much faster than in 2008, and it’s interesting to think about where this field might be in 2021.