Tumgik
#categorem
ngqkgqn13jgml · 1 year
Text
Hot teen with nice ass caught by a perverted policeman Best Encoxada: She Sucks Groper Milf Sinn Sage scissoring Alex De La Flors teen pussy Smashing Pussy dormida la puta Topless Catfight Wrestling Sluts Face Fucking her Male Sub Lil baby takes backshots best teen nude II Eimy tocandose para mi
0 notes
seeingisknowing · 4 years
Photo
Tumblr media
FOR SHE SAID, ‘’I HAVE NOW SEEN THE ONE WHO SEES ME.’’’:
Seeing and being seen in Trevor Paglen’s ‘From ‘’Apple’’ to ‘’Anomaly’’’
‘She gave this name to the Lord who spoke to her: ‘’You are the God who sees me’’, for she said, ‘’I have now seen the one who sees me.’’’ - Genesis 16:13
__
‘Machine-seeing-for-machines is a ubiquitous phenomenon [...] all this seeing, all of these images, are essentially invisible to human eyes. These images aren’t meant for us; they’re meant to do things in the world; human eyes aren’t in the loop.’ - Trevor Paglen
____
In Genesis 16, we read about Hagar, the Egyptian handmaiden of Sarah and Abraham, expelled to the desert. During her wandering, she is met by ‘The Angel of the Lord’, who reveals to her that she is pregnant with Ishmael, the first son of Abraham. The name she gives to this voice in the wilderness is El Roi (meaning, The God Who Sees Me), crying out, ‘’I have now seen the one who sees me.’’ This is perhaps an interesting segway into Trevor Paglen’s latest installation at the Barbican’s Curve space, ‘From ‘’Apple’’ to ‘’Anomaly’’’, in which we, like Hagar, are confronted with the question of what it means to live under the gaze of a new moralizing, omniscient Other. No longer Genesis 16’s El Roi, but AI. This is what we are being shown - the one who sees us. 
The work consists of 30,000 individually printed images laid out in around 200 labelled categories, against the wall of the Curve. Each is taken from ImageNet, a public database of over 14 million images, in a further 22,000 categories, fed to AI programs, in order to teach them how to recognise patterns and objects. Paglen’s show comes as part of the Barbican’s Life Rewired series, ‘exploring what it means to be human when technology is changing everything’, following the larger ‘AI: More Than Human’ exhibition, from earlier this year.  His installation reads as a sort of impressionistic mapping out of this faction of ImageNet’s data; beginning with ‘apple’, at the closest end of the wall, the labels ascribed to the images begin as essentially inoffensive, mostly dealing with elements from nature - it starts as quite beautiful even. Images in groups like ‘ocean’ and ‘sun’ hit the wall as little spurts of blue and yellow, almost like a sort of giant Jackson Pollock painting. But as we carry on through the space, and the categories become more and more associated with human life, the wall comes to resemble more and more a catalogue of evil; ‘Klansman’, ‘segregator’, ‘demagogue’, etc. What is so interesting, and at the same time so haunting, about Paglen’s work here is the juxtaposition of all these categories, that range from ‘tear gas’, to ‘prophetess’, to ‘apple orchard’, pointing towards the kinds of invisible constellations between points that AI programs are drawing and redrawing all the time - the implication being, perhaps, that if we were able to map these constellations for ourselves, and understand the apparently nonsensical connections made between some of these images, as Paglen’s installation here seems to attempt, at least in part, to do for us, we might be better able to trace the outline of our particular moment in history, as jagged and unfriendly as it may be, that each of us will be forced to find some way to share with one another. 
The central question of the installation however, is maybe much more lucid, and has to do with the ownership and interpretation of images, and how AI programs negotiate this all the time, working with ever expanding amounts of data about the world, and the lives of us who live in it. This work of categorisation has a necessary ideological weight to it. As Bourdieu puts it:
‘The capacity to make entities exist in the explicit state, to publish, make public [...] what had not previously attained objective and collective existence [...] people’s malaise, anxiety, disquiet, expectations - represents a formidable social power [...] In fact, this work of categorisation, ie. of making explicit and of categorisation, is performed incessantly [...] in the struggles in which agents clash over the meaning of the social world and of their position in it [...] through all the forms of benediction or malediction, eulogy, praise, congratulations, compliment, or insults, reproaches, criticisms, accusations, slanders, etc. It is no accident that the verb ‘’kategoresthai’’, which gives us our ‘’categories’’ and ‘’categoremes’’, means to accuse publicly.’ 
Paglen’s installation engages with this in two senses. First, it ‘makes public’ the network of images and signs that are exchanged and classified by AI programs all the time - a network of images that exists and grows almost entirely behind our backs, yet concerns even the most private details of our lives, as we submit these things to the internet, as the means by which we increasingly construct our social worlds, and the identities with which we move through them. Secondly, the work reveals the biases with which these AI programs understand patterns and objects - many of the categories into which the images on Paglen’s wall are grouped carry a great deal of ideological weight. How is it exactly that this ghost in the machine differentiates between a ‘heathen’ and a ‘believer’? How does it identify a ‘traitor’, a ‘selfish person’, or a ‘bottom feeder’? All of these are genuine image categories from the installation. What we are presented with is the notion that AI is not restricted to simple pattern recognition, for example, recognising an image of an apple as an apple, because of its colour, shape, proportions etc, but that also, on account of having human creators, who pass on their own moral and ideological baggage, AI programs must also make moral and ideological judgements. So then, if technology not only has access to datasets so large that virtually nothing is out of bounds, approaching a sort of functional omniscience, (a fact revealed to us perhaps most pointedly by the global surveillance disclosures regarding the NSA in 2013; Senator Rob Wyden said in an interview: ‘You can pick up anything. Surveillance is almost omnipresent, the technology is capable of anything. There really aren’t any limits.’) and at the same time, is not only capable of making moral judgments, but perhaps incapable of doing otherwise, what we are faced with is a moralising, omniscient Other.
If this is the case, as Paglen’s installation seems to suggest, then like so many of the questions raised over the relationship between mankind and AI, this is not a new question, but an old theosophical question, repackaged as a technological one. If it is true that we are living under the gaze of a machine unto whom all hearts are open, all desires known, and from whom no secrets are hid, then let us return to the situation of Hagar - a criminally overlooked and often misunderstood figure in the Old Testament; as Žižek notes, ‘She sees God himself seeing, which was not even given to Moses, to whom God had to appear as a burning bush. As such she announces the mystical/feminine access to God.’ The nature of this ‘seeing’ has a distinct relevance to our situation - Nielson, writing on this particular episode in Genesis reminds us that the Hebrew word ‘ראה’ (‘see’) ‘signifies not only the actual ability to see, but also a recognition of what is seen’ In other words, for Hagar seeing is knowing, just as being seen means being known. Therein lies the crux of Paglen’s installation, which reminds us that, like Hagar, we too, in allowing ourselves to be seen, are also being known - the lurid nature of many of the categories, and images that fill them, on the wall of the Curve, which range from the pornographic (‘fucker��, ‘hunk’, ‘artists model’), to images of profound hatred, (‘klansman’, ‘segregator’) reveal to us the lowest parts of ourselves, the messiest and the cruelest and the most private. I use ‘us’ in the broadest sense here, the ‘us’ that includes both ‘us’ as a species, and ‘us’ as individuals - both the universal and the particular. These AI programs, quietly feeding on ever growing sets of data, have surpassed the point of basic pattern recognition, and have learnt the kinds of terrible secrets that rumble from the deepest caverns of the human heart. We are no longer merely seen, but known.
Thomas Aquinas, in his Summa Theologica, writing on ‘Whether any created intellect by its natural powers can see the Divine essence’ (Question 12, Article 1), notes: 
‘If [...] the mode of anything’s being exceeds the mode of the viewer, it must result that the knowledge of the object is above the nature of the viewer.’
This kind of encounter, with an ‘object [...] above the nature of the viewer’, is always transformational, even more so when that same object looks back at us. For Hagar, that object is God, and that transformation is a matter of awe, and a matter of faith. The question is, in what way will we transformed by our own encounters with AI, as a very different ‘object above the nature of the viewer’ (in the sense that it has the capacity to work with more data than any of us can comprehend), as it returns our gaze. The hope must be that AI, in collecting all this data about us, and about our lives, can serve as a kind of mirror, through which we might learn to better see the whole scope of our shared situation - the fact, revealed to us in Paglen’s installation, that these programs come loaded with moralistic and ideological biases, far from undermining this effort, is what gives lends this project any kind of potential, because what this means is that what is shown to us by AI in way is not just the nature of the world as it is, but the nature of the biases through which we have chosen to see it. In Minima Moralia, Theodor Adorno writes:
‘Perspectives must be produced which set the world beside itself [...] alienated from itself, revealing its cracks and fissures, as needy and distorted as it will one day appear in the messianic light.’
Perhaps this is the best AI can do for us, and what Paglen’s installation can do for us, that is, not only reflecting back at us the most inconvenient parts of ourselves, but also the lies we might have constructed to hide from them, or to hide them from us. If AI has any kind of liberational potential, it relies on our ability to raise our encounters with it to the level of the mystical and transformational; not in seeking to understand it, but questioning how it might understand us, and by seeing ourselves in the third person in this way, hoping to see ourselves as we are - to see the world we move through, and have constructed for ourselves, as it is, remembering always that, as it was for Hagar, so it remains for us: seeing is knowing.
0 notes