Tumgik
#once again a popular idea of the medieval world turns out to be early modern
finnlongman · 1 year
Text
I love how discussion of "medieval" fantasy novels had me half convinced the divine right of kings was a medieval concept even though I've never come across it in medieval literature, and then I start doing some actual research and discover we can blame that one on James I and the seventeenth century.
Edited to add: I turned off reblogs for a reason, lads. I realise there's a lot more nuance to the history of this phrase than I conveyed here and that versions of this concept have existed in different places. I was talking about a very specific manifestation of it in a very specific (English) context, in terms of how it gets used in popular understandings of the past – nothing else, and purely as a curiosity for myself, not a history lesson or discussion starter. If I could also turn off replies on this post, I would do so. Please stop telling me about the use of the concept elsewhere and during other periods, I am a) aware and b) not actually interested at this time. I have made that abundantly clear in my comments on the post.
175 notes · View notes
wolffyluna · 9 months
Note
So... sapphic steppe atrocities and death game streamers? :O
I'm going to start with Death Game Streamers, because that's a bit easier to explain succinctly.
Death Game Streamers
20 minutes into the future, full dive VR exists. You can feel pain, feel the touch of objects, and generally really fuck about with the physics engine. One of the most popular games to play and watch is a game that gives you a lot of creative tools to build and make things, including other games. Think if Minecraft, Gary's Mod, and Second Life had a weird VR baby.
And of course, there's lots of streamers for this game, and a big audience too, for every plausible way to play this game.
There's a big subculture for Hunger Games-esque games, where if you die in the game-- you don't die in real life. That would be absurd. But dying hurts like hell, because audiences for that really love it when the pain sim is turned up to high.
And the story is about the blossoming romance of two streamers invited to such a game-- and the way it goes Very Weird because it is happening while they have a constant audience and can't easily communicate about it-- between a pvp sweat and someone who is usually a builder and infamously bad at pvp. ("Hey, that sounds like--" NO IT DOESN'T. SHUT.)
Sapphic Steppe Atrocities
I love it, I have so much worldbuilding for it. It doesn't have a first draft yet. The plot is broken and I have to fix it.
A long time ago, humanity devoloped faster than light travel. You could travel to other worlds! But you couldn't travel back in your life time. Still, plenty of people sailed out to the stars, and colonised new planets. And this mostly went fine!
Except, a significant minority of planets had something, somewhere Go Wrong, and they lost the science fictional technology and and ended up varyingly in an Ancient/Medieval/Early Modern equilibrium.
Like the poor planet of Unnamed, where the use of Mega Smallpox as a bioweapon kind of took out everyone who knew how to run the power plants and the pharmaceutical plants, and also a lot of the survivors came from one of the several High Control Low Tech religious groups that moved to the planet? It's pretty medieval, in both technology and society (because well. a lot of the way modern society can be so modern is because we have consistently plentiful food and energy and access to birth control, etc.)
Faster faster than light travel has been invented, and so now all the human planets can be in contact again! Yay! ...and now people need to work out how to deal with The Bad Planets.
Staying out of the ring, we have: The Space UN! Who have regulations against giving technology or assistance to any society that is not a democracy, so no one, like, gives a tyrant a nuke.
In the blue corner, we have: Future! A representative of a semi-criminal group that believes that "no, what the fuck, the people living in medieval dictatorships deserve medical assistance, and we are giving it?" Her nom de crime is from a quote from Benjamin Franklin to Edward Jenner about how "future nations will now [how cool smallpox vaccination is]" because she is so strongly on Team We Made This Fucker Extinct Once and We Will Do It Again. This is planet is her first assignment, and it is Going Poorly!
In the red corner, we have: Sabina! A representative of Pax Imperia, that believes that these planets can be fixed by Just Authority With The Power To Enforce Itself Through Violence, and are as imperialist as their name implies! Sabina is a romanaboo who on all levels except physical has a greek statue as an icon, and who annoyingly put a lot of her points into Charisma.
Thrown into the ring against their will, we have: Alit and Ngaya! Two warrior women from steppe pastoralist groups at war. They had to flee a battle into the enemies-to-lovers wilderness, thanks to Pax Imperia interfering with lazer guns. They're respective cultures have very different opinions about women fighting. Ngaya has tied herself to an idea of the Ideal Warrior that is eating her alive. Alit is going "'on a journey of revenge, dig two graves?' Oh, no, we're going to be needing a lot more graves than that."
And it's a lot about how technology shapes society, and about gender, and about [wiggles hands] how to reshape a world full of suffering. And Alit and Ngaya are going to kiss, damnit. (And maybe Future will get involved, too, everyone has two hands.) (There are also plans for a fucked up Sabina/Ngaya flirtation, because they are Worryingly Similar People.)
8 notes · View notes
yamayuandadu · 3 years
Text
The Two (or more) Ishtars or A Certain Scandalous Easter Claim Proved to be The Worship of Reverend Alexander Hislop
Tumblr media
Once upon a time the official facebook page of Richard Dawkins' foundation posted a graphic according to which the holiday of Easter is just a rebranded celebration of the Mesopotamian mythology superstar Ishtar, arguing that the evidence is contained in its very name. As everyone knows, Dawkins is an online talking head notable for discussing his non-belief in such an euphoric way that it might turn off even the most staunch secularists and for appearing in some reasonably funny memes about half a decade ago. Bizarrely enough, however, the same claim can be often found among the crowds dedicated to crystal healing, Robert Graves' mythology fanfiction, indigo children and similar dubiously esoteric content. What's yet more surprising is that once in a while it shows up among a certain subset of fundamentalist Christians, chiefly the types who believe giants are real (and, of course, satanic), the world  is ruled by a secret group of Moloch worshipers and fossils were planted by the devil to led the sheeple astray from the truth about earth being 6000 years old, tops. Of course, to anyone even just vaguely familiar with Christianity whose primary language isn't English this claim rightfully seems completely baffling – after all it's evident in most languages that the name of the holiday celebrating Jesus' resurrection, and many associated customs, are derived from the earlier Jewish Pascha (Passover) which has nothing to do with Ishtar other than having its origin in the Middle East. Why would the purported association only be evident  in English and not in Aramaic, Greek, Latin, Spanish, virtually any language other than English and its close relatives – languages which generally didn't have anything to do with Mesopotamia or early christianity? Read on to find out what sort of sources let this eclectic selection of characters arrive to the same baffling conclusion, why are they hilariously wrong, and – most importantly – where you can actually find a variety of Ishtars (or at least reasonably Ishtar-like figures) under different names instead.
The story of baffling Easter claims begins in Scotland in the 19th century. A core activity of theologians in many faiths through history was (and sometimes still is) finding alleged proof of purported “idolatry” or other “impure” practices among ideological opponents, even these from within the same religion – and a certain Presbyterian minister, Alexander Hislop, was no stranger to this traditional pastime. Like many Protestants in this period, he had an axe to grind with the catholic church  - though not for the reasons many people are not particularly fond of this institution nowadays. What Hislop wanted to prove was much more esoteric – he believed that it's the Babylon known from the Book of Revelations. Complete with the beast with seven heads, blasphemous names and other such paraphernalia, of course. This wasn't a new claim – catholicism was equated with the New Testament Babylon for as long as Protestantism was a thing (and earlier catholicism itself regarded other religions as representing it). What set Hislop apart from dozens of other similar attempts like that was that he fancied himself a scholar of history and relied on the brand new accounts of excavations in what was once the core sphere of influence of the Assyrian empire (present day Iraq and Syria), supplemented by various Greek and Roman classics – though also by his own ideas, generally varying from baseless to completely unhinged. Hislop compiled his claims in the book The Two Babylons or The Papal Worship Proved to be the Worship of Nimrod and His Wife. You can find it on archive.org if you want to torment yourself and read the entire thing – please do not give clicks directly to any fundie sites hosting it though. How does the history of Easter and Ishtar look like according to Hislop? Everything started with Semiramis, who according to his vision was a historical figure and a contemporary of Noah's sons, here also entirely historical. Semiramis is either entirely fictional or a distorted Greek and Roman account of the 9th century BC Assyrian queen Shammuramat, who ruled as a regent for a few years after the death of her husband Shamshi Adad V – an interesting piece of historical trivia, but arguably not really a historical milestone, and by the standards of Mesopotamian history she's hardly a truly ancient figure. Hislop didn't even rely on the primary sources dealing with the legend of Semiramis though, but with their medieval christian interpretations, which cast her in the role of an adulterer first and foremost due to association of ancient Mesopotamia with any and all vices.
Tumblr media
Hislop claims that Semiramis was both the Whore of Babylon from the Book of Revelations and the first idolater, instituting worship of herself as a goddess. This goddess, he argues, was Astarte (a combination of two flimsy claims – Roman claim that Semiramis' name means “dove” and now generally distrusted assumption that Phoenician Astarte had the same symbols as Greek Aphrodite) and thus Ishtar, but he also denotes her as a mother goddess – which goes against everything modern research has to say about Ishtar, of course. However, shoddy scholarship relying on few sources was the norm at the time, and Hislop on top of that was driven by religious zeal. In further passages, he identified this “universal mother” with Phrygian Cybele, Greek Rhea and Athena, Egyptian Isis, Taoist Xi Wangmu (sic) and many more, pretty much at random, arguing all of them were aspects of nefarious Semiramis cult which infected all corners of the globe. He believed that she was venerated alongside a son-consort, derived from Semiramis' even more fictional husband Ninus (a mythical founder of Assyria according to Greek authors, absent from any Mesopotamian sources; his name was derived from Nineveh, not from any word for son like Hislop claims), who he identifies with biblical Nimrod (likewise not a historical figure, probably a distorted reflection of the god Ninurta). Note the similarity with certain ideas perpetrated by Frazer's Golden Bough and his later fans like Jung, Graves and many neopagan authors – pseudohistory, regardless of ideological background, has a very small canon of genuinely original claims. Ishtar was finally introduced to Britain by “druids” (note once again the similarity to the baffling integration of random Greek, Egyptian or Mesopotamian deities into Graves-derived systems of fraudulent trivia about “universal mother goddesses” often using an inaccurate version of Celtic myths as framework). This eventually lead to the creation of the holiday of Easter. Pascha doesn't come up in the book at all, as far as I can tell. All of this is basically just buildup for the book's core shocking reveal: catholic veneration of Mary and depictions of Mary with infant Jesus in particular are actually the worship of Semiramis and her son-consort Ninus, and only the truly faithful can reveal this evil purpose of religious art. At least so claims Hislop. This bizarre idea is laughable, but it remains disturbingly persistent – do you remember the Chick Tracts memes from a few years ago, for example? These comics were in part inspired by Hislop's work. Many fundamentalist christian communities appear to hold his confabulations in high esteem up to this day – and many people who by design see themselves as a countercultural opposition to christianity independently gleefully embrace them, seemingly ignorant of their origin. While there are many articles debunking Hislop's claim about Easter, few of them try to show how truly incomprehensibly bad his book is as a whole – hopefully the following examples will be sufficient to illustrate this point: -Zoroaster is connected to Moloch because of the Zoroastrian holy fire - and Moloch is, of course Ninus. Note that while a few Greek authors believed Zoroaster to be the “king of Bactria” mythical accounts presented as a contemporary of Ninus, the two were regarded as enemies – Hislop doesn't even follow the pseudohistory he uses as proof! -Zoroaster is also Tammuz. Tammuz is, of course, yet another aspect of Ninus. -demonic character is ascribed to relics of the historical Buddha; also he's Osiris. And Ninus. -an incredibly racist passage explains why the biblical Nimrod (identified with – you guessed it - Ninus) might be regarded as “ugly and deformed” like Haephestus and thus identical to him (no, it makes no sense in context either) - Hislop thinks he was black (that's not the word he uses, naturally) which to him is the same thing. -Attis is a deification of sin itself -the pope represents Dagon (incorrectly interpreted as a fish god in the 19th century) -Baal and Bel are two unrelated words – this is meant to justify the historicity of the Tower of Babel by asserting it was built by Ninus, who was identical to Bel (in reality a title of Marduk); Bel, according to Hislop, means “the confounder (of languages)” rather than “lord” -the term “cannibal” comes from a made up term for priests of Baal (Ninus) who according to Hislop ate children. In reality it's a Spanish corruption of the endonym of one of the first tribes encountered by the Spanish conquerors in America, and was not a word used in antiquity – also, as I discussed in my Baal post, the worship of Baal did not involve cannibalism. This specific claim of Hislop's is popular with the adherents of prophetic doomsday cult slash wannabe terrorist group QAnon today, and shows up on their “redpilling” graphics. -Ninus was also Cronos; Cronos' name therefore meant “horned one” in reference to Mesopotamian bull/horned crown iconography and many superficially similar gods from all over the world were the same as him - note the similarity to Margaret Murray's obsession with her made up idea of worldwide worship of a “horned god” (later incorporated into Wicca). -Phaeton, Orpheus and Aesculapius are the same figure and analogous to Lucifer (and in turn to Ninus) -giants are real and they're satanists (or were, I think Hislop argues they're dead already). They are (were?) also servants of Ninus. -as an all around charming individual Hislop made sure to include a plethora of comments decrying the practices of various groups at random as digressions while presenting his ridiculous theories – so, while learning about the forbidden history of Easter, one can also learn why the author thinks Yezidi are satanists, for example -last but not least, the very sign of the cross is not truly christian but constitutes the worship of Tammuz, aka Ninus (slowly losing track of how many figures were regarded as one and the same as him by Hislop). Based on the summary above it's safe to say that Hislop's claim is incorrect – and, arguably, malevolent (and as such deserves scrutiny, not further possibilities for spreading). However, this doesn't answer the question where does the name of Easter actually come from? As I noted in the beginning, in English (and also German) it's a bit of an oddity – it  actually was derived from a preexisting pagan term, at least if we are to believe the word of the monk Bede, who in the 8th century wrote that the term is a derivative of “Eosturmonath,” eg. “month of Eostre” - according to him a goddess. There are no known inscriptions mentioning such a goddess from the British Isles or beyond, though researchers involved in reconstructing proto-indo-european language assume that “Eostre” would logically be a derivative of the same term as  the name of the Greek Eos and of the vedic Ushas, and the Austriahenae goddesses from Roman inscriptions from present day Germany  – eg.  a word simply referring to dawn, and by extension to a goddess embodying it. This is a sound, well researched theory, so while early medieval chroniclers sometimes cannot be trusted, I see no reason to doubt Bede's account.
Tumblr media
While Ushas is a prominent goddess in the Vedas, Eos was rather marginal in Greek religion (see her Theoi entry for details), and it's hard to tell to what degree Bede's Eostre was similar to either of them beyond plausibly being a personification of dawn. Of course, the hypothetical proto-indo-european dawn goddess all of these could be derived from would have next to nothing to do with Ishtar. While the history of the name of Easter (though not the celebration itself) is undeniably interesting, I suppose it lacks the elements which make the fake Ishtar claim a viral hit – the connection is indirect, and an equivalent of the Greek Eos isn't exactly exciting (Eos herself is, let be honest, remembered at best as an obscure part of the Odyssey), while Ishtar is understood by many as “wicked” sex goddess (a simplification, to put it very lightly) which adds a scandalous, sacrilegious dimension to the baffling lie, explaining its appeal to Dawkins' fans, arguably. As demonstrated above, Hislop's theories are false and adapting them for any new context – be it christian, atheist or neopagan – won't change that, but are there any genuine examples of, well, “hidden Ishtars”? If that's the part of the summary which caught your attention, rejoice – there is a plenty of these to be found in Bronze Age texts. I'd go as far as saying that most of ancient middle eastern cultures from that era felt compelled to include an Ishtar ersatz in their pantheons. Due to the popularity of the original Ishtar, she was almost a class of figures rather than a single figure – a situation almost comparable to modern franchising, when you think about it. The following figures can be undeniably regarded as “Ishtar-like” in some capacity or even as outright analogs:
Tumblr media
Astarte (or Ashtart, to go with a more accurate transcription of the oldest recorded version of the name) – the most direct counterpart of Ishtar there is: a cognate of her own name. Simply, put Astarte is the “Levantine”equivalent of the “Mesopotamian” Ishtar. In the city of Mari, the names were pretty much used interchangeably, and some god lists equate them, though Astarte had a fair share of distinct traits. In Ugaritic mythology, which forms the core of our understanding of the western Semitic deities, she was a warrior and hunter (though it's possible that in addition to conventional weapons she was also skilled at wielding curses), and was usually grouped with Anat. Both of them were regarded as the allies of Baal, and assist him against his enemies in various myth. They also were envisioned to spend a lot of time together – one ritual calls them upon as a pair from distant lands where they're hunting together, while a fragmentary myth depicts both of them arriving in the household of the head god El and taking pity on Yarikh, the moon god, seemingly treated as a pariah. Astarte's close relation to Baal is illustrated by her epithet, “face of Baal” or “of the name of Baal.” They were often regarde as a couple and even late, Hellenic sources preserve a traditional belief that Astarte and “Adados” (Baal) ruled together as a pair. In some documents from Ugarit concerned with what we would call foreign policy today they were invoked together as the most prominent deities. It's therefore possible that she had some role related to human politics. She was regarded as exceptionally beautiful and some texts favorably describe mortal women's appearance by comparing them to Astarte. In later times she was regarded as a goddess of love, but it's unclear if that was a significant aspect of her in the Bronze Age. It's equally unclear if she shared Ishtar's astral character – in Canaan there were seemingly entirely separate dawn and dusk deities. Despite clamis you might see online, Astarte was not the same as the mother goddess Asherah. In the Baal cycle they actually belong to the opposing camps. Additionally, the names are only superficially similar (one starts with an aleph, the other with an ayin) and have different etymology. Also, that famous sculpture of a very blatantly Minoan potnia theron? Ugaritic in origin but not a depiction of either Astarte or Asherah.
Tumblr media
The Egyptians, due to extensive contact with Canaan and various Syrian states in the second half of the Bronze Age, adapted Astarte (and by extension Anat) into their own pantheon. Like in Ugarit, her warrior character was emphasized. An Egyptian innovation was depicting her as a cavalry goddess of sorts – associated with mounted combat and chariots. In Egypt, Ptah, the head god of Memphis and divine craftsman, was regarded as her father. In most texts, Astarte is part of Seth's inner circle of associates – however, in this context Seth wasn't the slayer of Osiris, but a heroic storm god similar to Baal. The so-called Astarte papyrus presents an account of a myth eerily similar to the Ugaritic battle between Baal and Yam – starring Seth as the hero, with Astarte in a supporting role resembling that played by Shaushka, another Ishtar analog, in the Hittite song of Hedammu, which will be discussed below.
Tumblr media
Shaushka – a Hurrian and Hittite goddess whose name means “the magnificent one” in the Hurrian language. Hurrian was widely spoken in ancient Mesopotamia and Anatolia (and in northernmost parts of the Levant – up to one fifth of personal names from Ugaritic documents were Hurrian iirc), but has no descendants today and its relation to any extant languages is uncertain. In Hittite texts she was often referred to with an “akkadogram” denoting Ishtar's name (or its Sumerian equivalent) instead of a phonetic  spelling of her own (there was an analogous practice regarding the sun gods), while in Egyptian and Syrian texts there are a few references to “Ishtar Hurri” - “Ishtar of the Hurrians” - who is argued by researchers to be one and the same as Shaushka. Despite Shaushka's Hurrian name and her prominence in myths popular both among Hittites and Hurrians, her main cult center was the Assyrian city of Nineveh, associated with Ishtar herself as well, and there were relatively few temples dedicated to her in the core Hittite sphere of influence in Anatolia. Curiously, both the oldest reference to Shaushka and to the city of Nineveh come from the same text, stating that a sheep was sacrificed to her there. While most of her roles overlap with Ishtar's (she too was associated with sex, warfare and fertility), here are two distinct features of Shaushka that set her apart as unique: one is the fact she was perceived in part as a masculine deity, despite being consistently described as a woman – in the famous Yazılıkaya reliefs she appears twice, both among gods and goddesses. In Alalakh she was depicted in outfits combining elements of male and female clothing. Similar fashion preferences were at times attributed to Ninshubur, the attendant of Ishtar's Sumerian forerunner Inanna – though in that case they were likely the result of conflation of Ninshubur with the male messenger deity Papsukkal, while in the case of Shaushka the dual nature seems to be inherent to her (I haven't seen any in depth study of this matter yet, sadly, so I can't really tell confidently which modern term in my opinion describes Shaushka's character the best). Her two attendants, musician goddesses Ninatta and Kulitta, do not share it. Shaushka's other unique niche is her role in exorcisms and incantations, and by extension with curing various diseases – this role outlived her cult itself, as late Assyrian inscriptions still associated the “Ishtar of Nineveh” (at times viewed as separate from the regular Ishtar) with healing. It can be argued that even her sexual aspect was connected to healing, as she was invoked to cure impotence. The most significant myth in which she appears is the cycle dedicated to documenting the storm god's (Teshub for the Hurrians, Tarhunna for the Hittites) rise to power. Shaushka is depicted as his sister and arguably most reliable ally, and plays a prominent role in two sections in particular – the Song of Hedammu and the Song of Ullikummi. In the former, she seemingly comes up with an elaborate plan to defeat a new enemy of her brother - the sea monster Hedammu - by performing a seductive dance and song montage (with her attendants as a support act) and offering an elixir to him. The exact result is uncertain, but Hedammu evidently ends up vanquished. In the latter, she attempts to use the same gambit against yet another new foe, the “diorite man” Ullikummi – however, since he is unfeeling like a rock, she fails; some translators see this passage as comedic. However, elsewhere in the Song, the storm god's main enemy Kumarbi and his minions view Shaushka as a formidable warrior, and in the early installment of the cycle, Song of LAMMA, she seemingly partakes in a fight. In another myth, known only from a few fragments and compared to the Sumerian text “Inanna and the huluppu tree,” Shaushka takes care of “Ḫašarri” -  a personification of olive oil, or a sentient olive tree. It seems that she has to protect this bizarre entity from various threats. While Shaushka lived on in Mesopotamia as “Ishtar of Nineveh,” this was far from the only “variant”of Ishtar in her homeland.
Tumblr media
Nanaya was another such goddess. A few Sumerian hymns mention her alongside Inanna, the Sumerian equivalent of Ishtar, by the time of Sargon of Akkad virtually impossible to separate from her. As one composition puts it, Nanaya was “properly educated by holy Inana” and “counselled by holy Inana.” Initially she was most likely a part of Inanna's circle of deities in her cult center, Uruk, though due to shared character they eventually blurred together to a large degree. Just like Inanna/Ishtar, Nanaya was a goddess of love, described as beautiful and romantically and sexually active, and she too had an astral character. She was even celebrated during the same holidays as Inanna. Some researchers go as far as suggest Nanaya was only ever Inanna/Ishtar in her astral aspect alone and not a separate goddess. However, there is also evidence of her, Inanna and the sky god An being regarded as a trinity of distinct tutelary deities in Uruk. Additionally, king Melishipak's kudurru shown above shows both Nanaya (seated) and Ishtar/Inanna (as a star). Something peculiar to Nanaya was her later association with the scribe god Nabu. Sometimes Nabu's consort was the the goddess Tashmetu instead, but I can't find any summary explaining potential differences between them – it seems just like Nanaya, she was a goddess of love, including its physical aspects. Regardless of the name used to describe Nabu's wife, she was regarded as a sage and scribe like him – this arguably gives her a distinct identity she lacked in her early role as part of Inanna's circle. As the above examples demonstrate, the popularity of the “Ishtar type” was exceptional in the Bronze Age – but is it odd from a modern perspective? The myths dedicated to her are still quite fun to read today – much like any hero of ancient imagination she has a plethora of adversaries, a complex love life (not to mention many figures not intended to be read as her lovers originally but described in such terms that it's easy to see them this way today – including other women), a penchant for reckless behavior – and most importantly a consistent, easy to summarize character. She shouldn't be a part of modern mass consciousness only because of false 19th century claims detached from her actual character (both these from Hislop's works and “secular”claims about her purported “real”character based on flimsy reasoning and shoddy sources) – isn't a female character who is allowed to act about the same way as male mythical figures do without being condemned for it pretty much what many modern mythology retellings try to create? Further reading: On Astarte: -entry in the Iconography of Deities and Demons in Ancient Near East database by Izak Cornelius -‛Athtart in Late Bronze Age Syrian Texts by Mark S. Smith -ʿAthtartu’s Incantations and the Use of Divine Names as Weapons by Theodore J. Lewis -The Other Version of the Story of the Storm-god’s Combat with the Sea in the Light of Egyptian, Ugaritic, and Hurro-Hittite Texts by Noga Ayali-Darshan -for a summary of evidence that Astarte has nothing to do with Asherah see A Reassessment of Asherah With Further Considerations of the Goddess by Steve A. Wiggins On Shaushka: -Adapting Mesopotamian Myth in Hurro-Hittite Rituals at Hattuša: IŠTAR, the Underworld, and the Legendary Kings by Mary R. Bacharova -Ishtar seduces the Sea-serpent. A new join in the epic of Ḫedammu (KUB 36, 56 + 95) and its meaning for the battle between Baal and Yam in Ugaritic tradition by Meindert Dijkstra -Ištar of Nineveh Reconsidered by Gary Beckman -Shaushka, the Traveling Goddess by Graciela Gestoso Singer -Hittite Myths by Harry A. Hoffner jr. -The Hurritic Myth about Šaušga of Nineveh and Ḫašarri (CTH 776.2) by Meindert Dijkstra -The West Hurian Pantheon and its Background by Alfonso Archi On Nanaya: -entry in Brill’s New Pauly by Thomas Richter -entry from the Ancient Mesopotamian Gods and Goddesses project by Ruth Horry -A tigi to Nanaya for Ishbi-Erra from The Electronic Text Corpus of Sumerian Literature -A balbale to Inana as Nanaya from The Electronic Text Corpus of Sumerian Literature -More Light on Nanaya by Michael P. Streck and Nathan Wasserman -More on the Nature and History of the Goddess Nanaya by Piotr Steinkeller A few introductory Ishtar/Inanna myths: -Inanna's descent to the netherworld -Inanna and the huluppu tree -Inanna and Enki -Enki and the world order -Inanna and Ebih -Dumuzid and Enkimdu
97 notes · View notes
dwellordream · 3 years
Text
“…I should hardly need to say by now that the idea that there is an intellectual downturn in early medieval Europe (or indeed medieval Europe more broadly) is a part of a specific imperial colonialist historiography which seeks to argue that any point when Europe wasn’t violently subjugating the world around it was necessarily a bad time. To this way of thinking, when the Roman Empire goes around turning everyone into slaves and violently opposing anyone it can get its hands-on things are good, because also some amphorae are traded across the Mediterranean; but when there isn’t one giant state oppressing everyone things are bad because fewer amphorae.
This is obviously a stupid and racist position which presumes that the nice things which rich Romans enjoyed (slaves and hegemony) were available to everyone, and also requires us to just ignore the fact that slaves are people. Rome wasn’t a very nice time for the great majority of individuals, and the medieval period had plenty of nice things for the average person – you just got fooled by a later medieval advertising campaign for art and a bunch of people who wanted to do slavery in the modern period. Accepting the idea that Europe did suck in the medieval period is automatically ascribing to this racist and imperialist version of history. In order for a society to be good and have worthwhile things it doesn’t need to be constantly attacking other cultures and enslaving people. Look inside yourself if you think that is true.
Another reason why this falls down as an argument is also that the whole “Europe as an isolated not trading enemy in opposition to the Arab world which had nice things and was gloriously well-connected” thing is not how things happened. If, for example, we look at trade routes in the earlier medieval period as a starting point we see that is in no way the case. We do see a drop off in international shipping when the Roman Empire collapses.
This is because the Empire itself used to ship goods along with moving troops in its fleets of tax-funded vessels. This existed alongside independent trading, which also moved stuff like olive oil from the Iberian peninsula or amphorae out of what is now Tunisia. Once there is no longer a state propped up by taxation doing shipping itself, shipping across the Mediterranean also slumps. That does not mean that it stops.
While we see a decline in movement, the key here is that we see a decline, not a total cessation. Movement very much continued throughout the early medieval period, and we have ample pot-shard based evidence to back that up. Yes certainly many people shifted to making their own pottery, but rich people could still get their hands on the good stuff if they wanted to.
You know when European shipping in the Mediterranean really slowed down? After the Muslim conquests. Where there had been a lively shipping economy suddenly there were a bunch of real bad ass guys who had carte blanche to intercept the ship of any infidel they could find. Oh and if you could take some of their land while you were at it, that would be great. All of this was made possible famously, the Umayyad conquest of Hispania went really well, felling the Visigothic kingdom on the Iberian Peninsula and turning all those olive orchards over to Muslim rule.
In quick succession, you then see the establishment of the Emirate of Ifriqiya on the North-African coast, as well as the Emirate of Sicily on, well, Sicily. In other words, a lot of the Western Mediterranean just wasn’t Christian any longer, so it’s kinda weird to blame Europeans for not maintaining trade routes there. You can’t simultaneously demand that Europeans trade more with the Muslim world while ignoring the fact that the Muslim world was also a part of Europe, and very much interested in dominating any extant trade routes.
This narrative also completely ignores the fact that there was thriving trade which existed all through this period. We have plenty of records on port tolls and taxes which tell the story of luxury goods crisscrossing the continent and across the Mediterranean, regardless of who was doing what. Walrus ivory and amber from the Baltic coast ends up at the Eastern Roman court in Constantinople.
Furs, honey, and elephant ivory popped up basically anywhere anyone had the gold to trade for it. Oh and gold, which largely came from Africa, was around the shop too. Indonesian spices like pepper and nutmeg featured happily in European cuisines, and lapis lazuli from Afghanistan was being ground into ultramarine. You want luxury goods? They were there, because trade was still happening. It just wasn’t happening on an imperial scale – an undertaking which I will again remind you takes a whole lot of slaves to maintain. The idea that Europeans were an unwashed and unrefined mass in opposition to the glories of life in the Arabic world just doesn’t hold up to scrutiny.
The backward post-Roman Europe versus glories of the East narrative also very helpfully ignores the fact that one of the glories of the East was the still extant Eastern Rome – with its afore-mentioned capital in Constantinople. (You may also know it as Byzantium, but we are trying to be precise here.) Of course, Eastern Rome was one of the big losers in the whole Muslim conquests thing, losing its extremely valuable territory in Egypt, which accounted for a huge amount of its tax revenue. It also very famously lost the near east more generally.
Having said all that, it was still a major maritime power, owning territory on the Italian Peninsula in what is now Calabria and Apulia. Constantinople was still very much about that Roman life in the medieval period, with a keen popular interest in Chariot racing, a lively trade with the near East and Western Christendom, and even what could be seen as a sort of pre-modern welfare state, ensuring that its citizens in cities always had enough grain to eat. If we want to pretend that everything was bad and gloomy in medieval Europe compared to the Arab world because Rome collapsed, how then do we account for the fact that it was actually still going at that time, and trading just fine?
Obviously then, narratives of trade stopping totally in medieval Europe are incorrect and overwrought, but why would I say that buying into them supports a colonialist narrative? The answer to that is saying that Europe didn’t have anything nice, as opposed to a flourishing Arab world is a way of justifying the violent incursions on the part of Europeans into the Middle East.
These arguments usually hinge on the idea that before the Crusades, Europe was a disgusting place full of people who didn’t bathe and nothing but unsalted porridge to eat. All of that changes, in theory, with increased contact to the Middle East with the establishment of the Crusader States in the middle east. The theory goes that it wasn’t until Europeans were able to carve their own ports out along the coast of the Levant that anything nice got into Europe at all. Without Europeans at Jaffa, there would be no spices, oranges, or rice in Europe. Hell, without all that religious violence maybe Europeans never would have anything nice ever!
That is not only factually incorrect, but it is a way of justifying what amounted to centuries of attempts to violently subjugate the Holy Land. Sure, all that violence was unseemly – but access to the Silk Road! It also amounts to a convenient justification for modern imperial and colonial violence. Well Europe was a terrible hell hole! What choice did they have other than to sail around the globe, enslave huge swathes of people, do a spot of genocide and begin to extract all possible value from any native people! After all, everything they had before they started in on the colonising in earnest was bad.
None of this is either historically correct, or acceptable. We can, and should, point out the major advancements that Mulsim society presented to the world. There is absolutely no doubt that there was a tonne of interesting stuff going on in the near East, and I in no way dispute that assertion. What is incorrect is the idea that medieval Europe was cut off from that brilliance, a backward hole where there was no trade, no spices, no intellectual culture.
Europe and South Western Asia have always been connected, and indeed the term “Arabic World” very much includes huge swathes of Europe at various points during the medieval period. If you want to say medieval Europe is a sad foil to the Muslim kingdoms, how do you account for the several European Caliphates? If you want to say that without the Roman Empire Europe lost everything bright and worthwhile, how do you explain the still up and running Eastern Roman Empire? If you want to say that without post-Crusades trade there never would have been meaningful trade in Europe how do you explain all the fucking trading?
The desire that many have to defend the medieval Arab world and its culture in the medieval period is laudable. I in no way am here to argue that it had a lot of good stuff going on. However, pretending that all of this had nothing to do with the European world and trade, or that the only place where intellectual advancement was happening was the Arab world is simply incorrect.
The medieval world was complex, interconnected, and very much a part of an on-going scholastic tradition. To argue that without violent force Europe would have languished as a dull afterthought it to argue for imperial colonialism. Medieval Europe was a vibrant and well-connected place, and it could have continued to be so without all of the slavery and genocide. Europeans didn’t need to rape and pillage their way through the world to learn and grow. They just did it because they could.
Pro-imperialist historiography is the air that we breath here in the decaying carcasses of the modern Imperium. I am extremely sympathetic to the urge to celebrate non-white cultures, and I spend quite a lot of time doing so myself. However, to argue that this was happening without any contact with Europe, and that Europeans cannot think or enjoy luxuries without also being involved in a violent imperial enterprise is extremely dangerous.
I know that the people who make this argument think they are being enlightened, but they are still making a pro-imperial argument when they trot out tired myths about the medieval period. We don’t undo the colonial historiography by agreeing with it. We need to write our own history which admits that every world culture has something useful and beautiful to offer us all, and that a better world can be achieved without the subjugation of others.”
- Eleanor Janega, “On colonial mindsets and the myth of medieval Europe in isolation from the Muslim world”
7 notes · View notes
qqueenofhades · 6 years
Text
Medieval Magic Week: Witchcraft in Early Medieval Europe
Apologies for not getting to this last week, but I will try to be at least semi-reliable about posting these. If you missed it: I’m teaching a class on magic and the supernatural in the Middle Ages this semester, and since the Tumblr people also wanted to be learned, I am here attempting to learn them by giving a sort of virtual seminar.
Last week was the introduction, where we covered overall concepts like the difference between magic, religion, and science (is there one?), who did magic benefit (depends on who you ask), was magic a good or a bad thing in the medieval world (once again, It’s All Relative) and who was practicing it. We also brought in ideas like the gendering of supernatural power (is magic a feminine or a masculine practice, and does this play into larger gendered concepts in society?) and did some basic myth-busting about the medieval era. No, not everybody was super religious and mind-controlled by the church. No, they were not all poor farmers. No, not every woman was Silent, Raped, and Repressed. Magic was a common and folkloric practice on some level, but it was also the concern of educated and literate ‘worldly’ observers. We can’t write magic off as the medieval era simply ‘not knowing any better,’ or having no more sophisticated epistemology than rudimentary superstition. These people navigated thousands of miles without any kind of modern technology, built amazing cathedrals requiring hugely complex mathematical and engineering skill, wrote and translated books, treatises, and texts, and engaged with many different fields of knowledge and areas of interest. They subjected their miracle stories to critical vetting and were concerned with proving the evidentiary truth of their claims. We cannot dismiss magic as them having no alternative explanation or way of thinking about the world, or being sheltered naïve rustics.
This week, we looked at some primary sources discussing ‘witchcraft’ beliefs in early medieval Europe, which for our purposes is about 500—eh we’ll say 1000 C.E. We also thought about some questions to pose to these texts. Where did belief in witchcraft – best known for early modern witch hunts – come from? How did it survive through centuries of cultural Christianisation? Why was it viewed as useful or as threatening? Scholars have tended to argue for a generic mystical ‘shamanism’ in pre-Christian Europe, which isn’t very helpful (basically, it means ‘we don’t have enough evidence, so fuck if we know!’). They have also assumed that these were ‘superstitions’ or ‘relics’ of pagan belief in an otherwise Christian culture, which is likewise not helpful. We don’t have time to get into the whole debate, but yes, you can imagine the kind of narratives and assumptions that Western historiography has produced around this.
At this point, Europe was slowly, but by no means monolithically, becoming Christian, which meant a vast remaking of traditional culture. There was never a point where beliefs and practices stopped point-blank being pagan and became Christian instead; they were always hybrid, and they were always subject to discussion and debate. Obviously, people don’t stop doing things they have done a particular way for centuries overnight. (Once again, this is where we remind people that the medieval church was not the Borg and had absolutely no power to automatically assimilate anyone.) Our first text, the ‘Corrector sive medicus,’ which is the nineteenth chapter of Burchard of Worms’ Decretum, demonstrates this. The Decretum is a collection of ecclesiastical law, dating from early eleventh-century Germany. This is well after Germany was officially ‘Christianised,’ and after the foundation of the Holy Roman Empire as an explicitly Christian polity (usually dated from Charlemagne’s coronation on 25 December 800; this was the major organising political unit for medieval Germany and the Carolingians were intensely obsessed with divine approval). And yet! Burchard is still extremely concerned with the prevalence of ‘magical’ or ‘pagan’ beliefs in his diocese, which means people were still doing them.
The Corrector is a handbook setting out the proper length of penances to do (by fasting on bread and water) for a variety of transgressions. It can seem ridiculously nitpicky and overbearing in its determination to prescribe lengthy penances for magical offenses, which are mixed in among punishments for real crimes: robbery, theft, arson, adultery, etc. This might seem to lend legitimacy to the ‘killjoy medieval church oppressing the people’ narrative, except the punishments for sexual sins are actually much lighter than in earlier Celtic law codes. If you ‘shame a woman’ with your thoughts, it’s five days of penance if you’re married, two if you aren’t, but if you consult an oracle or take part in element worship or use charms or incantations, it could be up to two years.
Overall, the Corrector gives us the impression that eleventh-century German society was a lot more worried about whether you were secretly cursing your neighbour with pagan sorcery, rather than who you’re bonking, even though sexual morality is obviously still a concern, and this reflected the effort of trying to explicitly and completely Christianise a society that remained deeply attached to its traditional beliefs and practices. (There’s also a section about women going out at night and running naked with ‘Diana, Goddess of the Pagans’, which sounds awesome sign me up.) Thus there is here, as there will certainly be later, a gendered element to magic. Women could be witches, enchantresses, sorceresses, or other possible threats, and have to be closely watched. Nonetheless, there’s no organised societal persecution of them. Formal witch hunts and witch trials are decidedly a post-Renaissance phenomenon (cue rant about how terrible the Renaissance was for women). So as much as we stereotype the medieval world as supposedly being intolerant and repressive of women, witch hunts weren’t yet a thing, and many educated women, such as Trota of Salerno, had professional careers in medicine.
The solution to this problem of magical misuse is not to stop or destroy magic, since everyone believes in it, but to change who is legitimately allowed to access it. Valerie Flint’s article, ‘The Early Medieval Medicus, the Saint – and the Enchanter’ discusses the renegotiation of this ability. Essentially, there were three categories of ‘healer’ figure in the early Middle Ages: 1) the saint, whose miraculous power was explicitly Christian; 2) the ‘medicus’ or doctor, who used herbal or medical treatment, and 3) the ‘enchanter’, who used pagan magical power. According to the ecclesiastical authors, the saint is obviously the best option, and believing in/appealing to this figure will give you cures beyond the medicus’ ability, as a reward for your faith. The medicus tries his best and has good intentions, but is limited in his effectiveness and serves in some way as the saint’s ‘fall guy’. Or: Anything the Doctor Can (Or Can’t) Do, The Saint Can Do Better. But the doctor has enough social authority and respected knowledge to make it a significant victory when the saint’s power supersedes him.
On the other hand, the ‘enchanter’ is basically all bad. He (or often, she) makes the same claim to supernatural power as the saint, but the power is misused at best and actively malicious and uncontrollably destructive at worst. You are likely to be far worse off after having consulted the enchanter than if you did nothing at all. Both the saint and the enchanter are purveyors of ‘magical’ power, but only the saint has any legitimate claim (again, according to our church authors, whose views are different from those of the people) to using it. The saint’s power comes from God and Jesus Christ, the privileged or ‘true’ source of supernatural ability, while the enchanter is drawing on destructive and incorrect pagan beliefs and making the situation worse. The medicus is a benign and well-intentioned, if not always effective, option for healing, but the enchanter is No Good Very Bad Terrible.
The fact that ecclesiastical authors have to go so hard against magic, however, is proof of the long-running popularity of its practitioners. The general public is apparently still too prone to consult an enchanter rather than turn to the church to solve their problems. The church doesn’t want to eradicate these practices entirely, but insists that people call upon God/Christ as the authority in doing them, rather than whatever local or folkloric belief has been the case until now. It’s not destroying magic, but repurposing and redefining it. What has previously been the unholy domain of the pagan is now proof of the ultimate authority of Christianity. If you’re doing it right, it’s no longer pagan sorcery, but religious miracles or devotion.
Overall: what role does witchcraft play in early medieval Europe? The answer, of course, is ‘it’s complicated.’ We’re talking about a dynamic, large-scale transformation and hybridising of culture and society, as Christian religion and society became more prevalent over long-rooted pagan or traditional beliefs. However, these beliefs arguably never fully vanished, and were remade, renamed, and allowed to stay, without any apparent sense of contradiction on the part of the people practicing them. Ecclesiastical authorities were extremely concerned to identify and remove these ‘pagan’ elements, of course, but the general public’s relationship with them was always more nuanced. When dealing with medieval texts about magic, we have a tendency to prioritise those that deal with a definably historical person, event, or place, whereas clearly mythological stories referring to supernatural creatures or encounters are viewed as ‘less important’ or as the realm of historical fiction or legend. This is a mistake, since these texts are still encoding and transmitting important cultural referents, depictions of the role of magic in society, and the way in which medieval people saw it as a helpful or hurtful force. We have to work with the sources we have, of course, but we also have to be especially aware of our critical assumptions and prejudices in doing so.
It should be noted that medieval authors were very concerned with proving the veracity of their miracle narratives; they did not expect their audiences to believe them just because they said so. This is displayed for example in the work of two famous early medieval historians, Gregory of Tours (c.538—594) and the Venerable Bede (672/3—735). Both Gregory’s History of the Franks and Bede’s Ecclesiastical History of the English People contain a high proportion of miracle stories, and both of them are at pains to explain to the reader why they have found these narratives reliable: they knew the individual in question personally, or they heard the story from a sober man of good character, or several trusted witnesses attested to it, or so forth. Trying to recover the actual historicity of reported ‘miracle’ healings is close to impossible, and we should resist the cynical modern impulse to say that none of them happened and Gregory and Bede are just exaggerating for religious effect. We’re talking about some kind of experienced or believed-in phenomena, of whatever type, and obviously in a pre-modern society, your options for healthcare are fairly limited. It might be worth appealing to your local saint to do you a solid. So to just dismiss this experience from our modern perspective, with who knows how much evidence lost, in an entirely different cultural context, is not helpful either. There’s a lot of sneering ‘look at these unenlightened religious zealots’ under-and-overtones in popular conceptions of the medieval era, and smugly feeling ourselves intellectually superior to them isn’t going to get us very far.
Next week: Ideas about the afterlife, heaven, hell, the development of purgatory, the kind of creatures that lived in these realms, and their representation in art, culture, and literature.
Further Reading:
Alver, B.G., and T. Selberg, ‘Folk Medicine As Part of a Larger Complex Concept,’ Arv, 43 (1987), 21–44.
Barry, J., and O. Davies, eds., Witchcraft Historiography (Basingstoke: Palgrave, 2007)
Collins, D., ‘Magic in the Middle Ages: History and Historiography’, History Compass, 9 (2011), 410–22.
Flint, V.I.J, ‘A Magical Universe,’ in A Social History of England, 1200-1500, ed. by R. Horrox and W. Mark Ormrod (Cambridge: Cambridge University Press, 2006), 340–55.
Hall, A., ‘The Contemporary Evidence for Early Medieval Witchcraft Beliefs’, RMN Newsletter, 3 (2011), 6-11.
Jolly, K.L., Popular Religion in Late Saxon England: Elf Charms in Context (Chapel Hill: University of North Carolina Press, 1996)
Kieckhefer, R., Magic in the Middle Ages (Cambridge: Cambridge University Press, 2000)
Maxwell-Stuart, P.G., The Occult in Mediaeval Europe (Basingstoke: Palgrave, 2005)
Storms, G., Anglo-Saxon Magic (The Hague: M. Nijhoff, 1947)
Tangherlini, T., ‘From Trolls to Turks: Continuity and Change in Danish Legend Tradition’, Scandinavian Studies, 67 (1995), 32–62.
45 notes · View notes
fyrapartnersearch · 5 years
Text
Long term M/M roleplays
Hey! I’m Kat, and looking for some more roleplays. I’m in my early twenties, so no worries there, and I’m in the GMT+3 time-zone, though I tend to be up at odd hours and I’m often online.
I mainly want to roleplay M/M right now. I tend to write multi-para and more often than not my replies are 700+ words. I don’t mind shorter replies, as long as I get at least a few good sized paragraphs and correct spelling/grammar. Mistakes happen and that’s fine, but I don’t want to read something with no punctuation and that’s nothing but mistakes.
Also please read this whole thing before sending me a message!! I ask you don’t just send “wanna RP?”, because I won’t know unless you tell me something you had in mind: a plot, an idea of mine you liked, anything really.
As for smut, I adore it. I don’t want to write only smut for now, but anything from 20/80 to 80/20 on plot/smut ratio is good for me. Just tell me if you want more plot or smut.
In M/M smut I prefer playing a submissive/bottom character. I can play a dominant character, but I don’t enjoy it so I wish you’ll be willing to play an exclusively dominant role.
What I like:
- Medieval/historical settings (especially ancient Egypt/Rome/Greece) - Forbidden love - Arranged marriage - Lots and lots of drama and dark themes - But also fluffy scenes and cute/happy moments - Mpreg (not a must at all, if you’re not into it) - Supernatural beings (werewolves, vampires, demons, gods etc.) - Omega verse - Role reversal (such as, for example, a bully getting himself in a situation where the bullied has complete control)
I’m rather reserved when it comes to modern day settings, but I can do those as well if there’s a lot of action and drama involved. I prefer a plot-heavy story in modern setting though.
Pairings I’d like to try: (Dom/sub)
- Warlord/prince - King or prince/prince in an arranged marriage setting - Pirate or thief/nobility - Samurai/geisha - Nobility/prostitute - Servant/nobility or royalty - Guard/nobility or royalty - Bullied/Bully - Nerd/popular student - Stepbrothers - Demon/angel - Poor guy/rich guy in an Victorian era/early 20th century setting - Mage/human (yes I have just finished rewatching all Harry Potter movies lol) - Werewolf/human - Professor/student
And many more fun things, but I can’t remember everything off the top of my head. Feel free to suggest anything, really. I’m also very much into playing femboys/crossdressing characters, though if that’s not your thing I can do other kinds of characters as well. I know it’s a concern for many with these kinds of characters, so I’ll promise my characters are never the “I-can-do-nothing-on-my-own-and-will-cry-at-the-drop-of-a-hat-and-whine-the-rest-of-the-time” blushing virgin, maiden in distress types. No need to worry about that.
I am busy a lot, so I might not always have time to reply every day or even every other day, but I try to be as active as I can. Feel free to poke me if it takes more than a week or so though.
A few plot ideas: (MC = my character, YC = your character)
1) Insipred by the TV show “Lucifer”. YC is the Devil himself, ruler of Hell, the first fallen angel. He has grown tired of the same old tortured souls and fires of damnation scenery though and decides it’s time to visit Earth for a bit. A notorious playboy, seducer to sin, the owner of one of the hottest nightclubs in town is the image he creates for himself among humans. He grants wishes in exchange to favours and soon enough everyone knows of him. MC is a struggling student, or someone who has just graduated and can’t get on in life, and as a last resort goes to see YC. YC takes an immidiate liking to him, and initial fascination quickly turns into something more… human. Love, maybe? Suddenly YC has to make a choice of whether or not he’ll reveal who he truly is to the innocent human he has fallen for.
2) Once upon a time MC and YC were lovers, young and oh, so in love. They were happy together, planning their future, until one day YC disappeared without a trace and MC never saw him again. Until 10 years later; YC has inherited a large fortune from his uncle who had no family of his own, and one lonely evening he heads to a brothel to ease his longing for company. There, much to his shock, he is reunited with MC who is a shell of what he once was. The bubbly, social human, always so full of life, has turned into someone with a haunted look in his eyes and a deep distrust for other people. Not able to leave MC there, YC buys him from the brothel and takes him home. Now he needs to decide what to do with him. (Historical setting)
3) (Omega verse, preferably mpreg included) MC is a rare kind of a shifter, an omega desired by many. He was born in a different kind of a prison: to a man who breeds only the best omegas for the royalty. He and the other omegas he lives with have never seen the outside world. They are kept safe behind locks in the innermost monastery on the castle grounds, where there’s no chance of them getting out on their own. They are given to the harems of the royal family, or occasionally bought by the wealthies of the wealthy. But MC wants more, he wants independence and a life of his own, rather than a life dedicated to fawning over an alpha with an ego big enough as it is. YC is an alpha who has made a considerable contribution to the kingdom (could be anything from being an honored soldier to being a famous artist, whatever you come up with) and who is being gifted one of these rare omegas by the royal family themselves. On his visit to the monastery to choose one of them, he takes a liking to MC, the spiteful little thing who can’t seem to know when to shut up and who won’t bat his lashes and swoon at everything YC does. It seems like MC will be getting a new home.
4) MC is a shifter (species can be discussed and decided on later) who has been separated from his pack and survived alone for a while now. He gets caught in the middle of a fight between YC’s pack and YC’s rival pack, and after - possibly accidentally - saving YC’s life he is accepted into the pack. Some time passes, YC and MC grow closer, the suspicions some had about MC fade and MC feels he’s starting to belong in a pack again, when he finds out his old pack has merged with YC’s rival pack. Now he’ll have to choose whether he is loyal to his new, or his old pack. (I would prefer this had mpreg, but again not a necessity)
5) YC is a shapeshifter, the leader of a clan of dragons. Dragons have long ago been thought extinct, but the truth is there are still some clans left. The problem is, with the dominant personalities of dragons, it’s quite difficult to find a mate or a breeding partner. YC thinks to look for the solution outside the clan, to make humans their child-bearers. He picks MC as the first test subject. (Includes Mpreg)
6) Two countries have been at war since the beginning of time, as long as anyone can remember. All boys who come of age must join the army and go to war. MC knows he could not survive the war - he’s never touched a sword in his life, never hurt anyone. He’s not physically strong nor does he have any knowledge of fighting. His family has already died because of the war, leaving him alone on a small farm. So, to avoid having to go, just before coming of age MC started disguising himself as a woman. For some years it has worked out well, he’s lived his life peacefully on his little farm, until the enemy forces take the city just outside of which MC lives. YC is a high ranking officer (or the king) who takes an interest in MC, thinking he is a woman. Now MC must figure out what to do with the peculiar situation he finds himself in.
7) (A rare futuristic plotline I’ve been dying to do since I watched Black Mirror; Nosedive) People want good ratings on their pictures, on their posts and videos - on themselves. Everyone has a technological chip inserted into their eyes when they’re born that lets them see how other people are rated. Only the “best” humans in society are rated 9 or higher overall - 5 or lower makes life Hell on Earth for a person. Anyone can rate anyone on their phones every time they interact in person. One’s rating has a tremendous impact on their lives; whether they get the job they want, whether they can apply to a certain school, even whether or not they can buy a house in a certain neighborhood… this system makes creating deep relationships nearly impossible, because people are too afraid to show who they really are in fear of being rated badly. MC is the youngest son of a well-off family, an ideal family where everyone is rated 8.9 or higher, loved by most people. YC is from a family who have never much cared about the system. They are decently rated, but they don’t seem to care - what they care about is the honestly and real human relationships that are so hard to find nowadays. When MC and YC meet, MC is intrigued, but YC thinks MC is an empty shell only after numbers just like everyone else. Eventually, feelings start to develop between the two, but there are many problems to overcome, especially in their society. That’s it for now. I’m always happy to hear any ideas you might have as well, and all the ideas above can be modified or changed up a little!
Anyway, hope to hear from you soon!!
2 notes · View notes
tired-writeblr · 6 years
Text
My first chapter
Sup dudes!
Some of you seem interested in my current wip, so I thought ‘what the heck, let them read some of it.’ Please bear in mind that it is the very first draft, and is by no means even near perfect, but I think it has some moments that shine through, and I hope you enjoy!
It was a relatively average night for a country village. There was a spot of rain, but the kind of rain that struggles to make a person even slightly damp, rain so light it’s almost as if it apologises for each little drop that hits. “Oops, I’m awfully sorry” the rain might say “I really hope I didn’t make a mark.” It’s awfully polite rain. The village, though small, had everything a person could need (as long as that person was a medieval peasant). It had some stalls to purchase goods, farms to work, a Blacksmith's shop, and the two most important buildings, that would remain a vital necessity for every Christian town, city and village for centuries; a place to worship, and a place to get drunk afterwards.
Candle light could be seen glowing in the church, even though it was in fact completely empty. The tavern, on the other hand, was packed. Of course, this shouldn’t be a surprise; drinking is a lot more fun than praying, even the priest and monks agreed. Hell, at a time when even drinking the water would probably kill you, getting drunk was one of the few pleasures people had. And since the water was likely to give you a minor case of death, it was much safer to drink wine and mead, and so getting drunk was just a daily fact of life. The tavern was quite large, with plenty of wooden stools and wooden tables, most of which were occupied by drunken men and women (and some drunken children). The owner, a large bald man with a crooked nose, light brown skin, and a very welcoming smile, was behind the counter serving people drinks, whilst his two daughters, Camilla and Magdalena, were running about carrying food and collecting the tankards. Camilla was a large woman, with her father's smile, a broad nose, long black hair in a bun, and brown eyes. Magdalena was thin, and had her father's crooked nose, but unlike her sister had brown hair in a long plait. Both were beautiful in their own way, and both were often the victim of unwanted advances from some of the non-local male patrons, which often didn't end well as Magdalena had a hell of a right hook, and Camilla often used it as an opportunity to pick the man's pocket.
The tavern was often a noisy place. That night was no exception. And one table a drunken coachman was telling tales no sober person would believe, but the men and women at his table were not sober and took him at his word. At the bar itself sat a large drunken monk with a big walrus moustache. He was one of those people that would be incredibly forgettable if it weren't for one single feature. For this monk it was his moustache. It was so memorable that people simply called him Friar Moustache, which he believed to be a term of endearment, but was in fact because not a soul in the village knew his actual name, not even the priest (who was at this point sat next to Friar Moustache resting his head on the bar, drunkenly mumbling incoherently). Friar Moustache was leading a choir of drunken men singing a popular drinking song. There were a lot of harrumph's and ho's, and a great deal of crude language and descriptions of various lewd acts. The only one more enthusiastic about the song than Friar Moustache was an old man, possibly in his early to mid sixties, known to the villagers as Ser Malcolm the White. He looked a bit like a mid-sized bear. Well more accurately, a mid-sized, shaved, pink, alcoholic bear wearing an almost shoulder length curly white wig, with a scruffy white goatee, a wrinkled face, and tired eyes. His accent was surprisingly similar to the modern Glaswegian accent. He had once been a knight who fought for glory and honour and place in the history books, but he never won any of those things. All he did achieve was reaching a ripe old age, and now the only fight he had was the one to get out of bed each morning, which was getting harder every day.
On a table near the back of the tavern sat a young man just holding a tankard. His skin was pale, his eyes were wide, and tired looking. He gazed ahead of him as if he were staring into the abyss itself. This young man was an unfortunate peasant by the name of Glenn, and earlier that day he had died, which, as it usually does for most people, was causing him a great deal of distress. Now, many may think ‘well, he doesn’t seem that dead, he seems pretty alive.’ And those who do think that would be correct. He was in fact very much alive.  
***
“Don’t worry, I got this this” Glenn had said to the huntsman, as the boar began charging and he attempted to pull back the drawstring on his longbow.  He most certainly did not. You see, longbows require a great deal of upper body strength, which weedy, little Glenn didn’t actually possess. Why he had been given a bow by his father, it’s hard to tell. Perhaps his father hated him, which actually seems quite likely; he did have several more capable siblings. He managed to pull the bowstring back only a little before releasing, causing the arrow to travel only a couple of feet in a downward arch until it landed on the ground in front of him, seconds before the boar collided with him, knocking him to the ground. It would have actually been a little funny if he weren’t about to die. The huntsman tried to stab the beast with, but he missed, and the boar itself narrowly missed him. He immediately decided the best course of action was to run away before he was killed horribly. The beast chased him off a little before turning back towards Glenn. By this point he had managed to get to his feet, but his head was still spinning, and he was very unsteady on his feet.
The boar looked more like a monster than anything else now. It looked almost the size of a cow, with huge sword length tusks either side of its incredibly large snout. Of course, it was not in fact that size, or even especially monstrous. It was an average boar, but in his panicked, and dizzy state, his imagination had gone mad. It didn't help that he had never actually seen a living boar this close before, so he had no memory to compare it to. He attempted to stagger away, with little success. He stumbled just as the beast charged at him again, and this time was immediately gored by the creature’s tusks. It was a rather unpleasant sight, huge gashes into the poor man’s flesh from the beast’s tusks. Spaghetti sauce or blood gushed out of the wound, covering his shirt. It was probably blood. Either way, it would stain. The world around him began to dim, and the last thing he saw was the bloody beast wandering off back into the forest.
Okay, so it wasn’t the last thing he saw. Not long after, he awoke to find himself still in the forest, and caught a glimpse of the beast’s backside as it wandered off. For a second he froze and held his breath, but when he was sure the boar wasn’t going to charge again, he sat up, and touched him side. He found two large, deep gashes from the boar's tusks on his right hand side that should have killed him as far as he was aware, but there was no blood. He stood up, and looked back to where he had been lying. His eye widened.
“Holy mother of god!” he screamed, on the edge of tears. Lying there, at his feet, was him. Well, more accurately, his body. Even more accurately, his very bloody body, with the exact same wounds he had. He stood there, staring at his own corpse for a while, sobbing in a very gross, ugly fashion.
He was disturbed from his silent mourning by the sounds of loud slurping. He turned to see a skeleton in a large black hooded cloak, and bright blue fluffy bunny slippers, drinking something from a ceramic mug covered in little colourful fish. The being was reading a newspaper (of course, Glenn had no idea what a newspaper was, as they wouldn’t be a thing for a few more centuries, he was also mostly illiterate, so it just looked like a piece of paper with squiggles one – which is all any newspaper or book is really) and hadn’t noticed him. He coughed a little to get the being’s attention, with no success. Whatever they were reading in the paper, they were engrossed in it. The being took another large, loud sip from his fish mug, and spoke. “Hmm, four down, five letters, unpleasantly bitter” said the being in an almost ethereal, other worldly voice. The being reached to put their mug down on a table that wasn’t there. The mug fell to the ground, and smashed. The being looked up from his paper, and down at the broken mug, then looked at Glenn, then back at the mug, then back to Glenn.
Now, without an actual face the being couldn’t really provide any facial expression that would suggest just how annoyed they were, but they were incredibly annoyed, and would have scowled at Glenn if possible, which it wasn't (no eyebrows). They were so annoyed that they gave off this feeling of deep, intense annoyance, that even the dimmest of people could pick up.  
“Oh great” said the being sarcastically “another dead mortal, just what I wanted.” Glenn shuffled awkwardly and didn’t say anything. He tried to avoid making eye contact. He didn’t want to make the skeletal being even angrier by saying something stupid. It did not work.
“I was happily doing my crossword, drinking my coffee, but you just had to die, didn’t you?” continued the being, slowly becoming less sarcastic, and more openly angry about having been disturbed “bloody mortals, I hate this damned job.” At this, Glenn was confused.
“What job?” he inquired
“Oh for goodness sake, are you really that dim? Must I explain everything?” replied the being
Glenn shrugged and nodded awkwardly.
"It might help a bit" he said.
The being groaned at this and would have grimaced if they could have.
“Very well. I am Death, claimer of souls, destroyer of worlds, and you died” said Death reluctantly “I’m here for your soul blah, blah blah, take you to the afterlife and all that crap so you can be judged by some jumped up little prick” Glenn just stood there, slightly stunned by the fact that he was talking to death, but also a little underwhelmed. He expected more from Death, though he couldn’t tell you exactly he expected. He definitely would have preferred someone nicer.
“That it?” he said after a few moments of silence.
“I’ve been doing this for a while buddy, and honestly I can’t be arsed with this” replied Death tiredly. They stood in silence for a few minutes. Glenn wasn’t sure what to say to an immortal cosmic entity. Death was beginning to think they should have listened to their mother and become a butcher (though in a way, being the grim reaper isn’t all that much different to being a butcher, at least, that was what they had said to her).
“So, mister Death, sir” began Glenn ending the awkward silence.
“Now listen here mate” said Death, interrupting the recently dead person “I am a skeletal cosmic freaking entity that exists outside of space and time, I really do not have the time for the restrictive genders of you mortals”
“Oh, right, sorry” responded the recently deceased Glenn “you could be a bit nicer about it though, I have just died!.” He gestured to his still warm body, that was lying in a pool of his own blood (or spaghetti sauce, though probably blood), and was being pecked at by a bird that looked a bit like a raven, though since Glenn knew nothing about birds, especially ravens, he wasn’t entirely certain.
“Mate, shut up” said Death “damned mortals!”
“But what now though?” asked Glenn, ignoring Death, “do I go with you? Or am I stuck here?”
“Honestly, I don’t care mate, do what you want” replied Death exasperatedly “I just want to go back to my crossword, but now I have to deal with all the sodding paper work!”
“Could you just let me go back to being alive?”
“Not likely, I mean look” Death said as he pointed at the corpse being pecked at what may or may not have been a raven “you are pretty obviously dead.”
“Oh, right” responded Glenn gloomily “I understand.”
“Although” began Death craftily
“Although what?”
“You could just be mostly dead”
“How can I be mostly dead?” asked Glenn confused by the whole situation
“Well, you personally are obviously properly dead, but sometimes people are a little bit alive, and in those circumstances, I can let them go back to being alive”
“Okay!” responded Glenn excitedly.
“And thankfully there is no paperwork because you were alive” continued Death happily, using his skeletal fingers to do air quotes around the word alive “plus I don’t have to deal with you anymore, so go on back.” Glenn nodded and followed Death’s orders. He lay down on top of his body, and closed his eyes. He took a deep breath, and then winced in pain. His eyes shot open, and he sat up, covered in his own blood, shirt ruined, glad about not having to be dead, but understandably still rather shaken by the whole experiences.
“Oh, by the way, don’t die again anytime soon, because if you do I’ll make you regret it” said Death threateningly before grabbing his newspaper and disappearing.
***
"Helloo, anyone home" said a woman's voice startling Glenn a bit, causing him to drop him his empty tankard. It was Camilla.
"Ah bollocks" exclaimed Glenn
"Watch your language Glenn" responded Camilla feigning offence
"Sorry, I was someplace else" explained Glenn
"No worries sweetie" she said reassuringly "is everything okay? You look like death." Glenn reached for his side. His shirt was still a little damp with his spaghetti sauce, I mean, blood. It was probably some sort of health and safety violation for him to be in the tavern, but they didn't have health and safety, which explains a great many things about the period, like why there were so many things that could end your life prematurely.
"Its...err...I'm fine?" he replied, though it came out as if he were asking a question.
"Oh, that's great sweetie" said Camilla, completely uninterested, she wasn't really paying attention. The tavern was busy and Glenn was one of those people who you could easily forget about. She grabbed his tankard and got back to work.
The singing had all but come to an end, even Ser Malcolm had stopped. The only one still singing, if you could call slurring most of the words and forgetting the other ones singing, was Friar Moustache. He was swaying a little one his stool and swinging his arm about, seemingly forgetting he was still holding a half full mug of mead. His big finish came, and he leant back on his stool and toppled over, flinging his mug into the air, which quickly came crashing down onto the head of another drunken patron.
"oi, Wheresh me drink gone?" slurred Friar Moustache "were in me han!"
He struggled to get back up onto his feet. Camilla walked quickly over to see what the commotion, and bent down. "Let me help you Friar" said Camilla. He smiled at her a great big stupid drunken grin.
"Yur a riight goodun" he replied taking her hand and letting her pull him up.
"You need to go home Friar" said the owner in a thick Lancashire accent from behind the counter "You've had a bit much mate."
“Iamsickofyourshit,” Moustache said, his words tumbling from his mouth in a rush of barely distinguishable syllables. The owner nodded to his daughter, and a couple of his larger, more sober patrons, who grabbed the drunken holy man, and tried to escort him calmly out.
“Gerroff me!” he said as he wobbled “I’m ash sober ash ‘m gonna git. And there nuffink - wait wait wait - nuffink you can do ‘boutit.” He shook free of their grasp, and ambled back to the bar without so much as hiccup in their direction. The owner was much less polite after the first attempt.
"Just carry him out" he ordered a couple of patrons.
"Gerroff! I'm a man o cloth" objected Friar Moustache "I'ma have words with god if ya don't gerroff." They ignored him, and carried him through the tavern, whilst the other patrons simply ignored what was pretty average for a Sunday evening.
They carried him through the door and dropped him on the ground. "Sorry Friar" said one of the men who had been carrying him. The friar rolled over and struggled to get up, but refused to any offer of help from those who had just chucked him out.
"Itsh fine, gerroff" he said "I can do it meself." The men looked at one another, shrugged and went back inside. The friar climbed back onto his feet and stumbled forward. He grabbed a wooden hitching post for support. He clung there, slack-jawed and slumped over, for a long time before he began staggering away from the tavern towards the church. He was planning to have a bit of holy wine before heading to bed. It was dark, and the polite rain had become proper rain. He was drunkenly mumbling angrily to himself about having been thrown out of the tavern. He was insistent that he wasn't that drunk, even though he was barely able to stand, or string a sentence together.
As he approached the midway point between the tavern and the church he noticed a very bright, almost blinding light out of the corner of his eye. He turned¸ squinted, and walked towards the light.
"Whasis? Whas goin on?" he exclaimed, though still slurring his words "Lord is tha you?"
Friar Moustache walked into the light, and fell backwards with a loud 'oof'
"Watch where you're going, drunk prick!" yelled a feminine voice, coming from the light, as it seemed to float round the friar and wandered towards the edge of the village. Moustache sat there for a minute, his mouth agape, shocked. After a few minutes of watching the light float away, he drunkenly climbed up onto his feet, looked towards the church, then at the tavern, then at the church again, made the sign of the cross, then staggered back towards the tavern.
6 notes · View notes
monicadeola · 4 years
Link
When the call came to say my mother had died, I was working on a jigsaw of Joan Miró’s painting The Tilled Field (1923-24). Like many others, I turned to jigsaws at the start of the pandemic as a way to manage stress, and symbolically reimpose order on a chaotic world. We take our consolations where we can and, as I continued with the puzzle in the days after mum’s death, its tactile qualities, the spicy smell of ink and card, and the small satisfactions of placing each piece where it belonged, grounded me when the world was in bits – both outside and within.
The Tilled Field is an elementally life-affirming painting. A view of Miró’s family farm in Mont-roig del Camp in Catalonia, it conjures a surreal collection of human, animal and vegetable forms, deconstructed and stylised, and heavily symbolic. Drawing on references from medieval Spanish tapestry to Catalan ceramics and cave paintings, the image is earthy, visceral and definitively a rural scene. Still, there’s something disquieting about the painting, as if it had emerged from a dream or the recesses of an unquiet mind. A tree grows a human ear and an eye; a cloud formation is also a weathervane; a piebald mare swishes her tail as her foal suckles at her teat. At its heart sits a tumbledown farmhouse straight from a dark folk tale. The smoke from its chimney suggests occupation, but the plaster walls are cracked and crumbling back to earth.
Since her diagnosis of dementia 15 years ago, my mother, too, had been disintegrating, as it were, piece by piece. At each of my fortnightly visits, some further part of her seemed to have newly dropped away, leaving gaps so raw and cruel that I sometimes had to remind myself to focus on what remained. COVID-19put a stop to my visiting the nursing home where she spent the final decade of her life. We tried FaceTime ‘get togethers’ but my mother was blind as well as in late-stage dementia, so these felt like one-way affairs – mum’s eyes half-closed, her face unresponsive, her body giving every impression of lifelessness. At the time of her death, I hadn’t seen her for four months, and her image had begun to fade in my mind.
Having a meaningful exchange with my mother involved delving into our shared narrative archive even as it shrank. In this way, we relived and remade the story of our life. We dipped toffee apples for bonfire night, rode donkeys on Llandudno beach, searched for the screech owl in the forest near my childhood home. Sometimes, my mother added to these memories as if they were lucid dreams she could shape at will. Meeting her where she was meant I had to map out the changing landscape of her dementia. Only there could we truly be together.
Three-year-oldswork by trial and error, but four-year-oldsuse the information in the picture to help them complete the puzzle
If maps are representations of a larger reality, then jigsaws are maps too. Indeed, they began life this way, as ‘dissected maps’. Invented by the British cartographer John Spilsbury in the 1760s, the earliest puzzles were designed to make geography lessons more fun for schoolchildren and, no doubt, inculcate them early into the cult of empire. They remained classroom aids until the 1800s, when their manufacture was made cheaper by lithographic printing techniques, the invention of plywood and the treadle jigsaw. Over the 19th century, what began as hand-coloured maps became printed images of monarchs and biblical illustrations, and by the fin de siècle, when the ideas of Freud, Darwin, Nietzsche and the ‘New Woman’ threatened to fragment the old reality entirely, jigsaws had become popular family entertainments.
Like childhood itself, the early dissected maps arrived without any paper picture to act as a guide. The puzzle historian Anne Williams notes that, in 1908, Parker Bros changed the game by adding a print of the complete image to the box. With uncertainty about the destination reduced, the path grew more enticing. By the early 1930s, with the Great Depression beginning to bite, sales of jigsaws in the United States topped 10 million a week. Enthusiasts queued at newsagents for new deliveries, much as modern lockdown puzzlers scoured the internet and traded in secondhand puzzles.
While there is evidence to suggest that jigsaws help older people retain visuospatial memory, a recent study led by the psychologist Martin Doherty at the University of East Anglia in the UK is the first to investigate how children use their understanding of pictures to complete jigsaw puzzles. The study found that three-year-olds work by trial and error, but four-year-olds use the information in the picture to help them complete the puzzle. Such an understanding of the language of pictorial representation is the foundation of the uniquely human ability to draw and create art.
It’s often said that old age is a second childhood. The similarity of the two states – the child immersed in their magic kingdom, the old person in their memory palace – isn’t lost on artists, scientists and thinkers. As the child emerges from the void, accumulating experience, making connections between things and people, so the old person divests themselves, or has taken from them, those same connections, before they return to the emptiness of nonexistence.
When cracks first began to appear in my mother’s memory, she frantically touched them up in a colour that never quite matched. Once touch-ups became insufficient, she began a programme of wholesale renovation in the form of confabulated memories, extending and reworking experiences that, had they been real, wouldn’t have passed building regulations. Though by now immobile, she’d insist that she had taken a long walk by the seaside, or run across my brother in a pear orchard, or just returned from holiday. The further her disease advanced, the less robust her attempts at repair became, as the supply of materials with which to build them dwindled. She once told me that her mind was falling to bits, which is what happens to everything and everyone eventually. We live with entropy. Yet how hard we resist it. Much of the human project is taken up with holding together things that will, eventually and inevitably, fall apart. Witnessing my mother labouring to put her brain back together was intensely moving. Her courage and resistance were flags planted in the territory of the living, and they deepened my love for her as she grew more frail. The lesson I learned is that it’s not memory that makes us human but meaning-making. That’s where the beauty and poignancy of human life is played out.
Slotting a familiar piece into its rightful place can feel almost as rewarding as returning a lost child to her mother
Art is a system of meaning-making too and, in the months since mum’s death, I have deepened my understanding of how it operates by ‘dissecting’ the map that is The Tilled Field. To complete the jigsaw of an artwork is to observe the artist’s work in a way that’s almost impossible to do in a gallery. You get to know it intimately, becoming familiar with every turn of the brush, each minute gradation of colour and tone. You develop an eye for certain patterns. Particularly ‘helpful’ or intriguing jigsaw pieces, that are vital sources of information, data points along the route to completion, take on the character of old friends. Slotting a familiar piece into its rightful place can feel almost as rewarding as returning a lost child to her mother. Over the weeks it takes me to complete The Tilled Field, its elements and some essence of the artist take up residence inside me, becoming, as the psychoanalyst Melanie Klein might have said, introjected internal objects.
This kind of dynamic encounter of projection and introjection with the world of people and objects is how Klein imagines the way an infant struggles to construct an integrated ego. If we’re lucky, Klein suggests, we develop from fragments of desire and need, frustrated or met, into coherent selves able to meet our own desires and needs. Whispering seductively in our ears all the while is Thanatos, the death instinct, willing us back towards the comforting psychic disintegration of not-feelingand unbeing. For Klein, coming into being is an existential battle. For some of us the drama returns, as it did for my mother, in the long, slow process of leaving life behind.
Klein’s one-time disciple Donald Winnicott had something interesting to say about becoming that seems important to me, standing as I am in the shadow of my mother’s death. For him, the mother is at the heart of everything – her willingness to hold, handle and ‘present objects’ to her baby, to lend him her ego for his own use, enabling him to see himself as a coherent being, separate from her (and thus able support a relationship with her). Only through her can he become whole and real. In the language of jigsaws, good-enough mothering is the guide-image that the infant requires to allow him to build an integrated self from the bits and pieces of his needs, his developing internal world and his body.
When, later, bereavement leaves us once more in pieces, when the mother who birthed us is no longer here, how do we put ourselves back together again? Where is the guide-picture to help us map loss when the world itself seems to be coming apart, exposing the insufficiency of the old rubrics for living?
The attachment theorist John Bowlby described mourning as a form of separation anxiety, akin to that felt by a child lost in a crowd. There is panic, disorientation, a shattering of reality. Freud thought that, in order to grieve healthily, we must sever our bonds with the dead, and establish new ones with the living. But even if that were desirable, cutting ties with the mother through whom one becomes a self seems to ask the impossible. Dennis Klass, an expert on bereavement, suggests a more compassionate model. In his view, there’s no ‘closure’, no turning away from the dead. The bereaved person doesn’t let go, but retains their bond with the dead by negotiating and renegotiating the meaning of their loss. This is the neverending task of grief, and it’s not without its consolations. My relationship with my mother remains alive for me, not simply as a fragment of the guide-picture I conjure of my life, but as a vibrant and evolving aspect of my internal world. When I speak to her, I’m addressing neither a ghost nor a memory, but the real mother who exists inside me, as all the versions of herself I ever knew. Death notwithstanding, our relationship continues to evolve.
And so back to The Tilled Field, and the making and remaking implicit in its creation – and also in my recreation of it as a jigsaw. To the decrepit farmhouse, the smoke rising from a cheerful fire, and to my image of my mother and me, warming our hands beside flames that, like us, are born and reformed in destruction and renewal. In The Tilled Field inside me, my mother and I talk quietly about our lives, or don’t talk but simply go-on-being, together, while beyond the crumbling walls, real life teems, strange and brilliant, as if in a dream.
- Melanie McGrath
0 notes
suzanneshannon · 4 years
Text
Chapter 4: Search
Previously in web history…
After an influx of rapid browser development following the creation of the web, Mosaic becomes the popular choice. Recognizing the commercial potential of the web, a team at O’Reilly builds GNN, the first commercial website. With something to browse with, and something to browse for, more and more people begin to turn to the web. Many create small, personal sites of their own. The best the web has to offer becomes almost impossible to find.
eBay had had enough of these spiders. They were fending them off by the thousands. Their servers buzzed with nonstop activity; a relentless stream of trespassers. One aggressor, however, towered above the rest. Bidder’s Edge, which billed itself as an auction aggregator, would routinely crawl the pages of eBay to extract its content and list it on its own site alongside other auction listings.
The famed auction site had unsuccessfully tried blocking Bidder’s Edge in the past. Like an elaborate game of Whac-A-Mole, they would restrict the IP address of a Bidder’s Edge server, only to be breached once again by a proxy server with a new one. Technology had failed. Litigation was next.
eBay filed suit against Bidder’s Edge in December of 1999, citing a handful of causes. That included “an ancient trespass theory known to legal scholars as trespass to chattels, basically a trespass or interference with real property — objects, animals, or, in this case, servers.” eBay, in other words, was arguing that Bidder’s Edge was trespassing — in the most medieval sense of that word — on their servers. In order for it to constitute trespass to chattels, eBay had to prove that the trespassers were causing harm. That their servers were buckling under the load, they argued, was evidence of that harm.
Tumblr media
eBay in 1999
Judge Ronald M. Whyte found that last bit compelling. Quite a bit of back and forth followed, in one of the strangest lawsuits of a new era that included the phrase “rude robots” entering the official court record. These robots — as opposed to the “polite” ones — ignored eBay’s requests to block spidering on their sites, and made every attempt to circumvent counter measures. They were, by the judge’s estimation, trespassing. Whyte granted an injunction to stop Bidder’s Edge from crawling eBay until it was all sorted out.
Several appeals and countersuits and counter-appeals later, the matter was settled. Bidder’s Edge paid eBay an undisclosed amount and promptly shut their doors. eBay had won this particular battle. They had gotten rid of the robots. But the actual war was already lost. The robots — rude or otherwise — were already here.
If not for Stanford University, web search may have been lost. It is the birthplace of Yahoo!, Google and Excite. It ran the servers that ran the code that ran the first search engines. The founders of both Yahoo! and Google are alumni. But many of the most prominent players in search were not in the computer science department. They were in the symbolic systems program.
Symbolic systems was created at Stanford in 1985 as a study of the “relationship between natural and artificial systems that represent, process, and act on information.” Its interdisciplinary approach is rooted at the intersection of several fields: linguistics, mathematics, semiotics, psychology, philosophy, and computer science.
These are the same fields of study one would find at the heart of artificial intelligence research in the second half of the 20ᵗʰ century. But this isn’t the A.I. in its modern smart home manifestation, but in the more classical notion conceived by computer scientists as a roadmap to the future of computing technology. It is the understanding of machines as a way to augment the human mind. That parallel is not by accident. One of the most important areas of study at the symbolics systems program is artificial intelligence.
Numbered among the alumni of the program are several of the founders of Excite and Srinija Srinivasan, the fourth employee at Yahoo!. Her work in artificial intelligence led to a position at the ambitious A.I. research lab Cyc right out of college.
Marisa Mayer, an early employee at Google and, later, Yahoo!’s CEO, also drew on A.I. research during her time in the symbolic systems program. Her groundbreaking thesis project used natural language processing to help its users find the best flights through a simple conversation with a computer. “You look at how people learn, how people reason, and ask a computer to do the same things. It’s like studying the brain without the gore,” she would later say of the program.
Tumblr media
Marissa Mayer in 1999
Search on the web stems from this one program at one institution at one brief moment in time. Not everyone involved in search engines studied that program — the founders of both Yahoo! and Google, for instance, were graduate students of computer science. But the ideology of search is deeply rooted in the tradition of artificial intelligence. The goal of search, after all, is to extract from the brain a question, and use machines to provide a suitable answer.
At Yahoo!, the principles of artificial intelligence acted as a guide, but it would be aided by human perspective. Web crawlers, like Excite, would bear the burden of users’ queries and attempt to map websites programmatically to provide intelligent results.
However, it would be at Google that A.I. would become an explicitly stated goal. Steven Levy, who wrote the authoritative book on the history of Google, In the Plex, describes Google as a “vehicle to realize the dream of artificial intelligence in augmenting humanity.” Founders Larry Page and Sergey Brin would mention A.I. constantly. They even brought it up in their first press conference.
The difference would be a matter of approach. A tension that would come to dominate search for half a decade. The directory versus the crawler. The precision of human influence versus the completeness of machines. Surfers would be on one side and, on the other, spiders. Only one would survive.
The first spiders were crude. They felt around in the dark until they found the edge of the web. Then they returned home. Sometimes they gathered little bits of information about the websites they crawled. In the beginning, they gathered nothing at all.
One of the earliest web crawlers was developed at MIT by Matthew Gray. He used his World Wide Wanderer to go and find every website on the web. He wasn’t interested in the content of those sites, he merely wanted to count them up. In the summer of 1993, the first time he sent his crawler out, it got to 130. A year later, it would count 3,000. By 1995, that number grew to just shy of 30,000.
Like many of his peers in the search engine business, Gray was a disciple of information retrieval, a subset of computer science dedicated to knowledge sharing. In practice, information retrieval often involves a robot (also known as “spiders, crawlers, wanderers, and worms”) that crawls through digital documents and programmatically collects their contents. They are then parsed and stored in a centralized “index,” a shortcut that eliminates the need to go and crawl every document each time a search is made. Keeping that index up to date is a constant struggle, and robots need to be vigilant; going back out and re-crawling information on a near constant basis.
The World Wide Web posed a problematic puzzle. Rather than a predictable set of documents, a theoretically infinite number of websites could live on the web. These needed to be stored in a central index —which would somehow be kept up to date. And most importantly, the content of those sites needed to be connected to whatever somebody wanted to search, on the fly and in seconds. The challenge proved irresistible for some information retrieval researchers and academics. People like Jonathan Fletcher.
Fletcher, a former graduate and IT employee at the University of Stirling in Scotland, didn’t like how hard it was to find websites. At the time, people relied on manual lists, like the WWW Virtual Library maintained at CERN, or Mosaic’s list of “What’s New” that they updated daily. Fletcher wanted to handle it differently. “With a degree in computing science and an idea that there had to be a better way, I decided to write something that would go and look for me.”
He built Jumpstation in 1993, one of the earliest examples of a searchable index. His crawler would go out, following as many links as it could, and bring them back to a searchable, centralized database. Then it would start over. To solve for the issue of the web’s limitless vastness, Fletcher began by crawling only the titles and some metadata from each webpage. That kept his index relatively small, but but it also restricted search to the titles of pages.
Fletcher was not alone. After tinkering for several months, WebCrawler launched in April of 1994 out of the University of Washington. It holds the distinction of being the first search engine to crawl entire webpages and make them searchable. By November of that year, WebCrawler had served 1 million queries. At Carnegie Mellon, Michael Maudlin released his own spider-based search engine variant named for the Latin translation of wolf spider, Lycos. By 1995, it had indexed over a million webpages.
Tumblr media
Search didn’t stay in universities long. Search engines had a unique utility for wayward web users on the hunt for the perfect site. Many users started their web sessions on a search engine. Netscape Navigator — the number one browser for new web users — connected users directly to search engines on their homepage. Getting listed by Netscape meant eyeballs. And eyeballs meant lucrative advertising deals.
In the second half of the 1990’s, a number of major players entered the search engine market. InfoSeek, initially a paid search option, was picked up by Disney, and soon became the default search engine for Netscape. AOL swooped in and purchased WebCrawler as part of a bold strategy to remain competitive on the web. Lycos was purchased by a venture capitalist who transformed it into a fully commercial enterprise.
Excite.com, another crawler started by Stanford alumni and a rising star in the search engine game for its depth and accuracy of results, was offered three million dollars not long after they launched. Its six co-founders lined up two couches, one across from another, and talked it out all night. They decided to stick with the product and bring in a new CEO. There would be many more millions to be made.
Tumblr media
Excite in 1996
AltaVista, already a bit late to the game at the end of 1995, was created by the Digital Equipment Corporation. It was initially built to demonstrate the processing power of DEC computers. They quickly realized that their multithreaded crawler was able to index websites at a far quicker rate than their competitors. AltaVista would routinely deploy its crawlers — what one researcher referred to as a “brood of spiders” — to index thousands of sites at a time.
As a result, AltaVista was able to index virtually the entire web, nearly 10 million webpages at launch. By the following year, in 1996, they’d be indexing over 100 million. Because of the efficiency and performance of their machines, AltaVista was able to solve the scalability problem. Unlike some of their predecessors, they were able to make the full content of websites searchable, and they re-crawled sites every few weeks, a much more rapid pace than early competitors, who could take months to update their index. They set the standard for the depth and scope of web crawlers.
Tumblr media
AltaVista in 1996
Never fully at rest, AltaVista used its search engine as a tool for innovation, experimenting with natural language processing, translation tools, and multi-lingual search. They were often ahead of their time, offering video and image search years before that would come to be an expected feature.
Those spiders that had not been swept up in the fervor couldn’t keep up. The universities hosting the first search engines were not at all pleased to see their internet connections bloated with traffic that wasn’t even related to the university. Most universities forced the first experimental search engines, like Jumpstation, to shut down. Except, that is, at Stanford.
Stanford’s history with technological innovation begins in the second half of the 20th century. The university was, at that point, teetering on the edge of becoming a second-tier institution. They had been losing ground and lucrative contracts to their competitors on the East Coast. Harvard and MIT became the sites of a groundswell of research in the wake of World War II. Stanford was being left behind.
In 1951, in a bid to reverse course on their downward trajectory, Dean of Engineering Frederick Terman brokered a deal with the city of Palo Alto. Stanford University agreed to annex 700 acres of land for a new industrial park that upstart companies in California could use. Stanford would get proximity to energetic innovation. The businesses that chose to move there would gain unique access to the Stanford student body for use on their product development. And the city of Palo Alto would get an influx of new taxes.
Hewlett-Packard was one of the first companies to move in. They ushered in a new era of computing-focused industry that would soon be known as Silicon Valley. The Stanford Research Park (later renamed Stanford Industrial Park) would eventually host Xerox during a time of rapid success and experimentation. Facebook would spend their nascent years there, growing into the behemoth it would become. At the center of it all was Stanford.
The research park transformed the university from one of stagnation to a site of entrepreneurship and cutting-edge technology. It put them at the heart of the tech industry. Stanford would embed itself — both logistically and financially — in the crucial technological developments of the second half of the 20ᵗʰ century, including the internet and the World Wide Web.
The potential success of Yahoo!, therefore, did not go unnoticed.
Jerry Yang and David Filo were not supposed to be working on Yahoo!. They were, however, supposed to be working together. They had met years ago, when David was Jerry’s teaching assistant in the Stanford computer science program. Yang eventually joined Filo as a graduate student and — after building a strong rapport — they soon found themselves working on a project together.
As they crammed themselves into a university trailer to begin working through their doctoral project, their relationship become what Yang has often described as perfectly balanced. “We’re both extremely tolerant of each other, but extremely critical of everything else. We’re both extremely stubborn, but very unstubborn when it comes to just understanding where we need to go. We give each other the space we need, but also help each other when we need it.”
In 1994, Filo showed Yang the web. In just a single moment, their focus shifted. They pushed their intended computer science thesis to the side, procrastinating on it by immersing themselves into the depths of the World Wide Web. Days turned into weeks which turned into months of surfing the web and trading links. The two eventually decided to combine their lists in a single place, a website hosted on their Stanford internet connection. It was called Jerry and David’s Guide to the World Wide Web, launched first to Stanford students in 1993 and then to the world in January of 1994. As catchy as that name wasn’t, the idea (and traffic) took off as friends shared with other friends.
Jerry and David’s Guide was a directory. Like the virtual library started at CERN, Yang and Filo organized websites into various categories that they made up on the fly. Some of these categories had strange or salacious names. Others were exactly what you might expect. When one category got too big, they split it apart. It was ad-hoc and clumsy, but not without charm. Through their classifications, Yang and Filo had given their site a personality. Their personality. In later years, Yang would commonly refer to this as the “voice of Yahoo!”
That voice became a guide — as the site’s original name suggested — for new users of the web. Their web crawling competitors were far more adept at the art of indexing millions of sites at a time. Yang and Filo’s site featured only a small subset of the web. But it was, at least by their estimation, the best of what the web had to offer. It was the cool web. It was also a web far easier to navigate than ever before.
Tumblr media
Jerry Yang (left) and David Filo (right) in 1995 (Yahoo, via Flickr)
At the end of 1994, Yang and Filo renamed their site to Yahoo! (an awkward forced acronym for Yet Another Hierarchical Officious Oracle). By then, they were getting almost a hundred thousand hits a day, sometimes temporarily taking down Stanford’s internet in the process. Most other universities would have closed down the site and told them to get back to work. But not Stanford. Stanford had spent decades preparing for on-campus businesses just like this one. They kept the server running, and encouraged its creators to stake their own path in Silicon Valley.
Throughout 1994, Netscape had included Yahoo! in their browser. There was a button in the toolbar labeled “Net Directory” that linked directly to Yahoo!. Marc Andreessen, believing in the site’s future, agreed to host their website on Netscape’s servers until they were able to get on steady ground.
Tumblr media
Yahoo! homepage in Netscape Navigator, circa 1994
Yang and Filo rolled up their sleeves, and began talking to investors. It wouldn’t take long. By the spring of 1996, they would have a new CEO and hold their own record-setting IPO, outstripping even their gracious host, Netscape. By then, they became the most popular destination on the web by a wide margin.
In the meantime, the web had grown far beyond the grasp of two friends swapping links. They had managed to categorize tens of thousands of sites, but there were hundreds of thousands more to crawl. “I picture Jerry Yang as Charlie Chaplin in Modern Times,” one journalist described, “confronted with an endless stream of new work that is only increasing in speed.” The task of organizing sites would have to go to somebody else. Yang and Filo found help in a fellow Stanford alumni, someone they had met years ago while studying abroad together in Japan, Srinija Srinivasan, a graduate of the symbolic systems program. Many of the earliest hires at Yahoo! were given slightly absurd titles that always ended in “Yahoo.” Yang and Filo went by Chief Yahoos. Srinivasan’s job title was Ontological Yahoo.
That is a deliberate and precise job title, and it was not selected by accident. Ontology is the study of being, an attempt to break the world into its component parts. It has manifested in many traditions throughout history and the world, but it is most closely associated with the followers of Socrates, in the work of Plato, and later in the groundbreaking text Metaphysics, written by Aristotle. Ontology asks the question “What exists?”and uses it as a thought experiment to construct an ideology of being and essence.
As computers blinked into existence, ontology found a new meaning in the emerging field of artificial intelligence. It was adapted to fit the more formal hierarchical categorizations required for a machine to see the world; to think about the world. Ontology became a fundamental way to describe the way intelligent machines break things down into categories and share knowledge.
The dueling definitions of the ontology of metaphysics and computer science would have been familiar to Srinija Srinivasan from her time at Stanford. The combination of philosophy and artificial intelligence in her studies gave her a unique perspective on hierarchical classifications. It was this experience that she brought to her first job after college at the Cyc Project, an artificial intelligence research lab with a bold project: to teach a computer common sense.
Tumblr media
Srinija Srinivasan (Getty Images/James D. Wilson)
At Yahoo!, her task was no less bold. When someone looked for something on the site, they didn’t want back a random list of relevant results. They wanted the result they were actually thinking about, but didn’t quite know how to describe. Yahoo! had to — in a manner of seconds — figure out what its users really wanted. Much like her work in artificial intelligence, Srinivasan needed to teach Yahoo! how to think about a query and infer the right results.
To do that, she would need to expand the voice of Yahoo! to thousands of more websites in dozens of categories and sub-categories without losing the point of view established by Jerry and David. She would need to scale that perspective. “This is not a perfunctory file-keeping exercise. This is defining the nature of being,” she once said of her project. “Categories and classifications are the basis for each of our worldviews.”
At a steady pace, she mapped an ontology of human experience onto the site. She began breaking up the makeshift categories she inherited from the site’s creators, re-constituting them into more concrete and findable indexes. She created new categories and destroyed old ones. She sub-divided existing subjects into new, more precise ones. She began cross-linking results so that they could live within multiple categories. Within a few months she had overhauled the site with a fresh hierarchy.
That hierarchical ontology, however, was merely a guideline. The strength of Yahoo!’s expansion lay in the 50 or so content managers she had hired in the meantime. They were known as surfers. Their job was to surf the web — and organize it.
Each surfer was coached in the methodology of Yahoo! but were left with a surprising amount of editorial freedom. They cultivated the directory with their own interests, meticulously deliberating over websites and where they belong. Each decision could be strenuous, and there were missteps and incorrectly categorized items along the way. But by allowing individual personality to dictate hierarchal choices, Yahoo! retained its voice.
They gathered as many sites as they could, adding hundreds each day. Yahoo! surfers did not reveal everything on the web to their site’s visitors. They showed them what was cool. And that meant everything to users grasping for the very first time what the web could do.
At the end of 1995, the Yahoo! staff was watching their traffic closely. Huddled around consoles, employees would check their logs again and again, looking for a drop in visitors. Yahoo! had been the destination for the “Internet Directory” button on Netscape for years. It had been the source of their growth and traffic. Netscape had made the decision, at the last minute (and seemingly at random), to drop Yahoo!, replacing them with the new kids on the block, Excite.com. Best case scenario: a manageable drop. Worst case: the demise of Yahoo!.
But the drop never came. A day went by, and then another. And then a week. And then a few weeks. And Yahoo! remained the most popular website. Tim Brady, one of Yahoo!’s first employees, describes the moment with earnest surprise. “It was like the floor was pulled out in a matter of two days, and we were still standing. We were looking around, waiting for things to collapse in a lot of ways. And we were just like, I guess we’re on our own now.”
Netscape wouldn’t keep their directory button exclusive for long. By 1996, they would begin allowing other search engines to be listed on their browser’s “search” feature. A user could click a button and a drop-down of options would appear, for a fee. Yahoo! bought themselves back in to the drop-down. They were joined by four other search engines, Lycos, InfoSeek, Excite, and AltaVista.
By that time, Yahoo! was the unrivaled leader. It had transformed its first mover advantage into a new strategy, one bolstered by a successful IPO and an influx of new investment. Yahoo! wanted to be much more than a simple search engine. Their site’s transformation would eventually be called a portal. It was a central location for every possible need on the web. Through a number of product expansions and aggressive acquisitions, Yahoo! released a new suite of branded digital products. Need to send an email? Try Yahoo! Mail. Looking to create website? There’s Yahoo! Geocities. Want to track your schedule? Use Yahoo! Calendar. And on and on the list went.
Tumblr media
Yahoo! in 1996
Competitors rushed the fill the vacuum of the #2 slot. In April of 1996, Yahoo!, Lycos and Excite all went public to soaring stock prices. Infoseek had their initial offering only a few months later. Big deals collided with bold blueprints for the future. Excite began positioning itself as a more vibrant alternative to Yahoo! with more accurate search results from a larger slice of the web. Lycos, meanwhile, all but abounded the search engine that had brought them initial success to chase after the portal-based game plan that had been a windfall for Yahoo!.
The media dubbed the competition the “portal wars,” a fleeting moment in web history when millions of dollars poured into a single strategy. To be the biggest, best, centralized portal for web surfers. Any service that offered users a destination on the web was thrown into the arena. Nothing short of the future of the web (and a billion dollar advertising industry) was at stake.
In some ways, though, the portal wars were over before they started. When Excite announced a gigantic merger with @Home, an Internet Service Provider, to combine their services, not everyone thought it was a wise move. “AOL and Yahoo! were already in the lead,” one investor and cable industry veteran noted, “and there was no room for a number three portal.” AOL had just enough muscle and influence to elbow their way into the #2 slot, nipping at the heels of Yahoo!. Everyone else would have to go toe-to-toe with Goliath. None were ever able to pull it off.
Battling their way to market dominance, most search engines had simply lost track of search. Buried somewhere next to your email and stock ticker and sports feed was, in most cases, a second rate search engine you could use to find things — only not often and not well. That’s is why it was so refreshing when another search engine out of Stanford launched with just a single search box and two buttons, its bright and multicolored logo plastered across the top.
A few short years after it launched, Google was on the shortlist of most popular sites. In an interview with PBS Newshour in 2002, co-founder Larry Page described their long-term vision. “And, actually, the ultimate search engine, which would understand, you know, exactly what you wanted when you typed in a query, and it would give you the exact right thing back, in computer science we call that artificial intelligence.”
Google could have started anywhere. It could have started with anything. One employee recalls an early conversation with the site’s founders where he was told “we are not really interested in search. We are making an A.I.” Larry Page and Sergey Brin, the creators of Google, were not trying to create the web’s greatest search engine. They were trying to create the web’s most intelligent website. Search was only their most logical starting point.
Imprecise and clumsy, the spider-based search engines of 1996 faced an uphill battle. AltaVista had proved that the entirety of the web, tens of millions of webpages, could be indexed. But unless you knew your way around a few boolean logic commands, it was hard to get the computer to return the right results. The robots were not yet ready to infer, in Page’s words, “exactly what you wanted.”
Yahoo! had filled in these cracks of technology with their surfers. The surfers were able to course-correct the computers, designing their directory piece by piece rather than relying on an algorithm. Yahoo! became an arbiter of a certain kind of online chic; tastemakers reimagined for the information age. The surfers of Yahoo! set trends that would last for years. Your site would live or die by their hand. Machines couldn’t do that work on their own. If you wanted your machines to be intelligent, you needed people to guide them.
Page and Brin disagreed. They believed that computers could handle the problem just fine. And they aimed to prove it.
That unflappable confidence would come to define Google far more than their “don’t be evil” motto. In the beginning, their laser-focus on designing a different future for the web would leave them blind to the day-to-day grind of the present. On not one, but two occasions, checks made out to the company for hundreds of thousands of dollars were left in desk drawers or car trunks until somebody finally made the time to deposit them. And they often did things different. Google’s offices, for instances, were built to simulate a college dorm, an environment the founders felt most conducive to big ideas.
Google would eventually build a literal empire on top of a sophisticated, world-class infrastructure of their own design, fueled by the most elaborate and complex (and arguably invasive) advertising mechanism ever built. There are few companies that loom as large as Google. This one, like others, started at Stanford.
Even among the most renowned artificial intelligence experts, Terry Winograd, a computer scientist and Stanford professor, stands out in the crowd. He was also Larry Page’s advisor and mentor when he was a graduate student in the computer science department. Winograd has often recalled the unorthodox and unique proposals he would receive from Page for his thesis project, some of which involved “space tethers or solar kites.” “It was science fiction more than computer science,” he would later remark.
But for all of his fanciful flights of imagination, Page always returned to the World Wide Web. He found its hyperlink structure mesmerizing. Its one-way links — a crucial ingredient in the web’s success — had led to a colossal proliferation of new websites. In 1996, when Page first began looking at the web, there were tens of thousands of sites being added every week. The master stroke of the web was to enable links that only traveled in one direction. That allowed the web to be decentralized, but without a central database tracking links, it was nearly impossible to collect a list of all of the sites that linked to a particular webpage. Page wanted to build a graph of who was linking to who; an index he could use to cross-reference related websites.
Page understood that the hyperlink was a digital analog to academic citations. A key indicator of the value of a particular academic paper is the amount of times it has been cited. If a paper is cited often (by other high quality papers), it is easier to vouch for its reliability. The web works the same way. The more often your site is linked to (what’s known as a backlink), the more dependable and accurate it is likely to be.
Theoretically, you can determine the value of a website by adding up all of the other websites that link to it. That’s only one layer though. If 100 sites link back to you, but each of them has only ever been linked to one time, that’s far less valuable than if five sites that each have been linked to 100 times link back to you. So it’s not simply how many links you have, but the quality of those links. If you take both of those dimensions and aggregate sites using backlinks as a criteria, you can very quickly start to assemble a list of sites ordered by quality.
John Battelle describes the technical challenge facing Page in his own retelling of the Google story, The Search.
Page realized that a raw count of links to a page would be a useful guide to that page’s rank. He also saw that each link needed its own ranking, based on the link count of its originating page. But such an approach creates a difficult and recursive mathematical challenge — you not only have to count a particular page’s links, you also have to count the links attached to the links. The math gets complicated rather quickly.
Fortunately, Page already knew a math prodigy. Sergey Brin had proven his brilliance to the world a number of times before he began a doctoral program in the Stanford computer science department. Brin and Page had crossed paths on several occasions, a relationship that began on rocky ground but grew towards mutual respect. The mathematical puzzle at the center of Page’s idea was far too enticing for Brin to pass up.
He got to work on a solution. “Basically we convert the entire Web into a big equation, with several hundred million variables,” he would later explain, “which are the page ranks of all the Web pages, and billions of terms, which are the links. And we’re able to solve that equation.” Scott Hassan, the seldom talked about third co-founder of Google who developed their first web crawler, summed it up a bit more concisely, describing Google’s algorithm as an attempt to “surf the web backward!”
The result was PageRank — as in Larry Page, not webpage. Brin, Page, and Hassan developed an algorithm that could trace backlinks of a site to determine the quality of a particular webpage. The higher value of a site’s backlinks, the higher up the rankings it climbed. They had discovered what so many others had missed. If you trained a machine on the right source — backlinks — you could get remarkable results.
It was only after that that they began matching their rankings to search queries when they realized PageRank fit best in a search engine. They called their search engine Google. It was launched on Stanford’s internet connection in August of 1996.
Tumblr media
Google in 1998
Google solved the relevancy problem that had plagued online search since its earliest days. Crawlers like Lycos, AltaVista and Excite were able to provide a list of webpages that matched a particular search. They just weren’t able to sort them right, so you had to go digging to find the result you wanted. Google’s rankings were immediately relevant. The first page of your search usually had what you needed. They were so confident in their results they added an “I’m Feeling Lucky” button which took users directly to the first result for their search.
Google’s growth in their early days was not unlike Yahoo!’s in theirs. They spread through word of mouth, from friends to friends of friends. By 1997, they had grown big enough to put a strain on the Stanford network, something Yang and Filo had done only a couple of years earlier. Stanford once again recognized possibility. It did not push Google off their servers. Instead, Stanford’s advisors pushed Page and Brin in a commercial direction.
Initially, the founders sought to sell or license their algorithm to other search engines. They took meetings with Yahoo!, Infoseek and Excite. No one could see the value. They were focused on portals. In a move that would soon sound absurd, they each passed up the opportunity to buy Google for a million dollars or less, and Page and Brin could not find a partner that recognized their vision.
One Stanford faculty member was able to connect them with a few investors, including Jeff Bezos and David Cheriton (which got them those first few checks that sat in a desk drawer for weeks). They formally incorporated in September of 1998, moving into a friend’s garage, bringing a few early employees along, including symbolics systems alumni Marissa Mayer.
Tumblr media
Larry Page (left) and Sergey Brin (right) started Google in a friend’s garage.
Even backed by a million dollar investment, the Google founders maintained a philosophy of frugality, simplicity, and swiftness. Despite occasional urging from their investors, they resisted the portal strategy and remained focused on search. They continued tweaking their algorithm and working on the accuracy of their results. They focused on their machines. They wanted to take the words that someone searched for and turn them into something actually meaningful. If you weren’t able to find the thing you were looking for in the top three results, Google had failed.
Google was followed by a cloud of hype and positive buzz in the press. Writing in Newsweek, Steven Levy described Google as a “high-tech version of the Oracle of Delphi, positioning everyone a mouse click away from the answers to the most arcane questions — and delivering simple answers so efficiently that the process becomes addictive.” It was around this time that “googling” — a verb form of the site synonymous with search — entered the common vernacular. The portal wars were still waging, but Google was poking its head up as a calm, precise alternative to the noise.
At the end of 1998, they were serving up ten thousand searches a day. A year later, that would jump to seven million a day. But quietly, behind the scenes, they began assembling the pieces of an empire.
As the web grew, technologists and journalists predicted the end of Google; they would never be able to keep up. But they did, outlasting a dying roster of competitors. In 2001, Excite went bankrupt, Lycos closed down, and Disney suspended Infoseek. Google climbed up and replaced them. It wouldn’t be until 2006 that Google would finally overtake Yahoo! as the number one website. But by then, the company would transform into something else entirely.
After securing another round of investment in 1999, Google moved into their new headquarters and brought on an army of new employees. The list of fresh recruits included former engineers at AltaVista, and leading artificial intelligence expert Peter Norving. Google put an unprecedented focus on advancements in technology. Better servers. Faster spiders. Bigger indexes. The engineers inside Google invented a web infrastructure that had, up to that point, been only theoretical.
They trained their machines on new things, and new products. But regardless of the application, translation or email or pay-per-click advertising, they rested on the same premise. Machines can augment and re-imagine human intelligence, and they can do it at limitless scale. Google took the value proposition of artificial intelligence and brought it into the mainstream.
Tumblr media
In 2001, Page and Brin brought in Silicon Valley veteran Eric Schmidt to run things as their CEO, a role he would occupy for a decade. He would oversee the company during its time of greatest growth and innovation. Google employee #4 Heather Cairns recalls his first days on the job. “He did this sort of public address with the company and he said, ‘I want you to know who your real competition is.’ He said, ‘It’s Microsoft.’ And everyone went, What?“
Bill Gates would later say, “In the search engine business, Google blew away the early innovators, just blew them away.” There would come a time when Google and Microsoft would come face to face. Eric Schmidt was correct about where Google was going. But it would take years for Microsoft to recognize Google as a threat. In the second half of the 1990’s, they were too busy looking in their rearview mirror at another Silicon Valley company upstart that had swept the digital world. Microsoft’s coming war with Netscape would subsume the web for over half a decade.
The post Chapter 4: Search appeared first on CSS-Tricks.
You can support CSS-Tricks by being an MVP Supporter.
Chapter 4: Search published first on https://deskbysnafu.tumblr.com/
0 notes
arty-agatha-blog · 7 years
Text
WHAT IS ART?!?
In this blog post I will be essentially be explaining and exploring what the word art means and represents, when it was founded and how we now look at art in a different way compared to how it was once looked upon.
The dictionary definition of art states ‘the expression or application of human creative skill and imagination, typically in a visual form such as painting or sculpture, producing works to be appreciated primarily for their beauty or emotional power’. This is what art is more renoun for in the 20th century, where as in the past it has been seen in many a different light including different styles of art and techniques. Art has many different meanings and can be molded into different types of art forms such as music and acting; however if we refine the term ‘fine art’ we instantly think of beautifully skilled paintings, drawings and sculpture, but as we evolve in today's world artists have experimented and create new forms of art which is questioned where or not it should be classed as ‘fine art; such as animation and more graphical creations which isn’t seen as the detailed traditional styled work, but who is it that decided what is classed as art?
Art tends to to be a reflection of the current era of the social, economic and historical happenings of the time, without these art would not be a representation of the culture occurring in the time. A main part of the art history is the ear of the renaissance which occurred due to the French Revolution.  Renaissance art is the act of painting and sculpture in the period of the history in Europe known as the Renaissance, the detailed fine art found its distinctive style in Italy in the early 1400 (artwork including Leonardo’s vision of the Last Supper, and the ‘Creation of Adam’ painted by Michelangelo) ;which also help influence music, philosophy, science and music. The renaissance art affected the whole of Europe, giving impact on artists developing their techniques and style of artwork; this marked the passage from the change of the medieval period and the very early stages of what we now know to perceive as ‘modern art’. Renaissance reached its height in the late 15th and early 16th centuries, with the work of the famous Italian masters of art including Da Vinci, Raphael and Michelangelo.
In retrospect the term ‘artist’ has also changed over the years and developed a new meaning just as well as art. In the 20th century an artist usually has to be familiar to having imagination, originality and have their own distinctive style on their take of self expression. We as artist can have the freedom to create whatever we want without boundaries using a wide range of materials and surfaces to express on. This hasn't always been the case, painting was the main source of imagery before photography bursted on to the scene in the 1830’s so people paid artists to paint the moment/time and to create whatever the buyer requested, this meant that the artist had really no influence and creativity on what they were going to paint, usually the consumer would be really precise giving them the colours they would have to use, the position of their muse etc. In the book ‘Art and its Histories’ Steve Edwards (page 91) - it states ‘The predominant way of envisaging the artist in our culture is a special kind of person who expresses their inborn genius and talent’ this shows that you need to look more into a painting and try and depict what the artist is trying to display and express in their work.
Art is constantly changing and expanding everyday and has been for many years, i feel like people of today are more open to the idea of expression and change which makes it a lot easier for artists to change the perception of the term ‘fine art’. This, I think is due to the youth of today, because we are a lot more accepting of new ideas/ change and we are constantly changing the way we communicate with the world; this has been shown through the forms of many things such as fashion and music. Fashion and music are constantly changing which I think influences the art scene to be more expressive and adventurous; fashion, music and art are all classed as an art form and all reflect a personal identity,and an outlet to really express yourself as an individual, which is why there are some really articulate and artistic pieces of artwork especially in the early 20th century.
Me as an individual love the more expressive an abstract style of art work and I have always been attracted to it more so the the classic ‘fine art’ classic, detailed paintings; i love the bold expressive colours used and how it the artwork links the day to day life we live in right now. I think more people are turning to the more modern art side of art because it is a huge part of today's galleries and exhibitions, but the trends are forever changing and growing and I believe people will tend to grow and head towards what is popular at the current time. However I do think people's opinions will not change and will always see classical art as ‘real’ art compared to the more expressive and abstract side of it becuase new art may require less of a skill and the art scene is constantly changing and art produces thousand of years ago preserved the time frame it was created in which can never be produced again. Some people find it easier to relate to more classical painting which are a lot easier to make out what they represent compared to more of the abstract side of art because they cannot understand what it is about and what it relates to and may need prompting by looking up what it represents.
To conclude my blog post art is see as different forms of expression and is created by artists whether is be painting, drawing or sculpture. The term art is a very broad word and can be seen in many different lights depending on your own personal view. The term art openly connects different art forms together letting people be expressive and reveal different processes of how the art world has evolved and revolutionised over the centuries.
1 note · View note
thedrunkaffogato · 7 years
Text
Cooking is Chemistry 3: The slow burn
There's many reasons that thinking of cooking as chemistry might seem initially off the mark. Chemistry is often shown off to children as flashy and immediate to try and keep their attention. Cooking, on the other hand, is often seen as a daily drudgery.
Part of it being seen more often as craft than art has to do with how much time and effort it often requires just to make something you can eat. Trying to hurry it along into an instant reaction, like dropping baking soda in vinegar, usually leads to unfortunately equally explosive results.
This bored view of the kitchen has shaped cooking. It’s part of why culinary knowledge is actually so often in the hands of people that the work was pushed off to - women, people of color, and the working class. People that society views as not having much to contribute have long been expected to devote their time to this supposedly unfulfilling work.
Setting aside the tangled question of what qualifies as craft or art, whether you learn to cook in a diner or a premier culinary school you are absorbing the knowledge distilled by centuries of other people's patience. Any time you cook you are setting aside some of your own time as well, adding to that tally, and hopefully expanding the corpus with that dedication.
I think the omelette, especially given the debates surrounding it in many cooking circles, is the perfect example of where these issues all come together. My favorite type to make and to eat is one that requires continuous effort and rapt attention. When done right however, it honors all the effort before it.
Tumblr media
Eggs are arguably the most versatile ingredient in cooking. They come neatly divided into protein-based whites and a yellow bulb of rich fats. You can separate them or use both. Because they cook at comparatively low temperatures but don't burn until fairly high ones, there's a broad range of ways to make and serve them. They are fairly mild when it comes to flavor, so spices, salt, sour acids, bitter sides, and even sweeteners can be used alongside them.
The problem with this flexibility is that there's so much you can do with them that it can be paralyzing. Omelettes are often presented as uniquely complicated, but I think they're actually a very useful place to start because they take a simple approach towards most choices with the eggs. You don't separate the yolks and whites but simply cook the whole without too much agitation, especially at the end. Those easy options are essentially what define an omelette.
Omelettes are a contentious topic in the cooking world because like most other egg dishes they have such porous borders. They originated in medieval Catalonia, France, and what would become Italy, as close cousins to frittatas and quiches - still visible in the making of so-called Spanish "omelettes" to this day.
Unlike those more cake-like constructions, however, the idea of what makes an omelette is right in its name - a long-lost reference to the peasant Latin word “lamella”, a plate. In short, a flat surface of cooked egg. Medieval recipes make it sound as though these early omelettes were pretty extensively cooked before probably being folded over on themselves.
Modern omelettes come in diverse shapes and textures, but are all derived from that same tradition. The popular standard in both American and British cooking schools today calls for agitating the eggs while they're still fluid in the pan. This distributes them as much as possible while they begin to cook without turning them into separate pieces.
The openly admitted goal behind that is to make sure they cook as little as possible, leaving still liquid proteins scattered throughout the omelette, to create an extremely soft texture. One of the other effects of this is that they also usually cook faster - partly because of the technique but partly because of the standards involves for what counts as “cooked”.
Few chefs discuss that, however, this is at odds with the original peasant tradition of slow and thorough cooking. Fewer still admit that the more extensively cooked versions remain popular in less costly restaurants, even though they take more time. Personally, I prefer omelettes slightly more solid than this modern professional style in any case. Perhaps that's just me, but maybe there’s some wisdom in this enduring popular tendency.
That being said, modern culinary schools are interested in avoiding overcooking eggs for a reason. The entire point of an omelette is to create a so-thin-it-folds egg dish, not the thickness of a quiche or frittata. A lot of half-pan omelettes in today’s diners are larger, deliberately to securely hold more and more filler ingredients. Those take still longer and longer to cook all the way through, however.
Maybe you prefer omelettes on either end of the spectrum, but I think the best lie in the middle - solid but still creamy, able to be flavored by the ingredients it holds but not expanded into a quasi-quiche to hold them. So, here's my solution to how to make an ideal omelette, balanced between these extremes.
The first thing I think everyone should do when cooking an omelette in a new kitchen or with a new pan is to do a quick test. Often I will, as in this case, just use one egg to see how it cooks under these conditions.
What seems to virtually always work well is to heat the pan with both butter and oil in on the lowest possible setting while preparing the eggs. The longer time the cooking medium and pan itself are on the heat, the more it evenly distributes through them. Keeping it low keeps the oil and butter from burning or browning however.
Tumblr media
Since your goal is to create a thin and evenly cooked sheet of eggs (remember it’s a lamella!), having it only cover some of the pan thinly is better than covering the whole thickly.
With a flexible spatula, I recommend circling the rim the eggs form in the pan. This technique makes sure there's a mixture of oil and butter beneath the eggs (as they will stick to even supposedly non-stick pans). Ideally, it will also distribute some of the hot oils to the top of the eggs, helping them catch up on cooking. Once the eggs are freely unattached in the butter and oil, you can begin rolling them from one side of the pan to the other.
Tumblr media
After you have them gathered up in one tight roll, I recommend experimenting with moving them within the pan or pouring the oil and butter over them for a few more seconds. The last bits of them that are still a bit liquid will firm up in this position, making a more secure omelette.
Tumblr media
I don't think this is my finest one-egg omelette by far, as I  poured the eggs in the pan a bit oddly. Whatever this one’s flaws, it's often easier with just one egg to perfect the technique and also account for what could be improved. If you think of cooking as a craft or art, this is practicing. If you think of it as chemistry, this is a trial experiment. In either case, it's essential.
However you think of it, what this shows is that one egg is most definitely not enough to create a full "plate" for the filling. The pan I'm using is about 10.5 inches in diameter at its top and tapers down to about 8.5 at its base. In all but the smallest of pans, I would recommend at least two large eggs to have enough of their proteins to create a stable omelette with filling.
When starting to use a filling there's a number of considerations you need to make. Here's what I started with yesterday-
Tumblr media
From those broccolini stalks, I trimmed them down somewhat - removing their leaves (which will cook too quickly) and the thicker parts of their stems (which will cook too slowly). Their buds, flowers, and smallest stems will cook fairly evenly, along with some ginger cut to be similarly fine. The rest I’ll save for other dishes this week.
Tumblr media
With those prepared, I put them to the side. As before, I heated up the pan on the lowest possible setting with a very liberal amount of oil and butter while mixing the yolks and whites. Then, in they went.
Tumblr media
Resist the temptation to raise the temperature - you need the time the lower temperature offers you.
As before, they don't quite fill up the whole of the pan, but that's not a problem. The bigger concern is whether they'll stick to the pan, which will affect the shape and increases risk of burning. Once again, lightly pushing from below allows the butter and oil to go beneath them, making sure they're free floating in it. It also usually disturbs their surface, making sure they're covered in oil and butter from above as well.
It's important not to let too much of that mixture of butter and oil touch the top of the omelette, however, because you don't want it to solidify quite yet.
Tumblr media
I added the chopped broccolini and ginger to the top of the omelette before it solidified, allowing these ingredients to sink slightly into the eggs.
Adding them in this way allows them to become incorporated into the omelette without directly touching the heat or appearing on what will become the surface of the omelette. When you mix them in with the raw eggs, it can end quite badly with cheese melting out of or mushrooms discoloring what will become the visible outside of the omelette.
At that point, the omelette had nearly hardened around those fillings. Its base was both already that solid and had been fully detached from the pan. It was ready to roll.
Tumblr media
Once again, I let it sit a bit longer to set and solidify a bit more.
Tumblr media
Now, after all that, you'll notice most of the oil and butter used in cooking is still in the pan. You want an omelette to be completely immersed in them to cook properly, but that means a lot is left behind. Even if you're quite careful, usually a few stray pieces of filling ingredients and bits of egg will be sitting in it too. That’s a very likely fire hazard if you reuse it.
What I find works best is to add some already cooked rice to catch those pieces and absorb some of the butter and oil. It only takes a small amount, so it's perfect as a side to the omelette. It quickly soaks that up, especially once the heat is turned up so that it can be done before the omelette gets too cold.
Tumblr media
The resulting dish is the product of careful work over a not insignificant amount of time. Cooking requires practice with all of your tools at hand - your ingredients, your cookware, your kitchen - to understand how they work. It requires a bit of diligence in this specific lesson too.
To cook an omelette in this manner, you have to work with low heats and carefully interact with a pliable substance. Especially at first, you will likely have some difficulties. I know I did. The hard work pays dividends, however, when you can more and more easily prompt the reactions you want.
With this dish particularly, the quiet commitment never disappears entirely. The moments when you first pour your eggs into hot oil and butter are ones of frantic yet prolonged movement to make sure all goes well. There's always room to improve and new variations to explore. Perfection isn't a result, it's a process. Hopefully you can find the liberation in that, that less than ideal results are momentary frustrations you can learn from if you respond with care and contemplation.
By doing that with ingredients as humble as eggs, you can find a carefully calibrated heir to two omelette traditions. It's one thick enough to easily hold filling and yet pleasantly light, creamy from its lighter cooking yet stable and solid.
Tumblr media
It's the product of generations of work, arguably all the way back to the first person to capture a wild chicken. So much of it is unsung.
Tumblr media
Yesterday's breakfast was an omelette with broccolini and fresh ginger filling dusted with garam masala, fried rice with sriracha, and cinnamon-infused coffee with milk.
1 note · View note
bluewatsons · 4 years
Conversation
Vincent Bevins, What Was the Child, The New Inquiry (January 17, 2018)
Vincent Bevins: What questions have you tried to answer about childhood? You point to the ways our changing public understanding of childhood serves specific purposes in our society. You also highlight what childhood’s shifting appearances in pop music can tell us about society.
Paul Rekret: Children perform an emotional role--Pleasure is taken in them, and this is usually grounded in a nostalgia for what one remembers fondly as a better time in one’s own life. Shulamith Firestone, from whom I’ve taken the book’s title, says this is natural. Given that life in capitalism is mostly drudgery, it’s therapeutic to imagine a period of relative pleasure and comfort. Children are also invested with hopes for the future, and this means they must be carefully guarded, trained, and controlled. But I also wanted to understand these feelings in their historical contexts. And it turns out they’ve changed quite fundamentally over time, in relation to the particular needs of capital and labor markets. Who counts as a child and how they ought to be treated have shifted quite dramatically since the emergence of capitalism. And, as it turns out, pop music has been a pretty good marker of changing experiences of childhood, while the changing ways childhood is represented in pop tell us a lot about music as well. Even more importantly, both pop music and childhood are understood and experienced, at least historically, through their segregation from waged labor. We associate both music and childhood with leisure and play. In that sense at least, they’re closely intertwined.
Vincent Bevins: Pop music is only really, what, 70 years old? Began in about the 1950s in the U.S.? What was going on with material human activity at the time?
Paul Rekret: Well, there are a lot of different ways we could define popular music, simply as music that is popular, for starters, and I’m not pretending to offer a holistic history. The particular thread through which I trace it starts with histories of work song which examine the way that industrialization ejected folk songs from work--Machinery was simply too loud to sing over, and factory bosses often banned singing as a means of disciplining workers. At that point, music becomes something one consumes in one’s leisure time, or, a bit later, professional singing is broadcast at you by the radio or TV in the evenings and weekends. So in this sense, the history of pop music rests with its isolation from work. Pop is not-work. This brings us to the idealized image of the postwar nuclear family. By that point, a clear set of borders had long emerged delineating the world of work and non-work, and the latter includes music. Obviously, this border is partly an illusion--School is mostly tedious work, domestic labor is lonely and hard, it’s also an undeniably gendered and deeply racialized ideal. But the point is that I want to locate pop within a longer history where music, play, and indeed women and children, are mainly excluded from the world of waged work and become associated, in part, with a world of leisure.
Vincent Bevins: You imply a few times that after children were removed from the labor market (relatively recently), our idea of childhood often served as a kind of inverse image of modern working life. How did we create our idea of childhood?
Paul Rekret: The ideal of childhood as we know it goes back to early modernity, as an emergent bourgeois social class drew on the notion of innocence partly as a way of liberating themselves from the authority of medieval superstitions and feudal hierarchy and also anchoring modern notions of progress to human biography. If you imagine children as a sort of blank slate, then their moral qualities are no longer viewed as innate but the product of development. So that implies that reasonable people ought to be free to make their own decisions and, also, that humanity as a whole is the agent and product of progress. Children become a living, walking, representation of history in miniature. But the notion of innocence, a property of children as well as colonial subjects, also legitimated continued authority over and careful management of development. The ideal of innocence was not really extended to working-class children. They were forced into work; this was viewed as a means of disciplining them, of inculcating bourgeois values in them. Not unlike in the present, in the 18th century working-class children hanging out in the streets all day were viewed with deep disquiet and fear. It’s only in the late 19th and early 20th century that, for a number of reasons (men’s resistance to children’s downward pressure on wages, the angst over autonomy won by working children, the need for a better-trained workforce), children are legally prohibited from most forms of waged work and schooling becomes compulsory. It’s at this moment that, as sociologist Viviana Zelizer argues, children become economically worthless. Young people become an economic liability, but the future they promise can offer parents a source of meaning for the toil they suffer. So I think it’s really only in the late 19th or early 20th century that childhood as we know it becomes generalized. I also think that ideal is once again in crisis today.
Vincent Bevins: What does all of this have to do with this pure Victorian idea of childhood you mention in the book that became so popular in 1960s rock? Where did that come from?
Paul Rekret: I actually think it’s more an Edwardian image of childhood that pervades the visual imagery of psychedelia, penny-farthings, village fairs, circuses, all the stuff that’s quite prominent in Sgt. Pepper’s–era Beatles for instance. I think the ’60s counterculture deployed a deeply romantic image of childhood as part of its broader demand for social renewal and change. This sort of ye-olde-circus imagery suited the ’60s perfectly. It stood in for the playful wonder, amusement, and sensuous intoxication with the world that the music sought to communicate. These are old tropes of renewal or revolt where children are concerned; one finds them in various forms in the long romantic tradition that runs from Rousseau to Walter Benjamin. Children question ceaselessly, their knowledge is tactile, and in this sense they have a redemptive quality. This not only suited the cultural politics of the epoch quite well but was also closely reflected in the music. Psych rock demanded immersion of the audience; songs were more like roving jams than tight, coherent pop melodies, and they had that sort of washing-over-you feeling we continue to associate with psychedelia. This was all part of a broader ethos of self-transformation and discovery. Childhood stood in for it particularly well.
Vincent Bevins: You say that in a few instances, young children are almost used to sneak dangerous Black music into the mainstream. Which understanding of childhood allows this to happen? How does it to happen?
Paul Rekret: Yeah, the Jackson 5 are the most well-known instance of this. Berry Gordy apparently quite cynically wanted to use Michael as a means of getting Motown Records, often associated with Black militancy, into white living rooms. But this isn’t the only example. If I remember this right, “Pass the Dutchie” was the first No. 1 reggae single in the UK and throughout Europe. All of this works because we see children as innocent, as vulnerable, and as passive. They’re not viewed as a threat. That said, one pervasive theme in the history of childhood is an anxiety over when exactly children are no longer children when they’ve become intentional actors responsible for their own actions and so no longer innocent. Or more broadly, who exactly counts as a child? The moment they’re not able or willing to be passive and vulnerable the child becomes the object of all kinds of forms of repression, from workhouses to juvenile prisons, curfews, or anti-gang ordinances. These can get pretty preposterous. Do you remember those sonic mosquito things that shops were putting in their doorways in London to deter children from loitering? At Turnpike Lane station they play classical music over the PA to deter loitering and antisocial behavior. What kind of deluded view of young people do you have to have to think that playing some Bach is going to send them scurrying off? Obviously, this accounting of childhood is deeply stratified by class and race. Poor or working-class kids, young people of color, are overwhelmingly more likely to be viewed as a threat, as poorly developed, or as adults and in need of carceral discipline and training, but it affects all children.
Vincent Bevins: You point out how much we love to publicize the narrative of the childhood star’s public breaking down as their innocence is lost. Britney’s shaved head is the most memorable example. What purpose does highlighting that narrative serve? Who gains?
Paul Rekret: I’d speculate that those sorts of breakdowns relate to a democratic need to see celebrities fail. But in the case of children, it has a more specific set of coordinates too. On the one hand, the child star is deeply seductive. They publicly perform and thus reproduce the ideal of innocence contemporary society cherishes. But we’re also aware on some level that these children are actors, that is to say, that they are working and that the innocence they put on display is affected. That means that these working child performers also represent a threat to our ideal of innocence. When we finally see images of them drunk, high, shaving their heads, or whatever, I think it serves for their audience as a sort of confirmation that they have sinned against innocence. In this way we can keep on consuming the child star’s posed vulnerability while being morally rewarded with images of their downfall. Best of both worlds, really.
Vincent Bevins: You write that for neoliberal subjects, there is no distinction between work and leisure. How did that collapse affect our public presentation of the child in contemporary music?
Paul Rekret: This is the central argument I want to make--If childhood as we know it emerges where children are segregated from waged work and are imbued with all these emotional qualities I’ve been talking about, then the current epoch of capitalism’s crisis seems to imply a deep crisis for the ideal of childhood too. Not only are work and play increasingly difficult to distinguish today, but the markers of childhood and adulthood are increasingly mixed up. Permanent employment, perhaps the ultimate marker of adulthood, grows increasingly unattainable for greater and greater portions of the population. Meanwhile, we’re asked to keep learning and “growing” throughout the period of our increasingly precarious attachment to work. It seems that as the capitalist wage relation undergoes fundamental mutations since the early 1980s, so too does the ideal of childhood become increasingly difficult to hold on to. I think this is what’s going on in so much of the popular music in recent decades. I think it’s particularly prevalent in what’s come to be known as “toytown techno” anthemic rave tunes from the early- to mid-’90s that drew heavily on samples from public service ads and ’70s kids’ TV shows. These reflect a sort of mocking derision of the ideal of childhood by a generation increasingly abandoned by the welfare state. Something similar is going on in a lot of hard rock. I’m thinking of a song like Metallica’s “Enter Sandman.” I wonder if it’s also what underlies the themes of pathological childhoods, abuse, suicide, depression, in a lot of grunge around the same time. It’s even more pronounced in a lot hip-hop, though in rap music the tone is much more confrontational. There’s often a playful ridiculing of the ideal of childhood, especially where children sing choruses to tracks like Trick Daddy’s “I’m a Thug,” “Hard Knock Life,” Gucci Mane’s “Lemonade,” and countless others.
Vincent Bevins: Do children exist?
Paul Rekret: If you mean is there a period in our biological and cognitive transformation that we associate with “childhood”? Then sure, yes obviously. But, at risk of profound cliché, I’d say it’s also a construct. Where does childhood stop? Why are we segregated from it? Why can’t we also play, explore, wonder? Not only does the ideal of childhood exclude the rest of us from its categories, but that very exclusion sets the rest of us up as complete and autonomous. That seems pretty problematic to me. Not only does it imply we are independent and demand we not be vulnerable or that we only play at allotted times and places, but it also legitimates our absolute authority over all those people we consider to be children. The most influential history of childhood is Philippe Ariès’s. He famously argues that there seemed to be no meaningful markers differentiating children in medieval society. They were simply, as he famously says, “miniature adults.” Aries was writing in the early ’60s, though, and since then a lot of historians have complicated this view. But I think his basic point still stands, which is that childhood was far more of something like a spectrum wherein one was more or less vulnerable, more or less capable of undertaking particular tasks. But it wasn’t this totally separate sphere of life, as it is conceived today. It didn’t require special forms of dress, literature, a wholly unique culture. Those are all much more recent inventions. In any case, I think that childhood as we know it is a characteristic of capitalism. Obviously, there are points in our lives where we’re more or less vulnerable, but those aren’t necessarily reducible to age, and they ought not to justify unquestionable power over us, regardless of our age.
Vincent Bevins: Might we guess how the adult/child distinction might change or break down in a different postcapitalist or postneoliberal system, even a utopian world?
Paul Rekret: The utopian horizon against which the book is set is one where human activity is not determined by the compulsion to earn money to survive. So one way to think about this is as a world where labor and play are not separate. It would also be a world where our understandings of the differences between adults and children wouldn’t make any sense. I don’t really know what that world would look like, but at the very least, I know it would imply that play wouldn’t only be an activity engaged in by very young people or be isolated from other activity. I see the adult/child binary collapsing in the current era, but for nefarious reasons that have to do with collapsing profits and declining levels of employment. I nonetheless want to look at this crisis to see what might be learned from it about the contingency of these categories. But I should also add that as the crisis continues and there are less and less social resources to care for children, one sees an increasingly venomous attack upon them. There’s an attempt to exclude many young people from the category of childhood, by trying them as adults for instance, and, in doing so, preserving an ideal of innocence ever further from reality. It’s genuinely really hard to be a child today.
0 notes
presta-hero · 6 years
Text
Essay for Love: Ancient Approach
I provides my grandma and grandpa that I would adhere to them for a few years in the summer months, after a finish my school term. When providing love description, we find the item appropriate to turn to an explanatory dictionary. For modern absolutely love essays, they give us distressing warnings.
The philosophical and anthropological problem of love emerged previously, in olden days. So , here I am, moving along an old time graveled roads, breathing in fresh new rural atmosphere and looking forward to meeting my favorite dear relatives. A concept that every other love is because love on the way to yourself will start gaining popularity. In this property, a hundred stretches from the location I live life, I ultimately feel at home.
The philosophy of love has been seen diversely in every amount of human progress, depending on important and national environment. Many works about appreciate point out that an individuality appeared to be considered as a part of society back then, that is why man or women interests were definitely strongly bound to public products.
The final objective of love would gain growing old either by simply worshipping Aphrodite Pandemos in addition to giving birth to help children as well as by worshipping Aphrodite Urania and learning intellectual function. Nonetheless , classical longevity brings precisely this know-how about love.
We advise one more plan of distinction, which is while using specifics of any relationship concerning two subjects: My grandpa and grandma break into joy, and I come closer to these to embrace my favorite nearest men and women. What is more necessary, love adjusted its route from a specific person to the whole humanity. I can find out their house on the distance.
Renaissance leaves together each of the previous experiences and gives beginning to numerous treatises, in which humanism occupies the most important place, even while love is actually a simple people feeling. Most likely this is the most fascinating and common subjects while in the history with philosophy. PurEssay offers exclusively quality authoring from professionals in this field.
I will really need to remind the grandparents again to change the place where they keep the key. I am unable to tell you precisely how proud my grandparents are generally of their family home. Now it is time to go into and to interact with my grandmother and grandfather. There exist a lot of parts of view towards the moment as soon as love very first appeared.
To be a human developed psychologically, absolutely love acquired completely new complicated options and got innovative forms and the theoretic examination. This can be a two-storied, regal building cloaked in a distinctive atmosphere within the previous one hundred year. I just enter a significant living room, urgentessay.net just where my horrific lies on a sofa, browsing his classifieds, and the grandmother rich waters her property plants.
Through earliest period, human feelings were mainly connected with intuition, and like was not a definite phenomenon. At the same time, they find a way to take care of your garden: a beautiful spot, full of fragrant flowers in addition to branchy forest that forged their dark areas on wood made benches. I like this how the place stays neat in summers, resembling an authentic heaven on the planet in the world of hellish heat.
My great-grandfather built this for her family a lot of decades before, and he would you think his employment conscientiously. Selfish relative person won’t acknowledge the value of the other and is eager to accomplish only his or her own needs. Take a look at first of all try to classify all of our understanding of like and find out what kinds it you can easily name.
Web site approach deeper, I can previously make out smart windows, coloured in white. While our grandparents loaf on the job for a while inside or maybe work not in the patio, the dog whenever manages to greet my family before them. Then he acknowledges me, my dear Oscar, and gets going wagging its tail, going towards all of us. Ancient Greeks’ first consideration was studying the world, in support of after that installed childbirth.
Oscar is actually the first one to fulfill me in this article. Really, no matter what hot here this time in the year. Really like originated due to socialization about instincts: intuition of self-preservation, which in such a way united people, and reproductive : instinct, combined with maternal sensations, which established certain sentimental closeness.
Any middle ages essay regarding love declares that enjoy towards The almighty is the exclusively true love, although chastity could be the only advantage. Almost every thinker one or more times tried to define or discuss love. Essay for Love: Ancient Approach
In that essay at love, PurEssay team will tell you how the perception of this thought evolved after a while. Unpretentious love the motivation to give up your own interests for the sake of your loved one. I fiddle together with heavy hair, soft and even pleasant to the touch; however , I find myself pity with the creature that must be suffering from tremendous heat less than this membrane of fashion.
Seeing that women were not fit meant for philosophy, relationship only preoccupied their partners from believing. In addition to Desire, there were a tad bit more gods of love in Artistic mythology exactly who played essential roles additionally. The origin of love presents a very disputable subject. This type of enjoy is ideal since it leads to balance and a normal state connected with human internal.
The house seems fabulous and unreal, as though it steered clear of from a fairy story and tried to get misplaced among different houses on the village yet failed. Of course, she has a great deal of them, the two inside and out of doors. The love in between, which usually presents often the golden imply, implying in which both people today present same value in a very relationship and tend to be eager to survive for each additional.
This matter is still topical even in the modern world, and now you will get to it. Foundation of love has got several ways to classifying idea. Our creating company finds it necessary to seek out some info in the tradition of early Greece. This process is usually indefinite for the reason that human trend still keeps going.
Your backyard is clean, only a floral population raise most of their buds into the sun, and also to the stones, as if looking for life-giving elements. I can odour freshly manufactured pastry someplace in the kitchen, along with a warm feeling of coziness covers my coronary heart. Physical violence and by simply seem to suppress love in the modern tradition.
But 2 men could achieve far more together, purpose Aristotle assumed that enjoy was comparable to friendship. Whenever i enter the front side gate, your pet dog looks at all of us suspiciously as well as barks once or twice. It truly is one often the presents of which my grandfather prepared personally when I had been little. Each epoch had their philosophers, research psychologists, sociologists together with other scientific brains who anxious themselves using love.
When i deeply inhale local atmosphere, and I stench a hint with flowers, quite possibly, my grandmother’s asters, nevertheless they may as well be any other think about because Me not very good during them. Love is put into the low, impolite Aphrodite Pandemos and the incredible Aphrodite Urania. Neo-Platonism develops, very, with its fans relying on Plato’s theories.
After I invest some time with the canine, I look into it for this grandparents, still I cannot find them any where outside. Industrial population is focused at consumerism; large production and even new technological know-how lead to a far more rationalized way of life. But only attitude could predetermine what kind of enjoy it was. Modern day lexicographers clarify love being a feeling of passion, but also because attraction this provokes sexual desire.
These kinds of prognoses will be dangerous to our society given that mechanized romances between people speak of inhumanity. This destination gets cozier every year. I wide open a heavy oak door along with a little crucial hidden below the welcome f?da. Some analysts believe that innovations in love would be rational together with deprived with emotions.
That is why many countries assist organizations and even movements of which promote standard family valuations. Our services make your school life simple and easy productive. The item depended on readers’ values, sociable patterns, perception of themselves and various other human beings.
This type of concepts like sympathy, benevolence and likely-hood became topical oils and element of that period. As I the actual building, the atmosphere around myself becomes awesome, and I like feeling precisely how my body relax down, as well. The v?ldigt bra Eros appears on phase; he is a strong unpredictable and even demanding the almighty, who evokes fear on other people perhaps even other gods.
Some people for their essays regarding love recommend dividing that into increased and decreased. In our short go about appreciate, we will keep to the opinion which love sprang out together with people because a people always expected communication plus close romances with very much the same beings. In the event ancient Greeks followed the principle of beauty, medieval modern society switched to principle for morality.
So , here, PurEssay provided the definition of love, enumerated it has the main models and explained to you what sort of notion of affection has evolved eventually. Later this attitude fades away, and the great start having interested in their very own feelings together with psychological factors. What I can tell for certain is that the stench is simply terrific, even though hardly ever perceptible.
These cases give agricultural soil towards the so-called purchaser love, as their characteristic characteristics are limited by low erectile culture and also lack of meekness. When it comes to Christianity, it gravely condemns human relationships outside photographer and care for corporal pastimes as low in addition to sinful. Our writers can create a identical essay in your academic demands or to create a work on a certain topic.
In Old love totally conformed that will religion. I am excited as always because immediately I am heading to my grandparents’ country bachelor’s pad. Other researchers single out unique variations of love, subject to its recipient: sexual like, paternal really enjoy, patriotic enjoy, love regarding God etc . The house is dealt with with a thick layer of light blue coloration that was clearly more saturated a year ago, still has disappeared due to the very hot sun with this territory.
My spouse and i still really like swinging there with a reserve in my present, pretending which drift off and then rise in remote countries. A single tree possibly even has a hand-made swing hanging from it. People start expressing out this emotion among ordre and even will fear it all.
It results emergence of lyric finery, which results in being the leading branch of poetry. The grandparents are generally constantly perfecting changing the interior of the house.
The post Essay for Love: Ancient Approach appeared first on Presta Hero.
0 notes
fyrapartnersearch · 5 years
Text
Looking for long-term M/M roleplays
Hey! I’m Kat, and looking for some more roleplays. I’m in my early twenties, so no worries there, and I’m in the GMT+3 time-zone, though I tend to be up at odd hours and I’m often online. I mainly want to roleplay M/M right now. I tend to write multi-para and more often than not my replies are 700+ words. I don’t mind shorter replies, as long as I get at least a few good sized paragraphs and correct spelling/grammar. Mistakes happen and that’s fine, but I don’t want to read something with no punctuation and that’s nothing but mistakes. Also please read this whole thing before sending me a message!! I ask you don't just send "wanna RP?", because I won't know unless you tell me something you had in mind: a plot, an idea of mine you liked, anything really. As for smut, I adore it. I don’t want to write only smut for now, but anything from 20/80 to 80/20 on plot/smut ratio is good for me. Just tell me if you want more plot or smut. In M/M smut I prefer playing a submissive/bottom character. I can play a dominant character, but I don't enjoy it so I wish you'll be willing to play an exclusively dominant role. What I like: - Medieval/historical settings (especially ancient Egypt/Rome/Greece) - Forbidden love - Arranged marriage - Lots and lots of drama and dark themes - But also fluffy scenes and cute/happy moments - Mpreg (not a must at all, if you’re not into it) - Supernatural beings (werewolves, vampires, demons, gods etc.) - Omega verse - Role reversal (such as, for example, a bully getting himself in a situation where the bullied has complete control) I'm rather reserved when it comes to modern day settings, but I can do those as well if there's a lot of action and drama involved. I prefer a plot-heavy story in modern setting though. Pairings I'd like to try: (Dom/sub) - Warlord/prince - King or prince/prince in an arranged marriage setting - Pirate or thief/nobility - Samurai/geisha - Nobility/prostitute - Servant/nobility or royalty - Guard/nobility or royalty - Bullied/Bully - Nerd/popular student - Stepbrothers - Demon/angel - Poor guy/rich guy in an Victorian era/early 20th century setting - Mage/human (yes I have just finished rewatching all Harry Potter movies lol) - Werewolf/human - Professor/student And many more fun things, but I can't remember everything off the top of my head. Feel free to suggest anything, really. I'm also very much into playing femboys/crossdressing characters, though if that's not your thing I can do other kinds of characters as well. I know it's a concern for many with these kinds of characters, so I'll promise my characters are never the "I-can-do-nothing-on-my-own-and-will-cry-at-the-drop-of-a-hat-and-whine-the-rest-of-the-time" blushing virgin, maiden in distress types. No need to worry about that. I am busy a lot, so I might not always have time to reply every day or even every other day, but I try to be as active as I can. Feel free to poke me if it takes more than a week or so though. A few plot ideas: (MC = my character, YC = your character) 1) Insipred by the TV show "Lucifer". YC is the Devil himself, ruler of Hell, the first fallen angel. He has grown tired of the same old tortured souls and fires of damnation scenery though and decides it's time to visit Earth for a bit. A notorious playboy, seducer to sin, the owner of one of the hottest nightclubs in town is the image he creates for himself among humans. He grants wishes in exchange to favours and soon enough everyone knows of him. MC is a struggling student, or someone who has just graduated and can't get on in life, and as a last resort goes to see YC. YC takes an immidiate liking to him, and initial fascination quickly turns into something more... human. Love, maybe? Suddenly YC has to make a choice of whether or not he'll reveal who he truly is to the innocent human he has fallen for. 2) Once upon a time MC and YC were lovers, young and oh, so in love. They were happy together, planning their future, until one day YC disappeared without a trace and MC never saw him again. Until 10 years later; YC has inherited a large fortune from his uncle who had no family of his own, and one lonely evening he heads to a brothel to ease his longing for company. There, much to his shock, he is reunited with MC who is a shell of what he once was. The bubbly, social human, always so full of life, has turned into someone with a haunted look in his eyes and a deep distrust for other people. Not able to leave MC there, YC buys him from the brothel and takes him home. Now he needs to decide what to do with him. (Historical setting) 3) (Omega verse, preferably mpreg included) MC is a rare kind of a shifter, an omega desired by many. He was born in a different kind of a prison: to a man who breeds only the best omegas for the royalty. He and the other omegas he lives with have never seen the outside world. They are kept safe behind locks in the innermost monastery on the castle grounds, where there's no chance of them getting out on their own. They are given to the harems of the royal family, or occasionally bought by the wealthies of the wealthy. But MC wants more, he wants independence and a life of his own, rather than a life dedicated to fawning over an alpha with an ego big enough as it is. YC is an alpha who has made a considerable contribution to the kingdom (could be anything from being an honored soldier to being a famous artist, whatever you come up with) and who is being gifted one of these rare omegas by the royal family themselves. On his visit to the monastery to choose one of them, he takes a liking to MC, the spiteful little thing who can't seem to know when to shut up and who won't bat his lashes and swoon at everything YC does. It seems like MC will be getting a new home. 4) MC is a shifter (species can be discussed and decided on later) who has been separated from his pack and survived alone for a while now. He gets caught in the middle of a fight between YC's pack and YC's rival pack, and after - possibly accidentally - saving YC's life he is accepted into the pack. Some time passes, YC and MC grow closer, the suspicions some had about MC fade and MC feels he's starting to belong in a pack again, when he finds out his old pack has merged with YC's rival pack. Now he'll have to choose whether he is loyal to his new, or his old pack. (I would prefer this had mpreg, but again not a necessity) 5) YC is a shapeshifter, the leader of a clan of dragons. Dragons have long ago been thought extinct, but the truth is there are still some clans left. The problem is, with the dominant personalities of dragons, it’s quite difficult to find a mate or a breeding partner. YC thinks to look for the solution outside the clan, to make humans their child-bearers. He picks MC as the first test subject. (Includes Mpreg) 6) Two countries have been at war since the beginning of time, as long as anyone can remember. All boys who come of age must join the army and go to war. MC knows he could not survive the war - he's never touched a sword in his life, never hurt anyone. He's not physically strong nor does he have any knowledge of fighting. His family has already died because of the war, leaving him alone on a small farm. So, to avoid having to go, just before coming of age MC started disguising himself as a woman. For some years it has worked out well, he's lived his life peacefully on his little farm, until the enemy forces take the city just outside of which MC lives. YC is a high ranking officer (or the king) who takes an interest in MC, thinking he is a woman. Now MC must figure out what to do with the peculiar situation he finds himself in. 7) (A rare futuristic plotline I've been dying to do since I watched Black Mirror; Nosedive) People want good ratings on their pictures, on their posts and videos - on themselves. Everyone has a technological chip inserted into their eyes when they're born that lets them see how other people are rated. Only the "best" humans in society are rated 9 or higher overall - 5 or lower makes life Hell on Earth for a person. Anyone can rate anyone on their phones every time they interact in person. One's rating has a tremendous impact on their lives; whether they get the job they want, whether they can apply to a certain school, even whether or not they can buy a house in a certain neighborhood... this system makes creating deep relationships nearly impossible, because people are too afraid to show who they really are in fear of being rated badly. MC is the youngest son of a well-off family, an ideal family where everyone is rated 8.9 or higher, loved by most people. YC is from a family who have never much cared about the system. They are decently rated, but they don't seem to care - what they care about is the honestly and real human relationships that are so hard to find nowadays. When MC and YC meet, MC is intrigued, but YC thinks MC is an empty shell only after numbers just like everyone else. Eventually, feelings start to develop between the two, but there are many problems to overcome, especially in their society. That's it for now. I'm always happy to hear any ideas you might have as well, and all the ideas above can be modified or changed up a little! My e-mail: [email protected] Anyway, hope to hear from you soon!!
3 notes · View notes
topmixtrends · 7 years
Link
TERRY EAGLETON once remarked that regarding religion as an attempt to offer a scientific explanation of the world is rather like seeing ballet as a botched attempt to run for a bus. Eagleton gestures toward a confusion that often afflicts those who advocate for an essential conflict between science and religion — the assumption that the two enterprises are competing for the same explanatory territory. Were this to be true, conflict between them would indeed be pretty much inevitable. An alternative view holds science and religion to be essentially independent operations, concerned with quite different questions. On this model, conflict is unlikely. Equally, dialogue would be unnecessary, perhaps even impossible?
My initial expectation, on approaching Yves Gingras’s Science and Religion: An Impossible Dialogue, translated by Peter Keating, was that he might be undertaking an analysis of the conditions that would make a conversation between science and religion possible, and concluding that the requisite conditions could not be met. Something like this has been maintained by others, including the 19th-century churchman John Henry Newman, whose stance (as noted by Gingras) was that “Theology and Science, whether in their respective ideas, or again in their own actual fields, on the whole, are incommunicable, incapable of collision, and needing, at most to be connected, never to be reconciled.” An updated analysis along these lines would have been a welcome intervention into a field in which the merits of dialogue are often taken for granted. But the initial promise of the book’s title went unfulfilled. Gingras instead adopts an alternative and somewhat puzzling configuration: dialogue is impossible, and conflict inevitable.
With this curious combination in mind we turn to the two stated aims of the book: first, to explain how the issue of the relations between science and religion, along with calls for a dialogue between them, came to be a significant topic of discussion in the 1980s; second, to analyze the historical relations between science and religion as institutions in the Western world since the 17th century.
For Gingras, what connects the two tasks is the work of historians of science over the past 40 years, and the consensus among them that there is no overarching pattern to past science-religion relations — neither perpetual harmony, nor unremitting conflict, only complexity. Indeed, historians of science-religion relations now routinely speak of “the conflict myth,” a distant and discredited historiography that arose in the 19th century. Gingras seeks to challenge this consensus and reinstate the older conflict model. A focus on institutions, he believes, will reveal the underlying pattern of conflict that our present-day historians all seem to have overlooked in their faddish insistence on complexities of history.
As for the increasing profile of the idea of science-religion dialogue since the 1980s, this, too, is laid at the feet of the same historians, whose insistence on the complexity of past science-religion relations is linked to present calls for dialogue. Connecting both, and adding a further institutional dimension, is the activity of the Templeton Foundation, which invests considerable funds in the promotion of dialogue between science and religion. A central claim of Gingras’s book is that the Foundation has bankrolled “the new ‘industry’ of the history of the relations between science and religion.”
Gingras’s revisionary history of religious conflict begins in the 13th century, with ecclesiastical attempts to control what was taught, and by whom, in the universities. The most celebrated instance of putative conflict took place at the University of Paris in 1277, when bishop Stephen Tempier issued a condemnation of 219 propositions in theology and philosophy. The facts of the case are much as Gingras records them, but already we encounter the difficulty that few of the relevant propositions concern what we might identify as “science.” Moreover, if we take the institutional perspective that Gingras recommends, it is surely significant that it was the Church that founded the medieval universities in the first place. Tension between universities and clerics, and turf wars between faculties, are undeniable and to some extent inevitable. What is not obvious is that such tensions can be shoehorned into a simple science versus religion story, since “religion” and “science” were typically represented by individuals and institutions on both sides of the controversy.
This becomes even more apparent with Gingras’s recounting of the fact that the Paris condemnations were followed in England by a similar, if shorter, list of condemnations promulgated by Robert Kilwardby, then Archbishop of Canterbury. Kilwardby, it turns out, was one of the most acute philosophical minds of the time and an enthusiastic early adopter of aspects of Aristotelian thought. He wrote influential commentaries on the Greek philosopher’s logical writings and authored a popular encyclopedic introduction to the sciences. It is clear, moreover, that at least some of the condemned propositions were philosophical errors, and that in these instances the intention was to correct what we might call “scientific” misconceptions. Again, we can say that “science” was represented on both sides of the controversy. Heavy-handed use of such formal prohibitions might be an affront to modern sensibilities, but it is a mistake to regard them as emblematic of a religiously motivated hostility to science.
Another complication with the medieval story comes from the fact that opposition to Aristotelian philosophy would later become a hallmark of scientific innovation. One of the prohibitions (listed by Gingras) concerns the existence of a vacuum, since this was something that God could instantiate if he so wished. It has been plausibly suggested that such prohibitions liberated Christian thinkers from too slavish an adherence to Aristotle, and that the theological emphasis on the unrestrained power of God that lay behind a number of the prohibitions promoted counterfactual thinking and hypothetical reasoning. Pierre Duhem, one of the pioneers of the history of medieval science, went so far as to say that the 1277 condemnations mark the beginning of modern science. This claim has been contested by other historians, but it cannot be summarily dismissed as a case of “spurious reasoning.”
More generally, on a number of occasions, Gingras makes much of prohibitions and book censorship on the assumption that this is a sign of an enduring battle between science and religion, or at least between the institutions that stand in for them. But this reading results from a failure to understand the universality of regimes of censorship and their ultimate goal. Legislative restrictions placed on the expression of religious, political, moral — and, in a small minority of cases, scientific — views might have served to maintain the power of particular institutions, but their goal was also the preservation of social order. It is patently clear, moreover, that religious views were far more likely to be subjected to the coercive powers of the state (and, in those cases where it could exercise temporal power, the Church) than were scientific ones. The most determined and courageous instances of resistance to such attempts at control, overwhelmingly, were religiously motivated. The history of censorship, then, does not pick out anything distinctive about science and religion, since “religion” itself was the most common target of censorship.
This brings us to the Galileo affair, which makes a predictable appearance as a set piece. The basic details of the story are well known, and again Gingras does a creditable job of reconstructing them. Galileo was warned by the Inquisition in 1616 not to teach or defend the heliocentric hypothesis first propounded by Copernicus over 70 years before. Following the publication, in 1632, of an insufficiently ambiguous defense of Copernicanism, Galileo was placed on trial, and in the following year he was found guilty of vehement suspicion of heresy and ordered to recant. He did so and remained under house arrest until his death almost 10 years later.
This looks like an open and shut case of science versus religion. But there are complications. For a start, Galileo’s theory lacked proof, and his argument for the Earth’s motion based on a theory about the tides was simply wrong. Not only that, but the absence of observable stellar parallax provided apparently unassailable evidence against the motion of the Earth. The planetary model of Tycho Brahe, which had the planets orbiting the sun, and the sun orbiting a stationary Earth, offered a good compromise solution, and accounted for at least some of Galileo’s telescopic observations without the physical difficulties of putting the Earth into motion. In short, at this time there was no consensus in the scientific community about whether Galileo was right, and good reasons for thinking he was wrong. For its part, the Church was well informed on the relative merits of the various systems, and its support for the Tychonic model in the later 17th century was scientifically defensible.
Turning from science to religion, it may seem obvious that in this controversy the Inquisition will stand in for “religion.” But again, recall that the Inquisition was founded in 12th-century France to combat heresy, that its scope expanded following the Protestant Reformation, and that its most notorious activities on the Iberian Peninsula were directed against Jewish and Muslim converts. Considered in this light, the existence of the Inquisition better reflects conflict within religion, and not between “religion” in abstract and something else. Cathars, Waldensians, Protestants, Jews, and Muslims would quite understandably not consider the Inquisition to be representative of “religion” in some general sense, and neither should we.
Matters become even more complicated when we consider other institutions that were part of the Catholic Church. Mention has already been made of the medieval universities, which were the chief sites of scientific activity in the Latin Middle Ages. Subsequently, the Collegio Romano, founded in 1551, provided considerable institutional support for the sciences conducted by members of the Jesuit order, with a particular focus on astronomy and mathematics. The present-day Vatican Observatory, which traces its origins back to the Roman College, bears further witness to the Catholic Church’s sponsorship of astronomical research. In fact, between the 12th and 18th centuries the Catholic Church’s material and moral support for the study of astronomy was unmatched by any other institution. In light of this, the unfortunate prosecution of Galileo is beginning to look like the exception rather than the rule. Affording emblematic status to the Galileo affair is a little like proposing, on the basis of the Athenians’ equally notorious trial and execution of Socrates, that the ancient Greeks were implacably opposed to philosophy.
Gingras’s rehearsal of well-known historical episodes thus turns up nothing new, and his focus on institutions simply reinforces what historians of science have been saying all along: the historical picture is complicated, and while we can construct tensions that are analogous to our modern “science and religion,” conflict is neither inevitable nor does it constitute an enduring pattern.
Moving into the present, and continuing the theme of a focus on the role of institutions, Gingras advances the bold proposal that the historians’ insistence on the complexity of past science-religion relations can be attributed in large measure to the activities of the Templeton Foundation. This foundation is a charitable organization with six funding areas, one of which is “Science and the Big Questions.” A search of the Foundation’s database reveals a number of grants under this rubric that are indeed devoted to the topic of dialogue between science and religion. (Full disclosure: I have been the recipient of Templeton funding, although none of my books on the historical relations between science and religion have been supported by them.) This looks promising for the second main argument of the book. However, Gingras’s key claim is not that the Foundation has sponsored dialogue between science and religion — which, given its stated mission, is a dead giveaway — but that it “has also played a major role in foisting the theme of a ‘dialogue’ between science and religion onto the history of science [emphasis added].” There is nothing obvious about that claim, and in fact it turns out to be well wide of the mark.
Historians of science tend to cling to the old-fashioned idea that effects come after their causes. The canonical works that first began to dismantle the idea of a perennial conflict between science and religion — God and Nature (1986) edited by David Lindberg and Ronald Numbers, and John Hedley Brooke’s classic Science and Religion: Some Historical Perspectives (1991) — were written before the Templeton Foundation’s funding activities had begun to have an impact in the 1990s (the Foundation itself was not constituted until 1987). Earlier still was James R. Moore’s The Post-Darwinian Controversies (1979). This book was instrumental in identifying the 19th-century progenitors of the conflict thesis, conclusively laying bare its deficiencies, and showing how religious opposition to Darwinism had been greatly exaggerated. The insinuation that authors such as David Lindberg or John Hedley Brooke might have penned their books, written in the early 1990s, in the hope of winning a modest prize that was only instituted in 1996 (and only partly supported by the Templeton Foundation) is unfortunate.
Had Gingras pursued his investigations with a little more diligence, he would also have discovered that the Templeton Foundation explicitly excludes historical research from the list of activities that it supports — much to the chagrin of a least a few historians. One of its websites expresses it unequivocally: the Foundation “typically does NOT fund […] historical projects, unless such a project is done for the sake of clarifying some non-historical question that falls within the scope of eligible topics.” Not only were the key works that first exploded the conflict myth written before the Foundation began its activities, but also, given its stated policy, there could hardly have been a subsequent myth-busting “industry” conducted by historians of science and bankrolled by Templeton funds.
It is true that, against the run of play, a handful of books dealing with historical topics have received some support from the Templeton Foundation. Gingras lists a grand total of five titles, of which four are edited collections. The acknowledgments in these four collections suggest that modest funds were expended not to support primary historical research, but to convene a meeting of editors and contributors in order to ensure thematic coherence and uniformity of style in the final collection. Some of these works make no reference at all to dialogue between science and religion. Of those that mention it, none advocate it.
There are, almost certainly, numerous other books and articles that have received support from the Templeton Foundation and that do explicitly advocate dialogue between science and religion. Determining their number, authorship, disciplinary orientation, and reception would have been a good place to start for a researcher seeking to establish a connection between the activities of the Templeton Foundation and a rise in the prominence of the idea of science-religion dialogue. Why Gingras did not pursue that course of action remains a mystery to me.
His book, however, is not without merits. It is well written, has a clearly stated thesis, and is informed by a considerable amount of historical research. I admire the author’s courage in taking on a whole subfield of intellectual inquiry. But the volume falls well short of establishing any of its central claims, falters on key issues of historical interpretation, and ultimately fails to deliver on the promise of its title.
This is a shame. The thesis of an “impossible dialogue” is underrepresented in the literature and is worthy of more attention. In pursuing it by other means Gingras could have found common cause with figures such as John Henry Newman, and indeed with theologians who share his doubts about the value of natural theology, albeit for different reasons. But the idea of an impossible dialogue is not well served by the simple rehearsal of a discredited conflict narrative. As for the effectiveness of the Templeton Foundation in fostering dialogue, a more careful and fine-grained analysis is called for here, rather than gestures toward correlations and the deployment of evidence that barely rises above the anecdotal. Again, Professor Gingras could find common cause, in this case with an institution that would share his interest in establishing the effectiveness of its own activities. Perhaps for his next project he might consider applying for a grant.
¤
Peter Harrison is an Australian Laureate Fellow and director of the Institute for Advanced Studies at the University of Queensland.
The post From Conflict to Dialogue and All the Way Back appeared first on Los Angeles Review of Books.
from Los Angeles Review of Books http://ift.tt/2BIRpcP
0 notes
suzanneshannon · 4 years
Text
Chapter 4: Search
Previously in web history…
After an influx of rapid browser development following the creation of the web, Mosaic becomes the popular choice. Recognizing the commercial potential of the web, a team at O’Reilly builds GNN, the first commercial website. With something to browse with, and something to browse for, more and more people begin to turn to the web. Many create small, personal sites of their own. The best the web has to offer becomes almost impossible to find.
eBay had had enough of these spiders. They were fending them off by the thousands. Their servers buzzed with nonstop activity; a relentless stream of trespassers. One aggressor, however, towered above the rest. Bidder’s Edge, which billed itself as an auction aggregator, would routinely crawl the pages of eBay to extract its content and list it on its own site alongside other auction listings.
The famed auction site had unsuccessfully tried blocking Bidder’s Edge in the past. Like an elaborate game of Whac-A-Mole, they would restrict the IP address of a Bidder’s Edge server, only to be breached once again by a proxy server with a new one. Technology had failed. Litigation was next.
eBay filed suit against Bidder’s Edge in December of 1999, citing a handful of causes. That included “an ancient trespass theory known to legal scholars as trespass to chattels, basically a trespass or interference with real property — objects, animals, or, in this case, servers.” eBay, in other words, was arguing that Bidder’s Edge was trespassing — in the most medieval sense of that word — on their servers. In order for it to constitute trespass to chattels, eBay had to prove that the trespassers were causing harm. That their servers were buckling under the load, they argued, was evidence of that harm.
Tumblr media
eBay in 1999
Judge Ronald M. Whyte found that last bit compelling. Quite a bit of back and forth followed, in one of the strangest lawsuits of a new era that included the phrase “rude robots” entering the official court record. These robots — as opposed to the “polite” ones — ignored eBay’s requests to block spidering on their sites, and made every attempt to circumvent counter measures. They were, by the judge’s estimation, trespassing. Whyte granted an injunction to stop Bidder’s Edge from crawling eBay until it was all sorted out.
Several appeals and countersuits and counter-appeals later, the matter was settled. Bidder’s Edge paid eBay an undisclosed amount and promptly shut their doors. eBay had won this particular battle. They had gotten rid of the robots. But the actual war was already lost. The robots — rude or otherwise — were already here.
If not for Stanford University, web search may have been lost. It is the birthplace of Yahoo!, Google and Excite. It ran the servers that ran the code that ran the first search engines. The founders of both Yahoo! and Google are alumni. But many of the most prominent players in search were not in the computer science department. They were in the symbolic systems program.
Symbolic systems was created at Stanford in 1985 as a study of the “relationship between natural and artificial systems that represent, process, and act on information.” Its interdisciplinary approach is rooted at the intersection of several fields: linguistics, mathematics, semiotics, psychology, philosophy, and computer science.
These are the same fields of study one would find at the heart of artificial intelligence research in the second half of the 20ᵗʰ century. But this isn’t the A.I. in its modern smart home manifestation, but in the more classical notion conceived by computer scientists as a roadmap to the future of computing technology. It is the understanding of machines as a way to augment the human mind. That parallel is not by accident. One of the most important areas of study at the symbolics systems program is artificial intelligence.
Numbered among the alumni of the program are several of the founders of Excite and Srinija Srinivasan, the fourth employee at Yahoo!. Her work in artificial intelligence led to a position at the ambitious A.I. research lab Cyc right out of college.
Marisa Mayer, an early employee at Google and, later, Yahoo!’s CEO, also drew on A.I. research during her time in the symbolic systems program. Her groundbreaking thesis project used natural language processing to help its users find the best flights through a simple conversation with a computer. “You look at how people learn, how people reason, and ask a computer to do the same things. It’s like studying the brain without the gore,” she would later say of the program.
Tumblr media
Marissa Mayer in 1999
Search on the web stems from this one program at one institution at one brief moment in time. Not everyone involved in search engines studied that program — the founders of both Yahoo! and Google, for instance, were graduate students of computer science. But the ideology of search is deeply rooted in the tradition of artificial intelligence. The goal of search, after all, is to extract from the brain a question, and use machines to provide a suitable answer.
At Yahoo!, the principles of artificial intelligence acted as a guide, but it would be aided by human perspective. Web crawlers, like Excite, would bear the burden of users’ queries and attempt to map websites programmatically to provide intelligent results.
However, it would be at Google that A.I. would become an explicitly stated goal. Steven Levy, who wrote the authoritative book on the history of Google,https://bookshop.org/books/in-the-plex-how-google-thinks-works-and-shapes-our-lives/9781416596585 In the Plex, describes Google as a “vehicle to realize the dream of artificial intelligence in augmenting humanity.” Founders Larry Page and Sergey Brin would mention A.I. constantly. They even brought it up in their first press conference.
The difference would be a matter of approach. A tension that would come to dominate search for half a decade. The directory versus the crawler. The precision of human influence versus the completeness of machines. Surfers would be on one side and, on the other, spiders. Only one would survive.
The first spiders were crude. They felt around in the dark until they found the edge of the web. Then they returned home. Sometimes they gathered little bits of information about the websites they crawled. In the beginning, they gathered nothing at all.
One of the earliest web crawlers was developed at MIT by Matthew Gray. He used his World Wide Wanderer to go and find every website on the web. He wasn’t interested in the content of those sites, he merely wanted to count them up. In the summer of 1993, the first time he sent his crawler out, it got to 130. A year later, it would count 3,000. By 1995, that number grew to just shy of 30,000.
Like many of his peers in the search engine business, Gray was a disciple of information retrieval, a subset of computer science dedicated to knowledge sharing. In practice, information retrieval often involves a robot (also known as “spiders, crawlers, wanderers, and worms”) that crawls through digital documents and programmatically collects their contents. They are then parsed and stored in a centralized “index,” a shortcut that eliminates the need to go and crawl every document each time a search is made. Keeping that index up to date is a constant struggle, and robots need to be vigilant; going back out and re-crawling information on a near constant basis.
The World Wide Web posed a problematic puzzle. Rather than a predictable set of documents, a theoretically infinite number of websites could live on the web. These needed to be stored in a central index —which would somehow be kept up to date. And most importantly, the content of those sites needed to be connected to whatever somebody wanted to search, on the fly and in seconds. The challenge proved irresistible for some information retrieval researchers and academics. People like Jonathan Fletcher.
Fletcher, a former graduate and IT employee at the University of Stirling in Scotland, didn’t like how hard it was to find websites. At the time, people relied on manual lists, like the WWW Virtual Library maintained at CERN, or Mosaic’s list ofhttps://css-tricks.com/chapter-3-the-website/ “What’s New” that they updated daily. Fletcher wanted to handle it differently. “With a degree in computing science and an idea that there had to be a better way, I decided to write something that would go and look for me.”
He built Jumpstation in 1993, one of the earliest examples of a searchable index. His crawler would go out, following as many links as it could, and bring them back to a searchable, centralized database. Then it would start over. To solve for the issue of the web’s limitless vastness, Fletcher began by crawling only the titles and some metadata from each webpage. That kept his index relatively small, but but it also restricted search to the titles of pages.
Fletcher was not alone. After tinkering for several months, WebCrawler launched in April of 1994 out of the University of Washington. It holds the distinction of being the first search engine to crawl entire webpages and make them searchable. By November of that year, WebCrawler had served 1 million queries. At Carnegie Mellon, Michael Maudlin released his own spider-based search engine variant named for the Latin translation of wolf spider, Lycos. By 1995, it had indexed over a million webpages.
Tumblr media
Search didn’t stay in universities long. Search engines had a unique utility for wayward web users on the hunt for the perfect site. Many users started their web sessions on a search engine. Netscape Navigator — the number one browser for new web users — connected users directly to search engines on their homepage. Getting listed by Netscape meant eyeballs. And eyeballs meant lucrative advertising deals.
In the second half of the 1990’s, a number of major players entered the search engine market. InfoSeek, initially a paid search option, was picked up by Disney, and soon became the default search engine for Netscape. AOL swooped in and purchased WebCrawler as part of a bold strategy to remain competitive on the web. Lycos was purchased by a venture capitalist who transformed it into a fully commercial enterprise.
Excite.com, another crawler started by Stanford alumni and a rising star in the search engine game for its depth and accuracy of results, was offered three million dollars not long after they launched. Its six co-founders lined up two couches, one across from another, and talked it out all night. They decided to stick with the product and bring in a new CEO. There would be many more millions to be made.
Tumblr media
Excite in 1996
AltaVista, already a bit late to the game at the end of 1995, was created by the Digital Equipment Corporation. It was initially built to demonstrate the processing power of DEC computers. They quickly realized that their multithreaded crawler was able to index websites at a far quicker rate than their competitors. AltaVista would routinely deploy its crawlers — what one researcher referred to as a “brood of spiders” — to index thousands of sites at a time.
As a result, AltaVista was able to index virtually the entire web, nearly 10 million webpages at launch. By the following year, in 1996, they’d be indexing over 100 million. Because of the efficiency and performance of their machines, AltaVista was able to solve the scalability problem. Unlike some of their predecessors, they were able to make the full content of websites searchable, and they re-crawled sites every few weeks, a much more rapid pace than early competitors, who could take months to update their index. They set the standard for the depth and scope of web crawlers.
Tumblr media
AltaVista in 1996
Never fully at rest, AltaVista used its search engine as a tool for innovation, experimenting with natural language processing, translation tools, and multi-lingual search. They were often ahead of their time, offering video and image search years before that would come to be an expected feature.
Those spiders that had not been swept up in the fervor couldn’t keep up. The universities hosting the first search engines were not at all pleased to see their internet connections bloated with traffic that wasn’t even related to the university. Most universities forced the first experimental search engines, like Jumpstation, to shut down. Except, that is, at Stanford.
Stanford’s history with technological innovation begins in the second half of the 20th century. The university was, at that point, teetering on the edge of becoming a second-tier institution. They had been losing ground and lucrative contracts to their competitors on the East Coast. Harvard and MIT became the sites of a groundswell of research in the wake of World War II. Stanford was being left behind.
In 1951, in a bid to reverse course on their downward trajectory, Dean of Engineering Frederick Terman brokered a deal with the city of Palo Alto. Stanford University agreed to annex 700 acres of land for a new industrial park that upstart companies in California could use. Stanford would get proximity to energetic innovation. The businesses that chose to move there would gain unique access to the Stanford student body for use on their product development. And the city of Palo Alto would get an influx of new taxes.
Hewlett-Packard was one of the first companies to move in. They ushered in a new era of computing-focused industry that would soon be known as Silicon Valley. The Stanford Research Park (later renamed Stanford Industrial Park) would eventually host Xerox during a time of rapid success and experimentation. Facebook would spend their nascent years there, growing into the behemoth it would become. At the center of it all was Stanford.
The research park transformed the university from one of stagnation to a site of entrepreneurship and cutting-edge technology. It put them at the heart of the tech industry. Stanford would embed itself — both logistically and financially — in the crucial technological developments of the second half of the 20ᵗʰ century, including the internet and the World Wide Web.
The potential success of Yahoo!, therefore, did not go unnoticed.
Jerry Yang and David Filo were not supposed to be working on Yahoo!. They were, however, supposed to be working together. They had met years ago, when David was Jerry’s teaching assistant in the Stanford computer science program. Yang eventually joined Filo as a graduate student and — after building a strong rapport — they soon found themselves working on a project together.
As they crammed themselves into a university trailer to begin working through their doctoral project, their relationship become what Yang has often described as perfectly balanced. “We’re both extremely tolerant of each other, but extremely critical of everything else. We’re both extremely stubborn, but very unstubborn when it comes to just understanding where we need to go. We give each other the space we need, but also help each other when we need it.”
In 1994, Filo showed Yang the web. In just a single moment, their focus shifted. They pushed their intended computer science thesis to the side, procrastinating on it by immersing themselves into the depths of the World Wide Web. Days turned into weeks which turned into months of surfing the web and trading links. The two eventually decided to combine their lists in a single place, a website hosted on their Stanford internet connection. It was called Jerry and David’s Guide to the World Wide Web, launched first to Stanford students in 1993 and then to the world in January of 1994. As catchy as that name wasn’t, the idea (and traffic) took off as friends shared with other friends.
Jerry and David’s Guide was a directory. Like the virtual library started at CERN, Yang and Filo organized websites into various categories that they made up on the fly. Some of these categories had strange or salacious names. Others were exactly what you might expect. When one category got too big, they split it apart. It was ad-hoc and clumsy, but not without charm. Through their classifications, Yang and Filo had given their site a personality. Their personality. In later years, Yang would commonly refer to this as the “voice of Yahoo!”
That voice became a guide — as the site’s original name suggested — for new users of the web. Their web crawling competitors were far more adept at the art of indexing millions of sites at a time. Yang and Filo’s site featured only a small subset of the web. But it was, at least by their estimation, the best of what the web had to offer. It was the cool web. It was also a web far easier to navigate than ever before.
Tumblr media
Jerry Yang (left) and David Filo (right) in 1995 (Yahoo, via Flickr)
At the end of 1994, Yang and Filo renamed their site to Yahoo! (an awkward forced acronym for Yet Another Hierarchical Officious Oracle). By then, they were getting almost a hundred thousand hits a day, sometimes temporarily taking down Stanford’s internet in the process. Most other universities would have closed down the site and told them to get back to work. But not Stanford. Stanford had spent decades preparing for on-campus businesses just like this one. They kept the server running, and encouraged its creators to stake their own path in Silicon Valley.
Throughout 1994, Netscape had included Yahoo! in their browser. There was a button in the toolbar labeled “Net Directory” that linked directly to Yahoo!. Marc Andreessen, believing in the site’s future, agreed to host their website on Netscape’s servers until they were able to get on steady ground.
Tumblr media
Yahoo! homepage in Netscape Navigator, circa 1994
Yang and Filo rolled up their sleeves, and began talking to investors. It wouldn’t take long. By the spring of 1996, they would have a new CEO and hold their own record-setting IPO, outstripping even their gracious host, Netscape. By then, they became the most popular destination on the web by a wide margin.
In the meantime, the web had grown far beyond the grasp of two friends swapping links. They had managed to categorize tens of thousands of sites, but there were hundreds of thousands more to crawl. “I picture Jerry Yang as Charlie Chaplin in Modern Times,” one journalist described, “confronted with an endless stream of new work that is only increasing in speed.” The task of organizing sites would have to go to somebody else. Yang and Filo found help in a fellow Stanford alumni, someone they had met years ago while studying abroad together in Japan, Srinija Srinivasan, a graduate of the symbolic systems program. Many of the earliest hires at Yahoo! were given slightly absurd titles that always ended in “Yahoo.” Yang and Filo went by Chief Yahoos. Srinivasan’s job title was Ontological Yahoo.
That is a deliberate and precise job title, and it was not selected by accident. Ontology is the study of being, an attempt to break the world into its component parts. It has manifested in many traditions throughout history and the world, but it is most closely associated with the followers of Socrates, in the work of Plato, and later in the groundbreaking text Metaphysics, written by Aristotle. Ontology asks the question “What exists?”and uses it as a thought experiment to construct an ideology of being and essence.
As computers blinked into existence, ontology found a new meaning in the emerging field of artificial intelligence. It was adapted to fit the more formal hierarchical categorizations required for a machine to see the world; to think about the world. Ontology became a fundamental way to describe the way intelligent machines break things down into categories and share knowledge.
The dueling definitions of the ontology of metaphysics and computer science would have been familiar to Srinija Srinivasan from her time at Stanford. The combination of philosophy and artificial intelligence in her studies gave her a unique perspective on hierarchical classifications. It was this experience that she brought to her first job after college at the Cyc Project, an artificial intelligence research lab with a bold project: to teach a computer common sense.
Tumblr media
Srinija Srinivasan (Getty Images/James D. Wilson)
At Yahoo!, her task was no less bold. When someone looked for something on the site, they didn’t want back a random list of relevant results. They wanted the result they were actually thinking about, but didn’t quite know how to describe. Yahoo! had to — in a manner of seconds — figure out what its users really wanted. Much like her work in artificial intelligence, Srinivasan needed to teach Yahoo! how to think about a query and infer the right results.
To do that, she would need to expand the voice of Yahoo! to thousands of more websites in dozens of categories and sub-categories without losing the point of view established by Jerry and David. She would need to scale that perspective. “This is not a perfunctory file-keeping exercise. This is defining the nature of being,” she once said of her project. “Categories and classifications are the basis for each of our worldviews.”
At a steady pace, she mapped an ontology of human experience onto the site. She began breaking up the makeshift categories she inherited from the site’s creators, re-constituting them into more concrete and findable indexes. She created new categories and destroyed old ones. She sub-divided existing subjects into new, more precise ones. She began cross-linking results so that they could live within multiple categories. Within a few months she had overhauled the site with a fresh hierarchy.
That hierarchical ontology, however, was merely a guideline. The strength of Yahoo!’s expansion lay in the 50 or so content managers she had hired in the meantime. They were known as surfers. Their job was to surf the web — and organize it.
Each surfer was coached in the methodology of Yahoo! but were left with a surprising amount of editorial freedom. They cultivated the directory with their own interests, meticulously deliberating over websites and where they belong. Each decision could be strenuous, and there were missteps and incorrectly categorized items along the way. But by allowing individual personality to dictate hierarchal choices, Yahoo! retained its voice.
They gathered as many sites as they could, adding hundreds each day. Yahoo! surfers did not reveal everything on the web to their site’s visitors. They showed them what was cool. And that meant everything to users grasping for the very first time what the web could do.
At the end of 1995, the Yahoo! staff was watching their traffic closely. Huddled around consoles, employees would check their logs again and again, looking for a drop in visitors. Yahoo! had been the destination for the “Internet Directory” button on Netscape for years. It had been the source of their growth and traffic. Netscape had made the decision, at the last minute (and seemingly at random), to drop Yahoo!, replacing them with the new kids on the block, Excite.com. Best case scenario: a manageable drop. Worst case: the demise of Yahoo!.
But the drop never came. A day went by, and then another. And then a week. And then a few weeks. And Yahoo! remained the most popular website. Tim Brady, one of Yahoo!’s first employees, describes the moment with earnest surprise. “It was like the floor was pulled out in a matter of two days, and we were still standing. We were looking around, waiting for things to collapse in a lot of ways. And we were just like, I guess we’re on our own now.”
Netscape wouldn’t keep their directory button exclusive for long. By 1996, they would begin allowing other search engines to be listed on their browser’s “search” feature. A user could click a button and a drop-down of options would appear, for a fee. Yahoo! bought themselves back in to the drop-down. They were joined by four other search engines, Lycos, InfoSeek, Excite, and AltaVista.
By that time, Yahoo! was the unrivaled leader. It had transformed its first mover advantage into a new strategy, one bolstered by a successful IPO and an influx of new investment. Yahoo! wanted to be much more than a simple search engine. Their site’s transformation would eventually be called a portal. It was a central location for every possible need on the web. Through a number of product expansions and aggressive acquisitions, Yahoo! released a new suite of branded digital products. Need to send an email? Try Yahoo! Mail. Looking to create website? There’s Yahoo! Geocities. Want to track your schedule? Use Yahoo! Calendar. And on and on the list went.
Tumblr media
Yahoo! in 1996
Competitors rushed the fill the vacuum of the #2 slot. In April of 1996, Yahoo!, Lycos and Excite all went public to soaring stock prices. Infoseek had their initial offering only a few months later. Big deals collided with bold blueprints for the future. Excite began positioning itself as a more vibrant alternative to Yahoo! with more accurate search results from a larger slice of the web. Lycos, meanwhile, all but abounded the search engine that had brought them initial success to chase after the portal-based game plan that had been a windfall for Yahoo!.
The media dubbed the competition the “portal wars,” a fleeting moment in web history when millions of dollars poured into a single strategy. To be the biggest, best, centralized portal for web surfers. Any service that offered users a destination on the web was thrown into the arena. Nothing short of the future of the web (and a billion dollar advertising industry) was at stake.
In some ways, though, the portal wars were over before they started. When Excite announced a gigantic merger with @Home, an Internet Service Provider, to combine their services, not everyone thought it was a wise move. “AOL and Yahoo! were already in the lead,” one investor and cable industry veteran noted, “and there was no room for a number three portal.” AOL had just enough muscle and influence to elbow their way into the #2 slot, nipping at the heels of Yahoo!. Everyone else would have to go toe-to-toe with Goliath. None were ever able to pull it off.
Battling their way to market dominance, most search engines had simply lost track of search. Buried somewhere next to your email and stock ticker and sports feed was, in most cases, a second rate search engine you could use to find things — only not often and not well. That’s is why it was so refreshing when another search engine out of Stanford launched with just a single search box and two buttons, its bright and multicolored logo plastered across the top.
A few short years after it launched, Google was on the shortlist of most popular sites. In an interview with PBS Newshour in 2002, co-founder Larry Page described their long-term vision. “And, actually, the ultimate search engine, which would understand, you know, exactly what you wanted when you typed in a query, and it would give you the exact right thing back, in computer science we call that artificial intelligence.”
Google could have started anywhere. It could have started with anything. One employee recalls an early conversation with the site’s founders where he was told “we are not really interested in search. We are making an A.I.” Larry Page and Sergey Brin, the creators of Google, were not trying to create the web’s greatest search engine. They were trying to create the web’s most intelligent website. Search was only their most logical starting point.
Imprecise and clumsy, the spider-based search engines of 1996 faced an uphill battle. AltaVista had proved that the entirety of the web, tens of millions of webpages, could be indexed. But unless you knew your way around a few boolean logic commands, it was hard to get the computer to return the right results. The robots were not yet ready to infer, in Page’s words, “exactly what you wanted.”
Yahoo! had filled in these cracks of technology with their surfers. The surfers were able to course-correct the computers, designing their directory piece by piece rather than relying on an algorithm. Yahoo! became an arbiter of a certain kind of online chic; tastemakers reimagined for the information age. The surfers of Yahoo! set trends that would last for years. Your site would live or die by their hand. Machines couldn’t do that work on their own. If you wanted your machines to be intelligent, you needed people to guide them.
Page and Brin disagreed. They believed that computers could handle the problem just fine. And they aimed to prove it.
That unflappable confidence would come to define Google far more than their “don’t be evil” motto. In the beginning, their laser-focus on designing a different future for the web would leave them blind to the day-to-day grind of the present. On not one, but two occasions, checks made out to the company for hundreds of thousands of dollars were left in desk drawers or car trunks until somebody finally made the time to deposit them. And they often did things different. Google’s offices, for instances, were built to simulate a college dorm, an environment the founders felt most conducive to big ideas.
Google would eventually build a literal empire on top of a sophisticated, world-class infrastructure of their own design, fueled by the most elaborate and complex (and arguably invasive) advertising mechanism ever built. There are few companies that loom as large as Google. This one, like others, started at Stanford.
Even among the most renowned artificial intelligence experts, Terry Winograd, a computer scientist and Stanford professor, stands out in the crowd. He was also Larry Page’s advisor and mentor when he was a graduate student in the computer science department. Winograd has often recalled the unorthodox and unique proposals he would receive from Page for his thesis project, some of which involved “space tethers or solar kites.” “It was science fiction more than computer science,” he would later remark.
But for all of his fanciful flights of imagination, Page always returned to the World Wide Web. He found its hyperlink structure mesmerizing. Its one-way links — a crucial ingredient in the web’s success — had led to a colossal proliferation of new websites. In 1996, when Page first began looking at the web, there were tens of thousands of sites being added every week. The master stroke of the web was to enable links that only traveled in one direction. That allowed the web to be decentralized, but without a central database tracking links, it was nearly impossible to collect a list of all of the sites that linked to a particular webpage. Page wanted to build a graph of who was linking to who; an index he could use to cross-reference related websites.
Page understood that the hyperlink was a digital analog to academic citations. A key indicator of the value of a particular academic paper is the amount of times it has been cited. If a paper is cited often (by other high quality papers), it is easier to vouch for its reliability. The web works the same way. The more often your site is linked to (what’s known as a backlink), the more dependable and accurate it is likely to be.
Theoretically, you can determine the value of a website by adding up all of the other websites that link to it. That’s only one layer though. If 100 sites link back to you, but each of them has only ever been linked to one time, that’s far less valuable than if five sites that each have been linked to 100 times link back to you. So it’s not simply how many links you have, but the quality of those links. If you take both of those dimensions and aggregate sites using backlinks as a criteria, you can very quickly start to assemble a list of sites ordered by quality.
John Battelle describes the technical challenge facing Page in his own retelling of the Google story, The Search.
Page realized that a raw count of links to a page would be a useful guide to that page’s rank. He also saw that each link needed its own ranking, based on the link count of its originating page. But such an approach creates a difficult and recursive mathematical challenge — you not only have to count a particular page’s links, you also have to count the links attached to the links. The math gets complicated rather quickly.
Fortunately, Page already knew a math prodigy. Sergey Brin had proven his brilliance to the world a number of times before he began a doctoral program in the Stanford computer science department. Brin and Page had crossed paths on several occasions, a relationship that began on rocky ground but grew towards mutual respect. The mathematical puzzle at the center of Page’s idea was far too enticing for Brin to pass up.
He got to work on a solution. “Basically we convert the entire Web into a big equation, with several hundred million variables,” he would later explain, “which are the page ranks of all the Web pages, and billions of terms, which are the links. And we’re able to solve that equation.” Scott Hassan, the seldom talked about third co-founder of Google who developed their first web crawler, summed it up a bit more concisely, describing Google’s algorithm as an attempt to “surf the web backward!”
The result was PageRank — as in Larry Page, not webpage. Brin, Page, and Hassan developed an algorithm that could trace backlinks of a site to determine the quality of a particular webpage. The higher value of a site’s backlinks, the higher up the rankings it climbed. They had discovered what so many others had missed. If you trained a machine on the right source — backlinks — you could get remarkable results.
It was only after that that they began matching their rankings to search queries when they realized PageRank fit best in a search engine. They called their search engine Google. It was launched on Stanford’s internet connection in August of 1996.
Tumblr media
Google in 1998
Google solved the relevancy problem that had plagued online search since its earliest days. Crawlers like Lycos, AltaVista and Excite were able to provide a list of webpages that matched a particular search. They just weren’t able to sort them right, so you had to go digging to find the result you wanted. Google’s rankings were immediately relevant. The first page of your search usually had what you needed. They were so confident in their results they added an ��I’m Feeling Lucky” button which took users directly to the first result for their search.
Google’s growth in their early days was not unlike Yahoo!’s in theirs. They spread through word of mouth, from friends to friends of friends. By 1997, they had grown big enough to put a strain on the Stanford network, something Yang and Filo had done only a couple of years earlier. Stanford once again recognized possibility. It did not push Google off their servers. Instead, Stanford’s advisors pushed Page and Brin in a commercial direction.
Initially, the founders sought to sell or license their algorithm to other search engines. They took meetings with Yahoo!, Infoseek and Excite. No one could see the value. They were focused on portals. In a move that would soon sound absurd, they each passed up the opportunity to buy Google for a million dollars or less, and Page and Brin could not find a partner that recognized their vision.
One Stanford faculty member was able to connect them with a few investors, including Jeff Bezos and David Cheriton (which got them those first few checks that sat in a desk drawer for weeks). They formally incorporated in September of 1998, moving into a friend’s garage, bringing a few early employees along, including symbolics systems alumni Marissa Mayer.
Tumblr media
Larry Page (left) and Sergey Brin (right) started Google in a friend’s garage.
Even backed by a million dollar investment, the Google founders maintained a philosophy of frugality, simplicity, and swiftness. Despite occasional urging from their investors, they resisted the portal strategy and remained focused on search. They continued tweaking their algorithm and working on the accuracy of their results. They focused on their machines. They wanted to take the words that someone searched for and turn them into something actually meaningful. If you weren’t able to find the thing you were looking for in the top three results, Google had failed.
Google was followed by a cloud of hype and positive buzz in the press. Writing in Newsweek, Steven Levy described Google as a “high-tech version of the Oracle of Delphi, positioning everyone a mouse click away from the answers to the most arcane questions — and delivering simple answers so efficiently that the process becomes addictive.” It was around this time that “googling” — a verb form of the site synonymous with search — entered the common vernacular. The portal wars were still waging, but Google was poking its head up as a calm, precise alternative to the noise.
At the end of 1998, they were serving up ten thousand searches a day. A year later, that would jump to seven million a day. But quietly, behind the scenes, they began assembling the pieces of an empire.
As the web grew, technologists and journalists predicted the end of Google; they would never be able to keep up. But they did, outlasting a dying roster of competitors. In 2001, Excite went bankrupt, Lycos closed down, and Disney suspended Infoseek. Google climbed up and replaced them. It wouldn’t be until 2006 that Google would finally overtake Yahoo! as the number one website. But by then, the company would transform into something else entirely.
After securing another round of investment in 1999, Google moved into their new headquarters and brought on an army of new employees. The list of fresh recruits included former engineers at AltaVista, and leading artificial intelligence expert Peter Norving. Google put an unprecedented focus on advancements in technology. Better servers. Faster spiders. Bigger indexes. The engineers inside Google invented a web infrastructure that had, up to that point, been only theoretical.
They trained their machines on new things, and new products. But regardless of the application, translation or email or pay-per-click advertising, they rested on the same premise. Machines can augment and re-imagine human intelligence, and they can do it at limitless scale. Google took the value proposition of artificial intelligence and brought it into the mainstream.
Tumblr media
In 2001, Page and Brin brought in Silicon Valley veteran Eric Schmidt to run things as their CEO, a role he would occupy for a decade. He would oversee the company during its time of greatest growth and innovation. Google employee #4 Heather Cairns recalls his first days on the job. “He did this sort of public address with the company and he said, ‘I want you to know who your real competition is.’ He said, ‘It’s Microsoft.’ And everyone went, What?“
Bill Gates would later say, “In the search engine business, Google blew away the early innovators, just blew them away.” There would come a time when Google and Microsoft would come face to face. Eric Schmidt was correct about where Google was going. But it would take years for Microsoft to recognize Google as a threat. In the second half of the 1990’s, they were too busy looking in their rearview mirror at another Silicon Valley company upstart that had swept the digital world. Microsoft’s coming war with Netscape would subsume the web for over half a decade.
The post Chapter 4: Search appeared first on CSS-Tricks.
You can support CSS-Tricks by being an MVP Supporter.
Chapter 4: Search published first on https://deskbysnafu.tumblr.com/
0 notes