Tumgik
#the majority of our universal knowledge is assumption-based
coockie8 · 2 months
Note
why do you hate "humans and earth are weird" posts and stuff?
Look, you can do whatever you want in fiction, I really don't care at the end of the day, it's just that like 70-80% of our problems stem from humans thinking we're uniquely special in some way, when we're really probably not, like at all, so I'm just not a fan of that trope.
5 notes · View notes
0809sysblings · 8 months
Text
Amane, indoctrination, and gaslighting
and why voting Amane innocent would be the best course of action
I've been wanting to write a big post on Amane talking about indoctrination and such. Because I see takes sometimes that make it clear the person doesn't really... Get It.
Most of what I'll be explaining comes from my personal experiences growing up.
Additionally, most of what I say when it comes to outcomes (i.e. "If x happens, Amane will do y") will be based on the assumption that realism, not entertainment, is prioritized in the writing and that there are no major holes in our knowledge of what's going on. Theoretically anything could happen since this is a fictional scenario and we don't know everything when it comes to the world, the cases, and the characters. Not to mention my situation was nowhere near as extreme as hers. So although I probably have a better understanding of it than most people, I definitely can't claim that I know what she's gone through.
Personal anecdotes I add to better support my points will be in the small font (this!) since I don't want them to distract from the main text and so that they can be easily skipped for those who may be worried about being triggered. But if anyone needs plain text descriptions, I'll happily provide them!
!! TW for child abuse, religious abuse, and cults !!
I recommend skipping my personal anecdotes if more detailed discussions about these topics are a trigger for you.
At the heart of "good" (read: successful) indoctrination is gaslighting.
Since gaslighting has been one of the many psychology terms completely watered down and distorted by the internet, I will define it just so we're all on the same page!
Gaslighting is a form of psychological manipulation used to make the victim question their own sanity, sense of reality, or power of reasoning.
Basically, you can't trust yourself. You can't trust your thoughts, your feelings, your interpretations, etc. You become completely reliant on other people (usually specific people who are the ones doing the gaslighting) to figure out what's real/true or not.
Toxic/extremist religious groups like to take gaslighting a step further though. Not only do they make it so you cannot trust yourself to judge what is right or not, they may also teach you that what feels wrong is actually right. You can see where this can start to cause some issues lol.
Anything your gut may tell you that contradicts what the group/cult leaders tell you—"this is wrong!", "this is bad!", "I don't want to do this..."—must be ignored. Because those feelings and thoughts, according to the leaders, are actually the sinful part of you trying to lead the good and faithful part of you astray. They make you question yourself to make sure you never question them.
They will figuratively or literally beat this into you until your first instinct is no longer to listen to your gut and do what it says, but to dismiss it and do what it's telling you not to do. Existing becomes a chronic power struggle between your unconscious mind and your conscious mind. Unfortunately, the fact that you're struggling often then gets used against you as proof that you need to follow their teachings. Because if you're unhappy, then you must be doing something wrong. You just need to have a little more faith, dedicate a little more time to the religion/group, go a little harder into your duties... Only then will you feel better—feel more enlightened.
An integral part in making all this work is isolation. If you don't somehow isolate the members, they may figure out that they're being manipulated and abused.
Now, isolation doesn't always mean purely physical isolation (though Amane is being isolated physically to at least some capacity). Psychological isolation is almost just as powerful. An almost universal psychological isolation tactic used by extremist groups and cults is the "Us vs Them" mentality. We can see this being very prominent with Amane. A lot of things she talks about with regard to the cult involves an Us-vs-Them dynamic. There is "Us", the cult, and "Them", everyone else.
Personally, we were taught that those who weren't believers of our religion were out to get us or will, at the very least, get us hurt/killed somehow. We were told many people wanted us dead just for being believers. You had to be careful and watch out when interacting with non-believers; you couldn't trust them. God was constantly testing you via others, and you had to make sure you stayed faithful.
This in particular is why no matter if you vote guilty or innocent, that itself will not actually do anything to change her beliefs. Voting her guilty will not make her start to feel bad and then question her beliefs. Voting her innocent will not make her listen to us and then question her beliefs. If we make her have any doubt about the cult, that's just proof to her that what we're telling her is wrong and is just another "trial" from God for her to overcome. So, changing her beliefs should not be a factor considered when voting since it's completely irrelevant. Everything can be twisted to support the cult. That's just how it works.
I don't think any amount of punishment will make Amane "come to her senses". I mean... what could we possibly do to her that she hasn't already had to endure? Punishment will likely only escalate things even more. Not to mention that having a bit of a fascination with martyrdom isn't all that uncommon in those who have been religiously abused and indoctrinated. The threat of punishment may only serve to motivate her to double down on her beliefs and behavior. Not to say she wants and likes punishment. It's obvious she's both scared of punishment and wants it to stop. After all, that's most likely the motive behind the murder.
Even prior to Amane's age, I was already fantasizing about being a martyr. A part of me almost wanted to be killed for my religion and community. It was seen as something extremely admirable. The ultimate sacrifice, if you will. We were taught that if given the choice between saving yourself by denying your faith or letting yourself be hurt/killed by standing your ground, you should choose the latter. Of course, I also did not want that to happen at all. It scared me shitless. But we weren't allowed to be scared about that stuff. It was seen as questioning God and the religious authorities, which was completely taboo. So I had no choice but to "want" it.
Isolating Amane is the worst possible thing we could do to her. No one gets better from being isolated, and this goes double for people living in abusive environments. She's been isolated her whole life. The best thing for her would be spending time with the other prisoners without restrictions. The more time she spends around people who have no connection to the cult, the better. Trying to argue with those in cults about why they're wrong and why they are in a cult (because most don't even recognize they're in a cult due to the gaslighting, indoctrination, and stigma) will almost always backfire. The best thing to do is to just be there for them to have someone to interact with who is not a cult member.
The only reason I left the extremist religious community I grew up in was because I made a friend who was not affiliated with it. I don't think I would've been able to see that the conditions I was living in were Not Very Good without that friend. He didn't even really do anything to actively help me. Just learning more about the real world through him was enough to make me start looking closer at my life.
To vote her guilty would be to continue isolating her. Not just physically as the guilty prisoners get restrictions put on them, but it's also an inescapable psychological isolation. Innocent vs Guilty is just another Us vs Them dynamic.
I fear that, if she ends up guilty this trial, she will likely be voted guilty again in trial 3. Her aggression will probably only escalate as she feels herself becoming more and more cornered. And since I know many people are voting her guilty solely to make sure she doesn't hurt Shidou or other prisoners, I can only imagine what the voting will look like for her in trial 3 once she's forced to become even more aggressive to protect herself.
And tbh... I can't imagine that having a prisoner with 3 guilty verdicts will make for all that interesting of a story for them. Not that it would be boring, per se. But having variety would, in my opinion, be the most interesting and entertaining! So, if nothing else I've said has been able to sway those who vote her guilty, then think about the entertainment factor!
Please vote this severely traumatized 12 y/o girl innocent. We can give her so many secret cakes to eat.
106 notes · View notes
haggishlyhagging · 10 months
Text
The universal and irrational belief that there is a "base element" in femaleness reflects "man's underlying fear and dread of women" to which Karen Horney referred, pointing out that it is remarkable that so little attention is paid to this phenomenon. More and more evidence of this fear, dread, and loathing is being unearthed by feminist scholars every day, revealing a universal misogynism which, in all major cultures in recorded patriarchal history, has permeated the thought of seemingly "rational" and civilized "great men"—"saints," philosphers, poets, doctors, scientists, statesmen, sociologists, revolutionaries, novelists. A quasi-infinite catalog could be compiled of quotes from the male leaders of "civilization" revealing this universal dread—expressed sometimes as loathing, sometimes as belittling ridicule, sometimes as patronizing contempt.
What has not received enough attention, however, is the silence about women's history. I do not refer primarily to the "Great Silence" concerning the acts of women under patriarchy, the failure to record or even to acknowledge the creative activity of great women and talented women. However, this is extremely significant and should be attended to. A typical case was Thomas More's brilliant daughter, Margaret. Men simply refused to believe that she was the author of her own writings. It was supposed that certainly she could not have done it without the help of a man. There were the women authors (e.g., George Eliot, George Sand, the Brontës) who could only get acceptance for their writings by disguising their sex under the pen name of a man. A reasonably talented woman today need only reflect honestly upon her own personal history in order to understand how the dynamics of wiping women out of history operate. Women who give cogent arguments concerning the oppression of women before male audiences repeatedly hear reports that "they were not able to defend their position." Words such as "flip," "slick," or "polemic" are used to describe carefully researched feminist writings. I point to this phenomenon of the wiping out of women's contributions within the context of patriarchal history, because it means that we must consciously develop a new sense of pride and confidence, with full knowledge of these mechanisms and of the fact that we cannot believe the history books that tell us implicitly that women are nothing. I point to it also because we have to overcome the hyper-cautiousness (not to be confused with striving for accuracy) that keeps us from strongly afirming our own history and thereby re-creating history.
I refer to the silence about women's historical existence since the dawn of patriarchy also because this opens the way to overcoming another "Great Silence," that is, concerning the increasing indications that there was a universally matriarchal world which prevailed before the descent into hierarchical dominion by males. Having experienced the obliterating process in our own histories and having come to recognize its dynamics within patriarchal history (which is pseudo-history to the degree that it has failed to acknowledge women), we have a basis for suspecting that the same dynamics operate to belittle and wipe out arguments for and evidence of the matriarchal period. Erich Fromm wrote:
The violence of the antagonism against the theory of matriarchy arouses the suspicion that it is . . . based on an emotional prejudice against an assumption so foreign to the thinking and feeling of our patriarchal culture.
While of itself such violence of antagonism obviously does not prove that the position so despised is correct, the very force of the attacks should arouse suspicions about the source of the opposition. It is important not to become super-cautious and hesitant in looking at the evidence offered for ancient matriarchy. It is essential to be aware that we have been conditioned to fear proposing any theory that supports feminism.
The writings supporting the matriarchal theory produced many decades ago are receiving new attention. These early contributions included the works of Bachofen (Das Mutterrecht, 1861), Louis Henry Morgan (Ancient Society, 1877), Robert Briffault (The Mothers, 1927), and Jane Harrison (Prolegomena to the Study of Greek Religion, 1903). They point not only to the existence of universal matriarchy, but also to evidence that it was basically a very different kind of society from patriarchal culture, being egalitarian rather than hierarchical and authoritarian. Bachofen claimed that matriarchal culture recognized but one purpose in life: human felicity. The scholarly proponents of the matriarchal theory maintain that this kind of culture was not bent on the conquest of nature or of other human beings. In brief: It was not patriarchy spelled with an "m." This is an important point, since many who are antagonistic to women's liberation ignorantly and unimaginatively insist that the result will be the same kind of society with women "on top." "On top" thinking, imagining, and acting is essentially patriarchal.
Elizabeth Gould Davis points out that recent archaeological discoveries support these early theories to a remarkable extent. She shows that archaeologists have tended to write of their discoveries that women were predominant in each of their places of research as if this must be a unique case. She maintains that "all together these archaeological finds prove that feminine preeminence was a universal, and not a localized, phenomenon." Davis further comments upon detailed reports that have been made on three prehistoric towns in Anatolia: Mersin, Hacilar, and Catal Huyuk. She concludes that "in all of them the message is clear and unequivocal: ancient society was gynocratic and its deity was feminine. " There is an accumulation of evidence, then, in support of Bachofen's theory of our gynocentric origins, and for the primary worship of a female deity.
-Mary Daly, Beyond God the Father: Toward a Philosophy of Women’s Liberation
16 notes · View notes
sonic-wildfire · 8 months
Note
hey there! i saw your post about british vs american people and as a working-class british person who is now based in the us, i wanted to add my two cents. i think something a lot of americans don't realise about brits is that our class system is so deep-rooted in our society that seemingly trivial things such as accents are in fact an integral part of it. accents in britain are not indicators of where you live or where you come from, but whether you are "rich" or "not rich". people with, say, east london or liverpudlian accents find it much harder to gain social status or high-paying jobs regardless of their education or financial situation because the assumption is always that they are poor and stupid. people who come from working-class backgrounds are mercilessly mocked and at worst actively discriminated against for their accents in traditionally middle-to-upper-class areas such as universities, and end up changing their accents to blend in, so it's really a self fulfilling cycle. slang such as "innit" or pronunciations such as "chewsday" are inherently working-class, and as such, people mocking them will be perceived as classism by most working-class brits. what you're seeing is not a country full of snobs unable to take criticism, but people who have been ridiculed for their accents their whole lives, and have the ability to strike back without major consequence. the problem here isn't either of you, it's the lack of knowledge surrounding each other's cultures, as a seemingly innocuous comment from you can be seen as a direct attack to someone else regardless of intent. this was very long and rambly but i hope it clears some things up! top of the mornin' to ya, guv'na, and maybe next time we can come together and roast the shit out of the english upper class :)
as a final note, id like to say that obviously none of this justifies the disproportionate response you describe in your post, and i hope that both of our fucked up systems improve soon and no-one innocent gets hurt in the process (<- this was very poorly articulated but i hope you understood what i was trying to say!)
I will admit, you have given me a new perspective on the matter of accents. I had figured there might be some form of class differences at play here, but I hadn’t realized just how deep those differences are rooted over in the UK.
In all honesty, taking a jab at accents was certainly not the best example to use, to put it lightly, and I apologize for that.
Thank you for taking the time to write this out; a friend of the working class is a friend of mine :)
8 notes · View notes
gothhabiba · 1 year
Note
about the scientific metaphors. this is slighly different but still highly related in my mind. I have a whole textbook that just deals with common misconceptions about highschool level physics. I have to use this book frequently, so the ideas from your post seem very mundane to me, I don't know what that person's problems are. Anyways.
The vast majority of the misconceptions in the book arise because of differences between the everyday usage of words and the way the same words are used in a scientific context. And the book is very careful to stress that choosing language is increddibly important if you want to communicate scientific discoveries etc. without causing more harm than good and inducing these and other misconceptions.
i don't know if this is also from the book or just something our professor told us, but these misconceptions are persisent independent of the level of education you attain/recieve in physics! you just get better at suppressing them in favour of the more correct explanations you learn at university etc. but the initial impulse is to describe physics phenomena in colloquial, intuitive but wrong ways, even if you know better, more correct explanations.
Yeah, physics is a large part of what I was thinking of when I said that other fields also make extensive use of metaphor!
It makes sense that metaphor of this kind would be intuitively reached for and only consciously suppressed (if one is attentive and rigorous enough to suppress it, that is).
Taylor and Dewsbury write about Lakoff and Johnson’s "theory of conceptual metaphor," which
posits that the nature of human cognition is metaphorical, and that all knowledge emerges as a result of embodied physical and social experiences. Under this view, metaphors are not mere linguistic embellishments. Rather, they are foundations for thought processes and conceptual understandings that function to map meaning from one knowledge and/or perceptual domain to another. When attempting to make sense of abstract, intangible phenomena, we draw from embodied experiences and look to concrete entities to serve as cognitive representatives.
Niebert and Gropengießer use this theory of conceptual metaphor to argue that scientific metaphor is just as conceptual (rather than 'merely' linguistic) and embodied as that in everyday language:
In recent years, researchers have become aware of the experiential grounding of scientific thought. Accordingly, research has shown that metaphorical mappings between experience-based source domains and abstract target domains are omnipresent in everyday and scientific language. The theory of conceptual metaphor explains these findings based on the assumption that understanding is embodied. Embodied understanding arises from recurrent bodily and social experience with our environment. As our perception is adapted to a medium-scale dimension, our embodied conceptions originate from this mesocosmic scale.
So of course scientists wouldn't be magically immune to or separate from this kind of thinking.
At a glance it looks like there's a decent body of scholarly work discussing the potential origins, uses, and pitfalls of scientific conceptual metaphor for understanding, learning, research, and scicomm.
22 notes · View notes
jcmarchi · 1 month
Text
Jay Dawani is Co-founder & CEO of Lemurian Labs – Interview Series
New Post has been published on https://thedigitalinsider.com/jay-dawani-is-co-founder-ceo-of-lemurian-labs-interview-series/
Jay Dawani is Co-founder & CEO of Lemurian Labs – Interview Series
Jay Dawani is Co-founder & CEO of Lemurian Labs. Lemurian Labs is on a mission to deliver affordable, accessible, and efficient AI computers, driven by the belief that AI should not be a luxury but a tool accessible to everyone. The founding team at Lemurian Labs combines expertise in AI, compilers, numerical algorithms, and computer architecture, united by a single purpose: to reimagine accelerated computing.
Can you walk us through your background and what got you into AI to begin with?
Absolutely. I’d been programming since I was 12 and building my own games and such, but I actually got into AI when I was 15 because of a friend of my fathers who was into computers. He fed my curiosity and gave me books to read such as Von Neumann’s ‘The Computer and The Brain’, Minsky’s ‘Perceptrons’, Russel and Norvig’s ‘AI A Modern Approach’. These books influenced my thinking a lot and it felt almost obvious then that AI was going to be transformative and I just had to be a part of this field. 
When it came time for university I really wanted to study AI but I didn’t find any universities offering that, so I decided to major in applied mathematics instead and a little while after I got to university I heard about AlexNet’s results on ImageNet, which was really exciting. At that time I had this now or never moment happen in my head and went full bore into reading every paper and book I could get my hands on related to neural networks and sought out all the leaders in the field to learn from them, because how often do you get to be there at the birth of a new industry and learn from its pioneers. 
Very quickly I realized I don’t enjoy research, but I do enjoy solving problems and building AI enabled products. That led me to working on autonomous cars and robots, AI for material discovery, generative models for multi-physics simulations, AI based simulators for training professional racecar drivers and helping with car setups, space robots, algorithmic trading, and much more. 
Now, having done all that, I’m trying to reign in the cost of AI training and deployments because that will be the greatest hurdle we face on our path to enabling a world where every person and company can have access to and benefit from AI in the most economical way possible.
Many companies working in accelerated computing have founders that have built careers in semiconductors and infrastructure. How do you think your past experience in AI and mathematics impacts your ability to understand the market and compete effectively?
I actually think not coming from the industry gives me the benefit of having the outsider advantage. I have found it to be the case quite often that not having knowledge of industry norms or conventional wisdoms gives one the freedom to explore more freely and go deeper than most others would because you’re unencumbered by biases. 
I have the freedom to ask ‘dumber’ questions and test assumptions in a way that most others wouldn’t because a lot of things are accepted truths. In the past two years I’ve had several conversations with folks within the industry where they are very dogmatic about something but they can’t tell me the provenance of the idea, which I find very puzzling. I like to understand why certain choices were made, and what assumptions or conditions were there at that time and if they still hold. 
Coming from an AI background I tend to take a software view by looking at where the workloads today, and here are all the possible ways they may change over time, and modeling the entire ML pipeline for training and inference to understand the bottlenecks, which tells me where the opportunities to deliver value are. And because I come from a mathematical background I like to model things to get as close to truth as I can, and have that guide me. For example, we have built models to calculate system performance for total cost of ownership and we can measure the benefit we can bring to customers with software and/or hardware and to better understand our constraints and the different knobs available to us, and dozens of other models for various things. We are very data driven, and we use the insights from these models to guide our efforts and tradeoffs. 
It seems like progress in AI has primarily come from scaling, which requires exponentially more compute and energy. It seems like we’re in an arms race with every company trying to build the biggest model, and there appears to be no end in sight. Do you think there is a way out of this?
There are always ways. Scaling has proven extremely useful, and I don’t think we’ve seen the end yet. We will very soon see models being trained with a cost of at least a billion dollars. If you want to be a leader in generative AI and create bleeding edge foundation models you’ll need to be spending at least a few billion a year on compute. Now, there are natural limits to scaling, such as being able to construct a large enough dataset for a model of that size, getting access to people with the right know-how, and getting access to enough compute. 
Continued scaling of model size is inevitable, but we also can’t turn the entire earth’s surface into a planet sized supercomputer to train and serve LLMs for obvious reasons. To get this into control we have several knobs we can play with: better datasets, new model architectures, new training methods, better compilers, algorithmic improvements and exploitations, better computer architectures, and so on. If we do all that, there’s roughly three orders of magnitude of improvement to be found. That’s the best way out. 
You are a believer in first principles thinking, how does this mold your mindset for how you are running Lemurian Labs?
We definitely employ a lot of first principles thinking at Lemurian. I have always found conventional wisdom misleading because that knowledge was formed at a certain point in time when certain assumptions held, but things always change and you need to retest assumptions often, especially when living in such a fast paced world. 
I often find myself asking questions like “this seems like a really good idea, but why might this not work”, or “what needs to be true in order for this to work”, or “what do we know that are absolute truths and what are the assumptions we’re making and why?”, or “why do we believe this particular approach is the best way to solve this problem”. The goal is to invalidate and kill off ideas as quickly and cheaply as possible. We want to try and maximize the number of things we’re trying out at any given point in time. It’s about being obsessed with the problem that needs to be solved, and not being overly opinionated about what technology is best. Too many folks tend to overly focus on the technology and they end up misunderstanding customers’ problems and miss the transitions happening in the industry which could invalidate their approach resulting in their inability to adapt to the new state of the world.
But first principles thinking isn’t all that useful by itself. We tend to pair it with backcasting, which basically means imagining an ideal or desired future outcome and working backwards to identify the different steps or actions needed to realize it. This ensures we converge on a meaningful solution that is not only innovative but also grounded in reality. It doesn’t make sense to spend time coming up with the perfect solution only to realize it’s not feasible to build because of a variety of real world constraints such as resources, time, regulation, or building a seemingly perfect solution but later on finding out you’ve made it too hard for customers to adopt.
Every now and then we find ourselves in a situation where we need to make a decision but have no data, and in this scenario we employ minimum testable hypotheses which give us a signal as to whether or not something makes sense to pursue with the least amount of energy expenditure. 
All this combined is to give us agility, rapid iteration cycles to de-risk items quickly, and has helped us adjust strategies with high confidence, and make a lot of progress on very hard problems in a very short amount of time. 
Initially, you were focused on edge AI, what caused you to refocus and pivot to cloud computing?
We started with edge AI because at that time I was very focused on trying to solve a very particular problem that I had faced in trying to usher in a world of general purpose autonomous robotics. Autonomous robotics holds the promise of being the biggest platform shift in our collective history, and it seemed like we had everything needed to build a foundation model for robotics but we were missing the ideal inference chip with the right balance of throughput, latency, energy efficiency, and programmability to run said foundation model on.
I wasn’t thinking about the datacenter at this time because there were more than enough companies focusing there and I expected they would figure it out. We designed a really powerful architecture for this application space and were getting ready to tape it out, and then it became abundantly clear that the world had changed and the problem truly was in the datacenter. The rate at which LLMs were scaling and consuming compute far outstrips the pace of progress in computing, and when you factor in adoption it starts to paint a worrying picture. 
It felt like this is where we should be focusing our efforts, to bring down the energy cost of AI in datacenters as much as possible without imposing restrictions on where and how AI should evolve. And so, we got to work on solving this problem. 
Can you share the genesis story of Co-Founding Lemurian Labs?
The story starts in early 2018. I was working on training a foundation model for general purpose autonomy along with a model for generative multiphysics simulation to train the agent in and fine-tune it for different applications, and some other things to help scale into multi-agent environments. But very quickly I exhausted the amount of compute I had, and I estimated needing more than 20,000 V100 GPUs. I tried to raise enough to get access to the compute but the market wasn’t ready for that kind of scale just yet. It did however get me thinking about the deployment side of things and I sat down to calculate how much performance I would need for serving this model in the target environments and I realized there was no chip in existence that could get me there. 
A couple of years later, in 2020, I met up with Vassil – my eventual cofounder – to catch up and I shared the challenges I went through in building a foundation model for autonomy, and he suggested building an inference chip that could run the foundation model, and he shared that he had been thinking a lot about number formats and better representations would help in not only making neural networks retain accuracy at lower bit-widths but also in creating more powerful architectures. 
It was an intriguing idea but was way out of my wheelhouse. But it wouldn’t leave me, which drove me to spending months and months learning the intricacies of computer architecture, instruction sets, runtimes, compilers, and programming models. Eventually, building a semiconductor company started to make sense and I had formed a thesis around what the problem was and how to go about it. And, then towards the end of the year we started Lemurian. 
You’ve spoken previously about the need to tackle software first when building hardware, could you elaborate on your views of why the hardware problem is first and foremost a software problem?
What a lot of people don’t realize is that the software side of semiconductors is much harder than the hardware itself. Building a useful computer architecture for customers to use and get benefit from is a full stack problem, and if you don’t have that understanding and preparedness going in, you’ll end up with a beautiful looking architecture that is very performant and efficient, but totally unusable by developers, which is what is actually important. 
There are other benefits to taking a software first approach as well, of course, such as faster time to market. This is crucial in today’s fast moving world where being too bullish on an architecture or feature could mean you miss the market entirely. 
Not taking a software first view generally results in not having derisked the important things required for product adoption in the market, not being able to respond to changes in the market for example when workloads evolve in an unexpected way, and having underutilized hardware. All not great things. That’s a big reason why we care a lot about being software centric and why our view is that you can’t be a semiconductor company without really being a software company. 
Can you discuss your immediate software stack goals?
When we were designing our architecture and thinking about the forward looking roadmap and where the opportunities were to bring more performance and energy efficiency, it started becoming very clear that we were going to see a lot more heterogeneity which was going to create a lot of issues on software. And we don’t just need to be able to productively program heterogeneous architectures, we have to deal with them at datacenter scale, which is a challenge the likes of which we haven’t encountered before. 
This got us concerned because the last time we had to go through a major transition was when the industry moved from single-core to multi-core architectures, and at that time it took 10 years to get software working and people using it. We can’t afford to wait 10 years to figure out software for heterogeneity at scale, it has to be sorted out now. And so, we got to work on understanding the problem and what needs to exist in order for this software stack to exist. 
We are currently engaging with a lot of the leading semiconductor companies and hyperscalers/cloud service providers and will be releasing our software stack in the next 12 months. It is a unified programming model with a compiler and runtime capable of targeting any kind of architecture, and orchestrating work across clusters composed of different kinds of hardware, and is capable of scaling from a single node to a thousand node cluster for the highest possible performance.
Thank you for the great interview, readers who wish to learn more should visit Lemurian Labs.
0 notes
healthup · 1 year
Text
Gay Test - What Does It Mean To Be Asexual?
There is a vast, lovely universe of identities to explore within the ranges of sexual orientations and gender identities.
For some people, knowing who they are and how they identify comes naturally. Others, though, may embark on a lifelong quest for knowledge based on their own growth and experiences.
But are you gay test is helping people to know there sexual orientation.
Our capacity to develop new terminology that more accurately describes how we feel about who we are, who we are attracted.
How we experience attraction has increased as our language surrounding identity continues to change. For the asexual population, this is especially accurate.
Tumblr media
What is asexuality?
Asexuality is an umbrella term for anyone who experiences little to no sexual attraction toward other people of any gender.
Asexuality can vary in a lot of ways depending on our relationships and how we define our levels of sexual attraction, romantic attraction and aesthetic attraction.
It’s important to note that sexual attraction and romantic attraction are two different things; although, in modern society.
Sexual attraction can be defined as our desire to touch another person in an intimate way. We can be sexually attracted to someone without ever experiencing romantic feelings.
Romantic attraction, then, can be defined as our desire to have a deep, affectionate connection with someone else.
Romantic attraction can exist without us ever experiencing sexual attraction and it relies solely on establishing that close, emotional attachment.
You can be aromantic, which means you may experience little to no romantic attraction to other people of any gender.
Then, there’s aesthetic attraction, which can be defined as appreciation for someone’s beauty or appearance without having sexual or romantic attraction.
For people who identify as asexual, they may experience heightened levels of romantic or aesthetic attraction, but little to no sexual attraction to the people in their lives.
“It looks different from person to person,” says Dr. Rhodes. “For some people, it means they have a romantic partner but perhaps that relationship either doesn’t include sexual contact.
Someone who identifies as asexual might engage in sex as more of a way to show that they care, for example, rather than something they’re invested in.
There are a lot of misconceptions about asexuality. It’s often been wrongly confused with loss of libido.
One major misconception is that asexuality is a medical condition on its own that can be cured or treated.
Others like to think asexuality is like celibacy or abstinence. And then there’s the dangerous presumption that someone might identify as asexual because they’ve experienced some kind of sexual trauma or physical assault.
But Dr. Rhodes warns that these misconceptions are generally untrue and can be quite harmful.
“Associating asexuality with trauma does get harmful because, for many people, this is something that’s been a process of discovery rather than something that was inflicted upon them,” says Dr. Rhodes.
“The assumption that someone’s identity is a result of trauma can really take away the agency and hard work that person may have done to accept themselves for being asexual, especially when we live in a society that really values sexual and romantic attraction.”
And unlike celibacy or abstinence — which are temporary decisions based on one’s circumstances or beliefs — asexuality is an orientation, an identity and a state of being. It’s not a choice.
Tumblr media
Types of asexuality
Like all sexual orientations that exist on a spectrum, asexuality can be fluid. Someone who is asexual may experience varying degrees of sexual attraction throughout their lifetime and it can vary from relationship to relationship.
Someone may use the umbrella term “asexual” as their defining identity. There are also several subgroups or categories of identities that have been created to better define the various degrees someone might experience their asexual identity.
The following list is not all-inclusive, but these are some asexual identities to be aware of:
Aceflux As a sexual orientation on the asexual spectrum, this identity is defined as someone whose orientation changes over time but generally stays on the asexual spectrum. For example, if you usually have no desire for sexual activity, but there are days or weeks where you do desire sexual activity, this might be an identity you relate to.
“Asexuality can change over time as people know themselves better and get a better picture of what relationships look like,” says Dr. Rhodes. “It’s something that can be in flux for a lot of people.”
Demisexual If you’re demisexual, you may only find someone sexually attractive if you’ve developed an emotional or romantic connection with them. If you don’t have that emotional or romantic connection, you typically don’t experience sexual attraction to others. For many people who identify as demisexual, they may hold off on participating in sexual activity until they’ve developed that emotional and romantic connection.
Fraysexual Someone who is fraysexual (or ignotasexual) may experience sexual attraction with someone at first, but then that sexual attraction fades over time once they develop an emotional bond. This identity can be seen as the opposite of demisexual.
Graysexual As asexuality is a spectrum, graysexuals fall into that gray area that exists between wanting and not wanting sexual activity. You may identify as graysexual if you experience limited sexual attraction on an infrequent basis. When you do experience sexual attraction, it may not be strong or intense enough to act upon it.
Lithosexual This sexual orientation refers to people who may experience sexual attraction to others but don’t want those feelings reciprocated. You may be uncomfortable at the thought of someone finding you sexually attractive and you may lose your feelings of sexual attraction if you discover those feelings are mutual. For these reasons, you may not seek out sexual relationships.
1 note · View note
jelleace11 · 1 year
Text
Blog #3
THE ENTREPRENEURIAL MIND DISCUSSION
“A major reason why startup fail is because they design their initial product based on assumptions. They assume that people care enough about the problem to pay for a solution.” – Laura Holton
                It’s been a while since we have our discussion on our Entrepreneurial class. One of some worth recalling information during our classroom discussion was when our instructor, Ma’am Rhea, introduced to us the term MINIMUM VIABLE PRODUCT (MVP). Basically, it is a beginning form of product in which Entrepreneurs furthermore develop and assess its potential by using their customers or users’ feedback. I was also amazed how some begin from a simple step which includes the Zappos in which Nick Swinmurn created a web site that offered the absolute best selection in shoes in terms of brands, styles, colors, sizes, and widths after recognizing the point of pain on not being able to find shoes on store you wanted. Ma’am Rhea gave us a simple example to visualize how it works. She presented to us how one should begin from a simple skateboard until it gradually develop into a car rather than slowly purchasing or acquiring some parts of the car. This simple explanation helped me understand that MVP launches a product with significant use in the early phases of a startup and then expanding functionality.
THE LECTURE SERIES
Last April 19, 2023, I attended the lecture series program facilitated by the Business Administration Department faculty at our University Convention Center which aims to give us, the students, an opportunity to hear insights and viewpoints from the invited lecture speaker from their various experiences on industry and academe.
Tumblr media Tumblr media
LECTURE SERIES # 1: Intellectual Property Rights & Technology Transfer
Dr. Gamaleila A. Dumangcas
Tumblr media
“Intellectual Property is the oil of the 21st Century. All the richest men have made their money out of intellectual property.” – Mark Getty
                Intellectual Property is the creation of our mind. This includes invention, literary and artistic works, symbols, names, images, and designs. Thus, it’s essential to secure protection on the intellectual property to be able to transform it into a more tangible product and to secure the ”first to file” basis. Its need for protection requires an attentive measure since having it was like also owning a land property, which demands a regulation, valuation, taking ownership, and monetizing transfer. Thereby, its vital to register first your intellectual property before disclosing it to prevent any competitors from copying your trademark, technology or product; and avid wasteful investment in research. That’s the reason why acquiring intellectual property rights, such as patents, trademarks, copyrights, and trade secret, an important move since it’s protected in law that provides the creators a protection on their original works, inventions, or ideas. Furthermore, technology transfer is also an essential process which allow the knowledge developed at one organization to another, with a purpose of commercializing it to be able to manage distribution channels and identify potential markets.
LECTURE SERIES #2: Incubating Innovation: Exploring Start-Up Opportunities in the Philippines
Tumblr media
                Unfortunately, I wasn’t able to listen throughout the said lecture since I had to process my clearance. However, my short participation in the lecture gained me a little background on how it works. This Business incubators provides a support  and resources on start-ups and someone who’s starting their business. I believe this program really help some start-ups at least good starting point of foundation, its like a toddler being assisted by their parent to their following steps in their early-stage.
LECTURE SERIES #3: RAISE
Ma'am Keren
Tumblr media
                More than a lecture, the speaker had shown to us videos related to some start-up programs and seminars that occur in the Philippines, as well as her various experiences and seminars she attended. I was really eager to take some of notes on some of her videos she presented because I planned to watch and search on my way home.
                Start-up is a company working to solve a problem where the solution is not obvious, and success is not guaranteed. She also shared her experience on how one of her business idea, Alima Home Healthcare for Every Juan − a booking app for nursing which aims to provide healthcare for rural area, had failed because of the pandemic. I was really amazed on her dedication and her ability to cope-up with the failure. I also love how she also said that “it all starts with the IDEA”. I wasn’t really fond on at least considering to be an entrepreneur in the future since my heart already settled for being a Nephrologist, however, learning that her course was actually BS Nursing, I was really amazed on this discovery. In which I realized that anyone could really become an entrepreneur and it will all start with the idea. Especially now where there are a lot programs that would support a new start-ups which includes the start-up grant fund, step up program, Philippine start-up challenge, and etc.
0 notes
secludeduser · 2 years
Text
Right and wrong
How do we know the difference? We can’t just overpower a persons opinion or thoughts by saying that it is wrong. What gives us the right to do so? Many people had made assumptions and others believed they were correct for centuries. Claudius Ptolemy, an egytian mathematician and astronomer during the era of the romam empire, believed that the sun revolved around the earth. That the earth was theb centre of our solar system. He created the geocentric model and people believed in it for over 1,500 years. But was later proved wrong by Nicolaus Copernicus, a  Renaissance polymath experted in the subjects of mathematics and astronomy. All those years, poeple had believed the words of Ptolemy and assumed them as true. And now, of course, we can prove with evidence that Copernicus’ theory is scientifically correct.
Before and during WW2, Adolf Hitler had created an entire army through words of hatred. He had manipulated majority of the German kingdom to believe that they were better, more worthy of living than jews (mainly) but also other minority groups. Why do we now say that this is wrong? What proof do we have that they are just as worthy of living as other people?
Well, our decisions of good and bad come from law, justice and morality. Some of the most important and fundamental moral principles seem to be universally held by all people in all cultures and do not change over time. Humans ahve gained this sense of symapthy during evolution and living in societies for centuries, which makes us different from sefl-isolating animals. It is their instinct.
Going back to our topic, we may say there there are set things that everyone agrees are right. These have undeniable proof. They are the laws of nature. We cannot deny these. We know for a fact that gravity exists. What was the theory of gravity? The Newtonian theory of gravity is based on an assumed force acting between all pairs of bodies. We know that fire is hot to the touch and ice is cold. We know how when the two north poles of two magnets are held near each other, they repel. We know why birds can fly but pigs cannot. These are all, what we call, basic knowledge, common sense. It is all logical to us. 
But what about the multiverse theory. Let’s say we travel out of our solar system. Beyond our galaxy and out of the cosmic web. What will we reach? Our universe is said to be endless but what if it is not. What if we would reach its end. At our current level of knowledge we would never be able to reach it, since it’s explanding much too quickly for us to reach it. But is we were to exit it, what would we have discovered? The mutliverse theory believes that there may be an infinity of other universes, all with their own laws of physics. Now, the knowledge we have collected from our would be useless. They would be wrong. 
Everything I have said until this point are mere assumptions and hypotheses. If and maybe. But it doesn’t necessarily mean it is incorrect. We simply do not know.  And there is no answer to right or wrong, good or bad. Just like there if no true self of a person. There is no one version of me. No correct version; I am seen differently by every person. Everyone has a different perception of me, including myself. So there is no true me, for a single correct version doesn’t exist. It is all the different forms of me that create the person I am.
According to scientific evidence and common knowledge, I can state that I have the ability to think. The ability to think cannot be restricted, which makes it such a wonderful thing.
0 notes
scarfhedge3 · 2 years
Text
20 Fun Facts About Fake Diploma
For unmatched top quality phony diplomas, degrees and also records, Diploma Makers has exactly what you're looking for. With a knowledgeable team as well as appreciation for authenticity. Currently you can bring that sense of success as well as ambition to your office or home with a practical phony diploma or phony records. With our customized developed internal phony diploma and phony transcripts layouts, you select the design template of your choice, then we will certainly tailor the info to fit your demands. If you are aiming to get something much better to the actual school or college diploma layout, you might intend to consider our perfect match alternative. With the best suit styles, instead of picking from among the typical in-house phony diploma templates, we start with a layout that was based off of the real institution diploma. * The term ideal match layouts especially refers to message design matching that of the institution or university you pick. Wherever possible we will certainly produce the college seals, crest, emblems, and so on to look as close as we potentially can within the limits of the legislation, but ideal suit does not always include copy-written artwork. This really professional look is achieved by integrating the gold aluminum foil seals directly onto diploma-quality paper. We additionally add printed, elevated seals to really feel and also make sure an authentic look! Phonydegree is the ideal Diploma company on the internet, with a team containing the best diploma makers. With decades of collected experience, our skilled style group specializes in replicating diplomas as well as levels for countless GED, High Schools, Colleges as well as Universities from around the globe with magnificent accuracy. A lot of our completely satisfied customers were in determined demand to replace their lost or damaged diploma and FAST. That's where we come in and save the day! If Fake Diplomas owe your college cash, your school closed, documents were lost, wish to deceive your family and friends, require a duplicate of diploma for job office screen, you were unable to finish the program or technology has left your details outdated and also inaccessible, WE CAN HELP YOU! We are the Champion of diploma replica and replacement solutions, and also the truth that you chose us verifies you're a champion, also! You can rest assured understanding that your needs will certainly be satisfied and also your assumptions exceeded by our remarkable item, superior high quality, as well as top-notch client solution personnel. 10 REASONS WHY WE'RE THE BEST DIPLOMA COMPANY: - The diploma you receive is a real duplication of the original level from your selected school. We get rid of the requirement of scrolling via generic styles in hopes of discovering a paper that looks comparable; we currently have the reproduction! - We adhere to all legal and technological constraints when it comes down to the file credentials you receive. We assure you will not get a certification with trademarked or forged details. - Professional Transcripts that match your college, graduate information, and also level & major or diploma program. Our scientists have done all the work, so you do not need to. All courses are course specific as well as relate to the GPA you've provided. It doesn't obtain far better than that! - FREE SAMPLES! Sure, any individual can state they have a simulated diploma, but we can verify it. We're pleased to offer you with a sample of the design prior to you order. - We go the additional mile to supply topographic 3-D seals for included authenticity. Simply select the increased seal option on the order form, and we'll take care of the rest. You'll witness firsthand why we have the most effective Diploma Press. - Are you a nit-picker like us? If so, we have you covered with a pre-printed email Proof so you are the one in control. During this procedure, you can make any modifications you choose prior to your file is published as well as shipped. Allow us recognize on the order type, and also we'll have a top notch printable PDF in your e-mail within 24 hours. We have the fastest turnaround in the Diploma copy sector. - Order monitoring is done! Not just will you have the ability to track your order once it ships, but you'll get updates during the style and printing process. We'll never ever leave you at night, as well as you'll recognize the standing every action of the method. - FAST SHIPPING! We can supply you with following day diplomas and also very same day diplomas! - Our enthusiasm for high quality is 2nd to none. It is very important to have a diploma, and also we understand exactly how important this is for you. From the actual diploma parchment paper to increased seals and discreet delivery, your favorable experience is critical. We ensure our productions, as well as we will certainly work with you one-on-one to ensure your contentment. For unmatched top notch phony diplomas, degrees as well as records, Diploma Makers has precisely what you're looking for. With our custom-made designed in-house phony diploma as well as phony transcripts themes, you pick the design template of your option, then we will certainly personalize the info to match your demands. If you are looking to get something much more detailed to the actual college or university diploma design, you may want to consider our perfect suit option. With the best match layouts, rather than selecting from one of the typical internal fake diploma design templates, we begin with a layout that was based off of the actual college diploma. If you owe your institution cash, your institution closed, records were lost, want to mislead your friends and household, need a copy of diploma for work workplace display screen, you were unable to finish the program or modern technology has left your info outdated and also hard to reach, WE CAN HELP YOU!
1 note · View note
continuations · 3 years
Text
The World After Capital in 64 Theses
Over the weekend I tweeted out a summary of my book The World After Capital in 64 theses. Here they are in one place:
The Industrial Age is 20+ years past its expiration date, following a long decline that started in the 1970s.
Mainstream politicians have propped up the Industrial Age through incremental reforms that are simply pushing out the inevitable collapse.
The lack of a positive vision for what comes after the Industrial Age has created a narrative vacuum exploited by nihilist forces such as Trump and ISIS.
The failure to enact radical changes is based on vastly underestimating the importance of digital technology, which is not simply another set of Industrial Age machines.
Digital technology has two unique characteristics not found in any prior human technology: zero marginal cost and universality of computation.
Our existing approaches to regulation of markets, dissemination of information, education and more are based on the no longer valid assumption of positive marginal cost.
Our beliefs about the role of labor in production and work as a source of purpose are incompatible with the ability of computers to carry out ever more sophisticated computations (and to do so ultimately at zero marginal cost).
Digital technology represents as profound a shift in human capabilities as the invention of agriculture and the discovery of science, each of which resulted in a new age for humanity.
The two prior transitions, from the Forager Age to the Agrarian Age and from the Agrarian Age to the Industrial Age resulted in humanity changing almost everything about how individuals live and societies function, including changes in religion.
Inventing the next age, will require nothing short of changing everything yet again.
We can, if we make the right choices now, set ourselves on a path to the Knowledge Age which will allow humanity to overcome the climate crisis and to broadly enjoy the benefits of automation.
Choosing a path into the future requires understanding the nature of the transition we are facing and coming to terms with what it means to be human.
New technology enlarges the “space of the possible,” which then contains both good and bad outcomes. This has been true starting from the earliest human technology: fire can be used to cook and heat, but also to wage war.
Technological breakthroughs shift the binding constraint. For foraging tribes it was food. For agrarian societies it was arable land. Industrial countries were constrained by how much physical capital (machines, factories, railroads, etc.) they could produce.
Today humanity is no longer constrained by capital, but by attention.
We are facing a crisis of attention. We are not paying enough attention to profound challenges, such as “what is our purpose?” and “how do we overcome the climate crisis?”
Attention is to time as velocity is to speed: attention is what we direct our minds to during a time period. We cannot go back and change what we paid attention to. If we are poorly prepared for a crisis it is because of how we have allocated our attention in the past.
We have enough capital to meet our individual and collective needs, as long as we are clear about the difference between needs and wants.
Our needs can be met despite the population explosion because of the amazing technological progress we have made and because population growth is slowing down everywhere with peak population in sight.
Industrial Age society, however, has intentionally led us down a path of confusing our unlimited wants with our modest needs, as well as specific solutions (e.g. individually owned cars) with needs (e.g. transportation).
The confusion of wants with needs keeps much of our attention trapped in the “job loop”: we work so that we can buy goods and services, which are produced by other people also working.
The job loop was once beneficial, when combined with markets and entrepreneurship, it resulted in much of the innovation that we now take for granted.
Now, however, we can and should apply as much automation as we can muster to free human attention from the “job loop” so that it can participate in the “knowledge loop” instead: learn, create, and share.
Digital technology can be used to vastly accelerate the knowledge loop, as can be seen from early successes, such as Wikipedia and open access scientific publications.
Much of digital technology is being used to hog human attention into systems such as Facebook, Twitter and others that engage in the business of reselling attention,  commonly known as advertising. Most of what is advertised is  furthering wants and reinforces the job loop.
The success of market-based capitalism is that capital is no longer our binding constraint. But markets cannot be used for allocating attention due to missing prices.
Prices do not and cannot exist for what we most need to pay attention to. Price formation requires supply and demand, which don't exist for finding purpose in life, overcoming the climate crisis, conducting fundamental research, or engineering an asteroid defense.
We must use the capabilities of digital technology so that we can freely allocate human attention.
We can do so by enhancing economic, information, and psychological freedom.
Economic freedom means allowing people to opt out of the job loop by providing them with a universal basic income (UBI).
Informational freedom means empowering people to control computation and thus information access, creation and sharing.
Psychological freedom means developing mindfulness practices that allow people to direct their attention in the face of a myriad distractions.
UBI is affordable today exactly because we have digital technology that allows us to drive down the cost of producing goods and services through automation.
UBI is the cornerstone of a new social contract for the Knowledge Age, much as pensions and health insurance were for the Industrial Age.
Paid jobs are not a source of purpose for humans in and of themselves. Doing something meaningful is. We will never run out of meaningful things to do.
We need one global internet without artificial geographic boundaries or fast and slow lanes for different types of content.
Copyright and patent laws must be curtailed to facilitate easier creation and sharing of derivative works.
Large systems such as Facebook, Amazon, Google, etc. must be mandated to be fully programmable to diminish their power and permit innovation to take place on top of the capabilities they have created.
In the longrun privacy is incompatible with technological progress. Providing strong privacy assurances can only be accomplished via controlled computation. Innovation will always grow our ability to destroy faster than our ability to build due to entropy.
We must put more effort into protecting individuals from what can happen to them if their data winds up leaked, rather than trying to protect the data at the expense of innovation and transparency.
Our brains evolved in an environment where seeing a cat meant there was a cat. Now the internet can show us an infinity of cats. We can thus be forever distracted.
It is easier for us to form snap judgments and have quick emotional reactions than to engage our critical thinking facilities.
Our attention is readily hijacked by systems designed to exploit these evolutionarily engrained features of our brains.
We can use mindfulness practices, such as conscious breathing or meditation to take back and maintain control of our attention.
As we increase economic, informational and psychological freedom, we also require values that guide our actions and the allocation of our attention.
We should embrace a renewed humanism as the source of our values.
There is an objective basis for humanism. Only humans have developed knowledge in the form of books and works of art that transcend both time and space.
Knowledge is the source of humanity’s great power. And with great power comes great responsibility.
Humans need to support each other in solidarity, irrespective of such differences as gender, race or nationality.
We are all unique, and we should celebrate these differences. They are beautiful and an integral part of our humanity.
Because only humans have the power of knowledge, we are responsible for other species. For example, we are responsible for whales, rather than the other way round.
When we see something that could be improved, we need to have the ability to express that. Individuals, companies and societies that do not allow criticism become stagnant and will ultimately fail.
Beyond criticism, the major mode for improvement is to create new ideas, products and art. Without ongoing innovation, systems become stagnant and start to decay.
We need to believe that problems can be solved, that progress can be achieved. Without optimism we will stop trying, and problems like the climate crisis will go unsolved threatening human extinction.
If we succeed with the transition to the Knowledge Age, we can tackle extraordinary opportunities ahead for humanity, such as restoring wildlife habitats here on earth and exploring space.
We can and should each contribute to leaving the Industrial Age behind and bringing about the Knowledge Age.
We start by developing our own mindfulness practice and helping others do so.
We tackle the climate crisis through activism demanding government regulation, through research into new solutions, and through entrepreneurship deploying working technologies.
We defend democracy from attempts to push towards authoritarian forms of government.
We foster decentralization through supporting localism, building up mutual aid, participating in decentralized systems (crypto and otherwise).
We promote humanism and live in accordance with humanist values.
We recognize that we are on the threshold of both transhumans (augmented humans) and neohumans (robots and artificial intelligences).
We continue on our epic human journey while marveling at (and worrying about) our aloneness in the universe.
We act boldly and with urgency, because humanity’s future depends on a successful transition to the Knowledge Age.
Tumblr media
1K notes · View notes
mannlibrary · 2 years
Text
Seed Saver
Tumblr media
Wild Apples of Middle Asia, produced by J.S. Lawson from field drawings made by Nikolai Vavilov (1 of 6 plates donated to Cornell professor of horticulture  Richard Wellington by Nikolai Vavilov during the 6th International Genetics Conference held in Geneva, N.Y. in 1932; For an online look at the complete illustration set, please visit the Biodiversity Heritage Library at bit.ly/vavilov-apples). 
The rare and distinctive collections vault at Mann Library houses hand-colored  prints of wild apples and pears that tell an extraordinary story. The wild fruit specimens depicted were collected in Central Asia and the Caucasus by the Russian plant scientist Nikokai Vavilov (1887-1943) and his team of botanist colleagues over the course of extensive plant-finding expeditions during the early decades of the 1900s. These illustrations provide a full color glimpse of the intrepid work undertaken by a pioneering life scientist to advance food security in Russia and the world beyond. Sadly, Nikolai Vavilov’s was a brilliant career that was cut brutally short. January 26th marks the anniversary of Vavilov’s death at fifty-five in a Soviet prison in Saratov, Russia. We post this piece today in profound esteem for his inspiring legacy. 
Tumblr media
During his all-too-brief life, Nikolai Vavilov advanced the knowledge of genetics and plant science in innumerable ways—in both lab and field.  His work on genetic homology led to the formulation of scientific law, while his expeditions to remote parts of the world both founded the largest seed bank of his time and traced the origins of numerous crop species.
Tumblr media
Original Russian edition of N.I.Vavilov’s Studies on the Origin of Cultivated Plants, Leningrad, 1926, donated to Cornell University by the author. The work was widely acclaimed by the world’s plant scientists for establishing the geographical origins of major food crops. 
Born in 1887 just west of Moscow, Vavilov became an international star of Soviet science in the 1920s.  He traveled widely, not just to collect specimens but to collaborate and share ideas with fellow scientists around the world.  In 1932 Vavilov came to Ithaca to participate at the 6th International Congress on Genetics, hosted at Cornell University.  
Tumblr media Tumblr media
The Sixth International Congress on Genetics, Cornell University, 1932; excerpt shows Nikolai Vavilov among his international colleagues.
He wasn't here simply to lecture, though; afterwards, writing for a Soviet audience, he made effort to bring what he learned in Ithaca to the USSR.  An excellent example is what Vavilov wrote about Dr. Barbara McClintock, who was researching maize genetics at Cornell:
"For our understanding of chromosomes, the star exhibit was the work of the young assistant to Cornell University, Miss McClintock. She displayed her remarkable preparations of corn chromosomes, allowing one to see the chromosomes' internal structure. This new technique made it possible to visually distinguish chromosomes, and to identify discrete chromosomal regions. Her astute understanding of the genetic background of corn could be seen by looking at the maps she created. These maps charted the distribution of genes over cytological scans, making it possible to connect the internal structure of the chromosome with the external phenomena of the genes. Miss McClintock was able to capture the conjugation of non-homologous chromosomes, and map the chromosomal sectioning of corn. This discovery is of huge significance, as all modern conceptions of genetics are based on the assumption that only homologous chromosomes conjugate."
(Written by N. I. Vavilov for the All-Soviet Academy of Agricultural and Rural Sciences in the name of Lenin: Institute of Applied Botany of USSR, available online in Russian at the All-Russian Research Institute of Plant Genetics. Translated to English by Anya Osatuke)
Tumblr media
Unfortunately, the 1932 Genetics Congress in Geneva, N.Y. was the last the world outside of Russia would see of Nikolai Vavilov. By the 1930’s, Vavilov’s line of research in plant genetics had fallen deeply out of favor with the Stalin regime, which sought to purge Soviet institutes of science from any scholarship that argued with principles of Lysenkoism, a doctrine of the Soviet era—since de-bunked—that rejected Mendelian genetics and asserted an inexorable passing along of environmentally-influenced traits from all organisms to their offspring. Increasingly pressured by the Stalin government and hedged out by ideological pseudo-science, Vavilov was barred from travel until his arrest in 1940. Three years later, he perished imprisoned, likely from starvation. 
By the early 1950s, the political tides in the Soviet Union were turning. Soviet science under the regime of Nikita Kruschev slowly abandoned Lysenkoism. In 1987, a celebration marked the official rehabilitation of Vavilov and his scientific contributions in the eyes of the Soviet government. A postage stamp was issued in Vavilov’s honor and a collection of his papers on food crop origins was published, with an English translation produced five years later and reissued in 1997. Today, Russia’s premier institute for genetics research bears Vavilov’s name: the Vavilov Institute of General Genetics in Moscow. And the institute that Vavilov founded over a century ago, the Institute for Plant Genetic Resources in St. Petersburg, which keeps a collection of seeds of thousands of food plant varieties and has an extraordinary story of its own to tell, continues to be a world renowned center for the preservation of agrobiodiversity. 
Tumblr media
The important story of Nikolai Vavilov is discussed in more detail in the exhibit Cultivating Silence: Nikolai Vavilov and the Suppression of Science in the Modern Era, which opened in the Mann Library lobby in October 2021 and can now also be viewed at exhibits.library.cornell.edu/cultivating-silence.
21 notes · View notes
the-river-person · 3 years
Text
Secrets of the Deltarune
Tumblr media
Okay so I was taking a closer look at the Deltarune and I started to notice some really weird things. It’s a symbol for the Kingdom of Monsters, right? Wrong. Gerson tells us “That's the Delta Rune, the emblem of our kingdom.The Kingdom...Of Monsters.” Okay so its the same thing, right? Nope. I looked up emblem and its distinction from Symbol. A Symbol represents an idea, a process, or a physical entity. While an Emblem is often an abstract that represents a concept like a moral truth or an allegory. And when it is used for a person, it is usually a King, a saint, or even a deity. An emblem crystallizes in concrete, visual terms some abstraction: a deity, a tribe or nation, or a virtue or vice and can be worn as an identifier if worn as say a patch or on clothing or armor or carried on a flag or banner or shield. So what does it matter? Well Gerson even tells us why. “That emblem actually predates written history.The original meaning has been lost to time...“  Hold up. Predates written history? The beginning of written history is approximately 5500 years ago. Somewhere around 3400 B.C.E. Thats a long time. And the prophecy that goes with the symbol talks about the Underground going empty, so it can only really be as old as The War Between Humans and Monsters. But...when was that? The game doesn’t tell us the exact dates. Well we have a couple clues. At the beginning of the game we have a little cut-scene of the war and then a bit where we see a human going up the mountain only to fall down into the Underground. Most players assume that this is you, beginning your adventure. Except its not. Later in the game, when you SAVE Asriel in the True Pacifist Route, we’re shown another cutscene with the exact same human figure in EXACTLY the same position, being helped by a very  young Asriel and the silhouette of Toriel. It’s Chara, not Frisk. So our date of 201X (2010-2019) takes place long before Frisk even arrives. We don’t know how long before. That really doesn’t help with when they were first thrown down there though. So I took a look at the images before that, of the war. The first image shows a human who is very different from the later pictures. Both the make of the spear and the animal hide-like clothing suggest that it’s probably stone age. The text tells us a very general “Long Ago”when describing how both races ruled the earth together. In the next two images we’re shown the actual war. The crowd of humans have various things like torches and spears. Those diamond type spears are very similar to Roman Pilums. The Human figure with a sword was interesting though. He bore a mantle (cape or cloak) and is sporting a sword. Though there’s not much detail, we can still identify the general time period of the sword. The size isn’t big enough for a proper claymore or longsword, or even a hand and half sword. Since our figure appears to be moving forward, and we can guess that it’s not in a friendly manner given the context, yet still holding the sword in one hand instead of two, it’s probably a one handed broadsword. It also has a cruciform hilt (cross-shaped) that is slightly curved. The blade is quite wide with what appears to be straight edges (based on two images with limited detail). And it has a very narrow Ricasso, an unsharpened length of blade just above the guard or handle. Ricassos were used all throughout history, but they’re pretty notable for the Early Medieval Period in Europe. And the rest of the sword (blade type, length, crossguard, and method of use) is very reminiscent of a Medieval Knightly Arming Sword, the prominent type of sword in that period from the 10th to 13th centuries. So I had to take a closer look at my spears. Turns out, they actually more closely resemble a medieval cavalry lance or javelin. And many Javelins have their root in the style of the Roman Pilums, including the sometimes diamond shaped tips. The sword and mantle of the figure suggest heavily he’s a knight, and backed up by the spear carriers we can guess that its the Early Medieval Period, possibly the beginning of the Romanesque Period. So that would place us all roughly a thousand and at least ten years before Chara fell into the Underground in 201X. Asgore was certainly alive back then. In the Genocide Route Gerson says “Long ago, ASGORE and I agreed that escaping would be pointless...Since once we left, humans would just kill us.“ and in the Post-Pacifist when you go back to talk to everyone he’ll say this when talking about Undyne “I used to be a hero myself, back in the old days. Gerson, the Hammer of Justice.” He even talks about how Undyne would follow him around when he was beating up bad guys, and try to help, by enthusiastically attacking people at random such as the mailman. This tells us that Gerson and Asgore are as old as the original war and both had been part of the battle. And both lived long enough to survive till now. Gerson is quite old looking, while Asgore is not. He explains this by saying that Boss Monsters don’t age unless they have children and then they age as their children grow, otherwise they’ll be the same age forever. But Undyne doesn’t appear to be old. And I started to wonder how long normal monsters lived in comparison to Boss Monsters. A long time for sure. From the Undertale 5th Anniversary Alarm Clock Dialogue we can learn that Asgore once knew a character called Rudy (who also appears in the Deltarune Game), who he met at Hotland University and appeared to be generally the same age as Asgore. Since it takes place in Hotland we know that it was already when they were underground, Asgore was King and was already doing his Santa Clause thing, and that Asgore was trying to find ways to occupy his time aside from actually Ruling. In the dialogue he tells us that Rudy began to look older than him. “I was there for it all. His Youth, his Marriage, his Fatherhood. Then, suddenly, one day... he fell down. ... Rudy... I... was never able to show you the sun.” Monsters can live a long time. But Boss Monsters, as long as they don’t have a child, can live nearly forever as long as they aren’t killed. Based on that, Undyne is probably quite young and Gerson is incredibly old even for a Monster, and yet only recently he’s stopped charging around fighting bad guys. Since Undyne was with him, those bad guys were in the Underground, and his distinction of her attacking not so bad folk like the mailman, means that he was probably in an official capacity to fight crime, such as a guardsman, or maybe captain of the royal guard. So. Even though there’s plenty of time for a prophecy to spring up naturally. We have a number of Monsters who have actually lived that long that would be more than happy to correct mistakes and assumptions. Gerson is quite elderly and is a tad forgetful, but he still knows much. Characters such as Toriel and Asgore are still hale and hearty, and both had witnessed so much. Though we know very little about the character, Elder Puzzler is also implied to be quite aged and knows a great deal about the “Puzzling Roots” of Monster History. You’re probably wondering what all of this is leading to. Well with these characters in place to maintain knowledge of history in the populace, then we have an Underground which created a prophecy AFTER it was trapped there, which leads me to conclude that when the prophecy was created, it must have been referencing something older than the War of Monsters and Humans.
“The original meaning has been lost to time... All we know is that the triangles symbolize us monsters below, and the winged circle above symbolizes... Somethin' else. Most people say it's the 'angel,' from the prophecy...” ‘Angel’. This is when we hear about the angel. We see the Deltarune on Toriel’s clothing and on the Ruins door. As well as behind Gerson himself. The thing he mentions clearly has wings of some kind. Surrounding a ball (note to self: Look into possible connection between mythical ball artifact from the piano room and the Deltarune Emblem). It looks a little like the fairy from the Zelda series. Those “triangles” are the greek letter Delta. That letter has a lot of connections and meanings to it. A river delta is shaped like the letter which is how it got its name. There are a number of maths and science connections. But the two connections you’d be interested in are that a Delta chord is another name for a Major Seventh Chord in music. The soundtrack of Undertale uses these chords to do fantastic things with the tone and aesthetic of its leitmotifs, changing them from a happy or hopeful tune, to a dark and despairing one without actually changing the melody. And in a subfield of Set Theory, a branch of mathematics and philosophical logic, it is used to calculate and examines the conditions under which one or the other player of a game has a winning strategy, and the consequences of the existence of such strategies. The games studied in set theory are usually Gale–Stewart games—two-player games of perfect information (each player, when making any decision, is perfectly informed of all the events that have previously occurred, including the "initialization event" of the game (e.g. the starting hands of each player in a card game)) in which the players make an infinite sequence of moves and there are no draws. But why is one of them turned upside down? I started looking things up again. Turns out there is such a symbol. The Nabla symbol is the Greek Letter Delta only inverted so that it appears upside down. Its name comes from the Phoenician harp shape, though its also called the “Del”. A musical connection is exactly what Toby would do. But its main use is in mathematics, where it is a mathematical notation to represent three different operators which make equations infinitely easier to write. These equations are all concerned with what is called Physical Mathematics. That is... Mathematics that calculate and have to do with measuring the physical world. Why is that relevant? Well the difference between humans and monsters is that humans have physical bodies while monsters are made primarily of magic. Well I also discovered that the Delta symbol for the ancient Greeks was sometimes used to as an abbreviation for the word  δύση , which meant the West in the compass points. West, westerly, sunset, twilight, nightfall, dusk, darkness, decline, end of a day. All this symbolism for a couple of triangles. There’s entire books devoted to them. And he calls the whole symbol, deltas and angel alike, the Delta RUNE. Whats a rune? Well a rune is a letter, but specifically a letter from the writing of one of the Germanic Languages before the adoption of the Latin alphabet. Interestingly... the Greek Letter Delta does NOT qualify as a Rune. In any stretch of the word. I searched for hours. What I DID find was the etymological origins of the word Rune. It comes from a Proto-Germanic word “rūnō“ which means something along the lines of “whisper, mystery, secret,  secret conversation, letter”. Interesting. So since its paired up with the Delta... it could be taken to mean “The Secret of the Delta” or “The Delta’s Secret”. If we make a few assumptions we might even get something like “The Secret of the West” or “The Mystery of the Twilight” or numerous other variations that have different connotations. It’s conjecture, certainly, and possibly a few stretches. But it is certainly there to think about. My thoughts centered around the positioning of the letters. The idea that the one facing up represented Humanity, and the two ordinary Deltas were Monsters. With the Angel above them all. Or rather, SOMETHING above them all. We have no proof that the idea of an Angel existed before the Underground’s prophecy. I like to think it did because usually that sort of thing draws on previously existing beliefs and ideas. For all we know the symbol could represent an abstract idea that governed both monsters and humans. Like “Kill or be killed” or “Do unto others as you would have others do unto you” or other basic idiomatic ideologies of that sort. Other than the realization that the Deltarune is older than the prophecy and the Underground, I didn’t have a concrete idea of what the Emblem actually means. Just a lot of theories and connective ideas. But there’s certainly a lot to be found. I don’t really know how much thought Toby actually put into this, but he’s quite well known for secrets within secrets. So its possible he knew all this going in. If he’s anything like me, and I am notorious for writing this sort of twisting references within references within references into my stories, then he’s probably at least aware of an existing connection. Its quite probably that the Deltarune is exactly what Gerson tells us. An emblematic set of symbols that is used to represent the continuing Kingdom of Monsters and has been since before written history. But as he says... its so old that it might have had a different meaning originally, whatever idea the Monsters wanted to remember, wanted to uphold enough to use it for their royal family and their kingdom, a reminder. Of something, or someone.
76 notes · View notes
arcticdementor · 3 years
Link
Imagine that the US was competing in a space race with some third world country, say Zambia, for whatever reason. Americans of course would have orders of magnitude more money to throw at the problem, and the most respected aerospace engineers in the world, with degrees from the best universities and publications in the top journals. Zambia would have none of this. What should our reaction be if, after a decade, Zambia had made more progress?
Obviously, it would call into question the entire field of aerospace engineering. What good were all those Google Scholar pages filled with thousands of citations, all the knowledge gained from our labs and universities, if Western science gets outcompeted by the third world?
For all that has been said about Afghanistan, no one has noticed that this is precisely what just happened to political science. The American-led coalition had countless experts with backgrounds pertaining to every part of the mission on their side: people who had done their dissertations on topics like state building, terrorism, military-civilian relations, and gender in the military. General David Petraeus, who helped sell Obama on the troop surge that made everything in Afghanistan worse, earned a PhD from Princeton and was supposedly an expert in “counterinsurgency theory.” Ashraf Ghani, the just deposed president of the country, has a PhD in anthropology from Columbia and is the co-author of a book literally called Fixing Failed States. This was his territory. It’s as if Wernher von Braun had been given all the resources in the world to run a space program and had been beaten to the moon by an African witch doctor.
Phil Tetlock’s work on experts is one of those things that gets a lot of attention, but still manages to be underrated. In his 2005 Expert Political Judgment: How Good Is It? How Can We Know?, he found that the forecasting abilities of subject-matter experts were no better than educated laymen when it came to predicting geopolitical events and economic outcomes. As Bryan Caplan points out, we shouldn’t exaggerate the results here and provide too much fodder for populists; the questions asked were chosen for their difficulty, and the experts were being compared to laymen who nonetheless had met some threshold of education and competence.
At the same time, we shouldn’t put too little emphasis on the results either. They show that “expertise” as we understand it is largely fake. Should you listen to epidemiologists or economists when it comes to COVID-19? Conventional wisdom says “trust the experts.” The lesson of Tetlock (and the Afghanistan War), is that while you certainly shouldn’t be getting all your information from your uncle’s Facebook Wall, there is no reason to start with a strong prior that people with medical degrees know more than any intelligent person who honestly looks at the available data.
I think one of the most interesting articles of the COVID era was a piece called “Beware of Facts Man” by Annie Lowrey, published in The Atlantic.
The reaction to this piece was something along the lines of “ha ha, look at this liberal who hates facts.” But there’s a serious argument under the snark, and it’s that you should trust credentials over Facts Man and his amateurish takes. In recent days, a 2019 paper on “Epistemic Trespassing” has been making the rounds on Twitter. The theory that specialization is important is not on its face absurd, and probably strikes most people as natural. In the hard sciences and other places where social desirability bias and partisanship have less of a role to play, it’s probably a safe assumption. In fact, academia is in many ways premised on the idea, as we have experts in “labor economics,” “state capacity,” “epidemiology,” etc. instead of just having a world where we select the smartest people and tell them to work on the most important questions.
But what Tetlock did was test this hypothesis directly in the social sciences, and he found that subject-matter experts and Facts Man basically tied.
Interestingly, one of the best defenses of “Facts Man” during the COVID era was written by Annie Lowrey’s husband, Ezra Klein. His April 2021 piece in The New York Times showed how economist Alex Tabarrok had consistently disagreed with the medical establishment throughout the pandemic, and was always right. You have the “Credentials vs. Facts Man” debate within one elite media couple. If this was a movie they would’ve switched the genders, but since this is real life, stereotypes are confirmed and the husband and wife take the positions you would expect.
In the end, I don’t think my dissertation contributed much to human knowledge, making it no different than the vast majority of dissertations that have been written throughout history. The main reason is that most of the time public opinion doesn’t really matter in foreign policy. People generally aren’t paying attention, and the vast majority of decisions are made out of public sight. How many Americans know or care that North Macedonia and Montenegro joined NATO in the last few years? Most of the time, elites do what they want, influenced by their own ideological commitments and powerful lobby groups. In times of crisis, when people do pay attention, they can be manipulated pretty easily by the media or other partisan sources.
If public opinion doesn’t matter in foreign policy, why is there so much study of public opinion and foreign policy? There’s a saying in academia that “instead of measuring what we value, we value what we can measure.” It’s easy to do public opinion polls and survey experiments, as you can derive a hypothesis, get an answer, and make it look sciency in charts and graphs. To show that your results have relevance to the real world, you cite some papers that supposedly find that public opinion matters, maybe including one based on a regression showing that under very specific conditions foreign policy determined the results of an election, and maybe it’s well done and maybe not, but again, as long as you put the words together and the citations in the right format nobody has time to check any of this. The people conducting peer review on your work will be those who have already decided to study the topic, so you couldn’t find a more biased referee if you tried.
Thus, to be an IR scholar, the two main options are you can either use statistical methods that don’t work, or actually find answers to questions, but those questions are so narrow that they have no real world impact or relevance. A smaller portion of academics in the field just produce postmodern-generator style garbage, hence “feminist theories of IR.” You can also build game theoretic models that, like the statistical work in the field, are based on a thousand assumptions that are probably false and no one will ever check. The older tradition of Kennan and Mearsheimer is better and more accessible than what has come lately, but the field is moving away from that and, like a lot of things, towards scientism and identity politics.
At some point, I decided that if I wanted to study and understand important questions, and do so in a way that was accessible to others, I’d have a better chance outside of the academy. Sometimes people thinking about an academic career reach out to me, and ask for advice. For people who want to go into the social sciences, I always tell them not to do it. If you have something to say, take it to Substack, or CSPI, or whatever. If it’s actually important and interesting enough to get anyone’s attention, you’ll be able to find funding.
If you think your topic of interest is too esoteric to find an audience, know that my friend Razib Khan, who writes about the Mongol empire, Y-chromosomes and haplotypes and such, makes a living doing this. If you want to be an experimental physicist, this advice probably doesn’t apply, and you need lab mates, major funding sources, etc. If you just want to collect and analyze data in a way that can be done without institutional support, run away from the university system.
The main problem with academia is not just the political bias, although that’s another reason to do something else with your life. It’s the entire concept of specialization, which holds that you need some secret tools or methods to understand what we call “political science” or “sociology,” and that these fields have boundaries between them that should be respected in the first place. Quantitative methods are helpful and can be applied widely, but in learning stats there are steep diminishing returns.
Outside of political science, are there other fields that have their own equivalents of “African witch doctor beats von Braun to the moon” or “the Taliban beats the State Department and the Pentagon” facts to explain? Yes, and here are just a few examples.
Consider criminology. More people are studying how to keep us safe from other humans than at any other point in history. But here’s the US murder rate between 1960 and 2018, not including the large uptick since then.
Tumblr media
So basically, after a rough couple of decades, we’re back to where we were in 1960. But we’re actually much worse, because improvements in medical technology are keeping a lot of people that would’ve died 60 years ago alive. One paper from 2002 says that the murder rate would be 5 times higher if not for medical developments since 1960. I don’t know how much to trust this, but it’s surely true that we’ve made some medical progress since that time, and doctors have been getting a lot of experience from all the shooting victims they have treated over the decades. Moreover, we’re much richer than we were in 1960, and I’m sure spending on public safety has increased. With all that, we are now about tied with where we were almost three-quarters of a century ago, a massive failure.
What about psychology? As of 2016, there were 106,000 licensed psychologists in the US. I wish I could find data to compare to previous eras, but I don’t think anyone will argue against the idea that we have more mental health professionals and research psychologists than ever before. Are we getting mentally healthier? Here’s suicides in the US from 1981 to 2016
What about education? I’ll just defer to Freddie deBoer’s recent post on the topic, and Scott Alexander on how absurd the whole thing is.
Maybe there have been larger cultural and economic forces that it would be unfair to blame criminology, psychology, and education for. Despite no evidence we’re getting better at fighting crime, curing mental problems, or educating children, maybe other things have happened that have outweighed our gains in knowledge. Perhaps the experts are holding up the world on their shoulders, and if we hadn’t produced so many specialists over the years, thrown so much money at them, and gotten them to produce so many peer reviews papers, we’d see Middle Ages-levels of violence all across the country and no longer even be able to teach children to read. Like an Ayn Rand novel, if you just replaced the business tycoons with those whose work has withstood peer review.
Or you can just assume that expertise in these fields is fake. Even if there are some people doing good work, either they are outnumbered by those adding nothing or even subtracting from what we know, or our newly gained understanding is not being translated into better policies. Considering the extent to which government relies on experts, if the experts with power are doing things that are not defensible given the consensus in their fields, the larger community should make this known and shun those who are getting the policy questions so wrong. As in the case of the Afghanistan War, this has not happened, and those who fail in the policy world are still well regarded in their larger intellectual community.
Those opposed to cancel culture have taken up the mantle of “intellectual diversity” as a heuristic, but there’s nothing valuable about the concept itself. When I look at the people I’ve come to trust, they are diverse on some measures, but extremely homogenous on others. IQ and sensitivity to cost-benefit considerations seem to me to be unambiguous goods in figuring out what is true or what should be done in a policy area. You don’t add much to your understanding of the world by finding those with low IQs who can’t do cost-benefit analysis and adding them to the conversation.
One of the clearest examples of bias in academia and how intellectual diversity can make the conversation better is the work of Lee Jussim on stereotypes. Basically, a bunch of liberal academics went around saying “Conservatives believe in differences between groups, isn’t that terrible!” Lee Jussim, as someone who is relatively moderate, came along and said “Hey, let’s check to see whether they’re true!” This story is now used to make the case for intellectual diversity in the social sciences.
Yet it seems to me that isn’t the real lesson here. Imagine if, instead of Jussim coming forward and asking whether stereotypes are accurate, Osama bin Laden had decided to become a psychologist. He’d say “The problem with your research on stereotypes is that you do not praise Allah the all merciful at the beginning of all your papers.” If you added more feminist voices, they’d say something like “This research is problematic because it’s all done by men.” Neither of these perspectives contributes all that much. You’ve made the conversation more diverse, but dumber. The problem with psychology was a very specific one, in that liberals are particularly bad at recognizing obvious facts about race and sex. So yes, in that case the field could use more conservatives, not “more intellectual diversity,” which could just as easily make the field worse as make it better. And just because political psychology could use more conservative representation when discussing stereotypes doesn’t mean those on the right always add to the discussion rather than subtract from it. As many religious Republicans oppose the idea of evolution, we don’t need the “conservative” position to come and help add a new perspective to biology.
The upshot is intellectual diversity is a red herring, usually a thinly-veiled plea for more conservatives. Nobody is arguing for more Islamists, Nazis, or flat earthers in academia, and for good reason. People should just be honest about the ways in which liberals are wrong and leave it at that.
The failure in Afghanistan was mind-boggling. Perhaps never in the history of warfare had there been such a resource disparity between two sides, and the US-backed government couldn’t even last through the end of the American withdrawal. One can choose to understand this failure through a broad or narrow lens. Does it only tell us something about one particular war or is it a larger indictment of American foreign policy?
The main argument of this essay is we’re not thinking big enough. The American loss should be seen as a complete discrediting of the academic understanding of “expertise,” with its reliance on narrowly focused peer reviewed publications and subject matter knowledge as the way to understand the world. Although I don’t develop the argument here, I think I could make the case that expertise isn’t just fake, it actually makes you worse off because it gives you a higher level of certainty in your own wishful thinking. The Taliban probably did better by focusing their intellectual energies on interpreting the Holy Quran and taking a pragmatic approach to how they fought the war rather than proceeding with a prepackaged theory of how to engage in nation building, which for the West conveniently involved importing its own institutions.
A discussion of the practical implications of all this, or how we move from a world of specialization to one with better elites, is also for another day. For now, I’ll just emphasize that for those thinking of choosing an academic career to make universities or the peer review system function better, my advice is don’t. The conversation is much more interesting, meaningful, and oriented towards finding truth here on the outside.
11 notes · View notes
gregorsarnsa · 2 years
Text
ok so well it started from a chem lecture in class 11th i think? yeah so there's this thing called Heisenberg's principle it basically states that if your observing an electron and if u know it's velocity in space, then u cannot know it's position in space. and if you know the position, the exact x, y and z coordinate of the electron at a time, then u can never know the exact velocity of the electron.
anyway so just the fact that. science. a subject based on calculations and precisions and maths is so fucking vague that THEOREMS have to state that "no you cannot know this for sure" just got me thinking. like i considered 2-3 examples and like arts is vague? but so is science? because your version of truth in science is defined by what you observe, hear, feel and whether it is alike or not according to a person and majority of humans in general. but then again and who said that what humans observe is in fact the absolute truth? like humans as always tend to be overly selfish and they consider themselves and their own observations to be the truth? but that's just stupid bc inherently there are no facts and there are only assumptions any sentence and all theorems are based on assumptions. like even 1+1 = 2 is based on the assumption that the world works on a mathematical scale governed by humans and that just coincides with almost all of human knowledge over the years. like everything is an assumption that is believed by a group of people and that just makes it? the truth? not convinced here but ok and well there are groups that believe that humans have to be in such a mindset to progress?? but then that's ignoring the fact that humanity is random. the universe, for all we know, is a fluke and it has just one law, and that's randomness the universe pushes itself to randomness, not chaos or discipline and it's just wild. that there is a tremendous amount of randomness that has lead to everything being the way it is this moment in time [m considering time to be linear rn and us as matter based individuals who wake up everyday with no guide as to how they want to spend their day/knowledge of how they're going to spend their day] and that's just wild? bc one moment after this, we might just get sucked into a black hole and we'd never even k n o w like the principle of randomness is what amazes me sometimes and it's all due to randomness, you going "oooooooooooh" over a cluster of nebulae, hormones finding it pretty etc etc etc and also. life. 
i remember this one paragraph in class 10th bio that defined what protoplasm is and how we exist and think and all it just said that we are the same carbon, oxygen and nitrogen as the soil but we are in flux? and im just imagining the amount of randomness that had to be in use for this special mix of protoplasmic cell to be created years ago and then, more so, the amount of randomness that had to take place EXACTLY as it was for us to be us today. and this. is when i do not consider us as masters of time. we're still considering time to be a governing factor and not a governable factor from the viewpoint of humans imagine if. time was n o t linear. imagine if time was random speeding up and slowing down and still and still and a millenia in the blink of an eye we would, theoretically, be able to control randomness if we surpass time or light velocity but again. sigh. we are matter based organisms that unfortunately cannnot work think or observe if our matter gets annihilated by einstein’s theory and that's kind of the basis for god in almost all religions? like people could never explain randomness and why it is how it is and they just started calling this randomness as what people call the Creator or God.
love my brain bc i have an exam in 5 days and brain's just like. "ok. time to Discover The Secrets Of The Universe"
2 notes · View notes
anguyen06 · 3 years
Text
WGST Blog Post #6 (due 09/30)
What are the ways the presence of sex (female and male) creates problems in the gaming world?
Most of the time, in games, there is an objective and winning satisfaction. Skill and knowledge of how the game operates are normally how players are able to find the objective or win the game. It did not matter whether you were female or male. However, in some games, whenever the presence of sex is introduced, it explains why a certain player is skilled or unskilled. For instance, when a player loses the round or plays out a bad strategy, it differs when the player is male or female. When the player is male, these types of plays are brushed off. When the player is female, the player receives backlash and harsh assessment for their action. It is similar to the situation of Pokemon Go, discussed in the Fickle reading. It started off as an adventure game that any person could participate in. As players went to more extreme lengths to catch pokemons, what race one was started to become a problem (Fickle, 2019). In both scenarios, females and people of color had more difficulties playing the game or to advance in the game without belittlement or verbal abuse. It goes to show the difference between when one group plays and how another group plays.
Why is it harmful to categorize people by physical appearance/traits?
A few ways people are able to distinguish individuals is through similar physical traits within the community. In some cases, it could be proven to be useful, such as when trying to apprehend a burglar. However, in some cases, categorizing people by physical appearance can be harmful when placing people in a narrow box of how people of different races or ethnicities should look. In the “Race in Cyberspace” reading, Kolko brought up the point that “...the systems of racial categorization that permeate our world are derived from culture, not nature” (Kolko, 2000). What makes up a person’s ethnicity is not just the physical appearance but the history and heritage. It is harmful to categorize people based on physical appearance because if one does not have some of the physical characteristics, does it mean that the individual is not that ethnicity? This could lead to the concerns of identity crisis, insecurities of oneself, etc. The opposite is the same: if someone meets all the criteria, should people automatically assume and distinguish that person as that ethnicity? With assumptions made, people may criticize someone else for cultural appropriation, unaware of the fact that that person is that ethnicity. For example, if an individual, who may not necessarily look Asian or have the physical features of Eastern appearances, posts a picture with chopsticks in their hair may receive hate online for cultural appropriation. That is why it is important to not categorize people into certain race/ethnicity groups based on appearances in order to avoid generalization and confusion.
What are the consequences of letting people get away with racist comments/(borderline) discriminatory actions?
In the gaming community, there are many ways that people use to express anger and frustration; one of the ways is by yelling or typing out racial slurs. Even if it was not directed towards one person or meant to harm another individual, it still has major effects. When it becomes intentional, the effect that it has is that the game loses its appeal to certain gamers. People may stop playing the game due to the major toll on their mental health. Some people may not appreciate the overuse of generalization and generic stereotyping, and may have to end their gaming time early. These are short term effects of allowing people to say these types of comments. The long term effects are from ignoring the severity of these actions. In the “The Revenge of the Yellowfaced Cyborg Terminator” reading, it mentions how the yellow-faced cyborg” was a mixture of Asian cultures. When criticized for being cultural appropriation, it was played off as a fun concept of mixing Asian cultures (Ow, 2003). Downplaying the problem is what allows racism to continue to exist. By not speaking out on this or by ignoring the issue, it allows people to believe that it is okay to say racial slurs, because there are no consequences.
Are there ways to combat online racism?
Other than speaking up against racism, whether it is through posts or commenting under the racist posts, there are common ways people are penalized for saying engaging in racist activity. One way is to report an individual's post or account. Reporting a social media account has the ability to ban and/or suspend the user from accessing that account. Reporting a gamer’s account, specifically on Valorant, puts an extended timer on their account that prevents them queuing in a game, or completely bans the user from accessing that account. One other way racism has been combated against was through community efforts. Skai Jackson, an American actress, took upon herself to take actions against racism by personally notifying the schools of the people who have said or engaged in racist activity. Her big following also took action and received responses from each of these schools. Combatting racism is difficult, especially when dealing with users that believe they can hide behind their aliases and screens. However, it is extremely important to take actions and to not be complicit with these actions.
Fickle, T. (2019). The race card : from gaming technologies to model minorities . New York University Press.
Kolko, B., Nakamura, L., & Rodman, G. (2000). Race in Cyberspace. Taylor & Francis Group.
Ow, J. A. (2003). The Revenge of the Yellowfaced Cyborg Terminator: The Rape of Digital Geishas and the Colonization of Cyber-Coolies in 3D Realms’ Shadow Warrior. In Asian America.Net: Ethnicity, Nationalism, and Cyberspace (pp. 256–257). https://doi.org/10.4324/9780203957349
6 notes · View notes