All things are impermanent. This is the law of life and extinction. When both life and extinction perish. Nirvana will be bliss.
Don't wanna be here? Send us removal request.
Text
the reason behind the range of spectrum our eyes sea, evolved from the bottom of the sea
0 notes
Text
https://www.threads.com/@wealth.value/post/DMWjiSwuu3C?xmt=AQF0wU0xJwZJflA45su1at0tVt4YUWjxTThJzJ4W25z-SA
0 notes
Text
“Moral landmines” are actions or strategies that look extremely good under one moral theory but could be deeply wrong, catastrophic, or immoral under others.
Effective Altruists use the concept of moral uncertainty to avoid these landmines—because if you’re wrong about your moral theory, stepping on one could do real damage.
💥 Examples of Moral Landmines
1. Sacrificing the Few for the Many
"Kill one to save five."
Utilitarianism: May approve, if it maximizes net utility.
Deontology: Absolutely forbidden—violates rights, treats person as a means.
Virtue ethics: May corrupt the character of the person who does the killing.
🧠 Landmine Risk: Justifying unethical actions “for the greater good” can lead to atrocities, slippery slopes, or loss of trust in institutions.
2. "Earning to Give" in Unethical Industries
"Make billions in crypto or weapons manufacturing to donate more."
Utilitarianism: May justify if the net result is good (e.g., saving millions).
Deontology: Profiting from harm or manipulation is intrinsically wrong.
Virtue ethics: Can erode character and social responsibility.
🧠 Landmine Risk: The means corrupt the agent and risk legitimizing harmful systems—SBF is a prime example.
3. Neglecting Present Suffering for Speculative Futures
"We should prioritize AI safety over malaria nets."
Longtermist utilitarianism: Future lives vastly outweigh today's.
Deontology: We have stronger duties to actual, living humans.
Care ethics: Compassion demands focus on real suffering, not hypotheticals.
🧠 Landmine Risk: Perceived callousness or neglect, and moral miscalculation due to Pascal’s Mugging—acting on tiny probabilities of vast payoffs.
4. Moral Offsets
"I can do something bad as long as I do more good elsewhere."
E.g., lie, pollute, or manipulate markets if you donate enough to compensate.
Utilitarianism: Might allow it.
Deontology: Never permits doing wrong for good ends.
Virtue ethics: Encourages vice, not virtue.
🧠 Landmine Risk: Rationalizing harmful behavior leads to corruption or moral license.
5. Using Deception or Non-Consent
"We don’t have to tell people we’re experimenting if it increases impact."
E.g., conducting psychological experiments or behavioral nudges without informed consent.
Utilitarianism: May justify it as efficient.
Deontology: Violates respect for persons.
Virtue ethics: Undermines trust and integrity.
🧠 Landmine Risk: Erosion of ethical standards and legitimacy of EA initiatives.
6. Prioritizing Sentient AIs over Humans
"We must protect future digital minds over current human suffering."
Longtermism/digital sentience ethics: May assign moral weight to AIs.
Commonsense morality: Sees this as absurd or offensive.
Care ethics: Prioritizes empathy and real human relationships.
🧠 Landmine Risk: Public backlash, loss of moral clarity, or deprioritizing real moral obligations.
7. Animal Welfare Extremes
"Stopping wild animal suffering may require altering ecosystems."
Some EAs have discussed sterilizing predators or geoengineering nature.
Utilitarianism: May justify it based on suffering reduction.
Environmental ethics: Sees it as hubris and interference with nature.
Virtue ethics: Could view it as lacking respect for natural order.
🧠 Landmine Risk: Ecological disaster, moral overreach, public revulsion.
8. Neglecting Relational or Local Duties
"I owe the same to a stranger in Malawi as to my mother."
Utilitarianism: All moral patients are equal.
Confucian, virtue, or care ethics: Duties to kin and community are morally real and important.
🧠 Landmine Risk: Alienation, erosion of family/community ties, appearing cold or inhuman.
⚠️ Why These Matter
Public trust in EA can erode when ideas seem too abstract, cold, or extreme.
Personal corruption can result when people pursue high-impact strategies without ethical “brakes.”
Reputational and strategic failure arises when EA ideas are perceived as alien or dangerous.
✅ How EA Tries to Avoid These
Emphasizing moral pluralism and guardrails.
Promoting ethical integrity, not just outcome-maximization.
Investing in community norms of humility, caution, and reflection.
Encouraging moral “portfolio thinking”—balancing present and future, certainty and speculation.
0 notes
Text
“Both in fighting and in everyday life you should be determined though calm. Meet the situation without tenseness yet not recklessly, your spirit settled yet unbiased. Even when your spirit is calm do not let your body relax, and when your body is relaxed do not let your spirit slacken. Do not let your spirit be influenced by your body, or your body be influenced by your spirit.”
― Miyamoto Musashi
0 notes
Text
empáticamente, mi alma se moldea al narrador de cada una de las historias que leo
memorizo las lecciones no visual, no oral, sino sentimentalmente
mil vidas en momentos residen en mi
el oráculo no es más que el hipercubo donde habitamos todos
mi personalidad cedió humildemente a la sabiduría de 500 épocas luego del nacimiento de las letras
más allá, más allá, ir, completamente
0 notes