Tumgik
#eli pariser
ismsetcetera · 2 months
Text
Tumblr media
(more) From "The Filter Bubble" by Eli Pariser
Penguin Press, 2011
0 notes
gudguy1a · 2 years
Text
C’mon Americans – Mid-term Elections ARE as Important as Presidential Elections -- VOTE…!
C’mon Americans – Mid-term Elections ARE as Important as Presidential Elections — VOTE…!
TL;DR / BLUF (because this is a L-O-N-G paper, last few pages is content from another paper) VOTE!  Get out and VOTE! You CAN; You SHOULD; You MUST; get out and vote if you truly value America and the continuation of ‘better’ values. We cannot just vote in Presidential elections and THINK that is all it takes. It is NOT!!!  The mid-term races are the arenas where the least or worse…
View On WordPress
0 notes
publicatiosui · 11 months
Text
The faster the system learns from you, the more likely it is that you can get trapped in a kind of identity cascade, in which a small initial action—clicking on a link about gardening or anarchy or Ozzy Osbourne—indicates that you’re a person who likes those kinds of things. This in turn supplies you with more information on the topic, which you’re more inclined to click on because the topic has now been primed for you. Especially once the second click has occurred, your brain is in on the act as well. Our brains act to reduce cognitive dissonance in a strange but compelling kind of unlogic—“Why would I have done x if I weren’t a person who does x—therefore I must be a person who does x.”Each click you take in this loop is another action to self-justify—“Boy, I guess I just really love ‘Crazy Train.’ ” When you use a recursive process that feeds on itself, Cohler tells me, “You’re going to end up down a deep and narrow path.” The reverb drowns out the tune. If identity loops aren’t counteracted through randomness and serendipity, you could end up stuck in the foothills of your identity, far away from the high peaks in the distance.
Eli Pariser, Filter bubble
10 notes · View notes
xxxpu55yslay3rxxx · 8 months
Text
If you wanna know my lore and how I act like on the internet, Then read "The Filter Bubble" by Eli Pariser:
Tumblr media
A very influential book that actually dictated a good chunk of my personality in the 2010s.
I say influence my personality as if it was a self help book but actually it's a discussion and study on how the control and moderation of the internet affects our views. The author states how with personalized searches and data moderation, tech giants and media sources can influence our views to trap us in a bubble, hence the title. A book ahead of its time however not anymore as whatever has been said in it is happening now.
Amazing ass book that if anyone even glances at my account I would recommend to read.
Here, have a libgen link to the E-book files: https://www.libgen.is/search.php?req=The+filter+Bubble&lg_topic=libgen&open=0&view=simple&res=25&phrase=1&column=def
2 notes · View notes
isearchgoood · 15 days
Text
The arrest of Telegram CEO Pavel Durov — and why you should care | Eli Pariser
https://www.ted.com/talks/eli_pariser_the_arrest_of_telegram_ceo_pavel_durov_and_why_you_should_care?rss=172BB350-0205&utm_source=dlvr.it&utm_medium=tumblr
0 notes
endquire · 20 days
Text
You Should Care that Telegram CEO Pavel Durov Was Just Arrested. Eli Par...
0 notes
Text
Front Porch users’ satisfaction shows how careful moderation and prioritizing civility over engagement can lead to a vastly different experience of social media, said Eli Pariser, co-director of New_ Public and author of “The Filter Bubble.”
0 notes
jennbarrigar · 4 months
Text
0 notes
leaacquaviva · 6 months
Text
SÉANCE #13 - La tromperie digitale : savoir s’informer sur les médias numériques.  
Les médias sociaux sont, de plus en plus, le moyen privilégié par la nouvelle génération afin de s’informer sur les nouvelles les plus récentes. Cependant, est-il vraiment possible de s’informer sur les réseaux sociaux numériques, lorsque nous sommes sans cesse traqués par ceux-ci ? 
Nous le savons, sur les médias sociaux, les algorithmes font en sorte de nous proposer du contenu par lequel nous sommes intéressés. Cependant, à travers cela, les algorithmes nous enferment dans des bulles de filtres. Une bulle de filtres se construit autour de nos interactions et de nos réactions à des contenus sur les médias sociaux, et nous enferme ainsi dans une bulle d’informations qui nous plairont. 
Ainsi, nous ne sommes plus confrontés avec des opinions qui divergent des nôtres, ce qui nous conforte dans nos idées et réduit la possibilité de se forger une bonne réflexion. Selon Eli Pariser, qui a théorisé ce concept, les bulles de filtres nous placent dans un état « d’isolement intellectuel et culturel » (Roué, 2023).  
De plus, la désinformation y est abondante. En effet, si l’instantanéité des médias sociaux et la diversité d’informations que l’on y trouve sont attrayantes, il faut être à l’affût des fausses nouvelles afin de ne pas s’y perdre. 
Ceci est d’autant plus important avec les actualités que nous affrontons aujourd’hui. De fait, avec les guerres actuelles, les gens cherchent à tout prix à s’informer, cependant les informations partagées sur les médias sociaux, en plus d’être immédiates, sont très nombreuses. Dans cette surcharge d’informations, il est donc facile de se perdre et de croire des post à but de désinformation ou de propagande. De plus, lors d’opérations militaires, cela crée un « brouillard de guerre » pour les personnes sur le terrain (Meloche-Holubowski, 2023).  
Aussi, dans un contexte électoral, la désinformation est d’autant plus virulente que chaque parti essaye de convaincre le public et de recruter de nouveaux adhérents. Dans une telle situation, il y a donc une « guerre informationnelle » puisque chacun veut influencer l’opinion publique en sa faveur (Meloche-Holubowski, 2023).
Ainsi, lorsque l’on s’informe sur les médias sociaux, il faut être conscient des risques et des procédés utilisés, il est aussi important de toujours vérifier les sources. Il ne faut donc pas se contenter de ce que les algorithmes nous proposent, mais également s’ouvrir à d’autres opinions et au débat. 
0 notes
vivvvbar · 7 months
Text
"For a time, it seemed that the Internet was going to entirely redemocratize society... Local governments would become more transparent and accountable to their citizens. And yet the era of civic connection I dreamed about hasn’t come. Democracy requires citizens to see things from another’s point of view, but instead we’re more and more enclosed in our own bubbles. Democracy requires a reliance on shared facts; instead we’re being offered parallel but separate universes."
A quote by Eli Pariser regarding the internet and how it is affecting what we read and how we think.
0 notes
ismsetcetera · 2 months
Text
Tumblr media
From "The Filter Bubble" by Eli Pariser
Penguin Press, 2011
1 note · View note
arte-en-la-red · 7 months
Text
0 notes
radicalbotanicals · 9 months
Text
Popping the Filter Bubble – Radicalisation tool or Just Misrepresented?
For this entry, I am thinking about playing devil’s advocate here. I want to approach this from a different perspective rather than just harp on the usual narrative that it, (filter bubbles that is) contributes to hive-mind mentality at the extreme end of the argument or simply does not allow people access to diverse content that could perhaps give them better informed opinions. Personally, I believe people who want to live in their own echo chamber are going to deliberately construct spaces for themselves that they can comfortably exist in. You cannot force someone to be understanding or empathetic to differing opinions from their own if they do not want to. I think we are all guilty of this to some degree, even if it is on a minor scale. For example, I will purposely ignore any incendiary comments or criticisms regarding bands I enjoy and have essentially barred myself from spaces on the internet I know where I can see such content. It has brought much peace to my life by doing so, particularly as metalhead who is in quite an elitist music community.
To dive deeper into this issue on a more serious note, can you be radicalised by a filter bubble? Michael Wolfowicz, David Weisburd, and Badi Hasisi would say not in their paper: “Examining the interactive effects of the filter bubble and the echo chamber on radicalization”. They determined that In fact, evidence suggests that algorithmic selection’s potential effect on radicalization would be small relative to the effects of self-selection which supports my opening statement above. They point to several challenges that are presented when trying to seek a link between filter bubbles, echo chambers and their radicalising power. Studies in this field have thus far been “relegated primarily to the macro-level, with little if any individual-level study. Another issue comes from attempting to measure ‘echo chamberness’ which is noted to often “fail to capture the type of network structure characteristics described by the respective propositions.” (Wolfowicz et al., 2021). Given that the terms were coined by someone outside of the sphere of academia, Eli Pariser to be specific, as terminology to be used in the tech world, it has created this “conceptual ambiguity…which has challenged researchers with developing their own way of testing the frameworks’ hypotheses.” (Wolfowicz et al., 2021) All of this to say there has not been definitive proof to conclude that filter bubbles are this mystical powerful machine that can mind-control any unsuspecting consumer into believing whatever the algorithm wants them to.
There is in fact even positive evidence to suggest filter bubbles can create safe spaces for the disenfranchised or “at risk” groups to connect and find community. In Filter Bubbles? Also Protector Bubbles! Folk Theories of Zhihu Algorithms Among Chinese Gay Men (Zhao 2023) Zhao dives into a popular Chinese social media app known as Zhihu and how the “filter bubble” has in fact information barriers built by the filter bubbles that “are believed to shield gay male users from outsiders; the recommendation algorithms are also perceived to recognize scattered gay male users and to provide them with access to various gay communities; and thus, algorithm-driven exclusive networks are believed to have been established.” (Zhao 2023).
Tumblr media
(Source KRAsia.) Sometimes filter bubbles can be good!
Zhao brings up the notion that “it is always emphasized that information barriers hinder the interaction and flow of views and polarize people’s thinking. However, the informants in this study offer a queering interpretation of information barriers.” One of the participants in this study, known as 13 mentioned “the information barrier of Zhihu is so well constructed and will never lie to you. It should be said that absolutely no straight man or woman will find this [gay] space; for those who have never been exposed to this [gay] topic, the possibility of entering is too low.” (Zhao 2023). It is very likely that diverse identities such as those of the LGBTQ+ community are not considered when filter bubbles are spoken about in a negative context. “visibility” in spaces is taken for granted for sure, but the potential shielding power it holds for individuals who may be persecuted cannot be denied, and they can continue to exist in these spaces with potentially less danger being presented to their person simply by being who they are. It is a double-edged sword for sure, but I think we need to move past the conversation that it is merely the work of algorithms that are resulting in perhaps radicalisation of dangerous or extremist groups and it is in fact a much larger societal issue that we should be addressing, but perhaps we are not ready for that conversation just yet.
Bibliography
Wolfowicz, M., Weisburd, D. and Hasisi, B. (2021) ‘Examining the interactive effects of the filter bubble and the Echo Chamber on Radicalization’, Journal of Experimental Criminology, 19(1), pp. 119–141. doi:10.1007/s11292-021-09471-0.
Zhao, L. (2023) ‘Filter bubbles? also protector bubbles! folk theories of Zhihu algorithms among Chinese gay men’, Social Media + Society, 9(2), p. 205630512311686. doi:10.1177/20563051231168647.
0 notes
xxxpu55yslay3rxxx · 8 months
Text
Bitch you guise been ready for this!!
Yep
I got work to do. I'm now in busy mode!!!
Posting here was fun but eventually I was gonna have to do something worky lol.
That doesn't mean I'll cut stuff of completely, It's just that I'll be less on here.
Oh and btw you guys should read "The Filter Bubble" by Eli Pariser. Gonna reblog it one more time lol
0 notes
heloirgan24 · 11 months
Text
SÉANCE #10— De l'algorithme de recommandation à l'IA Intuitive : l’épopée fascinante de l'évolution algorithmique
Tumblr media
Dans l’écosystème du numérique qui est en perpétuelle évolution, la vive progression des algorithmes ouvre la voie à de multiples défis et questionnements particulièrement sur la sociologie des usages. Selon Aurélien Jean, docteure en sciences et entrepreneure formée à l'ENS, Mines Tech et au MIT, un algorithme est « une méthode de résolution qui contient des étapes dites hiérarchisées. ».
L’algorithme est passé par de simples recommandations à l'intelligence artificielle reposant principalement sur le deep learning.  Au départ, les algorithmes de recommandation avaient recours à des modèles simples afin d’analyser les données et de proposer des choix adaptés à nos préférences. Ces systèmes informatiques, étaient relativement compréhensibles quant à leur fonctionnement. Cependant, avec l'avènement du deep learning, l'intelligence artificielle ne cesse de se développer dans différents domaines grâce à des systèmes d’apprentissage complexes et automatisés.
Pour illustrer l'évolution des algorithmes, on peut évoquer l'exemple des réseaux sociaux. Ces derniers usent du deep learning afin de personnaliser le contenu pour les utilisateurs. Cependant, l'hyperpersonnalisation peut entraîner une "bulle de filtres", comme théorisé par Eli Pariser. Cela pousse les utilisateurs à être exposés à un contenu qui confirme leurs croyances existantes, limitant ainsi leur compréhension du monde extérieur. Cette opacité algorithmique soulève de multiples préoccupations, telles que la manipulation des informations ou la confidentialité des données.
Cette progression technologique a introduit une certaine complexité, créant ce que l'on appelle la "boîte noire". Dans le domaine de la sociologie des usages, cette notion permet de comprendre comment les technologies numériques influencent nos comportements et notre perception de nos sociétés. Cette "boîte noire" soulève des interrogations incontournables en sociologie des usages, car elle obscurcit la compréhension des algorithmes qui influent sur nos décisions et nos interactions sociales. Cette opacité doit être combattue par la transparence et l'éthique pour éviter les biais algorithmiques, par exemple.
Pour conclure, la transition des algorithmes de recommandation vers l'IA reposant sur le deep learning a révélé la complexité croissante des modèles utilisés. Ces avancées technologiques offrent des avantages considérables ; néanmoins, il est nécessaire de rester vigilant quant à l'impact social et individuel de ces systèmes opaques. Il faudrait donc encourager la transparence et la compréhension de leur fonctionnement afin de promouvoir une utilisation éthique et éclairée des technologies indispensables dans notre quotidien.
Sources :
https://www.cairn.info/revue-du-crieur-2018-3-page-68.htm
0 notes
ismsetcetera · 3 months
Text
Tumblr media Tumblr media Tumblr media
From the introduction of "The Filter Bubble" by Eli Pariser
Penguin Press, 2011
1 note · View note