#responsibledata
Explore tagged Tumblr posts
sergeyhay · 6 years ago
Text
Privacy is a form of freedom. Limited privacy is a threat to one's freedom. Today, is a great day to reflect on how much freedom do we unintentionally put at risk, how can we change it, and are we the stewards of freedom that we ought to be. #DataPrivacyDay #ResponsibleData
Privacy is a form of freedom. Limited privacy is a threat to one's freedom. Today, is a great day to reflect on how much freedom do we unintentionally put at risk, how can we change it, and are we the stewards of freedom that we ought to be. #DataPrivacyDay #ResponsibleData
— Sergey H. (@SergeyHay) January 28, 2019
from Twitter https://twitter.com/SergeyHay January 28, 2019 at 06:56PM via IFTTT
0 notes
kaythaney · 7 years ago
Link
Important @wilbanks point at #sageassembly, reflecting on #Ostrom - the most dangerous systems prevent participants from shaping the design of the system. True of #dataprotection regulations, which give correction rights, but no ability to change systems. #GDPR #responsibledata
— Sean McDonald (@McDapper) April 19, 2018
0 notes
the-compiler · 9 years ago
Text
Data Ethics - investing wisely in data at scale
September 2016
David Robinson and Miranda Bogen at Upturn have mapped ways in which “data at scale” may pose risks to beneficiaries and funders. (By “data at scale,” they mean collecting, storing, analysing or using digital information in ways that have recently become possible.)
They found that grantmakers didn’t know enough about the benefits and risks of using data because of: the rapid growth in the use of technology; grantmakers and beneficiaries’ lack of data science expertise; limited training and resources; and a lack of clear guidelines for tackling new risks.
They identify three main challenges in data-oriented grantmaking:
1. The collection and use of data from the public sector isn’t carefully regulated, meaning that such data can now be used in a growing variety of potentially harmful ways.
2. Although there are many benefits related to the use of data at scale in decision-making, the use of automation may also risk reinforcing longstanding social biases.
3. Expertise for using data at scale is concentrated in certain large companies and government organisations. The nonprofit sector and academic researchers need to be prepared to harness these powerful new methods and shape how they are used across society by investing in training and fostering expertise.
They suggest that major foundations have a unique role to play in shaping norms around how this data is treated, and include a set of key questions for program staff and grantees to consider in data-intensive work. Finally, they recommend that funders incorporate data ethics in the grantmaking process and create data ethics checklists for grantees and program staff. For an report with similar themes based on research conducted in 2013-14, see this link. 
0 notes
sergeyhay · 6 years ago
Text
Privacy is a form of freedom. Limited privacy is a threat to one's freedom. Today, is a great day to reflect on how much freedom do we uintentionally put at risk, how can we change it, and are we the stewards of freedom that we ought to be. #DataPrivacyDay #ResponsibleData
Privacy is a form of freedom. Limited privacy is a threat to one's freedom. Today, is a great day to reflect on how much freedom do we uintentionally put at risk, how can we change it, and are we the stewards of freedom that we ought to be. #DataPrivacyDay #ResponsibleData
— Sergey H. (@SergeyHay) January 28, 2019
from Twitter https://twitter.com/SergeyHay January 28, 2019 at 06:43PM via IFTTT
0 notes
kaythaney · 7 years ago
Link
This is how I feel the public interest groups struggling with digitization. It’s not easy, but it’s an honor to struggle together. #responsibledata #commisaid #ict4d #civictech https://t.co/cGlSI6gp3c
— Sean McDonald (@McDapper) January 27, 2018
0 notes
kaythaney · 8 years ago
Link
And if you were interested in data ethics/#responsibledata discussed at #SIF17, we’d love to see you on this list: https://t.co/dnDvKLvk50
— Zara Rahman (@zararah) May 18, 2017
0 notes
kaythaney · 8 years ago
Link
Lots to dig into = Responsible data considerations for open source investigation in human rights https://t.co/4FMBW4vJB6 #responsibledata http://pic.twitter.com/yKUTpQQg43
— vanessa rhinesmith (@vrhinesmith) May 17, 2017
0 notes
the-compiler · 9 years ago
Text
Monitoring Conflict to Reduce Violence: Evidence from a Satellite Intervention in Darfur
Download paper here
Grant M. Gordon’s research investigates the impact of Amnesty International USA’s “Eyes on Darfur”, the first satellite intervention that aimed to reduce violence in the Darfur region of Sudan. It highlights some important points about unintended consequences of using technology in projects that monitor human right violations, election irregularitiess.
Gordon estimated the impact of Eyes on Darfur by using a dataset from the US State Department’s Humanitarian Information Unit on village population in Darfur, and the year that they experienced conflict in the period 2003-10 - as well as interviews with staff who worked at Amnesty during the programme.
What did he find? Areas monitored by Eyes on Darfur were associated with between a 15 and 20 percentage point increase in violence, compared with unmonitored areas. This wasn’t just during the period of that Amnesty monitoring the areas - violence continued there in subsequent years. (As Gordon points out, this could be because Amnesty didn’t inform the Government of Sudan that monitoring had stopped - but in any case, it doesn’t diminish the overall impact of the trend).
Monitoring also didn't seem to simply displace violence elsewhere - but neither did it seem to protect the area where it was implemented: there was no increase or decrease in violence in villages that neighboured the villages monitored by Eyes on Darfur.
Gordon suggests some reasons why this might be the case (including that Eyes on Darfur may have offered the Government of Sudan a low-cost way of assessing whether the Janjaweed had performed the tasks they had been assigned), but suggests that it was most likely that the Government was attempting to retaliate against Amnesty, dissuade it from monitoring areas and deter other human rights organizations from engaging in the same behaviour. (The Government of Sudan had prohibited most organizations from operating in Darfur, targeted their workers or made it very costly for them to operate - but could not otherwise halt the satellite monitoring.)
Retaliating against Amnesty could only be done by targeting the communities they aimed to protect.
So, what does this mean for human rights organisations? It's important to note that Amnesty started by monitoring a small number of villages because they were concerned about this possibility, and that they decided not to expand the program after getting anecdotal evidence that it had increased violence. More generally, Gordon suggests that the results show that satellite monitoring did have an impact, and that they:
speak to the underlying potential for limited external interventions conducted by human rights organizations to change the strategic calculus of actors involved in genocide, even if they cause violence.
He then suggests “two scope conditions under which advocacy-driven monitoring may fail”: - "Human rights interventions may be more successful in the early days of conflict when actors are still invested in their reputations and believe negotiated settlements are still in sight." [During the period when Eyes on Darfur was in place, the Sudanese Government was already set on its path]. - Monitoring might be less likely to reduce violence in contexts where “actors have strong beliefs and are unlikely to care about international audience costs that fall short of inciting a military intervention.”
0 notes
the-compiler · 9 years ago
Text
Mapping and Comparing Responsible Data Approaches
Download paper here [PDF]
by Jos Berens, Ulrich Mans, and Stefaan Verhulst
This paper, released in July 2016, looks at the scope and reach of various policies which address elements of responsible data. Interestingly, only one of the 17 policies chosen for inclusion here is actually labelled as a “responsible data policy” (from Oxfam) - the others are labelled as addressing Data Protection, Privacy, or Data Sharing. 
The paper goes through various elements of these policies, looking at similarities and differences between them. The takeaways from this analysis centre around implementation of policies (leadership, accountability, feedback loops), accessibility (clear language), and learning (what works, what doesn’t, and iterating.) 
The paper seems to be aimed at those working in the humanitarian space rather than more broadly, but has useful and interesting lessons for people looking at creating privacy/data sharing policies. 
0 notes
the-compiler · 9 years ago
Text
Mapping Refugee Media Journeys
The "Mapping Refugee Media Journeys" project, by the Open University and France Médias Monde, looks at how Syrian and Iraqi refugees use technology and suggests ways of using this to help the most vulnerable refugees.
It’s based on interviews with more than 50 Syrian and Iraqi refugees in France and the UK, analysis of refugee social networks (Facebook and Twitter), and interviews with the European Commission, international media and NGOs.
Here are some of the key points we picked out:
Unsurprisingly, responsible data was a big concern: 
Refugees will not share personal information online, preferring to remain anonymous for fear of reprisals, surveillance, detention and/ or deportation. 
The digital tools and resources that help, guide and comfort refugees are also used to exploit, monitor and track them.
For example, researchers found that interviewees followed news on Whatsapp mainly because they trusted that the service (unlike Twitter and Facebook) wasn’t under surveillance.
Refugees also rarely used resources produced by national or state­‐funded organisations, and were often driven towards “unofficial, potentially dangerous and exploitative resources” as a result. 
The report gives practical recommendations to deal with this (informed by Aspiration Tech guidelines and Oxfam’s Responsible Data policy, among others). They include designing tools that don’t asking refugees to disclose any information about themselves, and recommending that any information sources  include warnings about the known risks of financial exploitation by taxi/private drivers or smuggling networks.
Highlighting the daily changing nature of the situation in border areas, the researchers criticised the plethora of hackathons to create tools, saying:
any resource must be frequently updated in order to avoid it doing more harm than good with misinformation....there is a real danger that quick tech fix initiatives are not viable or sustainable. A sustainable resource of the kind that international news organisations might provide would offer a more viable alternative.
They argue that any digital outreach project should be “highly personal” and ensure that "trusted individuals” should have a continuous physical and digital presence at key sites. In particular, they recommend drawing on sustainable existing networks (such as by encouraging refugees to ask for help and counseling from local NGO staff) and understanding shifting local conditions on­‐the­‐ground experience. They also found that, despite the range of tools available, there was little content available in key areas, notably "relevant high quality legal information...and sources of information about language learning facilities”.
Download paper here
0 notes
the-compiler · 10 years ago
Text
The Ethics of Big Data: Current and Foreseeable Issues in Biomedical Contexts
By Brent Daniel Mittelstadt and Luciano Floridi
This was published in May 2015, but brings together many of the issues that we consider under the umbrella of ‘responsible data’: privacy, anonymisation, ethics, consent, surveillance, and more. Biomedical data is has a lot of responsible data concerns as health data is so sensitive, and this paper goes into a lot of detail around current and upcoming issues. The authors make clear that generalisations around Big Data aren’t that useful in order to think through ethical concerns, and many of the issues raised here are very pertinent for other sensitive datasets, too. 
Read more here
0 notes
the-compiler · 10 years ago
Text
Applying Humanitarian Principles to Current Uses of Information Communication Technologies
This paper by Nathaniel Raymond and Brittany Card argues that we need minimum standards for mobile network access and coverage, providing ICT services for vulnerable people, providing early warning to people at risk, involving communities in ICT programme design, and assessing the data accessibility needs of particular populations.
Download paper
The goal of this paper is to identify and address current gaps, challenges and opportunities that face the humanitarian sector as it seeks to apply traditional humanitarian principles to the increasingly central role information communication technologies (ICTs) play in 21st Century humanitarian operations. While much has been written about the roles ICTs may play in support of humanitarian action, there is an absence of literature addressing how core humanitarian principles should guide, limit, and shape the use of these technologies in practice.
0 notes
the-compiler · 10 years ago
Text
Investigating the Computer Security Practices and Needs of Journalists
by Susan E. McGregor, Polina Charters, Tobin Holliday and Franziska Roesner. Read report (PDF)
“Though journalists are often cited as potential users of computer security technologies, their practices and mental models have not been deeply studied by the academic computer security community. Such an understanding, however, is critical to developing technical solutions that can address the real needs of journalists and integrate into their existing practices. We seek to provide that insight in this paper, by investigating the general and computer security practices of 15 journalists in the U.S. and France via in-depth, semi-structured interviews”
0 notes