• Migrants sue German state over mobile phone searches

    In Germany, three migrants from Syria, Afghanistan and Cameroon are suing the state for accessing personal data on their mobile phones. A civil rights group taking part in the action says the phone searches are a serious invasion of privacy.

    29-year-old Syrian Mohammad A. was recognized as a refugee in Germany in 2015. Four years later, the German Office for Migration and Refugees (BAMF), reviewed his case – without giving a specific reason. During the review, they carried out an evaluation of his smartphone.

    “Suddenly the #BAMF employee told me to hand over my mobile phone and unlock it,” said Mohammad A. in a statement published by the Berlin-based Society for Civil Rights (GFF). “I didn’t know what was happening. Nothing was explained to me. But I was afraid of being deported. So I gave him the mobile phone. It felt like I was handing over my whole life.”

    Under a law passed in 2017, German authorities can examine the mobile phones of asylum seekers who are unable to present a valid passport on arrival, in order to verify information provided regarding identity. But the GFF, which filed the lawsuits together with the three refugees, says this represents “a particularly serious and extensive encroachment on the privacy of those affected.”

    Law fails to uncover false information

    The law permitting phone searches was meant to prevent “asylum abuse”. As many of those who arrive in Germany after fleeing their home countries cannot present a valid passport, it was seen as an effective way to detect fraudulent claims. However, the GFF says that despite thousands of such mobile phone searches, hardly any have uncovered false information.

    The GFF also argues that asylum authorities do not ensure that core areas of the asylum seekers’ rights are protected. “The BAMF is disregarding the strict constitutional rules by which the state must abide when accessing personal data,” Lea Beckmann from the GFF told Reuters.

    According to the news agency, a spokesman for BAMF said it was aware that checking mobile data was an intrusion and every case was determined by strict rules. “A mobile phone is often the only, or a very important, source to establish the identity and nationality of people entering Germany without a passport or identification documents,” he said.

    Privacy, transparency concerns

    The GFF argues that BAMF should be using mobile phone reading as a last resort, and that there are other, less drastic, means of clarifying doubts about identity. Mobile phone readouts are also extremely error-prone, the organization claims.

    The BAMF has also been criticized over a lack of transparency. For example, according to the GFF, little is known about how the software used to read and analyze the information obtained from phones actually works.
    Similarly, Reuters reports, the World Refugee Council has warned that consent for data collection is rarely sought and refugees often do not know how their data is used.

    Mohammad A.’s case is pending before a local court in the northwestern German city of Hanover. The case of an Afghan woman aged about 37 was lodged in Berlin and that of a 25-year-old woman from Cameroon, in the southwestern city of Stuttgart. The GFF hopes that the cases will lead to a constitutional review of the legal basis for mobile phone data evaluation.


    #smartphone #données #Allemagne #justice #asile #migrations #réfugiés #surveillance #données_personnelles #téléphone_portable #identité #identification #procédure_d'asile #nationalité

    ping @etraces @karine4 @kg

    https://seenthis.net/messages/853006 via CDB_77

  • Don’t assume technology is racially neutral

    Without adequate and effective safeguards, the increasing reliance on technology in law enforcement risks reinforcing existing prejudices against racialised communities, writes Karen Taylor.


    Within the European Union, police and law enforcement are increasingly using new technologies to support their work. Yet little consideration is given to the potential misuse of these technologies and their impact on racialised communities.

    When the everyday experience of racialised policing and ethnic profiling is already causing significant physical, emotional and social harm, how much will these new developments further harm people of colour in Europe?

    With racialised communities already over-policed and under-protected, resorting to data-driven policing may further entrench existing discriminatory practices, such as racial profiling and the construction of ‘suspicious’ communities.

    This was highlighted in a new report published by the European Network Against Racism (ENAR) and the Open Society Justice Initiative.

    Using systems to profile, survey and provide a logic for discrimination is not new; what is new is the sense of neutrality afforded to data-driven policing.

    The ENAR report shows that law enforcement agencies present technology as ‘race’ neutral and independent of bias. However, such claims overlook the evidence of discriminatory policing against racialised minority and migrant communities throughout Europe.

    European criminal justice systems police minority groups according to the myths and stereotypes about the level of ‘risk’ they pose rather than the reality.

    This means racialised communities will feel a disproportionate impact from new technologies used for identification, surveillance and analysis – such as crime analytics, the use of mobile fingerprinting scanners, social media monitoring and mobile phone extraction - as they are already overpoliced.

    For example, in the UK, social media is used to track ‘gang-associated individuals’ within the ‘Gangs Matrix’. If a person shares content on social media that references a gang name or certain colours, flags or attire linked to a gang, they may be added to this database, according to research by Amnesty International.

    Given the racialisation of gangs, it is likely that such technology will be deployed for use against racialised people and groups.

    Another technology, automatic number plate recognition (ANPR) cameras, leads to concerns that cars can be ‘marked’, leading to increased stop and search.

    The Brandenburg police in Germany used the example of looking for “motorhomes or caravans with Polish license plates” in a recent leaked internal evaluation of the system.

    Searching for license plates of a particular nationality and looking for ‘motorhomes or caravans’ suggests a discriminatory focus on Travellers or Roma.

    Similarly, mobile fingerprint technology enables police to check against existing databases (including immigration records); and disproportionately affects racialised communities, given the racial disparity of those stopped and searched.

    Another way in which new technology negatively impacts racialised communities is that many algorithmically-driven identification technologies, such as automated facial recognition, disproportionately mis-identify people from black and other minority ethnic groups – and, in particular, black and brown women.

    This means that police are more likely to wrongfully stop, question and potentially arrest them.

    Finally, predictive policing systems are likely to present geographic areas and communities with a high proportion of minority ethnic people as ‘risky’ and subsequently make them a focus for police attention.

    Research shows that data-driven technologies that inform predictive policing increased levels of arrest for racialised communities by 30 percent. Indeed, place-based predictive tools take data from police records generated by over-policing certain communities.

    Forecasting is based on the higher rates of police intervention in those areas, suggesting police should further prioritise those areas.

    We often – rightly – discuss the ethical implications of new technologies and the current lack of public scrutiny and accountability. Yet we also urgently need to consider how they affect and target racialised communities.

    The European Commission will present a proposal on Artificial Intelligence within 100 days of taking office. This is an opportunity for the European Parliament to put safeguards in place that ensure that the use of AI does not have any harmful and/or discriminatory impact.

    In particular, it is important to consider how the use of such technologies will impact racialised communities, so often overlooked in these discussions. MEPs should also ensure that any data-driven technologies are not designed or used in a way that targets racialised communities.

    The use of such data has wide-ranging implications for racialised communities, not just in policing but also in counterterrorism and immigration control.

    Governments and policymakers need to develop processes for holding law enforcement agencies and technology companies to account for the consequences and effects of technology-driven policing.

    This should include implementing safeguards to ensure such technologies do not target racialised as well as other already over-policed communities.

    Technology is not neutral or objective; unless safeguards are put in place, it will exacerbate racial, ethnic and religious disparities in European justice systems.


    #neutralité #technologie #discriminations #racisme #xénophobie #police #profilage_ethnique #profilage #données #risques #surveillance #identification #big-data #smartphone #réseaux_sociaux #Gangs_Matrix #automatic_number_plate_recognition (#ANPR) #Système_de_reconnaissance_automatique_des_plaques_minéralogiques #plaque_d'immatriculation #Roms #algorythmes #contrôles_policiers


    Pour télécharger le rapport:

    ping @cede @karine4 @isskein @etraces @davduf

    https://seenthis.net/messages/816511 via CDB_77

  • #Mouton_2.0 - La puce à l’oreille

    #film #documentaire #puces #RFID

    Site du film :

    La #modernisation de l’#agriculture d’après guerre portée au nom de la science et du progrès ne s’est pas imposée sans résistances. L’#élevage ovin, jusque là épargné commence à ressentir les premiers soubresauts d’une volonté d’#industrialisation.

    Depuis peu une nouvelle obligation oblige les éleveurs ovins à puçer électroniquement leurs bêtes. Ils doivent désormais mettre une #puce_RFID, véritable petit mouchard électronique, pour identifier leurs animaux à la place de l’habituel boucle d’oreille ou du tatouage. Derrière la puce RFID, ses ordinateurs et ses machines il y a tout un monde qui se meurt, celui de la #paysannerie.

    Dans le monde machine, l’animal n’est plus qu’une usine à viande et l’éleveur un simple exécutant au service de l’industrie. Pourtant certains d’entre eux s’opposent à tout cela …

    #technologie #identification #surveillance
    cc @odilon

    https://seenthis.net/messages/626195 via CDB_77

  • Fighting slavery from space

    The University launched its ‘Slavery from Space’ project last month (May), which has been looking for volunteers to help researchers trawl through hundreds of satellite images to identify, and mark brick kilns. This work is focussing on India where the Global Slavery Index estimates there are over 18 million slaves.


    #cartographie #visualisation #google #esclavage #résistance #travail #exploitation #briques #fabriques_de_briques #identification #crowd_sourcing #recherche #cartographie_participative #dénonciation

    Ici pour voir la #carte (même si j’ai pas vraiment réussi à voir la carte en entier) :
    cc @reka @fil

    https://seenthis.net/messages/621084 via CDB_77

  • Ghosbusters et le sens de l’empathie - Mon blog sur l’écologie politique


    Il est à cet égard important que les hommes puissent trouver normal de s’identifier à des personnages féminins, pas que pour des raisons cinématographiques mais aussi pour développer un peu d’empathie par rapport au sort des autres. Qu’il soit Noir, musulman ou femelle. Que la violence qui lui est faite soit aussi intolérable que celle qui est faite à mes semblables, ceux auxquels je m’identifie. Et si le #cinéma a un rôle à jouer, c’est celui d’offrir l’opportunité à tout le monde de s’identifier à des personnages féminins, des personnages profonds, aimables et qui ont des interactions riches avec leurs semblables. Parce que pour l’instant, ce n’est pas fameux. Non seulement #Ghostbusters passe le test de #Bechdel (qui nous renseigne sur la dimension sociale des perso féminins) mais en plus ces personnages ont le même droit au ridicule, à l’embonpoint et à montrer qu’ils ont dépassé quarante ans que leurs camarades masculins.

    #empathie #identification #femmes

    https://seenthis.net/messages/510300 via Aude V

  • World Bank, Accenture Call for Universal ID - FindBiometrics

    In a new report issued in collaboration with Accenture, the World Bank is calling on governments to work together to implement standardized, cost-effective identity management solutions.

    A report synopsis notes that about 1.8 billion adults around the world currently lack any kind of official documentation. That can exclude those individuals from access to essential services, and can also cause serious difficulties when it comes to trans-border identification.

    That problem is one that Accenture has been tackling in collaboration with the United Nations High Commissioner for Refugees, which has been issuing Accenture-developed biometric identity cards to populations of displaced persons in refugee camps in Thailand, South Sudan, and elsewhere. The ID cards are important for helping to ensure that refugees can have access to services, and for keeping track of refugee populations.

    notez l’argument : that can exclude… (les autres causes d’exclusion sont promues par Accenture).

    #totalitarisme #fichage #identification #pour_ton_bien

    http://seenthis.net/messages/465386 via Fil