Training exercises that we created: Part 1

Women’s safety apps



How to get a roomful of people to examine the implications of privacy, data policies and app design, and get them all excited about it?

We were faced with this challenge as we conducted a range of workshops, as part of our research project Gendering Surveillance. The following exercise, which builds on our case study of women safety apps in India was well received by different groups - from school-going girls to developers and designers. We are sharing it here for use by anyone who might find it useful to replicate.

Time: 90 minutes

  • To be able to critically assess whether applications meant to enhance women’s safety are useful, and identify shortcomings and challenges
  • To learn to identify the different fields of information that are available about apps in the Play Store and on developers’ website, and to develop a sense of what those fields might indicate
  • To be able to evaluate applications on their privacy policies, and develop a sense of what fields would will be empowering to a user
Materials needed
  • Smart phones running Android and an Internet connection
  • Pre-printed pages of the applications from the Play Store.
  • The apps we most often use are VithU, Himmat, Nirbhaya: Be Fearless, Raksha, Eyewatch Women, My Safetipin, Damini and Pari - selected for the range of actors involved in their creation, and for the range of issues that emerge. You can find more apps used in India in this database, developed for our research.
  • Pre-printed set of guiding questions on (i) Features (ii) Data policies
  • Pen, paper
Format and steps
  • Divide participants into small groups of 3-5
  • Select women’s safety apps you’d like to discuss. You will need half as many apps as the number of groups, such that for every app, one group will work on ‘Features’ and one group will work on ‘Data Policies’. If participants seem keen and there’s sufficient time, then you could adapt to add more than one app per group - we’ve found that this often enriches the discussion.
  • Give each group either the set of questions on ‘Features’ or on ‘Data policies’ (see below) and ask them to answer the questions for each app they will be discussing. No groups should have the same questions *and* app, so that the report back is not boring!
  • Small group discussion for 30 minutes
Groups working on ‘Features’ will discuss
  • Does the ‘type’ of feature (like geofencing, tracking upon alert, heatmaps, walk-me-home etc.) enhance or decrease autonomy of the user?
  • Who has a say in initiating geofencing, walk-me-home?
  • Who are actors/emergency contacts the alert gets sent to? What are the implications for the user and their power relations with various actors?
  • Through what medium is the alert sent?
  • What control does the user have vis-a-vis their emergency contact? Are there risks associated with additional information like audio and video recordings being sent upon alerts?
  • What might be the vulnerabilities in putting information in ‘networked publics’ i.e. within reach of third parties (for eg., in case of some apps, an alert is raised on the Facebook profiles of users)?
Groups working on ‘Data Policies’ will discuss
  • What permissions does the app ask for? Does it ask for fields of information that are not required for the app to function? [see app’s page on Playstore]
  • Does the app have a privacy policy? [see app’s page on Playstore or developer’s website if available]
  • What is the business model of the app? [see developer profile; whether app is free or priced]
  • Does the app give users meaningful control over data collected ? [see in Privacy Policy or Terms of Services]
  • Examples of fields of inquiry:
  • Does it mention restrictions placed on data use?
  • Does it mention what is considered appropriate data use?
  • Can users opt out a particular uses of data?
  • Can users access data after it is collected?
  • Can users edit it for accuracy?
  • Can users decide to delete all information?

Each group presents in plenary important points from the app they were looking at. If their discussion went beyond the questions that you have provided, then that’s great!

Remarks and tips
  • Identify beforehand points that you want to see emerging in the discussions, so that you can chip in if they don’t come up organically. Our research on safety apps, and the infographics embedded in it, might be able to give you some ideas!
  • Important to note that if an app’s features are impressive, it is still worthwhile to look at how it fares on the data policies, and vice versa.
  • Selection of apps should be such that a wide range of features are covered, and a representative range of good and bad practices are covered.