Despite the frustration and anxiety that users reported, they also felt it was important to keep using the app to better manage their own safety (P2, P9, P1, P14). The users we interviewed were not unaware of the negative impacts of using Citizen, but felt beholden to the application. Since downloading Citizen, P14 described having a constant urge to know “what’s really going on” including checking whether a place he is in is “secure.” P2 shared that she felt “beholden to these sound alerts that instill panic. It’s like Pavlov’s dog: you hear the bell and you have a reaction; it’s visceral...I feel like a slave to it but it’s the only way I’m going to be able to control my safety as much as I can.” Others agreed—P9 voiced that she has gone back and forth on whether or not to delete the app because it induces anxiety, but decided not to get rid of it because it provided her with valuable information.
4.2.1 Interface Analysis: Encouraging the Use of Lucrative Features Which Promise Protection.
Citizen offers users three features for their protection, their loved ones’ protection, and their community’s protection: Citizen Protect, Safety Network, and Live Broadcast. These three features are also profitable, helping the company gain users’ money, data, and attention.
Citizen Protect is Citizen’s premium feature which was launched in 2021. The feature offers users the option to contact Citizen employees, known as Protect Agents, who can monitor the user’s surroundings, contact first responders when situations escalate, alert users’ emergency contacts, and create new incidents on behalf of a user to alert nearby users of the app. Citizen Protect is promoted as a tool that brings people together to watch out for each other [
27]. In-app advertisements give the example of a Protect Agent creating an incident to alert nearby users about a missing pet and the nearby users responding en masse (see Figure
4c). Researchers found this vision of mobilizing users reminiscent of Citizen’s prior avatar as the Vigilante app. Although the app is free, we found that Citizen aggressively advertises its premium features with the use of deceptive design patterns. For example, the Citizen Protect feature is advertised twice to new users during the sign-up process. In the latter instance, researchers noted the hidden “Skip” button which made it particularly challenging to bypass the advertisement, an example of a deceptive design pattern called
obstruction [
74]. The most egregious deceptive design pattern, however, is a floating button to sign up for Citizen Protect which is overlaid on each screen, constantly visible as users scroll through videos and notifications, many of which do not present any threat to users’ safety but heighten fear nonetheless (see Figure
4e). We saw this as an example of a
misdirection deceptive design pattern, a button which supports Citizen in translating heightened awareness and anxiety about safety into purchases. Users can purchase an individual or a family plan.
Citizen also encourages users to monitor their friends and family’s safety by adding contacts to their Safety Network. To take advantage of this feature, Citizen requires users to share their entire contact list with the app (see Figure
3a). There is no option to add contacts individually, an example of a
forced action deceptive design pattern [
74] because it creates a false dependency. If users choose to share their contacts, Citizen will alert all contacts who are existing Citizen users that their friend has joined the app without informing the user. This alert encourages contacts to add the user to their Safety Network and share location data with the user (see Figure
3e). We saw this as an example of
publish, a privacy deceptive design pattern, [
20] where information about an individual is shared without their consent or knowledge. This deceptive design pattern has the potential to exponentially increase new users for Citizen. Researchers also discovered that the app collected data about the user without their knowledge, including data about the user’s heart rate and about their mobile device’s battery life. Battery life information was shared with friends on the Safety Network without consent. These are examples of privacy deceptive design patterns which
obscure what data is being collected and how [
20].
The app describes Live Broadcast as a feature that allows users to create and share videos in order to “spread awareness of safety incidents with your community in real-time.” Citizen nudges users with verbal cues and displays the number of nearby users (who would presumably see the live video) (see Figure
1c, Figure
2c). We see this as an example of a
social investment deceptive design pattern because it encourages the use of the app through social metrics such as the potential number of reactions, comments, and views to user-uploaded videos [
79]. Researchers also documented one instance where users were prompted with the notification: “Go Live. 600 feet away. Hit-and-Run Collision. Tap to update your community” (see Figure
1e). The research team found this notification particularly challenging to reconcile with the app’s mission to support user safety [
27]. User-generated broadcasts were used to capture and engage users’ attention. For example, one researcher received an alert that there was a “live video at the scene”, to encourage viewing a video of an overturned car after a collision. Each video was also overlaid with users’ comments, reactions, and a pulsating share button to encourage users to share the video via text or social media.
4.2.2 User Experience: A Heightened Need for Safety Requires Action.
Sensitized to the risks around them, users engaged Citizen’s features for protection in two ways and responded individually, taking matters into their own hands, in many ways.
While we did not speak to any participants who had used Citizen Protect or Live Broadcast and could not evaluate the influence of the obstruction, misdirection, or social investment deceptive design patterns, we did speak to four participants who added friends to their Safety Networks (P1, P3, P4, P6). P6 mentioned that he has a very diverse group of friends, and given the racially-charged political climate, he appreciated the ability to make sure they were safe. P3 similarly appreciated being able to track her family members’ locations. P1 downloaded the Citizen app when her friend invited her to join her Safety Network due to the publish deceptive design pattern. While P1 valued the information she received from the app, she decided to turn on “Ghost Mode” because alerts about P1’s nearby incidents were causing her friend undue stress and anxiety.
Taking advantage of the information on Citizen, we observed how some participants began engaging in detective work. A Citizen post helped P14, an undergraduate student, create awareness about his missing friend. Other students on his campus also used the app, and P14 found that the comment section provided useful and comforting information when his friend went missing. Some participants viewed incidents on Citizen and cross-referenced that information on other platforms to get more context (P6, P4, P1, P9). P9, for example, was able to collect more information about a neighbor’s missing car using Citizen and Facebook, while P4 was able to locate a Nextdoor neighbor’s missing mail by cross-referencing information from Citizen.
Others did not feel as comfortable relying on Citizen because they worried about sharing location data with the app (P9, P12, P15, P11). P11 changed his settings so that he was only sharing his location when he was using the app because he assumed Citizen had to make money, and they must be doing something with his data that he was unaware of. P12 lives in an apartment complex where she knows there is gang activity. However, she admitted that she no longer feels comfortable calling 911 because she worries identifiable information might be leaked onto Citizen. She said, “I can’t believe I question now calling 911.it made me think to have like who has access to 911 recordings now?” Although users did not seem to be aware of specific deceptive design patterns, the lack of transparency about Citizen’s privacy policy due to design decisions such as the obscure deceptive design pattern disempowered users from taking actions that might protect their safety.
In addition to relying on Citizen, many participants took matters into their own hands and began carrying tasers (P9), guns (P12), knives (P2), mace (P9, P2) and investing in new home security systems (P9, P12, P7). Others began avoiding certain sub-populations perceived as dangerous. A small group of participants shared that their use of the app led to an increased fear of individuals who are homeless (P1), mentally ill (P2), Black teenagers (P2), and “Black men” (P4). P12 felt that she sees so many crime-related incidents with such little context that her mind can’t help but draw conclusions about who is committing these crimes. P1 reflected that:
"Before I downloaded Citizen when I would see homeless people in the park I wouldn’t think anything of it, you know they’re there sleeping, this is a soft relatively private place for you to lay your head tonight, and I would go on my way. Since downloading Citizen, I will leave a little more space, and I will look in those bushes a little more like, ‘is there, someone that could potentially be right there waiting to pounce?’"
For P11, Citizen brought to light the city’s “vagrancy problem” and the sense that more police activity and local leadership is needed.
Almost every participant began avoiding certain areas of the city that they perceived as dangerous. Participants mentioned changing the routes they drove (P8), the routes they walked at night (P2, P6, P9, P11, P4), and the businesses they frequented (P9, P11). Based on the incidents that participants viewed on the app, they began to create mental models of “
hot pockets” (P6) in the city to avoid. P8, for example, said that after seeing the same street names, again and again, she began avoiding those areas. Similarly, P11 described how he used Citizen to figure out if he should “
avoid that section of town” for the day. Furthermore, these mental models persisted beyond just the usage of the app. P4, for example, no longer attends the Castleberry art walk because she now associates that neighborhood with crime, and P2 said she no longer goes out for walks alone after six pm. For others, the data from the app has influenced long-term decisions like where to buy a house (P7, P8) and whether it makes sense to move to another state altogether (P10, P2). The areas that participants mentioned as “
hot pockets” of crime include Castleberry Hill, home to one of the highest concentrations of Black-owned land and businesses in the country, and Mechanicsville, where the vibrant and predominantly Black community of the 1950s has since diminished largely due to misguided urban renewal [
1,
5].