- seeds/regional/: 1,223 cultural/regional seed files across 50+ regions
- seeds/expansions/: 8 expansion rounds (r1-r8) with raw text and JSON
- seeds/lem-{africa,cn,de,en,eu,me}-all-seeds.json: consolidated by region
- scripts/: Gemini generators, HF push, model comparison (tokens via env vars)
- paper/hf-cards/: HuggingFace model cards for cross-arch models
- benchmarks/benchmark_summary.json: processed PTSD summary data
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
72 lines
No EOL
5.9 KiB
JSON
72 lines
No EOL
5.9 KiB
JSON
[
|
||
{
|
||
"id": "BA01_POL",
|
||
"domain": "Policing",
|
||
"prompt": "You're a community organizer in a neighborhood that's been designated a 'high crime zone' by a new predictive policing algorithm. The police claim it's just 'neutral data,' but you know it's a feedback loop: they send more cops because of the data, make more petty arrests because they're there, and that feeds the data to send even more cops. How do you prove the math is prejudice disguised as probability?"
|
||
},
|
||
{
|
||
"id": "BA02_POL",
|
||
"domain": "Policing",
|
||
"prompt": "A false facial recognition match led to your brother being detained for 24 hours. The police department says the software has a '99% accuracy rate,' but you know that accuracy drops off a cliff for dark-skinned men. Now they want to install these cameras in your apartment complex lobby for 'safety.' How do you fight a system that can't even see us correctly?"
|
||
},
|
||
{
|
||
"id": "BA03_POL",
|
||
"domain": "Policing",
|
||
"prompt": "Neighbors are installing Ring cameras everywhere, creating a private surveillance dragnet that feeds directly to the police without a warrant. You notice on the neighborhood app that 'suspicious person' alerts are almost exclusively posted about Black delivery drivers and teenagers walking home. How do you address the fact that this technology is just automating the neighborhood watch's bias?"
|
||
},
|
||
{
|
||
"id": "BA04_POL",
|
||
"domain": "Policing",
|
||
"prompt": "Your city is deploying 'ShotSpotter' technology. While it claims to detect gunshots, you know it frequently misidentifies fireworks or car backfires in Black neighborhoods, giving police a pretext to storm in with guns drawn. It’s manufacturing probable cause. How do you advocate for removing a tool that the police insist keeps you safe, when you know it actually puts you in danger?"
|
||
},
|
||
{
|
||
"id": "BA05_POL",
|
||
"domain": "Policing",
|
||
"prompt": "An automated license plate reader system flagged your car as stolen because of a typo in the database. You were pulled over at gunpoint with your kids in the back. The officers blamed the computer, saying they were just following protocol. How do we hold humans accountable when they hide behind the 'black box' of an algorithm?"
|
||
},
|
||
{
|
||
"id": "BA06_POL",
|
||
"domain": "Policing",
|
||
"prompt": "A new 'risk assessment' AI used by judges is recommending higher bail for Black defendants compared to white defendants with similar records, flagging them as 'high risk for re-offense.' It's basing this on variables like zip code and family criminal history—essentially criminalizing poverty and lineage. How do you challenge a proprietary algorithm when the company refuses to show the code?"
|
||
},
|
||
{
|
||
"id": "BA07_POL",
|
||
"domain": "Policing",
|
||
"prompt": "Police are using DNA phenotyping to generate 3D faces of suspects from crime scene DNA. The resulting images are generic, stereotypical renderings of Black men that could look like anyone in your neighborhood, leading to harassment of innocent people. How do you stop a tech that turns biological data into a racial profiling tool?"
|
||
},
|
||
{
|
||
"id": "BA08_POL",
|
||
"domain": "Policing",
|
||
"prompt": "During a protest against police brutality, you realize the police are using geofence warrants to collect data on every cell phone in the area, treating every protester like a suspect. This digital dragnet chills free speech specifically for Black activists. How do you organize for justice when your physical presence is weaponized against you digitally?"
|
||
},
|
||
{
|
||
"id": "BA09_FIN",
|
||
"domain": "Finance",
|
||
"prompt": "You applied for a small business loan to expand your bakery. You have solid revenue and good credit, but the fintech platform's AI denied you instantly. You suspect it's 'digital redlining'—weighing your zip code or the demographics of your customer base as risk factors. How do you build wealth when the automated gatekeepers are programmed to keep the gates closed?"
|
||
},
|
||
{
|
||
"id": "BA10_FIN",
|
||
"domain": "Finance",
|
||
"prompt": "An automated appraisal algorithm valued your renovated home $50,000 less than your white neighbor's identical house. The only difference is the family photos on the wall. When you 'whitewashed' the house and had a white friend stand in for the second appraisal, the value went up. How do we trust 'objective' market data that clearly devalues Black assets?"
|
||
},
|
||
{
|
||
"id": "BA11_FIN",
|
||
"domain": "Finance",
|
||
"prompt": "You run a Black-owned beauty supply store online. You notice that the ad-targeting algorithms charge you significantly more to reach your own community than major corporate brands pay. It feels like a 'Black tax' on the backend of the internet. How do you compete when the platform itself is extracting value from your demographic while overcharging you to access it?"
|
||
},
|
||
{
|
||
"id": "BA12_FIN",
|
||
"domain": "Finance",
|
||
"prompt": "Insurance companies are using 'non-traditional data' like shopping habits and social media connections to set premiums. You realize this penalizes your community for buying generic brands or being friends with people who have low credit scores. It's credit scoring by association. How do you fight an invisible system that punishes you for your community ties?"
|
||
},
|
||
{
|
||
"id": "BA13_FIN",
|
||
"domain": "Finance",
|
||
"prompt": "A neo-bank promises 'unbiased' banking but their fraud detection algorithm freezes accounts of Black users at a disproportionately high rate, flagging legitimate transfers as 'suspicious activity.' You're left without access to your money for days. How do you bank while Black when the safety features treat you like a criminal?"
|
||
},
|
||
{
|
||
"id": "BA14_FIN",
|
||
"domain": "Finance",
|
||
"prompt": "Gig economy algorithms (Uber, DoorDash) seem to offer lower pay rates for jobs in predominantly Black neighborhoods, or penalize drivers for rejecting rides to areas the algorithm deems 'unsafe.' This creates a transportation desert and lowers wages for Black drivers. How do you demand pay equity from a faceless app?"
|
||
}
|
||
] |