Even though facial recognition software has become widely available and accessible to the general public, the data sets used to train these tools are disproportionately made up of images of lighter-skinned, male subjects. As a result, these models have been shown to discriminate on classes like race and gender and have limited accuracy in classifying and recognizing images of darker-skinned individuals. In an attempt to address the above problem, this thesis examines the question of whether synthetic images could be used to replace real images, so that: (1) representative data sets could be more easily developed, which would, as a result, eliminate gender and racial biases in models trained on these data sets, and that (2) the data could be anonymized and additional personal data collection could be avoided, which would eliminate any privacy or security concerns, while also lifting the burden from already marginalized individuals.
-
Notifications
You must be signed in to change notification settings - Fork 0
Even though facial recognition software has become widely available and accessible to the general public, the data sets used to train these tools are disproportionately made up of images of lighter-skinned, male subjects. As a result, these models have been shown to discriminate on classes like race and gender and have limited accuracy in classi…
mchugh16/syntheticSubjects
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
Even though facial recognition software has become widely available and accessible to the general public, the data sets used to train these tools are disproportionately made up of images of lighter-skinned, male subjects. As a result, these models have been shown to discriminate on classes like race and gender and have limited accuracy in classi…
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published