서브메뉴
검색
상세정보
Artificial Intelligence and the Operationalization of SESTA-FOSTA on Social Media Platforms.
Artificial Intelligence and the Operationalization of SESTA-FOSTA on Social Media Platforms.
상세정보
- 자료유형
- 학위논문(국외)
- 기본표목-개인명
- 표제와 책임표시사항
- Artificial Intelligence and the Operationalization of SESTA-FOSTA on Social Media Platforms.
- 발행, 배포, 간사 사항
- 발행, 배포, 간사 사항
- 형태사항
- 132 p.
- 일반주기
- Source: Dissertations Abstracts International, Volume: 87-04, Section: B.
- 일반주기
- Advisor: Obasogie, Osagie;Lee, Taeku.
- 학위논문주기
- Thesis (Ph.D.)--University of California, Berkeley, 2025.
- 요약 등 주기
- 요약This dissertation investigates how AI content moderation systems reflect and intensify sociotechnical inequalities, particularly in the wake of U.S. regulatory shifts like SESTA-FOSTA. Through three interdisciplinary studies, it traces the entanglement of law, algorithmic enforcement, and marginalized users' online experiences. The first paper audits the open_nsfw image classifier, using an Internet Archive dataset and embeddingbased computer vision to reveal systematic gender- and sexuality-based bias. It shows how regulatory pressure led Tumblr to adopt a model that disproportionately flagged queer, femme, and artistic content, embedding mainstream moral norms into machine learning systems. The second paper analyzes a large survey of Instagram creators- primarily BIPOC, LGBTQ+, and disabled-collected with Salty. It documents disparities in perceived takedowns, shadowbanning, and suspensions, especially post-SESTAFOSTA, and shows how users interpret and resist opaque algorithmic governance. The third paper provides a sociolegal analysis of SESTA-FOSTA's impact on CDA 230, mapping case law and demonstrating how courts have struggled with its ambiguity and symbolic scope. Using theories of legal endogeneity and delegated enforcement, it critiques how the law outsources regulatory power to platforms, often harming vulnerable users. Together, these studies expose the mechanics and consequences of embedding law into code, and they call for external audits, community-informed regulation, and governance frameworks that address harm without further marginalization.
- 주제명부출표목-일반주제명
- 주제명부출표목-일반주제명
- 주제명부출표목-일반주제명
- 주제명부출표목-일반주제명
- 비통제 색인어
- 비통제 색인어
- 비통제 색인어
- 비통제 색인어
- 비통제 색인어
- 비통제 색인어
- 부출표목-단체명
- 기본자료저록
- Dissertations Abstracts International. 87-04B.
- 전자적 위치 및 접속
- 원문정보보기
MARC
008260219s2025 us ||||||||||||||c||eng d■001000017359348
■00520260202105106
■006m o d
■007cr#unu||||||||
■020 ▼a9798297601192
■035 ▼a(MiAaPQ)AAI32236627
■040 ▼aMiAaPQ▼cMiAaPQ
■0820 ▼a301
■1001 ▼aBarreto, Renata.
■24510▼aArtificial Intelligence and the Operationalization of SESTA-FOSTA on Social Media Platforms.
■260 ▼a[S.l.]▼bUniversity of California, Berkeley. ▼c2025
■260 1▼aAnn Arbor▼bProQuest Dissertations & Theses▼c2025
■300 ▼a132 p.
■500 ▼aSource: Dissertations Abstracts International, Volume: 87-04, Section: B.
■500 ▼aAdvisor: Obasogie, Osagie;Lee, Taeku.
■5021 ▼aThesis (Ph.D.)--University of California, Berkeley, 2025.
■520 ▼aThis dissertation investigates how AI content moderation systems reflect and intensify sociotechnical inequalities, particularly in the wake of U.S. regulatory shifts like SESTA-FOSTA. Through three interdisciplinary studies, it traces the entanglement of law, algorithmic enforcement, and marginalized users' online experiences. The first paper audits the open_nsfw image classifier, using an Internet Archive dataset and embeddingbased computer vision to reveal systematic gender- and sexuality-based bias. It shows how regulatory pressure led Tumblr to adopt a model that disproportionately flagged queer, femme, and artistic content, embedding mainstream moral norms into machine learning systems. The second paper analyzes a large survey of Instagram creators- primarily BIPOC, LGBTQ+, and disabled-collected with Salty. It documents disparities in perceived takedowns, shadowbanning, and suspensions, especially post-SESTAFOSTA, and shows how users interpret and resist opaque algorithmic governance. The third paper provides a sociolegal analysis of SESTA-FOSTA's impact on CDA 230, mapping case law and demonstrating how courts have struggled with its ambiguity and symbolic scope. Using theories of legal endogeneity and delegated enforcement, it critiques how the law outsources regulatory power to platforms, often harming vulnerable users. Together, these studies expose the mechanics and consequences of embedding law into code, and they call for external audits, community-informed regulation, and governance frameworks that address harm without further marginalization.
■590 ▼aSchool code: 0028.
■650 4▼aSociology.
■650 4▼aComputer science.
■650 4▼aLaw.
■650 4▼aWeb studies.
■653 ▼aAI ethics
■653 ▼aAI safety
■653 ▼aAlgorithmic bias
■653 ▼aContent moderation
■653 ▼aPlatform governance
■653 ▼aSociotechnical systems
■690 ▼a0626
■690 ▼a0984
■690 ▼a0398
■690 ▼a0800
■690 ▼a0646
■71020▼aUniversity of California, Berkeley▼bJurisprudence & Social Policy.
■7730 ▼tDissertations Abstracts International▼g87-04B.
■790 ▼a0028
■791 ▼aPh.D.
■792 ▼a2025
■793 ▼aEnglish
■85640▼uhttp://www.riss.kr/pdu/ddodLink.do?id=T17359348▼nKERIS▼z이 자료의 원문은 한국교육학술정보원에서 제공합니다.


