Frontiers in Social Science features new research in the flagship journals of the Social Science Research Council’s founding disciplinary associations. Every month we publish a new selection of articles from the most recent issues of these journals, marking the rapid advance of the frontiers of social and behavioral science.

A minimum wage floor increased wages but reduced hours worked

In a randomized trial conducted on an online job platform, imposing a minimum wage on firms increased wages but reduced hours worked, with no net effects on workers’ total earnings.

Author(s)
John J. Horton
Journal
American Economic Review
Citation
Horton, John J. 2025. "Price Floors and Employer Preferences: Evidence from a Minimum Wage Experiment." American Economic Review, 115 (1): 117–46. DOI: 10.1257/aer.20170637 Copy
Abstract

Firms posting job openings in an online labor market were randomly assigned minimum hourly wages. When facing a minimum wage, fewer firms hired, but those they did hire paid higher wages. Hours-worked fell substantially. Treated firms shifted to hiring more productive workers. Using the platform's imposition of a market-wide minimum wage after the experiment, I find that many of the experimental results also hold in equilibrium, including the substitution towards more productive workers. However, there was also a large reduction in the number of jobs posted for which the minimum wage would likely bind.

Little support for content moderation on social media platforms

In a series of survey experiments, exposure to uncivil or intolerant social media posts did not lead to majority support for content moderation by social media platforms.

Author(s)
Franziska Pradel, Jan Zilinsky, Spyros Kosmidis, and Yannis Theocharis
Journal
American Political Science Review
Citation
PRADEL, FRANZISKA, JAN ZILINSKY, SPYROS KOSMIDIS, and YANNIS THEOCHARIS. “Toxic Speech and Limited Demand for Content Moderation on Social Media.” American Political Science Review 118, no. 4 (2024): 1895–1912. https://doi.org/10.1017/S000305542300134X. Copy
Abstract

When is speech on social media toxic enough to warrant content moderation? Platforms impose limits on what can be posted online, but also rely on users’ reports of potentially harmful content. Yet we know little about what users consider inadmissible to public discourse and what measures they wish to see implemented. Building on past work, we conceptualize three variants of toxic speech: incivility, intolerance, and violent threats. We present results from two studies with pre-registered randomized experiments (Study 1, N=5,130; Study 2, N=3,734) to examine how these variants causally affect users’ content moderation preferences. We find that while both the severity of toxicity and the target of the attack matter, the demand for content moderation of toxic speech is limited. We discuss implications for the study of toxicity and content moderation as an emerging area of research in political science with critical implications for platforms, policymakers, and democracy more broadly.

Menu