Executive Summary

For decades, the social sciences have generated knowledge vital to guiding public policy, informing business, and understanding and improving the human condition. But today, the social sciences face serious threats. From dwindling federal funding to public mistrust in institutions to widespread skepticism about data, the infrastructure supporting the social sciences is shifting in ways that threaten to undercut research and knowledge production.

How can we secure social knowledge for future generations?

This question has guided the Social Science Research Council’s Task Force. Following eighteen months of consultation with key players as well as internal deliberation, we have identified both long-term developments and present threats that have created challenges for the social sciences, but also created unique opportunities. And we have generated recommendations to address these issues.

Our core finding focuses on the urgent need for new partnerships and collaborations among several key players: the federal government, academic institutions, donor organizations, and the private sector. Several decades ago, these institutions had clear zones of responsibility in producing social knowledge, with the federal government constituting the largest portion of funding for basic research. Today, private companies represent an increasingly large share not just of research and funding, but also the production of data that informs the social sciences, from smart phone usage to social media patterns.

In addition, today’s social scientists face unprecedented demands for accountability, speedy publication, and generation of novel results. These pressures have emerged from the fragmented institutional foundation that undergirds research. That foundation needs a redesign in order for the social sciences to continue helping our communities address problems ranging from income inequality to education reform.

To build a better future, we identify five areas of action: Funding, Data, Ethics, Research Quality, and Research Training. In each area, our recommendations range from enlarging corporate-academic pilot programs to improving social science training in digital literacy.

A consistent theme is that none of the measures, if taken unilaterally, can generate optimal outcomes. Instead, we have issued a call to forge a new research compact to harness the potential of the social sciences for improving human lives. That compact depends on partnerships, and we urge the key players in the construction of social science knowledge—including universities, government, foundations, and corporations—to act swiftly. With the right realignments, the security of social knowledge lies within our reach.


Introduction

Two recent events have highlighted the challenges facing the social sciences. First, the Cambridge Analytica scandal revealed to millions of Facebook users that an outside firm had gained access to their personal data and used it for partisan political purposes by working with a social researcher. Second, United States government officials recently determined that the 2020 census will include an unreliable citizenship question, which will generate an inaccurate count of the nation’s population. The question has not been field tested and threatens to impoverish research and public policy decisions that require high-quality census information.1The Task Force endorses and supports current efforts to protect the integrity of the census and the federal statistical system from political interference. See, for example, the August 2018 letter from the National Academies Task Force on the 2020 census, https://www.nap.edu/read/25215/chapter/1#3, and the statement organized by the Consortium of Social Science Associations, https://www.cossa.org/2018/08/07/cossa-and-25-science-organizations-call-for-removal-of-census-citizenship-question/.

The politicization and misuse of social knowledge2For the purpose of this report, social knowledge refers to understandings of human behavior and social structures generated by professional researchers and scientists using advanced training, technical skills, and critical reasoning to push the frontiers of their fields and to promote the public good. in instances such as these has provoked anxieties in many sectors and raised concerns about disregard for the basic norms of research.3The multiple challenges recently facing the role of scientists and the use of science in the policymaking process of the Environmental Protection Agency (EPA) now encompass social science. In June 2018, the Environmental Economics Advisory Committee, a component of the EPA’s Science Advisory Board, was dismantled by the agency, effectively removing a key site for independent and nonpartisan social science to speak directly to environmental policy. Kevin Boyle and Matthew Kotchen, “Retreat on Economics at the EPA,” Science 361, no. 6404 (2018): 729.

The To Secure Knowledge Task Force sees such serious threats as symptoms of large-scale technological, political, and social transformations.How can we secure social knowledge for future generations?

The Social Science Research Council (SSRC) created the Task Force in April 2017 to examine both present pressures and opportunities. In their discussions, members of the Task Force took particular notice of the long-term developments, described below, that have produced deep challenges for the social sciences. Focusing on new possibilities to advance social knowledge, this report calls for a new “research compact”—which would emerge from reimagined collaboration among researchers, institutions, policymakers, and the private sector—to improve the pursuit of social knowledge and its potential to contribute to the common good.

Though this report focuses principally on the United States, the issues it addresses resonate in other parts of the world as well. The Task Force consulted many constituencies and conducted intensive internal discussions over the course of eighteen months (see the appendix regarding the consultation process). Its goal is to impel conversation and action, including creative experimentation, by leaders and organizations concerned with the future of social knowledge. These include academic and administrative leaders in higher education, as well as colleagues in government, business, the nonprofit world, and the media.

The report emphasizes both protective and proactive dimensions of “securing knowledge.” We seek to safeguard social knowledge by upholding the essential principles that must be in place to produce excellent research and inform public action. Concurrently, we want to seize possibilities to enhance social knowledge and its contributions to improving the human condition, and thus promote an expansive vision for the social sciences.4Of course, we are aware that “seizing possibilities” can also be done in ways that are more or less altruistic. Our intention here is to encourage the collective leveraging of knowledge across sectors and for the good of the many rather than the few.A new social compact between science, government, and the public must be crafted, one which opens up accountability regimes in ways that encourage risk, innovation, and long-term inquiry.

Both tasks are urgent and timely. While views of social science are more politicized than in the past, never has it been so important, both inside and outside the academy. From public health and climate change to issues concerning inequality and social media, knowledge generated by social scientists can shed light on many of today’s challenges. Social science also provides methods for solving problems—from organ donor matching systems to internet search algorithms.5See the multiple examples of the important impact of social science research recounted in National Academies of Sciences, The Value of Social, Behavioral and Economic Sciences to National Priorities: A Report to the National Science Foundation, National Academies Press, 2017, as well as the “Why Social Science?” series curated by the Consortium of Social Science Associations (COSSA). If this promise is to be secured, new partnerships must be crafted among academics, governments, corporate leaders, philanthropic organizations, and NGOs.

The challenge of building a new research compact among these actors lies at the center of this report and its recommendations. It chronicles changes to the post–World War II institutional framework that helped launch today’s social science, explores the current research ecosystem and how it has been destabilized, and proceeds to identify five key areas for thoughtful intervention:

  • Funding
  • Data
  • Ethics and Data Integrity
  • Research Quality
  • Research Training

The report also draws attention to ongoing and potential new efforts in each of these five areas to protect and advance knowledge by identifying prospects for partnerships and innovation.


A Knowledge
System in Flux

The United States possesses a vibrant system for social science research. Built during the second half the twentieth century, it connects universities, government institutions, firms, and foundations. For decades, partnerships among these institutions supported principles at the core of the research enterprise—openness, accessibility, transparency, and quality. Central too was the notion that research findings are both authoritative and provisional: research remains open to challenge from potentially better ideas and encourages debate when new evidence and arguments arise.

But the underpinnings of this system are shifting, and some of its foundations have come under stress.

The Digital Revolution

We are experiencing a revolution in media and technology that reverberates at every level of research. Just as the print revolution transformed scientific inquiry, the contemporary digital revolution is changing the meaning of empirical research, modes for producing knowledge, relationships among researchers, and practices for preserving information.6While this report does not address it directly, the digital age profoundly affects access to research results and the need to archive data securely and accessibility. Important experiments in open access are underway, although a significant amount of scholarship—and likely an even higher percentage of publications in the most selective journals—remains behind paywalls and is not available or affordable to many scholars, as well as practitioners and the general public. Archival and preservation needs are too often neglected. What is worth saving for future generations, and how to do it? A sixteenth-century book can still be taken down from the shelf of a library and read without further ado whereas a CD has a physical lifetime of a mere thirty years. Digitization requires scientific archiving to be reinvented, both conceptually and materially.

The possibilities for new kinds of research abound, as do the perils. The digital revolution enables new forms of knowledge production, such as algorithmic searches of huge troves of data, and it is remaking research practices such as the controlled experiment, the statistical survey, and archiving. Several decades ago, much of the world’s social science data was generated by universities, government agencies, or international organizations such as the Organisation for Economic Co-operation and Development (OECD). Today, information technologies have caused an exponential increase in the quantity of socially relevant data, much of which has become the intellectual property of companies. The possibilities for new kinds of research abound, as do the perils.

The institutional arrangements that were conducive to the production of social knowledge in the mid-twentieth century are no longer adequate to harness the potential of social science in the digital era. And they have left us with a host of new dangers and questions, such as how to protect the privacy of human subjects when so much of what we do and think is now recorded as part of everyday life.

Accountability Crisis

In addition to these technological transformations, social science also plays a dramatically different role in public life outside the academy than it did several decades ago. Many citizens want more information but, at the same time, are more skeptical of the value of social science findings and more insistent on accountability, often understood as short-term results. Researchers face pressure from governments, nonprofit organizations, the media, and the public at large to allow broad access to data and results, as well as to provide evidence of relevance and impact. The demands for social scientists to produce a “return on investment” often impose short-term timeframes on researchers with long-term questions about issues of great complexity.7Kenneth Prewitt, “Is Knowledge Good for the Public?”, Social Research 84, no. 3 (2017): xxi–xxxv.

Even as the pressures on social science research have increased, the public status of research is tangled in a broader cultural and political moment in which public skepticism clouds interpretations of robust science. In some sectors, public doubt has undermined the role of evidence in arbitrating disagreements and guiding policy.8See the Public Face of Science Initiative, Perceptions of Science in America (Cambridge, MA: American Academy of Arts & Sciences, 2018); Gordon W. Gauchat, “The Political Context of Science in the United States: Public Acceptance of Evidence-Based Policy and Science Funding,” Social Forces 94 no. 2 (2015): 723–46, http://dx.doi.org/10.1093/sf/sov040; and Gauchat, “The Politicization of Science in the Public Sphere: A Study of Public Trust in Science in the US, 1974–2010,” American Sociological Review 77, no. 2 (2012): 167–87, http://dx.doi.org/10.1177/0003122412438225. Broader trends such as the spread of misinformation and attacks on mainstream institutions have further complicated the role of social science in today’s world. In this context, the Task Force praises efforts that shed light on the sociotechnical mechanisms that undergird today’s misinformation ecology.9This includes the work of the Social Media and Political Participation Lab at New York University and the Media Manipulation research team at the Data and Society Research Institute.

Social scientists across a range of fields have responded to this political moment by stepping up self-scrutiny through movements for transparency, open access, deep querying of ethical practices, and commitment to public-facing scholarship. The Task Force applauds these trends and encourages continued innovation in these areas. At the same time, we argue that a new research compact between science, government, the private sector, and the public must be crafted, one that encourages risk taking as well as risk management, collaboration, and independent, long-term inquiry.

Institutional Transformations

With the revolution in information technologies and changes in accountability expectations, the roles of institutions at the core of social science research have shifted in the last fifty years. The institutional infrastructure supporting social science has always been diverse but, historically, there was a clear leader in research funding. Since at least the mid-twentieth century, the US government funded the lion’s share of research. Private universities and foundations contributed resources on a lesser scale, and private firms often conducted applied research. The funding patterns for each actor emerged in response to a range of institutional priorities: government focused on research in the service of the common good, universities and foundations sponsored research on a longer timescale and with a public mission, and commercial firms typically explored consumer understanding, buying behavior, and product development.

In recent years, this ecosystem has changed in ways that are deeply consequential for efforts to secure knowledge. New players are involved in social knowledge production, and engagement in various kinds of research activity is more widely distributed.

Government: Whereas governments once focused primarily on data collection—gathering vast amounts of survey and administrative data—today’s policymakers are also involved in social experiments, including “nudge units” that sponsor or conduct behavioral research to improve governance.10The Obama administration’s Social and Behavioral Sciences Team represents one example of such a program.

NGOs: The network of nonprofits that relied on and generated social knowledge was once small. Today, advocacy associations and think tanks engage with social science on issues ranging from human rights and public health to education reform and racial inequality.

Media: Large media firms now employ data journalists who conduct research and analysis and generate accessible accounts of social life.

Philanthropies: Historically vital sources of support for social science research, philanthropies increasingly conduct research in-house or commission research that focuses more on short-term payoffs than on long-term knowledge innovation and capacity building.

Corporations: Most dramatically, private industry has destabilized the social knowledge system. Corporations, of course, have always invested in social science research, but today the core product of many of the most successful businesses is social information. Today’s companies own exabytes of social data, and many also control the tools to gather and analyze that data, most of which concerns human behavior. As a consequence, private-sector norms of flexibility, efficiency, and profitability take their place alongside traditional scholarly ones of reliability, critical scrutiny, and openness.

***

These developments—the digital revolution, today’s accountability crisis, and institutional transformations—are conjoined, and the speed with which they have unfolded creates both opportunities and serious risks. Far more purposeful collaboration across institutions will be required to secure knowledge.


Key Areas for
Collaboration

Funding

Securing knowledge requires substantial financial investment to create the conditions for scientific innovation and to support new research. For much of the post–World War II era in the United States, the federal government played a central role in funding basic research, most notably with the creation of the National Science Foundation (NSF) as the principal nonmilitary research funding agency. The NSF’s mandate originally excluded the social and behavioral sciences, and even after its purview widened, social science has continued to face periodic challenges in lobbying for government support.11In Congress, regular attacks have taken place against the NSF’s division focused on social science research, the Directorate for Social, Behavioral, and Economic (SBE) Sciences.

More recently, the federal share of total research expenditures has declined, while support from the private sector has grown. In the 1970s, the government accounted for over 70 percent of basic research funding, but that figure has fallen below 50 percent in recent years.12For data and discussion of the change, see the American Association for the Advancement of Science, “Federal R&D Budget Overview,” http://www.aaas.org/program/rd-budget-and-policy-program; and Jeffrey Mervis, “Data Check: U.S. Government Share of Basic Research Funding Falls Below 50%,” Science (March 9, 2017). Data on how the percentages may have shifted between public and private spending for social science research specifically is not available. According to NSF data, federal funding for social science was slightly less in 2016 (in current dollars) as compared to the late 1970s. See “Research by Science and Engineering Discipline,” American Association for the Advancement of Science website, last updated July 2018, https://www.aaas.org/page/research-science-and-engineering-discipline. And the trend seems unlikely to reverse in the near term. How can we preserve the public benefits associated with federal funding on a terrain where the private sector claims increasing prominence?

Changes in funding patterns raise deep questions about how social science research can continue to generate benefits for all portions of society, rather than just profit-generating segments. Federal funding provides long-term investments in activities whose outcomes cannot be fully identified in advance.13For more information on the wide-ranging consequences of federally funded research for socially beneficial goals, see the National Academies, The Value of Social, Behavioral and Economic Sciences to National Priorities; and the Committee on Criteria for Federal Support of Research and Development, Allocating Federal Funds for Science and Technology (Washington, DC: National Academies Press, 1995). By contrast, the business sector is obligated to maximize returns to shareholders and pursue short-term priorities. These are not typically incentives for making long-term investments in knowledge, including research that might mitigate unanticipated harms.14Joseph Stiglitz has observed: “Knowledge can be viewed as a public good, and the private provision of a public good is essentially never optimal.” Stiglitz, “Leaders and Followers: Perspectives on the Nordic Model and the Economics of Innovation,” NBER Working Paper 20493 (September 2014). Historically, private foundations have complemented government support by investing in long-term horizons and building research capacity. Today, by contrast, many foundations have shifted priorities to project goals with short-term impact agendas and increased emphasis on observable indicators of impact.Changes in funding patterns raise deep questions about how social science research can continue to generate benefits for all portions of society.

The new funding ecosystem—marked by the private sector’s growing ascendancy in research matters, shifting priorities of government, and new mandates among funding organizations—will not necessarily result in dire consequences for the social sciences. Indeed, the public good has captured the attention of business and corporate enterprises, as demonstrated by the Business Council and the Council on Competitiveness, and by a broader interest in corporate social responsibility (CSR) practices.15Consider the emphasis the Council on Competitiveness places on the education of citizens in a changing world. The organization asserts: “From technology to trade skills, there is no issue on which Council members are more united than in their desire for progress in building a talented, diverse workforce.” Council on Competitiveness, 2017 Clarion Call, 9. Moreover, “American consumers and companies are increasingly reliant on a globally engaged economy for their jobs and standards of living,” 12. Nonetheless, the irreplaceable role of government funding requires a recalibration by all components of the knowledge system to ensure that the funding of independent, long-term, and socially beneficial research continues to be a priority for the state and other sectors.

Recommendations: Funding

  • Promote and deepen efforts to advocate for social science support by such organizations as the American Association for the Advancement of Science, the Consortium of Social Science Associations, and, primarily focused on the United Kingdom, the Campaign for Social Science.
  • Design and implement new models for public-private research funding partnerships that include government, the private sector, the academy, and philanthropy. A start will be made at an SSRC-sponsored convening of research stakeholders in the spring of 2019.

Data

Today’s information technologies have transformed the way social scientists gather, share, process, and even conceive of data. While the data revolution creates exciting possibilities for social science research, it also raises challenges, particularly for data access. Historically, the federal government has played a leading role in gathering and coordinating access to data. Today, the greater institutional complexity of data collection and access requires a new research compact that encompasses both governmental and private-sector data and the creation of a “data commons” through which data may be shared.

The Federal System

Along with state and municipal governments, the federal government maintains complex and largely uncoordinated systems for gathering data on multiple aspects of the lives of US citizens. Data arises both from administrative processes and systematic surveys, and the ways in which government collaborates with social scientists have changed in recent years.16For a recent overview, see the January 2018 special issue of the Annals of the American Academy of Political and Social Science, especially the introduction by Andrew Reamer and Julia Lane, “A Roadmap to a Nationwide Data Infrastructure for Evidence-Based Policy Making,” Annals of the American Academy of Political and Social Science 675, no. 1 (2018): 28–35.

Government-academic partnerships have historically been anchored in the design of survey and sampling techniques and in sharing census and other data. In particular, the census became a model for government-academic social research cooperation (see box 1). Short-term political pressures, cost-reduction demands, calls for greater data granularity, and declining response rates to surveys have put pressure on these collaborative arrangements. Amid these stresses, data sharing between the government and academia, as well as future collaborations that enable independent research using privately held data, must continue to safeguard several key norms, including privacy protections, baselines for measuring trends, academic freedom, and a commitment to improving the collection of social data for public benefit.

The overall framework for government-academy data sharing requires a conscientious re-examination. The Task Force endorses the Commission on Evidence-Based Policymaking’s recommendation to create a National Secure Data Service (NSDS) to link databases across the federal system. Such a platform would make data more accessible to researchers and enhance privacy protections.17Commission on Evidence-Based Policymaking, The Promise of Evidence-Based Policymaking (Washington, DC: Commission on Evidence-Based Policymaking, 2017), https://www.cep.gov/cep-final-report.html. The NSDS could become a “state-of-the-art resource for improving government’s capacity to use the data it already collects,” according to the commission’s 2017 report.

Box 1: The Census as a Historical Model

The oldest and most central part of the US social knowledge system has been the decennial census, mandated in the US Constitution.18Following the creation of the census, a steady expansion in the government’s data collection agencies, initiatives, and programs has taken place—the thirteen principal federal statistical agencies alone spend approximately $6 billion annually on data collection. There are at least another hundred or so statistical offices scattered across the government. Indeed, the statistical system of the US government is itself a diverse institutional ecology. This has served as a model for academic-government collaboration, with social scientists working to improve methods of data collection, ensure quality, and protect the system from political interference. Indeed, the American Statistical Association (ASA) was founded on the eve of the 1840 census with the express purpose of improving its technical features in order to prevent the manipulation of the census in the service of race science as had occurred in prior decades.

Later, the Census Bureau began turning to university experts, professional associations, and, by the end of the twentieth century, think tanks and private contractors. These relationships have proven mutually beneficial; the influence of the social sciences increased the likelihood that census and other government survey data would be suitable for testing hypotheses about issues ranging from employment mobility to educational reform.

Today’s scholars continue this engagement by offering new methods for incorporating and analyzing administrative and other kinds of data into the census. They also support the census by advocating for sufficient funding, analytic rigor, and autonomy of the political process in the lead-up to the 2020 census.

Private Data

From credit card transactions to internet search patterns to smart phone usage, commercial companies are producing new forms of social data. This data creates exciting possibilities for social science research, but it also introduces new questions about profit motives, intellectual property rights, and other issues stemming from its commercial nature.19There are also important differences in corporate versus academic approaches to data. Digital data from the commercial sector tends to be thin on theory and rich in data. Social science is traditionally the inverse: theory rich and data poor. Further, in terms of analytical techniques, the technology sector boasts a range of computational innovations focused on prediction of social behavior and interaction. By contrast, social science historically focuses on causal mechanisms and interpretive understanding. On the latter point, see Jake M. Hofman, Amit Sharma, and Duncan Watts, “Prediction and Explanation in Social Systems,” Science 355, no. 6324 (2017): 486–488.

Aligning the fundamental motives of the government and academia has created challenges for navigating data-sharing partnerships, but the growing role of the commercial sector poses even greater complexities. The rewards to all parties are clear: the new forms of data greatly enrich the possibilities for social science. The challenge involves creating the right knowledge infrastructure to generate social benefits from this new data.20The Task Force acknowledges the prescient calls in this direction made by scholars in the past decade. See Duncan Watts, “A Twenty-First Century Science,” Nature 44, no. 1 (2007): 489; and David Lazer et al., “Life in the Network: The Coming Age of Computational Social Science,” Science 323, no. 5915 (2009): 721–23.

Several promising pilot programs demonstrate possibilities for private-sector collaboration with government and academic researchers (see box 2). The Task Force endorses the creation of a data commons, hosted by a third-party institution, to link public and private data. The proposal, outlined by scholars Robert Groves and Adam Neufeld, would make data available temporarily through an intermediary institution, rather than create a permanent data depository.21Groves and Neufeld, Accelerating the Sharing of Data across Sectors to Advance the Common Good (Washington, DC: Georgetown University Beeck Center for Social Impact and Innovation, 2017). The initiative would require strict privacy guidelines that would still allow independent researchers, as well as industry and the public sector, to access critical data on particular public issues. An initiative focused on identifying best practices for data sharing could help ensure that the public will benefit from the potential of today’s information technologies.

Such ambitious initiatives face significant obstacles, but they are essential to secure knowledge in ways that distribute both benefits and responsibilities across the institutional infrastructure. Along these lines, the corporate world has already created a model for inculcating public values in the business models of profit-generating firms: corporate social responsibility (CSR) initiatives. Just as CSR has helped articulate and realize corporate commitments to do good for society, an initiative focused on identifying best practices for data sharing could help ensure that the public will benefit from the potential of today’s information technologies.22See, for example, Shamina Singh, “A Call to Action on Data Philanthropy,” Mastercard Center for Inclusive Growth, October 4, 2016, https://mastercardcenter.org/action/call-action-data-philanthropy/.

Box 2: Pilot Programs in Public-Private Collaboration

Several projects are taking important steps toward developing collaborative relationships among commercial actors, scholars, and policymakers.

For years, a focus of the Alfred P. Sloan Foundation has been lowering the transaction costs incurred when researchers wish to study government, company, or other administrative data that was not originally collected for research purposes. As part of this effort, the Sloan Foundation has established a number of sector-specific intermediaries called Administrative Data Research Facilities (ADRFs), which are being organized into an open network.23More about the Administrative Data Research Facilities Network is available at https://www.adrf.upenn.edu/.

Another experimental project, the SSRC’s Social Data Initiative24More about SSRC’s Social Data Initiative is available at https://www.ssrc.org/programs/view/social-data-initiative/. — a partnership with Social Science One, LLC, and a consortium of nonprofit foundations—will facilitate social scientists’ access to Facebook data under conditions that protect privacy and intellectual property, while allowing scholarly independence in data analysis and publication venue.25See Gary King and Nate Persily, “A New Model for Industry-Academic Partnerships” (working paper), updated August 10, 2018, https://gking.harvard.edu/files/gking/files/partnerships.pdf. Peer-review competitions administered by the SSRC will identify researchers who will examine questions about social media’s influence on democracy and elections.

These initiatives could provide models for technology companies—and the private sector more broadly — to make their data available for research on issues that address public problems.

Recommendations: Data

  • Encourage the expansion of new experiments in data sharing, such as the Sloan Foundation’s data facilities network and the SSRC’s Social Data Initiative.
  • Ensure cumulative learning from experimental private-public collaborations by conducting and sharing evaluations of pilot projects such as those of the ADRFs and the SSRC.
  • Draw on “data philanthropy” and similar frameworks to develop best practices for access to privately held data, similar to corporate social responsibility commitments.
  • Promote the development of a data commons, both in terms of creating a national data service for government-held data and a multisource platform linking private-sector data.

Ethics and Data Integrity

Today’s information technologies and institutional changes raise new questions about the ethics and norms that guide social science inquiry. The Task Force has focused on two particular areas that would benefit from collaboration and the creation of a new compact to guide social knowledge creation: data integrity and ethics governing research protocols.

Data Integrity

Securing knowledge requires not only expanding the accessibility of data but also protecting it from corruption, manipulation, and exploitation for partisan, fraudulent, or repressive ends. Online data hosting facilitates collaboration among a larger range of social scientists, but greater accessibility comes with risks. The more data that accumulates, the more vulnerable it becomes to outside actors with wide-ranging motives, from individual hackers and criminal networks to businesses and national governments. Even more worrisome, cyber breaches will increasingly seek not just to steal but to alter data.26“Future cyber operations will almost certainly include an increased emphasis on changing or manipulating data to compromise its integrity (i.e., accuracy and reliability) to affect decision-making, reduce trust in systems, or cause adverse physical effects.” James R. Clapper, Director of National Intelligence, “Statement for the Record: Worldwide Threat Assessment of the US Intelligence Community before the Senate Armed Services Committee,” February 6, 2016, https://www.dni.gov/files/documents/SASC_Unclassified_2016_ATA_SFR_FINAL.pdf.

Owing to recent advances in computer technology, we are entering a world in which technology blurs distinctions between images, voices, and videos created in the moment and those altered after the fact. Already, artificial intelligence has enabled the creation of realistic photographs of nonexistent people. Computer scientists can engineer voices that sound exactly like specific, real individuals. Fake video is on the horizon. If a picture is worth a thousand words, what will happen when policymakers and social scientists cannot authenticate photographs, audio recordings, or videos?

No single party can resolve the challenges in data authentication or unilaterally protect the integrity of information. Indeed, the problems are not just technical; they also raise questions about the accessibility of information, the tradeoffs of data protection, and other ethical matters that require technology companies to work with academics and policymakers.

Protecting Human Subjects

For decades, research on human subjects has been governed according to the norms of “informed consent”: individuals who participate in studies must consent to the terms of participation. However, the model—realized through the creation of institutional review boards (IRBs)—fails to address many challenges facing today’s social science research. While the IRB process focuses on important but fairly narrow dimensions of informed consent (see box 3), a broader consideration of ethics is needed to guide the next generation of research on human subjects in the academy and other sectors.

One challenge of the traditional method is that informed consent fails to address power dynamics between researcher and subject, the potential for unintended uses and consequences of research findings, and ethical questions inherent in new forms of social data. For example, unequal power dynamics between researchers and vulnerable populations complicate simple calculations of consent. Further, new technologies raise deep concerns over privacy, with respect to the collection and analysis of large amounts of personal data. Even though users must consent—in narrow technical terms—to whatever conditions the technologies and companies require, a deeper understanding of “consent” is needed to guide appropriate uses of data.27The permanence of data is also an issue. Does informed consent granted today imply consent for unanticipated future uses of the data?

The growing use and sophistication of experimental methods—in which customers, platform users, and others become research subjects—raise further questions about consent and ethics. If social scientists are to fulfill their hopes of accessing new data to work on social issues of public importance, a wider and more complex discussion of ethical research practices is required.

The Task Force encourages new initiatives to rethink the ethical compact that governs social science research. More broadly, it calls for collaboration between university researchers, IRBs, federal regulators, and private companies to create ethical guidelines and privacy protections appropriate to contemporary social science.

Box 3: Traditional Ethics Review and the IRB

In the recent past, federal regulations (typically referred to as the “Common Rule”) created the principal mechanism for guiding research ethics and protecting human research subjects in the sciences: institutional review boards (IRBs). The IRB has traditionally focused on the principle of informed consent, which has served as a useful model for biomedical research. However, social science researchers have at times been frustrated by IRBs’ narrow interpretation of informed consent, given that it fails to consider research contexts that social scientists frequently encounter.

Recent proposals have suggested changes to the Common Rule that would simplify compliance for many social scientists.28These changes, informed by collaborative discussions between academic researchers and the federal government, appear to introduce more flexibility in the ways that diverse kinds of social science research practice are evaluated in terms of compliance. Consortium of Social Science Associations, “Common Rule Agencies Release Proposal for 6-Month Delay of Revisions, Optional Implementation of ‘Burden-Reducing’ Provisions; Comments Sought for 30 Days,” COSSA Washington Update, April 25, 2018, http://www.cossa.org/2018/04/25/common-rule-agencies-release-proposal-for-6-month-delay-of-revisions/. While these steps are encouraging, they are occurring alongside the emergence of deeper challenges to ethical research practices. The current IRB structure has at times been both overly cumbersome and too limited to promote ethical research conduct, protect the privacy and confidentiality of human subjects, and address the nature of those subjects’ relation to the research itself.

New initiatives have begun to address some of these issues. For example, Pervasive Data Ethics for Computational Research (PERVADE), a project funded by the National Science Foundation, is developing new ethical research practices related to privacy concerns.29More about PERVADE is available at https://pervade.umd.edu.

Recommendations: Ethics and Data Integrity

  • Construct cross-sectoral partnerships to address threats to data integrity and to explore the use of data credibility forensics.
  • Convene social researchers, ethicists, and other constituencies to develop best practices in research ethics.
  • Ensure cumulative learning of experiments in understanding and designing new ethical codes and practices, building on the SSRC’s work with PERVADE as well as other collaborations.

Research Quality

The peer-review system for evaluating and disseminating scholarly research has been an indispensable component of the knowledge infrastructure. But it has come under strain in recent years due to the digital revolution, changes within the publishing industry, and intensified publication pressures from within the academy. Working together, academics, scholarly societies, and publishers, along with funding organizations, urgently need to develop mechanisms to uphold quality while fashioning criteria and incentives that could better serve both scholars and the public.

Blind peer review by scholars and editors continues to be the gold standard for research evaluation, but the system is far from perfect. As an example, a recent study found that women researchers are less likely to be selected as reviewers. Gender and other biases in the peer-review process have negative implications for scholars’ professional networks and trajectories, as well as for setting research priorities.30See, for example, Jory Lerback and Brooks Hansen, “Journals Invite Too Few Women to Referee,” Nature 541, no. 7638 (January 26, 2017): 455–57, https://doi.org/10.1038/541455a; and Cat Ferguson, Adam Marcus, and Ivan Oransky, “Publishing: The Peer-Review Scam,” Nature 515, no. 7528 (November 27, 2014): 480–482, https://doi.org/10.1038/515480a. The pressure of scale is stressing scholars’ capacity to curate and identify excellence through peer assessment. The number of researchers worldwide is rising at a rate of about 4 percent per year.31United Nations Educational, Scientific and Cultural Organization, UNESCO Science Report: Towards 2030 (Paris: UNESCO Publishing, 2015), http://unesdoc.unesco.org/images/0023/002354/235406e.pdf. In this context, many new venues for scholarship—including outright fraudulent publications and others with low standards—have appeared to meet the increased demand for publishing outlets.

Many high-quality journals and academic presses have already opened discussions about criteria for publishing decisions. A central question involves how to value and elevate research results beyond those perceived as most original, counterintuitive, or intriguing, which the current system privileges. These criteria can provide incentives for cherry picking results, thereby misrepresenting the actual state of knowledge on a particular research topic. Such priorities also undervalue the importance of confirming results and null findings.32Recent discussions about the lack of replication in psychology and other fields are, at least in part, tied to only positive “breakthrough” results being submitted to journals, because these are what get published. This phenomenon has the potential to cause serious damage when findings are acted upon and to public trust in the research enterprise.

The increasing quantity of submissions has another consequence: the slow pace of reviewing and publishing papers. Reviewing delays not only slow down learning and debate among scholars, but they also frustrate funders and potential “consumers” of knowledge. The Task Force encourages new experiments that seek to increase the availability of social science findings and to rethink the criteria and processes for reviewing quality in the social sciences (see box 4). Given the importance of the private sector in social research, these experiments should include company-based researchers in efforts to develop transparent standards, processes, and norms for research quality and methods.

Box 4: Pilot Projects in Publishing

Several promising initiatives have experimented with accessible publishing strategies. The Public Library of Science (PLOS) has pioneered a project that establishes a bar for accuracy and basic quality in order to make more work available freely and quickly. The American Sociological Association’s new journal Socius has adopted a version of this access model while publishing work, including early discoveries, “that is empirically accurate and theoretically novel.”33Lisa A. Keister and James Moody, “Academic Publishing: New Rules, New Opportunities,” Parameters, January 11, 2017, http://parameters.ssrc.org/2017/01/academic-publishing-new-rules-new-opportunities/.

Another model, of which perhaps the best-known example is the working paper series of the National Bureau of Economic Research (NBER), involves posting research papers prior to publication.34See, for example, the National Bureau of Economic Research Working Paper Series, http://www.nber.org/papers/. This approach allows other scholars to comment on the work without compromising authors’ publication possibilities in journals that accept this process as common practice.

Recommendations: Research Quality

  • Increase systematic understanding of the current peer-review process by developing quantitative and qualitative studies of evaluation strategies for books and journals.
  • Convene academic publishers and scholarly societies to rethink peer review, with a special focus on issues related to publication timelines and criteria.
  • Encourage the adoption of more pre-publication outlets to facilitate learning and sharing of ideas.

Research Training

Another dimension of social science infrastructure concerns advanced research training, which has not sufficiently adapted to the ways in which social science skills intersect with labor markets. While employers seek new skills for changing work environments, academia’s models for preparing researchers have remained relatively static.

As the private sector has increased its share of funding for social research, the need for researchers with relevant skills has grown. At the same time, states’ funding for public higher education has decreased dramatically, and private universities continue to trim budgets for permanent research faculty positions. As a result, newly trained social scientists have access to a markedly lower number of full-time, tenure-track positions. Part-time teaching positions marked by low pay and few benefits, in which women and minority scholars are overrepresented, have become far more prominent, and expectations for faculty in these positions rarely emphasize research.35Decreased support for public higher education has a range of negative implications for the social science labor market. For example, research shows that multifaceted teams are the most successful teams across sectors, but contracting budgets at the institutions that educate the largest and most diverse student populations risk depriving social science of this intellectual talent. See Scott E. Page, The Difference: How the Power of Diversity Creates Better Groups, Firms, Schools, and Societies (Princeton, NJ: Princeton University Press, 2008).

At the same time, computational skills have never been so highly valued in the job market. Much of the knowledge economy depends on socially relevant data, and information technology firms seek out job candidates with such experience. The nonprofit sector and government similarly look for employees with quantitative and other research skills, while design and marketing firms have increasingly identified the importance of social science methods such as ethnography.36A recent National Academies report shows that, in 2015, just under 50 percent of social science PhD holders worked in educational institutions (mostly higher education), 32 percent in the for-profit sector, and 10 percent at nonprofits. A smaller percentage of social science PhDs are employed in for-profit businesses than doctorate holders in other STEM fields, but a higher percentage of social scientists are self-employed or work for nonprofits. National Academies of Sciences, Engineering, and Medicine, Graduate STEM Education for the 21st Century (Washington, DC: National Academies Press, 2018), 100, table 5-2.

Despite the demand for computational abilities across a number of sectors, digital literacy of various kinds is sometimes peripheral to academic training in the social sciences. Of course, training varies by discipline, and some social science fields, such as economics, provide these skills as core components of doctoral training. But in some disciplines, digital literacy is not central to graduate work, and students must seek out ways to get such experience. Other skills critical to today’s labor market include teamwork abilities, project management, staff oversight, and the ability to translate research findings for broader audiences. But few graduate training programs prioritize exposure to these skills.37Lab-based disciplines such as psychology do provide elements of these kinds of experiences, but for the most part doctoral work in the social sciences is a solitary endeavor in terms of dissertation research and writing. The difficulties involved in adding even further demands on stressed out graduate students (and their faculty advisors, many of whom are not experts themselves in such “soft skills”) should not be underestimated.

In addition to a skills mismatch, another structural problem that has left research training out of sync with employment markets involves the way in which doctoral programs prepare students for career paths. A common complaint among graduate students and potential employers is that many doctoral programs mold students for only one career track—tenure-track faculty positions at research-intensive universities. Some programs have begun to adapt to the new realities of today’s labor market by promoting training that opens students to a range of different professional paths. One challenge in many disciplines is that departments lack information about the array of potential careers available to social science PhDs (see box 5).

Greater collaboration between the private sector, academia, and professional organizations could improve the alignment of research training with the skills needed for careers in a range of relevant fields. The Task Force recognizes that the barriers for increasing workloads on already stressed graduate students and overworked advisors are not to be underestimated. Nonetheless, improving mentoring at all stages in the research training pipeline can help integrate technical, teamwork, administrative, and translational expertise into graduate social science research programs.

Box 5: Where Are the Jobs?

Many academic departments have already recognized the need to prepare doctoral students for nonacademic career tracks. However, implementing this goal has proven more challenging. One obstacle involves knowing where graduates eventually find employment in the first place.

  • The American Historical Association has provided guidance to history departments in this area; it recently published a comprehensive study of where PhDs in history work by sector and institutional type.38“Where Historians Work: An Interactive Database of History PhD Outcomes,” American Historical Association, 2017, https://www.historians.org/wherehistorianswork.
  • The American Council of Learned Societies’ Public Fellows program,39“Mellon/ACLS Public Fellows Program,” American Council of Learned Societies, https://www.acls.org/programs/publicfellows/. with support from the Andrew W. Mellon Foundation, provides the opportunity for early postdocs to receive two-year positions in government and private nonprofit agencies to build careers outside the academy.
  • The American Association for the Advancement of Science’s yearlong Science and Technology Policy Fellows program places social scientists in the three branches of government, providing a gateway to nonacademic careers.

More effort is needed to more purposefully prepare students to enter career paths in and out of the academy, and this will entail greater engagement between higher education, professional associations, and prospective employers.

Recommendations: Research Training

  • Convene academic and private-sector leaders to design mechanisms for incorporating digital literacy, collaboration, and other skills into research training at the doctoral level, building on a range of current efforts.40For digital training, one example is the SSRC’s Digital Literacy project, which integrates digital methods—data analytics, visualization, computational modeling, and network analysis—with field-specific knowledge.
  • Conduct additional research into the professional locations of PhDs in the social sciences, using the recent American Historical Association study as a model.
  • Mobilize disciplinary associations to develop innovative ways to encourage social scientists working outside of the academy to remain engaged in professional research communities.

Conclusion: Toward a
New Compact for the
Social Sciences

The To Secure Knowledge Task Force has emphasized the urgent need to create a new institutional infrastructure supporting social science research and the importance of cross-sectoral collaboration to meet this challenge. The Task Force calls for the forging of a new “research compact” to identify a set of shared understandings and expectations that build trust in and support for today’s scientific enterprise.

The connection to people’s lives and relevance to contemporary social issues create profound possibilities for social science knowledge. But too often, the role and significance of social science are not fully understood.

We encourage social scientists to take up the challenge of communicating with research partners and the public at large about the rigorous processes of knowledge creation and the conditions under which social knowledge can be used in policy and practice.41See Prewitt, “Is Knowledge Good for the Public?”, and Kenneth Prewitt, Thomas A. Schwandt, and Miron L. Straf, eds., Using Science as Evidence in Public Policy (Washington, DC: National Academies Press, 2012). For more examples of the challenges of social scientists communicating about “accountable” work, see also the National Academies, The Value of Social, Behavioral, and Economic Sciences to National Priorities.

Key pathways and means to secure knowledge lie within our grasp.

We are confident that crafting a new research compact can help harness the current potential of the social sciences to improve human lives. The most central component of a new research compact involves forging collaborations across all parts of the institutional infrastructure that produce, use, and care about social knowledge. The Task Force recommends that the SSRC help create these new connections and partnerships by building on current experiments that show the potential to reinvent a robust and effective research compact. Notwithstanding persisting assaults on the conditions necessary to protect knowledge, we are convinced that key pathways and means to secure knowledge lie within our grasp.

Recommendations

Funding

  • Promote and deepen efforts to advocate for social science support by such organizations as the American Association for the Advancement of Science, the Consortium of Social Science Associations, and, primarily focused on the United Kingdom, the Campaign for Social Science.
  • Design and implement new models for public-private research funding partnerships that include government, the private sector, the academy, and philanthropy. A start will be made at an SSRC-sponsored convening of research stakeholders in the spring of 2019.

Data

  • Encourage the expansion of new experiments in data sharing, such as the Sloan Foundation’s data facilities network and the SSRC’s Social Data Initiative.
  • Ensure cumulative learning from experimental private-public collaborations by conducting and sharing evaluations of pilot projects such as those of the ADRFs and the SSRC.
  • Draw on “data philanthropy” and similar frameworks to develop best practices for access to privately held data, similar to corporate social responsibility commitments.
  • Promote the development of a data commons, both in terms of creating a national data service for government-held data and a multisource platform linking private-sector data.

Ethics and Data Integrity

  • Construct cross-sectoral partnerships to address threats to data integrity and to explore the use of data credibility forensics.
  • Convene social researchers, ethicists, and other constituencies to develop best practices in research ethics.
  • Ensure cumulative learning of experiments in understanding and designing new ethical codes and practices, building on the SSRC’s work with PERVADE as well as other collaborations.

Research Quality

  • Increase systematic understanding of the current peer-review process by developing quantitative and qualitative studies of evaluation strategies for books and journals.
  • Convene academic publishers and scholarly societies to rethink peer review, with a special focus on issues related to publication timelines and criteria.
  • Encourage the adoption of more pre-publication outlets to facilitate learning and sharing of ideas.

Research Training

  • Convene academic and private-sector leaders to design mechanisms for incorporating digital literacy, collaboration, and other skills into research training at the doctoral level, building on a range of current efforts.
  • Conduct additional research into the professional locations of PhDs in the social sciences, using the recent American Historical Association study as a model.
  • Mobilize disciplinary associations to develop innovative ways to encourage social scientists working outside of the academy to remain engaged in professional research communities.

Appendix: Task Force Process

April 2017
Task Force Appointments and Public Launch

May 2017
Preliminary discussion with the American Association of Political and Social Science’s Social Science Leadership group

June 2017
First gathering of Task Force: outline of the report developed

October 2017
Annotated outline produced and reviewed; presented at College and University Fund Conference

November 2017
Initial presentation with Executive Directors of social science disciplinary associations

November and December 2017
Individual conversations with Task Force members

December 15, 2017
Full meeting of Task Force to discuss structure of report and writing assignments

Spring 2018
Submission of Task Force members’ written contributions

April 2018
Subsequent presentation with Executive Directors of social science disciplinary associations

May 2018
Full report discussed with the American Association of Political and Social Science’s Social Science Leadership group

June and July 2018
Comments on draft received from members and a wide range of invited reviewers

Acknowledgments

The SSRC gratefully acknowledges the Future of Scholarly Knowledge project at Columbia University, led by Professor Kenneth Prewitt and funded by SAGE Publishing, for its support of the work of this Task Force. We would also like to thank the following colleagues for their thoughtful comments and guidance in the writing of this report:

Danielle Allen, James Bryant Conant University Professor and Director of the Edmond J. Safra Center for Ethics, Harvard University

danah boyd, Principal Researcher, Microsoft Research; Founder, Data & Society; SSRC Board of Directors

Jonathan Cole, John Mitchell Mason Professor, Columbia University Law School

Sandra Dawson, KPMG Professor Emeritus of Management Studies, University of Cambridge; SSRC Board of Directors

Andrew Delbanco, Alexander Hamilton Professor of American Studies, Columbia University; President, Teagle Foundation

John H. Evans, Professor of Sociology, Associate Dean of Social Sciences, Co-Director, Institute for Practical Ethics, University of California, San Diego

Jonathan F. Fanton, President, American Academy of Arts and Sciences; SSRC Visiting Committee

Daniel L. Goroff, Vice President and Program Director, Alfred P. Sloan Foundation

Robert Groves, Provost, Georgetown University

Karen Hanson, Provost, University of Minnesota

Howard Kurtzman, Acting Executive Director, American Psychology Association Science Directorate

Naomi Lamoreaux, Stanley B. Resor Professor of Economics and History at Yale University; Research Associate, National Bureau of Economic Research; SSRC Board of Directors

Margaret Levi, Director, Center for Advanced Study in the Behavioral Sciences and Professor of Political Science, Stanford University; SSRC Board of Directors

Sara Miller McCune, Co-Founder and Chair of SAGE Publications; SSRC Board of Directors and Chair, SSRC Visiting Committee

Wendy Naus, Executive Director, Consortium of Social Science Associations

Walter Powell, Professor of Sociology, Organizational Behavior, Management Science and Engineering, and Communication, Faculty Co-Director, Stanford Center on Philanthropy and Civil Society, Stanford University; SSRC Board of Directors

Danilyn Rutherford, President, Wenner-Gren Foundation for Anthropological Research

James Shulman, Vice President and Chief Operating Officer, American Council of Learned Societies

Duncan Watts, Principal Researcher, Microsoft Research

Steven C. Wheatley, Senior Adviser, American Council of Learned Societies

Marina Whitman, Professor Emerita of Business Administration and Public Policy, University of Michigan; SSRC Visiting Committee

Pauline Yu, President, American Council of Learned Societies

We also thank the SSRC Board of Directors, representatives of the SSRC’s College and University Fund, and the executive directors of the social science disciplinary associations for their contributions to the development of this report. Participants at the American Academy of Political and Social Sciences’ assemblies of social science organizations and members of the PERVADE (Pervasive Data Ethics for Computational Research) project team provided valuable feedback. Mary Bridges provided editorial support and Erika Olbey oversaw the design. SSRC staff members Ron Kassimir, Jason Rhody, Kate Grantz, Vina Tran, Clare McGranahan, Beth Post, Dewey Blanton, and Rajat Singh were central to the administration of the Task Force’s deliberation process and the development of the report.

  • 1
    The Task Force endorses and supports current efforts to protect the integrity of the census and the federal statistical system from political interference. See, for example, the August 2018 letter from the National Academies Task Force on the 2020 census, https://www.nap.edu/read/25215/chapter/1#3, and the statement organized by the Consortium of Social Science Associations, https://www.cossa.org/2018/08/07/cossa-and-25-science-organizations-call-for-removal-of-census-citizenship-question/.
  • 2
    For the purpose of this report, social knowledge refers to understandings of human behavior and social structures generated by professional researchers and scientists using advanced training, technical skills, and critical reasoning to push the frontiers of their fields and to promote the public good.
  • 3
    The multiple challenges recently facing the role of scientists and the use of science in the policymaking process of the Environmental Protection Agency (EPA) now encompass social science. In June 2018, the Environmental Economics Advisory Committee, a component of the EPA’s Science Advisory Board, was dismantled by the agency, effectively removing a key site for independent and nonpartisan social science to speak directly to environmental policy. Kevin Boyle and Matthew Kotchen, “Retreat on Economics at the EPA,” Science 361, no. 6404 (2018): 729.
  • 4
    Of course, we are aware that “seizing possibilities” can also be done in ways that are more or less altruistic. Our intention here is to encourage the collective leveraging of knowledge across sectors and for the good of the many rather than the few.
  • 5
    See the multiple examples of the important impact of social science research recounted in National Academies of Sciences, The Value of Social, Behavioral and Economic Sciences to National Priorities: A Report to the National Science Foundation, National Academies Press, 2017, as well as the “Why Social Science?” series curated by the Consortium of Social Science Associations (COSSA).
  • 6
    While this report does not address it directly, the digital age profoundly affects access to research results and the need to archive data securely and accessibility. Important experiments in open access are underway, although a significant amount of scholarship—and likely an even higher percentage of publications in the most selective journals—remains behind paywalls and is not available or affordable to many scholars, as well as practitioners and the general public. Archival and preservation needs are too often neglected. What is worth saving for future generations, and how to do it? A sixteenth-century book can still be taken down from the shelf of a library and read without further ado whereas a CD has a physical lifetime of a mere thirty years. Digitization requires scientific archiving to be reinvented, both conceptually and materially.
  • 7
    Kenneth Prewitt, “Is Knowledge Good for the Public?”, Social Research 84, no. 3 (2017): xxi–xxxv.
  • 8
    See the Public Face of Science Initiative, Perceptions of Science in America (Cambridge, MA: American Academy of Arts & Sciences, 2018); Gordon W. Gauchat, “The Political Context of Science in the United States: Public Acceptance of Evidence-Based Policy and Science Funding,” Social Forces 94 no. 2 (2015): 723–46, http://dx.doi.org/10.1093/sf/sov040; and Gauchat, “The Politicization of Science in the Public Sphere: A Study of Public Trust in Science in the US, 1974–2010,” American Sociological Review 77, no. 2 (2012): 167–87, http://dx.doi.org/10.1177/0003122412438225.
  • 9
    This includes the work of the Social Media and Political Participation Lab at New York University and the Media Manipulation research team at the Data and Society Research Institute.
  • 10
    The Obama administration’s Social and Behavioral Sciences Team represents one example of such a program.
  • 11
    In Congress, regular attacks have taken place against the NSF’s division focused on social science research, the Directorate for Social, Behavioral, and Economic (SBE) Sciences.
  • 12
    For data and discussion of the change, see the American Association for the Advancement of Science, “Federal R&D Budget Overview,” http://www.aaas.org/program/rd-budget-and-policy-program; and Jeffrey Mervis, “Data Check: U.S. Government Share of Basic Research Funding Falls Below 50%,” Science (March 9, 2017). Data on how the percentages may have shifted between public and private spending for social science research specifically is not available. According to NSF data, federal funding for social science was slightly less in 2016 (in current dollars) as compared to the late 1970s. See “Research by Science and Engineering Discipline,” American Association for the Advancement of Science website, last updated July 2018, https://www.aaas.org/page/research-science-and-engineering-discipline.
  • 13
    For more information on the wide-ranging consequences of federally funded research for socially beneficial goals, see the National Academies, The Value of Social, Behavioral and Economic Sciences to National Priorities; and the Committee on Criteria for Federal Support of Research and Development, Allocating Federal Funds for Science and Technology (Washington, DC: National Academies Press, 1995).
  • 14
    Joseph Stiglitz has observed: “Knowledge can be viewed as a public good, and the private provision of a public good is essentially never optimal.” Stiglitz, “Leaders and Followers: Perspectives on the Nordic Model and the Economics of Innovation,” NBER Working Paper 20493 (September 2014).
  • 15
    Consider the emphasis the Council on Competitiveness places on the education of citizens in a changing world. The organization asserts: “From technology to trade skills, there is no issue on which Council members are more united than in their desire for progress in building a talented, diverse workforce.” Council on Competitiveness, 2017 Clarion Call, 9. Moreover, “American consumers and companies are increasingly reliant on a globally engaged economy for their jobs and standards of living,” 12.
  • 16
    For a recent overview, see the January 2018 special issue of the Annals of the American Academy of Political and Social Science, especially the introduction by Andrew Reamer and Julia Lane, “A Roadmap to a Nationwide Data Infrastructure for Evidence-Based Policy Making,” Annals of the American Academy of Political and Social Science 675, no. 1 (2018): 28–35.
  • 17
    Commission on Evidence-Based Policymaking, The Promise of Evidence-Based Policymaking (Washington, DC: Commission on Evidence-Based Policymaking, 2017), https://www.cep.gov/cep-final-report.html.
  • 18
    Following the creation of the census, a steady expansion in the government’s data collection agencies, initiatives, and programs has taken place—the thirteen principal federal statistical agencies alone spend approximately $6 billion annually on data collection. There are at least another hundred or so statistical offices scattered across the government. Indeed, the statistical system of the US government is itself a diverse institutional ecology.
  • 19
    There are also important differences in corporate versus academic approaches to data. Digital data from the commercial sector tends to be thin on theory and rich in data. Social science is traditionally the inverse: theory rich and data poor. Further, in terms of analytical techniques, the technology sector boasts a range of computational innovations focused on prediction of social behavior and interaction. By contrast, social science historically focuses on causal mechanisms and interpretive understanding. On the latter point, see Jake M. Hofman, Amit Sharma, and Duncan Watts, “Prediction and Explanation in Social Systems,” Science 355, no. 6324 (2017): 486–488.
  • 20
    The Task Force acknowledges the prescient calls in this direction made by scholars in the past decade. See Duncan Watts, “A Twenty-First Century Science,” Nature 44, no. 1 (2007): 489; and David Lazer et al., “Life in the Network: The Coming Age of Computational Social Science,” Science 323, no. 5915 (2009): 721–23.
  • 21
    Groves and Neufeld, Accelerating the Sharing of Data across Sectors to Advance the Common Good (Washington, DC: Georgetown University Beeck Center for Social Impact and Innovation, 2017).
  • 22
    See, for example, Shamina Singh, “A Call to Action on Data Philanthropy,” Mastercard Center for Inclusive Growth, October 4, 2016, https://mastercardcenter.org/action/call-action-data-philanthropy/.
  • 23
    More about the Administrative Data Research Facilities Network is available at https://www.adrf.upenn.edu/.
  • 24
    More about SSRC’s Social Data Initiative is available at https://www.ssrc.org/programs/view/social-data-initiative/.
  • 25
    See Gary King and Nate Persily, “A New Model for Industry-Academic Partnerships” (working paper), updated August 10, 2018, https://gking.harvard.edu/files/gking/files/partnerships.pdf.
  • 26
    “Future cyber operations will almost certainly include an increased emphasis on changing or manipulating data to compromise its integrity (i.e., accuracy and reliability) to affect decision-making, reduce trust in systems, or cause adverse physical effects.” James R. Clapper, Director of National Intelligence, “Statement for the Record: Worldwide Threat Assessment of the US Intelligence Community before the Senate Armed Services Committee,” February 6, 2016, https://www.dni.gov/files/documents/SASC_Unclassified_2016_ATA_SFR_FINAL.pdf.
  • 27
    The permanence of data is also an issue. Does informed consent granted today imply consent for unanticipated future uses of the data?
  • 28
    These changes, informed by collaborative discussions between academic researchers and the federal government, appear to introduce more flexibility in the ways that diverse kinds of social science research practice are evaluated in terms of compliance. Consortium of Social Science Associations, “Common Rule Agencies Release Proposal for 6-Month Delay of Revisions, Optional Implementation of ‘Burden-Reducing’ Provisions; Comments Sought for 30 Days,” COSSA Washington Update, April 25, 2018, http://www.cossa.org/2018/04/25/common-rule-agencies-release-proposal-for-6-month-delay-of-revisions/.
  • 29
    More about PERVADE is available at https://pervade.umd.edu.
  • 30
    See, for example, Jory Lerback and Brooks Hansen, “Journals Invite Too Few Women to Referee,” Nature 541, no. 7638 (January 26, 2017): 455–57, https://doi.org/10.1038/541455a; and Cat Ferguson, Adam Marcus, and Ivan Oransky, “Publishing: The Peer-Review Scam,” Nature 515, no. 7528 (November 27, 2014): 480–482, https://doi.org/10.1038/515480a.
  • 31
    United Nations Educational, Scientific and Cultural Organization, UNESCO Science Report: Towards 2030 (Paris: UNESCO Publishing, 2015), http://unesdoc.unesco.org/images/0023/002354/235406e.pdf.
  • 32
    Recent discussions about the lack of replication in psychology and other fields are, at least in part, tied to only positive “breakthrough” results being submitted to journals, because these are what get published. This phenomenon has the potential to cause serious damage when findings are acted upon and to public trust in the research enterprise.
  • 33
    Lisa A. Keister and James Moody, “Academic Publishing: New Rules, New Opportunities,” Parameters, January 11, 2017, http://parameters.ssrc.org/2017/01/academic-publishing-new-rules-new-opportunities/.
  • 34
    See, for example, the National Bureau of Economic Research Working Paper Series, http://www.nber.org/papers/.
  • 35
    Decreased support for public higher education has a range of negative implications for the social science labor market. For example, research shows that multifaceted teams are the most successful teams across sectors, but contracting budgets at the institutions that educate the largest and most diverse student populations risk depriving social science of this intellectual talent. See Scott E. Page, The Difference: How the Power of Diversity Creates Better Groups, Firms, Schools, and Societies (Princeton, NJ: Princeton University Press, 2008).
  • 36
    A recent National Academies report shows that, in 2015, just under 50 percent of social science PhD holders worked in educational institutions (mostly higher education), 32 percent in the for-profit sector, and 10 percent at nonprofits. A smaller percentage of social science PhDs are employed in for-profit businesses than doctorate holders in other STEM fields, but a higher percentage of social scientists are self-employed or work for nonprofits. National Academies of Sciences, Engineering, and Medicine, Graduate STEM Education for the 21st Century (Washington, DC: National Academies Press, 2018), 100, table 5-2.
  • 37
    Lab-based disciplines such as psychology do provide elements of these kinds of experiences, but for the most part doctoral work in the social sciences is a solitary endeavor in terms of dissertation research and writing. The difficulties involved in adding even further demands on stressed out graduate students (and their faculty advisors, many of whom are not experts themselves in such “soft skills”) should not be underestimated.
  • 38
    “Where Historians Work: An Interactive Database of History PhD Outcomes,” American Historical Association, 2017, https://www.historians.org/wherehistorianswork.
  • 39
    “Mellon/ACLS Public Fellows Program,” American Council of Learned Societies, https://www.acls.org/programs/publicfellows/.
  • 40
    For digital training, one example is the SSRC’s Digital Literacy project, which integrates digital methods—data analytics, visualization, computational modeling, and network analysis—with field-specific knowledge.
  • 41
    See Prewitt, “Is Knowledge Good for the Public?”, and Kenneth Prewitt, Thomas A. Schwandt, and Miron L. Straf, eds., Using Science as Evidence in Public Policy (Washington, DC: National Academies Press, 2012). For more examples of the challenges of social scientists communicating about “accountable” work, see also the National Academies, The Value of Social, Behavioral, and Economic Sciences to National Priorities.
Menu