Open science

The six principles of open science[1]

Open science is the movement to make scientific research, data and dissemination accessible to all levels of an inquiring society, amateur or professional. It encompasses practices such as publishing open research, campaigning for open access, encouraging scientists to practice open notebook science, and generally making it easier to publish and communicate scientific knowledge. The European-funded project Facilitate Open Science Training for European Research (FOSTER)[2] has developed an open science taxonomy[3] as an attempt to map the open science field.

Open science began in the 17th century with the advent of the academic journal, when the societal demand for access to scientific knowledge reached a point where it became necessary for groups of scientists to share resources[4] with each other so that they could collectively do their work.[5] In modern times there is debate about the extent to which scientific information should be shared.[6] The conflict is between the desire of scientists to have access to shared resources versus the desire of individual entities to profit when other entities partake of their resources.[7] Additionally, the status of open access and resources that are available for its promotion are likely to differ from one field of academic inquiry to another, as OpenScience[8] blog indicates.

Background

Science is broadly understood as collecting, analyzing, publishing, reanalyzing, critiquing, and reusing data. Proponents of open science identify a number of barriers that impede or dissuade the broad dissemination of scientific data.[9] These include financial paywalls of for-profit research publishers, restrictions on usage applied by publishers of data, poor formatting of data or use of proprietary software that makes it difficult to re-purpose, and cultural reluctance to publish data for fears of losing control of how the information is used.[9][10]

Open Science Taxonomy[11]

According to the FOSTER taxonomy[3] Open science can often include aspects of Open access, Open data and the open source movement whereby modern science requires software in order to process data and information.[12][13] Open research computation also addresses the problem of reproducibility of scientific results.

Different Types of "Open Science"

The term "open science" does not have any one fixed definition or operationalization. On the one hand, it has been referred to as a "puzzling phenomenon".[14] On the other hand, the term has been used to encapsulate a series of principles that aim to foster scientific growth and its complementary access to the public. Two influential sociologists, Benedikt Fecher and Sascha Friesike, have created multiple "schools of thought" that describe the different interpretations of the term.[15]

According to Fecher and Friesike ‘Open Science’ is an umbrella term for various assumptions about the development and dissemination of knowledge. To show the term’s multitudinous perceptions, they differentiate between five Open Science schools of thought:

Infrastructure School

The infrastructure school is founded on the assumption that "efficient" research depends on the availability of tools and applications. Therefore, the "goal" of the school is to promote the creation of openly available platforms, tools, and services for scientists. Hence, the infrastructure school is concerned with the technical infrastructure that promotes the development of emerging and developing research practices through the use of the internet, including the use of software and applications, in addition to conventional computing networks. In that sense, the infrastructure school regards open science as a technological challenge. The infrastructure school is tied closely with the notion of "cyberscience", which describes the trend of applying information and communication technologies to scientific research, which has led to an amicable development of the infrastructure school. Specific elements of this prosperity include increasing collaboration and interaction between scientists, as well as the development of "open-source science" practices. The sociologists discuss two central trends in the Infrastructure school:

1. Distributed computing: This trend encapsulates practices that outsource complex, process-heavy scientific computing to a network of volunteer computers around the world. The examples that the sociologists cite in their paper is that of the Open Science Grid, which enables the development of large-scale projects that require high-volume data management and processing, which is accomplished through a distributed computer network. Moreover, the grid provides the necessary tools that the scientists can use to facilitate this process.[16]

2. Social and Collaboration Networks for Scientists: This trend encapsulates the development of software that makes interaction with other researchers and scientific collaborations much easier than traditional, non-digital practices. Specifically, the trend is focused on implementing newer Web 2.0 tools to facilitate research related activities on the internet. De Roure and colleagues (2008) [17] list a series of four key capabilities which they believe composes A Social Virtual Research Environment (SVRE):

Measurement School

The measurement school, in the view of the authors, deals with developing alternative methods to determine scientific impact. This school acknowledges that measurements of scientific impact are crucial to a researcher's reputation, funding opportunities, and career development. Hence, the authors argue, that any discourse about Open Science is pivoted around developing a robust measure of scientific impact in the digital age. The authors then discuss other research indicating support for the measurement school. The three key currents of previous literature discussed by the authors are:

Hence, this school argues that there are faster impact measurement technologies that can account for a range of publication types as well as social media web coverage of a scientific contribution to arrive at a complete evaluation of how impactful the science contribution was. The gist of the argument for this school is that hidden uses like reading, bookmarking, sharing, discussing and rating are traceable activities, and these traces can and should be used to develop a newer measure of scientific impact. The umbrella jargon for this new type of impact measurements is called altmetrics, coined in a 2011 article by Priem et al., (2011).[18] Markedly, the authors discuss evidence that altmetrics differ from traditional webometrics which are slow and unstructured. Altmetrics are proposed to rely upon a greater set of measures that account for tweets, blogs, discussions, and bookmarks. The authors claim that the existing literature has often proposed that altmetrics should also encapsulate the scientific process, and measure the process of research and collaboration to create an overall metric. However, the authors are explicit in their assessment that few papers offer methodological details as to how to accomplish this. The authors use this and the general dearth of evidence to conclude that research in the area of altmetrics is still in its infancy.

Public School

According to the authors, the central concern of the school is to make science accessible to a wider audience. The inherent assumption of this school, as described by the authors, is that the newer communication technologies such as Web 2.0 allow scientists to open up the research process and also allow scientist to better prepare their "products of research" for interested non-experts. Hence, the school is characterized by two broad streams: one argues for the access of the research process to the masses, whereas the other argues for increased access to the scientific product to the public.

Democratic School

The democratic school concerns itself with the concept of access to knowledge. As opposed to focusing on the accessibility of research and its understandability, advocates of this school focus on the access of products of research to the public. The central concern of the school is with the legal and other obstacles that hinder the access of research publications and scientific data to the public. The authors argue that proponents of this school assert that any research product should be freely available. The authors argue that the underlying notion of this school is that everyone has the same, equal right of access to knowledge, especially in the instances of state-funded experiments and data. The authors categorize two central currents that characterize this school: Open Access and Open Data.

Pragmatic School

The pragmatic school considers Open Science as the possibility to make knowledge creation and dissemination more efficient by increasing the collaboration throughout the research process. Proponents argue that science could be optimized by modularizing the process and opening up the scientific value chain. ‘Open’ in this sense follows very much the concept of open innovation.[19] Tacke for instance transfers the outside-in (including external knowledge in the production process) and inside-out (spillovers from the formerly closed production process) principles to science.[20] Web 2.0 is considered a set of helpful tool that can foster collaboration (sometimes also referred to as Science 2.0). Further, citizen science is seen as form of collaboration that includes knowledge and information from non-scientist. Fecher and Friesike describe data sharing as an example of the pragmatic school as it enables researchers to use other researchers’ data to pursue new research questions or to conduct data-driven replications.

History

The widespread adoption of the institution of the scientific journal marks the beginning of the modern concept of open science. Before this time societies pressured scientists into secretive behaviors.

Before journals

Before the advent of scientific journals, scientists had little to gain and much to lose by publicizing scientific discoveries.[21] Many scientists, including Galileo, Kepler, Isaac Newton, Christiaan Huygens, and Robert Hooke, made claim to their discoveries by describing them in papers coded in anagrams or cyphers and then distributing the coded text.[21] Their intent was to develop their discovery into something off which they could profit, then reveal their discovery to prove ownership when they were prepared to make a claim on it.[21]

The system of not publicizing discoveries caused problems because discoveries were not shared quickly and because it sometimes was difficult for the discoverer to prove priority. Newton and Gottfried Leibniz both claimed priority in discovering calculus.[21] Newton said that he wrote about calculus in the 1660s and 1670s, but did not publish until 1693.[21] Leibniz published "Nova Methodus pro Maximis et Minimis", a treatise on calculus]] in 1684. Debates over priority are inherent in systems where science is not published openly, and this was problematic for scientists who wanted to benefit from priority.

These cases are representative of a system of aristocratic patronage in which scientists received funding to develop either immediately useful things or to entertain.[5] In this sense, funding of science gave prestige to the patron in the same way that funding of artists, writers, architects, and philosophers did.[5] Because of this, scientists were under pressure to satisfy the desires of their patrons, and discouraged from being open with research which would bring prestige to persons other than their patrons.[5]

Emergence of academies and journals

Eventually the individual patronage system ceased to provide the scientific output which society began to demand.[5] Single patrons could not sufficiently fund scientists, who had unstable careers and needed consistent funding.[5] The development which changed this was a trend to pool research by multiple scientists into an academy funded by multiple patrons.[5] In 1660 England established the Royal Society and in 1666 the French established the French Academy of Sciences.[5] Between the 1660s and 1793, governments gave official recognition to 70 other scientific organizations modeled after those two academies.[5][22] In 1665, Henry Oldenburg became the editor of Philosophical Transactions of the Royal Society, the first academic journal devoted to science, and the foundation for the growth of scientific publishing.[23] By 1699 there were 30 scientific journals; by 1790 there were 1052.[24] Since then publishing has expanded at even greater rates.[25]

The first popular science periodical of its kind was published in 1872, under a suggestive name that is still a modern portal for the offering science journalism: Popular Science. The magazine claims to have documented the invention of the telephone, the phonograph, the electric light and the onset of automobile technology. The magazine goes so far as to claim that the "history of Popular Science is a true reflection of humankind's progress over the past 129+ years".[26] Discussions of popular science writing most often contend their arguments around some type of "Science Boom". A recent historiographic account of popular science traces mentions of the term"science boom" to Daniel Greenberg's Science and Government Reports in 1979 which posited that "Scientific magazines are bursting out all over. Similarly, this account discusses the publication Time, and its cover story of Carl Sagan in 1980 as propagating the claim that popular science has "turned into enthusiasm".[27] Crucially, this secondary accounts asks the important question as to what was considered as popular "science" to begin with. The paper claims that any account of how popular science writing bridged the gap between the informed masses and the expert scientists must first consider who was considered a scientist to begin with.

Collaboration among academies

In modern times many academies have pressured researchers at publicly funded universities and research institutions to engage in a mix of sharing research and making some technological developments proprietary.[7] Some research products have the potential to generate commercial revenue, and in hope of capitalizing on these products, many research institutions withhold information and technology which otherwise would lead to overall scientific advancement if other research institutions had access to these resources.[7] It is difficult to predict the potential payouts of technology or to assess the costs of withholding it, but there is general agreement that the benefit to any single institution of holding technology is not as great as the cost of withholding it from all other research institutions.[7]

Politics

In many countries, governments fund some science research. Scientists often publish the results of their research by writing articles and donating them to be published in scholarly journals, which frequently are commercial. Public entities such as universities and libraries subscribe to these journals. Michael Eisen, a founder of the Public Library of Science, has described this system by saying that "taxpayers who already paid for the research would have to pay again to read the results."[28]

In December 2011, some United States legislators introduced a bill called the Research Works Act, which would prohibit federal agencies from issuing grants with any provision requiring that articles reporting on taxpayer-funded research be published for free to the public online.[29] Darrell Issa, a co-sponsor of the bill, explained the bill by saying that "Publicly funded research is and must continue to be absolutely available to the public. We must also protect the value added to publicly funded research by the private sector and ensure that there is still an active commercial and non-profit research community."[30] One response to this bill was protests from various researchers; among them was a boycott of commercial publisher Elsevier called The Cost of Knowledge.[31]

The Dutch Presidency of the Council of the European Union called out for action in April 2016 to migrate European Commission funded research to Open Science. European Commissioner Carlos Moedas introduced the Open Science Cloud at the Open Science Conference in Amsterdam on April 4–5.[32] During this meeting also The Amsterdam Call for Action on Open Science was presented, a living document outlining concrete actions for the European Community to move to Open Science.

Arguments against open science

The open sharing of research data is not widely practiced

Arguments against open science tend to advance several concerns. These include the potential for some scholars to capitalize on data other scholars have worked hard to collect, without collecting data themselves, the potential for less qualified individuals to misuse open data and arguments that novel data are more critical than reproducing or replicating older findings.[33][34]

Too much unsorted information overwhelms scientists.

Some scientists find inspiration in their own thoughts by restricting the amount of information they get from others.[35] Alexander Grothendieck has been cited as a scientist who wanted to learn with restricted influence when he said that he wanted to "reach out in (his) own way to the things (he) wished to learn, rather than relying on the notions of consensus."[36]

Potential misuse.

In 2009 scientists' email regarding climate research was stolen, starting the Climatic Research Unit email controversy. In 2011, Dutch researchers announced their intention to publish a research paper in the journal Science describing the creation of a strain of H5N1 influenza which can be easily passed between ferrets, the mammals which most closely mimic the human response to the flu.[37] The announcement triggered a controversy in both political[38] and scientific[39] circles about the ethical implications of publishing scientific data which could be used to create biological weapons. These events are examples of how science data could potentially be misused.[40] Scientists have collaboratively agreed to limit their own fields of inquiry on occasions such as the Asilomar conference on recombinant DNA in 1975,[41]:111 and a proposed 2015 worldwide moratorium on a human-genome-editing technique.[42]

The public will misunderstand science data.

In 2009 NASA launched the Kepler spacecraft and promised that they would release collected data in June 2010. Later they decided to postpone release so that their scientists could look at it first. Their rationale was that non-scientists might unintentionally misinterpret the data, and NASA scientists thought it would be preferable for them to be familiar with the data in advance so that they could report on it with their level of accuracy.[43]

Increasing the scale of science will make verification of any discovery more difficult.

When more people report data it will take longer for anyone to consider all data, and perhaps more data of lower quality, before drawing any conclusion.[44]

Low-Quality Science

Post-publication peer review, a staple of open science, has been criticized as promoting the production of lower quality papers that are extremely voluminous.[45] Specifically, critics assert that as quality is not guaranteed by preprint servers, the veracity of papers will be difficult to assess by individual readers. This will lead to rippling effects of false science, akin to the recent epidemic of false news, propagated with ease on social media websites.[46] Common solutions to this problem have been cited as adaptations of a new format in which everything is allowed to be published but a subsequent filter-curator model is imposed to ensure some basic quality of standards are met by all publications.[47]

Arguments for open science

A number of scholars across disciplines have advanced various arguments in favor of open science. These generally focus on the perceived value of open science in improving the transparency and validity of research as well as in regards to public ownership of science, particularly that which is publicly funded. For example, in January 2014 J. Christopher Bare published a comprehensive "Guide to Open Science".[48] Likewise in January, 2017, a group of scholars known for advocating open science published a "manifesto" for open science in the journal Nature.[49]

Open access publication of research reports and data allows for rigorous peer-review

An article published by a team of NASA astrobiologists in 2010 in Science reported a bacterium known as GFAJ-1 that could purportedly metabolize arsenic (unlike any previously known species of lifeform).[50] This finding, along with NASA's claim that the paper "will impact the search for evidence of extraterrestrial life", met with criticism within the scientific community. Much of the scientific commentary and critique around this issue took place in public forums, most notably on Twitter, where hundreds of scientists and non-scientists created a hashtag community around the hashtag #arseniclife.[51] University of British Columbia astrobiologist Rosie Redfield, one of the most vocal critics of the NASA team's research, also submitted a draft of a research report of a study that she and colleagues conducted which contradicted the NASA team's findings; the draft report appeared in arXiv,[52] an open-research repository, and Redfield called in her lab's research blog for peer review both of their research and of the NASA team's original paper.[53]

Science is publicly funded so all results of the research should be publicly available[54]

Public funding of research has long been cited as one of the primary reasons for providing Open Access to research articles.[55] Since there is significant value in other parts of the research such as code, data, protocols, and research proposals a similar argument is made that since these are publicly funded, they should be publicly available under a creative commons licence.

Open Science will make science more reproducible and transparent

Increasingly the reproducibility of science is being questioned and the term "reproducibility crisis" has been coined.[56] Open Science approaches are proposed as one way to help increase the reproducibility of work.[57]

Open Science has more impact

There are several components to impact in research, many of which are hotly debated.[58] However, under traditional scientific metrics parts Open science such as Open Access and Open Data have proved to outperform traditional versions.[59][60]

Open Science Will Help Answer Uniquely Complex Questions

Recent arguments in favor of Open Science have maintained that Open Science is a necessary tool to begin answering immensely complex questions, such as the neural basis of consciousness.[61] The typical argument propagates the fact that these type of investigations are too complex to be carried out by any one individual, and therefore, they must rely on a network of open scientists to be accomplished. By default, the nature of these investigations also makes this "open science" as "big science".[62]

Organizations and projects of open science

Big scientific projects are more likely to practice open science than small projects.[63] Different projects conduct, advocate, develop tools for, or fund open science, and many organizations run multiple projects. For example, the Allen Institute for Brain Science[64] conducts numerous open science projects while the Center for Open Science has projects to conduct, advocate, and create tools for open science.

Organizations have extremely diverse sizes and structures. The Open Knowledge Foundation (OKF) is a global organization sharing large data catalogs, running face to face conferences, and supporting open source software projects. In contrast, Blue Obelisk is an informal group of chemists and associated cheminformatics projects. The tableau of organizations is dynamic with some organizations becoming defunct, e.g., Science Commons, and new organizations trying to grow, e.g., the Self-Journal of Science.[65] Common organizing forces include the knowledge domain, type of service provided, and even geography, e.g., OCSDNet's[66] concentration on the developing world.

Conducting open science projects

Many open science projects focus on gathering and coordinating encyclopedic collections of large amounts of organized data. The Allen Brain Atlas maps gene expression in human and mouse brains; the Encyclopedia of Life documents all the terrestrial species; the Galaxy Zoo classifies galaxies; the International HapMap Project maps the haplotypes of the human genome; and the Sloan Digital Sky Survey which regularizes and publishes data sets from many sources. All these projects accrete information provided by many different researchers with different standards of curation and contribution.

Other projects are organized around completion of projects that require extensive collaboration. For example, OpenWorm seeks to make a cellular level simulation of a roundworm, a multidisciplinary project. The Polymath Project seeks to solve difficult mathematical problems by enabling faster communications within the discipline of mathematics. The Collaborative Replications and Education project recruits undergraduate students as citizen scientists by offering funding. Each project defines its needs for contributors and collaboration.

Advocating open science

Numerous documents, organizations, and social movements advocate wider adoption of open science. Statements of principles include the Budapest Open Access Initiative from a December 2001 conference[67] and the Panton Principles. New statements are constantly developed, such as the Amsterdam Call for Action on Open Science to be presented to the Dutch Presidency of the Council of the European Union in late May, 2016. These statements often try to regularize licenses and disclosure for data and scientific literature.

Other advocates concentrate on educating scientists about appropriate open science software tools. Education is available as training seminars, e.g., Software Sustainability Institute's Software Carpentry project ; as domain specific training materials, e.g., Software Sustainability Institute's Data Carpentry project ; and as materials for teaching graduate classes, e.g., the Open Science Training Initiative . Many organizations also provide education in the general principles of open science.

Publishing open science

Replacing the current scientific publishing model is one goal of open science. High costs to access literature gave rise to protests such as The Cost of Knowledge and to sharing papers without publisher consent, e.g., Sci-hub and ICanHazPDF. New organizations are experimenting with the open access model: the Public Library of Science, or PLOS, is creating a library of open access journals and scientific literature; F1000Research provides open publishing and open peer review for the life-sciences; figshare archives and shares images, readings, and other data; and arXiv provide electronic preprints across many fields; and many individual journals. Other publishing experiments include delayed and hybrid models.

Software of open science

A variety of computer resources support open science. These include software like the Open Science Framework from the Center for Open Science to manage project information, data archiving and team coordination; distributed computing services like Ibercivis to utilize unused CPU time for computationally intensive tasks; and services like Experiment.com to provide crowdsourced funding for research projects.

Preprint Servers

Preprint Servers come in many varieties, but the standard traits across them are stable: they seek to create a quick, free, open access, open source mode of communicating scientific knowledge to the public. Preprint servers act as a venue to quickly disseminate research that has typically been accepted into a formal, pree-reviewed venue of publication. Also typical of preprint servers is their lack of a peer-review process - typically, preprint servers have some type of quality check in place to ensure a minimum standard of publication, but this mechanism is not the same as a peer-review mechanism. Some preprint servers have explicitly partnered with the broader open science movement.[68] Preprint servers typically imitate the resources offered by a full journal,[69] as they seek to make available all posted articles on Google Scholar, while also collecting data about citations. Especially unique is the feature to make articles available to the public through social media sharing which does not require potential readers to create new accounts with the preprint servers. The case for preprint servers is often made based on the slow pace of conventional publication formats.[70] The motivation to start Socarxiv, an open-access preprint server for social science research, is the claim that valuable research being published in traditional venues often times takes several months to tears to get published, which slows down the process of science significantly. Another argument made in favor of preprint servers like Socarxiv is the quality and quickness of feedback offered to scientists on their pre-published work.[71] The founders of Socarxiv claim that their platform allows researchers to gain easy feedback from their colleagues on the platform, thereby allowing scientists to develop their work into the highest possible quality before formal publication and circulation. The founders of Socarxiv further claim that their platform affords the authors the greatest level of flexibility in updating and editing their work to ensure that the latest version is available for rapid dissemination. The founders claim that this is not traditionally the case with formal journals, which instate formal procedures to make updates to published articles. Perhaps the strongest advantage of some preprint servers is their seamless compatibility with Open Science software such as the Open Science Framework. The founders of SocArXiv claim that their preprint server connects all aspects of the research life cycle in OSF with the article being published on the preprint server. According to the founders, this allows for greater transparency and minimal work on the authors' part.[68]

Plagiarism and Pre-Print Servers

The most common criticism of pre-print servers is their proclivity to foster a culture of plagiarism. For example, the popular physics preprint venue called ArXiv had to withdraw 22 papers whence it came to light that they were plagiarized. In June 2002, a high-energy physicist in Japan was contacted by a man called Ramy Naboulsi, a non-institutionally affiliated mathematical physicist. Naboulsi requested Watanabe to upload his papers on ArXiv as he was not able to do so, because of his lack of an institutional affiliation. Later, the papers were realized to have been copied from the proceedings of a physics conference.[72] Preprint servers are increasingly developing measures to circumvent this plagiarism problem. In developing nations like India and China, where plagiarism is common, explicit measures are being taken to combat it.[73] These measures usually involve creating some type of central repository for all available pre-prints, allowing the use of traditional plagiarism detecting algorithms to detect the fraud. Nonetheless, this is a pressing issue in the discussion of pre-print servers, and consequently for Open Science.

See also

References

  1. Was ist Open Science? online 23.06.2014 from OpenScience ASAP
  2. "FOSTER". Retrieved 12 August 2015.
  3. 1 2 Nancy Pontika; Petr Knoth; Matteo Cancellieri; Samuel Pearce (2015). "Fostering Open Science to Research using a Taxonomy and an eLearning Portal". Retrieved 12 August 2015.
  4. Machado, J. "Open data and open science". In Albagli, Maciel & Abdo. "Open Science, Open Questions", 2015
  5. 1 2 3 4 5 6 7 8 9 David, P. A. (2004). "Understanding the emergence of 'open science' institutions: Functionalist economics in historical context". Industrial and Corporate Change. 13 (4): 571–589. doi:10.1093/icc/dth023.
  6. Nielsen 2011, p. 198-202.
  7. 1 2 3 4 David, Paul A. (March 2004). "Can "Open Science" be Protected from the Evolving Regime of IPR Protections?". Journal of Institutional and Theoretical Economics. Mohr Siebeck GmbH & Co. KG. 160 (1). JSTOR 40752435.
  8. "OpenScience". De Gruyter Open.
  9. 1 2 Molloy, J. C. (2011). "The Open Knowledge Foundation: Open Data Means Better Science". PLoS Biology. 9 (12): e1001195. PMC 3232214Freely accessible. PMID 22162946. doi:10.1371/journal.pbio.1001195.
  10. Bosman, Jeroen (2017-03-26). "Defining Open Science Definitions". I&M / I&O 2.0. Retrieved 2017-03-27.
  11. online 02.09.2015
  12. Glyn Moody (26 October 2011). "Open Source, Open Science, Open Source Science". Retrieved 3 January 2012.
  13. Rocchini, D.; Neteler, M. (2012). "Let the four freedoms paradigm apply to ecology". Trends in Ecology & Evolution. 27 (6): 310–311. doi:10.1016/j.tree.2012.03.009.
  14. David, P. A. (2008). The historical origins of ‘Open Science’: An essay on patronage, reputation and common agency contracting in the scientific revolution. Capitalism and Society, 3(2), 5.
  15. Fecher, Benedikt; Friesike, Sascha (2014). "Open Science: One Term, Five Schools of Thought". Opening Science. doi:10.1007/978-3-319-00026-8_2.
  16. Altunay, M., et al. (2010). A science-driven production Cyberinfrastructure—the Open Science grid. Journal of Grid Computing, 9(2), 201–218. doi:10.1007/s10723-010-9176-6.
  17. De Roure, D., et al. (2008). myExperiment: defining the social virtual research environment. In IEEE (pp. 182–189). Available at: http://ieeexplore.ieee.org/lpdocs/epic03/wrapper. htm?arnumber=4736756.
  18. Priem, J., et al. (2011). Uncovering impacts: CitedIn and total-impact, two new tools for gathering altmetrics (pp. 9–11). In iConference 2012. Available at: http://jasonpriem.org/selfarchived/ two-altmetrics-tools.pdf
  19. Friesike, S., et al. (2015). Opening science: towards an agenda of open science in academia and industry. The Journal of Technology Transfer, 40(4), 581-601. DOI: 10.1007/s10961-014-9375-6.
  20. Tacke, O., 2010. Open Science 2.0: How Research and Education Can Benefit from Open Innovation and Web 2.0. In T. J. Bastiaens, U. Baumöl, & B. J. Krämer, eds. On Collective Intelligence. Berlin, Heidelberg: Springer Berlin Heidelberg, pp. 37–48..
  21. 1 2 3 4 5 Nielsen 2011, p. 172-175.
  22. McClellan III, James E. (1985). Science reorganized : scientific societies in the eighteenth century. New York: Columbia University Press. ISBN 978-0-231-05996-1.
  23. Groen 2007, p. 215-216.
  24. Kronick 1976, p. 78.
  25. Price 1986.
  26. http://www.popsci.com/scitech/article/2002-07/history-popular-science
  27. Lewenstein, Bruce V. "Was there really a popular science “boom”?." Science, Technology, & Human Values 12.2 (1987): 29-41.
  28. Eisen, Michael (10 January 2012). "Research Bought, Then Paid For". The New York Times. New York: NYTC. ISSN 0362-4331. Retrieved 12 February 2012.
  29. Howard, Jennifer (22 January 2012). "Who Gets to See Published Research?". The Chronicle of Higher Education. Retrieved 12 February 2012.
  30. Rosen, Rebecca J. (5 January 2012). "Why Is Open-Internet Champion Darrell Issa Supporting an Attack on Open Science? - Rebecca J. Rosen". The Atlantic. Retrieved 12 February 2012.
  31. Dobbs, David (30 January 2012). "Testify: The Open-Science Movement Catches Fire". wired.com. Retrieved 12 February 2012.
  32. Van Calmthout, Martijn (5 April 2016). "EU wil dat onderzoekers gegevens meer gaan delen in eigen datacloud". De Volkskrant. Retrieved 8 April 2016.
  33. Osborne, Robin (2013-07-08). "Why open access makes no sense". The Guardian. ISSN 0261-3077. Retrieved 2017-01-11.
  34. Eveleth, Rose. "Free Access to Science Research Doesn't Benefit Everyone". The Atlantic. Retrieved 2017-01-11.
  35. Nielsen 2011, p. 198.
  36. Smolin, Lee (2006). The trouble with physics : the rise of string theory, the fall of a science, and what comes next (1st Mariner Books ed.). Boston: Houghton Mifflin. ISBN 978-0-618-55105-7.
  37. Enserink, Martin (November 23, 2011). "Scientists Brace for Media Storm Around Controversial Flu Studies". Retrieved April 19, 2012.
  38. Malakoff, David (March 4, 2012). "Senior U.S. Lawmaker Leaps Into H5N1 Flu Controversy". Science Insider - AAAS.ORG. Retrieved April 19, 2012.
  39. Cohen, Jon (January 25, 2012). "A Central Researcher in the H5N1 Flu Debate Breaks His Silence". Science Insider - AAAS.ORG. Retrieved April 19, 2012.
  40. Nielsen 2011, p. 200.
  41. Crotty, Shane (2003). Ahead of the curve : David Baltimore's life in science. Berkeley, Calif.: University of California Press. ISBN 9780520239043. Retrieved 23 May 2015.
  42. Wade, Nicholas (March 19, 2015). "Scientists Seek Ban on Method of Editing the Human Genome". The New York Times. Retrieved 25 May 2015.
  43. Nielsen 2011, p. 201.
  44. Nielsen 2011, p. 202.
  45. http://ronininstitute.org/open-science-and-its-discontents/1383/
  46. http://www.npr.org/tags/502124007/fake-news
  47. https://thewinnower.com/
  48. http://digitheadslabnotebook.blogspot.co.uk/2014/01/guide-to-open-science.html
  49. Munafò, Marcus R.; Nosek, Brian A.; Bishop, Dorothy V. M.; Button, Katherine S.; Chambers, Christopher D.; Sert, Nathalie Percie du; Simonsohn, Uri; Wagenmakers, Eric-Jan; Ware, Jennifer J. (2017-01-10). "A manifesto for reproducible science". Nature Human Behaviour. 1 (1). ISSN 2397-3374. doi:10.1038/s41562-016-0021.
  50. Wolfe-Simon, Felisa; Blum, Jodi Switzer; Kulp, Thomas R.; Gordon, Gwyneth W.; Hoeft, Shelley E.; Pett-Ridge, Jennifer; Stolz, John F.; Webb, Samuel M.; et al. (2 December 2010). "A bacterium that can grow by using arsenic instead of phosphorus". Science. 332 (6034): 1163–1166. PMID 21127214. doi:10.1126/science.1197258. Retrieved 2014-07-20.
  51. Zimmer, Carl (May 27, 2011). "The Discovery of Arsenic-Based Twitter". Slate.com. Retrieved April 19, 2012.
  52. M. L. Reaves; S. Sinha; J. D. Rabinowitz; L. Kruglyak; R. J. Redfield (January 31, 2012). "Absence of arsenate in DNA from arsenate-grown GFAJ-1 cells". arXiv:1201.6643Freely accessible.
  53. Redfield, Rosie (February 1, 2012). "Open peer review of our arseniclife submission please". RRResearch - the Redfield Lab, University of British Columbia. Retrieved April 19, 2012.
  54. "Academic Publishing: Survey of funders supports the benign Open Access outcome priced into shares" (PDF). HSBC. Retrieved 2015-10-22.
  55. Albert, Karen M. (2006-07-01). "Open access: implications for scholarly publishing and medical libraries". Journal of the Medical Library Association. 94 (3): 253–262. ISSN 1536-5050. PMC 1525322Freely accessible. PMID 16888657.
  56. Couchman, John R. (2014-01-01). "Peer Review and Reproducibility. Crisis or Time for Course Correction?". Journal of Histochemistry and Cytochemistry. 62 (1): 9–10. ISSN 0022-1554. PMC 3873808Freely accessible. PMID 24217925. doi:10.1369/0022155413513462.
  57. Collaboration, Open Science (2012-11-01). "An Open, Large-Scale, Collaborative Effort to Estimate the Reproducibility of Psychological Science". Perspectives on Psychological Science. 7 (6): 657–660. ISSN 1745-6916. PMID 26168127. doi:10.1177/1745691612462588.
  58. "Specials : Nature". www.nature.com. Retrieved 2015-10-22.
  59. Piwowar, Heather A.; Day, Roger S.; Fridsma, Douglas B. (2007-03-21). "Sharing Detailed Research Data Is Associated with Increased Citation Rate". PLoS ONE. 2 (3): e308. PMC 1817752Freely accessible. PMID 17375194. doi:10.1371/journal.pone.0000308.
  60. Swan, Alma. "The Open Access citation advantage: Studies and results to date." (2010).
  61. https://www.youtube.com/watch?v=OXFB-SDCSX8
  62. https://www.braininitiative.nih.gov/
  63. Nielsen 2011, p. 109.
  64. Allen, Paul (30 November 2011). "Why We Chose 'Open Science'". Wall Street Journal. Retrieved 6 January 2012.
  65. http://www.sjscience.org
  66. http://www.ocsdnet.org
  67. Noble, Ivan (14 February 2002). "Boost for research paper access". BBC News. London: BBC. Retrieved 12 February 2012.
  68. 1 2 https://socopen.org/2016/07/09/announcing-the-development-of-socarxiv-an-open-social-science-archive/
  69. Tierney, H. L., Hammond, P., Nordlander, P., & Weiss, P. S. (2012). Prior Publication: Extended Abstracts, Proceedings Articles, Preprint Servers, and the Like.
  70. Moed, H. F. (2007). The effect of “open access” on citation impact: An analysis of ArXiv's Condensed matter section. Journal of the American Society for Information Science and Technology, 58(13), 2047-2054.
  71. Binfield, P. (2014). Novel scholarly journal concepts. In Opening science (pp. 155-163). Springer International Publishing.
  72. https://www-nature-com.proxy.uchicago.edu/nature/journal/v426/n6962/full/426007a.html
  73. Chaddah, P. (2016). On the need for a National Preprint Repository. Proceedings of the Indian National Science Academy, 82(4), 1167-1170.

Sources

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.