Uncategorized – Open Access Working Group https://access.okfn.org Sharing the results of publicly funded research Mon, 25 Apr 2016 11:11:04 +0000 en-US hourly 1 https://wordpress.org/?v=4.9.8 113590576 Open Access: a remedy against bad science https://access.okfn.org/2012/12/04/open-access-a-remedy-against-bad-science/ https://access.okfn.org/2012/12/04/open-access-a-remedy-against-bad-science/#respond Tue, 04 Dec 2012 14:40:41 +0000 https://access.okfn.org/?p=569 Who has never been in the situation that he had a set of data where some of them just didn’t seem to fit. A simple adjusting of the numbers or omitting of strange ones could solve the problem. Or so you would think. I certainly have been in such a situation more than once, and looking back, I am glad that I left the data unchanged. At least in one occasion my “petty” preformed theory proved to be wrong and the ‘strange data’ I had found were corresponding very well with another concept that I hadn’t thought of at the time.
There has been a lot of attention in the media recently for cases of scientific fraud. Pharmaceutical companies are under fire for scientific misconduct (Tamiflu story), and in the Netherlands the proven cases of fraud by Stapel (social psychology), Smeesters (psychology) and Poldermans (medicine/cardiology) have resulted in official investigations into the details of malpractice by these scientists. All this has led to a sharp decline in the trust that people used to have in science (Flawed science:The fraudulent research practices of socialpsychologist Diederik Stapel: report of the committee LND (Levelt,Noort, Drenth)). A report with recommendations for preventing scientific fraud, called “sharpening policy after Stapel”  was published by four Dutch social psychologists:  Paul van Lange (Amsterdam), Bram Buunk (Groningen), Naomi Ellemers (Leiden) and Daniel Wigboldus (Nijmegen). One of the report’s main recommendations is to share raw data and have them permanently stored safely and accessible for everyone.  It will be clear that the issue of scientific misconduct is by no means restricted to the Netherlands, nor to specific fields of research. Other countries have similar stories for other scientific fields. For example a committee like the committee LND mentioned above , recently presented the outcome of an investigation into the scientific publications of Eric Smart from the University of Kentucky in the field of food science. And then there is the essay by John Ioannidis “Why most published research findings are false”, where he gives 6 reasons for the bad quality of (medical) science……

In this article I propose that for almost all of the instances where scientific misconduct was found, open access to articles AND raw data would have either prevented the fraud altogether, or at the very least would have caused them to be exposed much more rapidly than has been the case in the current situation. Especially in the field of medical research such a change can literally change lives.
To illustrate this point I want to make a distinction between different forms of ‘Bad Science’. On the author side we can have selective publishing (omitting data that do not fit one’s theory), non-reproducibility, data manipulation and at the far end of the spectrum even data fabrication. On the side of publishers we have publication bias (preferential publishing of positive results or data that confirm an existing theory), fake peer review and reviewers or editors pushing authors to make a clear story by omitting data (effectively resulting in selective publishing!).
PUBLICATION BIAS. The strategy of publishers to preferentially publish the most exciting stories and stories in support of a new finding (publication bias) contributes to selective publishing and sloppy science. Under much pressure to publish their (exciting) results researchers take less care than would be advisable when they submit their research to highly ranked journals. No small wonder that so-called high impact journals also show very high retraction rates of manuscripts.  Publication bias is also a real problem when validating scientific findings. Published results  are often unrepresentative of the true outcome of many similar experiments that were not selected for publication. For example,  an empirical evaluation of the 49 most-cited papers on the effectiveness of medical interventions, published in highly visible journals in1990–2004, showed that a quarter of the randomised trials and five of six non-randomised studies had already been contradicted or found to have been exaggerated by 2005 (see: why current publication practices may distort science). At the same time negative findings tend to be dismissed. In the case of efficacy studies for a new drug two positive studies are sufficient for registration with the FDA while cases are reported where the number of submitted negative studies can be as high as 18 (see:  selective publication of anti-depressant trials and its influence on apparent efficacy). I don’t think that I have to spell out the consequences that this has for medical health.
QUALITY CONTROL. In cases where scientists commit fraud, the main control mechanism against this in the current situation consists of peer-review and comments from colleagues who have read the article(s). This control sometimes suffices, but in many cases peer-reviewers don’t have or don’t take the time to look at the actual content of an article in detail, let alone at the raw data. Often these data are not even available anyway, or inexplicably got lost somehow.  Another complication is that because of the enormous growth in number of journals and total scientific output it has become increasingly difficult to do proper quality checks of all the articles in the form of peer-review. And the kind of rigorous study into malpractices like the one done by the committee LND for the case of social psychologist D. Stapel, shows how much time (1 ½ years) it can take to check on just one scientist. This underpins the notion that it will be impossible to check on all suspicious articles in this way.The solution in my view can be found in open access publishing. Making information available for virtually everybody automatically entails a control mechanism for scientific quality, by something like ‘crowd-sourced peer-review’. To state it more simply: the more people there are who can take a look at complete data, the more likely it is that inconsistencies will be quickly spotted.
THE CASE FOR OPEN ACCESS. When articles and data are published open access, this fact alone discourages scientific misconduct. The availability of the complete article, including the raw data, to a very large audience has this kind of effect. One can be sure that if there is something wrong with the article, there will be someone out there who will spot this. The same mechanism is responsible for a major advantage of open access: the fact that scientific information that has been made available using  open access will reach such a large audience that there will always be someone out there who can and will improve on the ideas described in a publication. At the recent Berlin10 conference in Stellenbosch this so-called “proximity paradox” has been brilliantly explained by Elliot Maxwell. He described the effect with the single sentence: ““With enough eyeballs, all bugs are shallow”, meaning that with enough dissemination any problem can be solved. Tim Gowers, a fervent proponent of open access has exploited this in his now famous Polymath Project: sharing a very difficult mathematical problem with as many people as possible solved the problem in a fraction of the time than would have been possible doing it any other way. The company Innocentive.com exploits this effect by broadcasting a problem that has to be solved and offering a reward of a fixed amount of money for anyone offering the solution.  In this manner the  “wisdom of the crowd” offers a way to keep science and scientists on track, while at the same time it stimulates a new way of doing science: by speeding it up, promoting the pursuit of new research, increasing  innovation potential by contributions from unforeseen sources and accelerating the translation from actual discovery to practical solutions (E. Maxwell). And the crowd can only be wise with Open Access to information. Another effect of open access on the quality of science is that it effectively reduces duplicative and dead-end scientific projects. And last but not least open access facilitates the accountability for research funding and it facilitates focusing on real priorities.
THE ROLE OF OPEN ACCESS JOURNALS. While peer-review remains indispensable for publishing good science, open access enables other forms of peer-review than the ones that are traditionally in use. Open access articles can be peer-reviewed by many more people. Post-publication peer-reviewing will certainly prove to be an effective control mechanism for the quality of scientific articles, and for the detection of scientific misconduct. But also pre-publication peer-review can be improved. BiomedCentral recently started a new system of peer review Peerage of Science, which works with  a “pool” of possible peer-reviewers much larger than the small number of reviewers that other journals usually have on call. This will speed up the process of peer-review, and having less of a burden on single reviewers will probably also improve the quality of the peer-review process itself. Another very important point concerning the prevention of fraud is the open access publication of the underlying raw data together with the article. A number of initiatives exist in this area. Figshare and Dryad facilitate the storage and linking of raw research data and journals are slowly starting to move towards the publication of raw data linked to the article. The National Science Foundation and other funders are now accepting data as first class research output. However, we still have a long way to go. In spite of the fact that 44/50 journals had a policy put in place for the sharing of data, a survey in PLoSONE  (Public availability of published research data in high-impact journals) concluded that for only 47/500 scientific publications that had appeared in these journals in 2009, research data had been made available online.  Despite all the efforts described above this situation has probably not changed substantially.
CONCLUSION. Implementation of open access inclusive of full access to raw research data would minimize the possibilities for scientific fraud, which can be anything from biased presentation to the fabrication of data or the dismissal of negative results.  It would most certainly change the way that science is done. Having the data available, open access, for the whole world to see and check on, will provide a very strong incentive for scientists to publish good science. In my view it will prove to be very difficult indeed to present faulty data in an open access / open science system, and actually get away with it.
]]>
https://access.okfn.org/2012/12/04/open-access-a-remedy-against-bad-science/feed/ 0 569
The Impact Factor: Past its Expiry Date https://access.okfn.org/2012/06/22/the-impact-factor-past-its-expiry-date/ https://access.okfn.org/2012/06/22/the-impact-factor-past-its-expiry-date/#comments Fri, 22 Jun 2012 08:58:22 +0000 https://access.okfn.org/?p=528 Until very recently the one way to measure the quality of a scientific article was by pre-publication peer-review and post-publication citation rates.

Citation rates are still commonly used for the assessment of the quality of individual scientists and for the assessment of the quality of individual scientific journals. In the latter case the measuring tool, the impact factor (IF) is thought to represent the chance for high citation rates when publishing your work in High Impact journals. High citation rates for articles are thus taken to mean high quality for the underlying science. In reality the impact factor has been shown to correlate poorly with actual citation rates (http://arxiv.org/abs/1205.4328, http://blogs.lse.ac.uk/impactofsocialsciences/2012/06/08/demise-impact-factor-relationship-citation-1970s/). In fact, it correlates rather well with recorded rejection rates for submitted papers (http://blogs.lse.ac.uk/impactofsocialsciences/2011/12/19/impact-factor-citations-retractions/).

This effectively undermines the assumed relationship between impact factor and the quality of science.

The use of the impact factor also had another side-effect, because it has led to the preservation of a publishing system where authors sustain existing high impact Toll Access journals by publishing their work there, only because these journals are labeled high impact. For these reasons and more (see below) I will argue that the impact factor is long past its (imaginary) expiry date and should urgently be replaced by a new system consisting of a combination of altmetrics and citation rates. To distinguish this system from the old one I would like to suggest a completely new name: the Relevance Index.

Open access publishing has been instrumental in the imminent demise of the Toll-access high impact journals.

Today many high quality open access journals are publishing an increasing number of highly cited and high quality scientific articles.

Although open access has been shown to increase citation rates, we should not make the mistake of wanting to continue using the impact factor.The reason for this is simple: open access opens ways for far better methods for the assessment of scientific quality.

For starters many more people will be able to read scientific articles, and therefore post-publication peer-review can replace the bi’ased pre-publication peer-review system. In addition to actual citation rates, the relevance of the articles in an open access system can be measured by monitoring social media usage, download statistics, quality of accompanying data, external links to articles etc. In contrast with the system measuring a journal impact factor, this system called altmetrics focuses on the article level. The field of altmetrics is under heavy development and has raised much interest during the past few years. So much so, that this years altmetrics12 conference (#altmetrics12) taking place in Chicago this month has attracted a record number of visitors. The conference can be followed by a live stream on Twitter (#altmetrics12, @altmetrics12).

Apart from the fact that open access is enabling the development of better quality assessment tools than the  impact and citatation factors, open access in itself leads to better quality science by at least three separate mechanisms:

1)by counter-acting the publication bias in the current publication system, 2) by discouraging selective publishing on the part of the author, 3) by minimizing scientific fraud by the publication of underlying data. Let me explain.

1) Counter-acting the publication bias in the current publication system. The current publication system has evolved in such a way that the more spectacular or unusual the results are, the more the chance is that they will be accepted for publication in leading scientific journals . The same goes for publications confirming these findings. Negative findings tend to be dismissed. In the case of efficacy studies for a new drug two positive studies are sufficient for registration with the FDA while cases are reported where the number of submitted negative studies can be as high as 18 (see: selective publication of anti-depressant trials and its influence on apparent efficacy). This publication bias is a real problem when validating scientific findings. Published results are very often unrepresentative of the true outcome of many similar experiments that were not selected for publication. For example, an empirical evaluation of the 49 most-cited papers on the effectiveness of medical interventions, published in highly visible journals in1990–2004, showed that a quarter of the randomized trials and five of six non-randomized studies had already been contradicted or found to have been exaggerated by 2005 (see: why current publication practices may distort science and references therein). The strategy of publishers to preferentially publish the most exciting stories and stories in support of a new finding is linked to creating a status based on selectivity. This selectivity then is defended with the argument of limited print space. But selectivity is in fact used for something else entirely. In terms of economics it is a way for publishers to turn a commodity (scientific information) of which the value for the future is unsure into a scarce product. This in itself is the well-known commercial process of ‘branding’ where a product with no clear intrinsic value gains value through restricted access and artificial exclusivity. In the case of scientific publications this value then translates into status for the journal and for the scientist publishing in that journal. The most astonishing part of the story however is, that publishers get their product (scientific information) which has been largely produced using public funding, for free, and succeed in selling it back to the public with the aid of commercial ‘ branding’. Seen in this light publication bias is the by-product of commercial branding. Open Access would put an end to these practices. It would give free access to information to the people who already paid for it. At the same time implementation of open access publishing would counteract the publication bias imposed by the publishers and possibly also stakeholders like pharmaceutical companies, because the grand total of papers published in this system would be more representative of the actual work done in the field. For the field of malaria research for example, the effect would be amplified through an increase in the number of relevant publications from researchers in the developing world. All this would lead to better science.

2) Discouraging selective publishing on the part of the author. The post-publication peer-review made possible by open access (discussed in another post click here) would also contribute to better science, because it would provide a control mechanism against selective publishing on the part of the author of a scientific publication.

3) minimizing scientific fraud by the publication of underlying data. An important but often overlooked aspect of scientific publishing is the availability of the original data behind the actual science. For Open Access to really work, access should not be restricted to the mere content of published articles in scientific journals. Access to the raw data behind the articles is equally important, because validation of a publication is not easy without access to the real data. In spite of the fact that 44/50 journals had a policy put in place for the sharing of data, a recent survey in PLoSONE (Public availability of published research data in high-impact journals) concluded that for only 47/500 scientific publications that had appeared in these journals in 2009, research data had been made available online. Implementation of an Open Access publication system inclusive of Full Access to raw research data would offer a further advantage of minimizing the possibilities for scientific fraud, which can be anything from biased presentation to the fabrication of data.

Open Access is the future of scientific publishing, and as this future is near, the impact factor and Toll Access journals will soon become relics of the past.

In my view the impact factor has been flawed from the beginning, and the sooner we make the transition to open access and new forms of metrics, the better; better for science, better for citizens, better for companies, better for businesses,  better for countries and better for society as a whole.

 

]]>
https://access.okfn.org/2012/06/22/the-impact-factor-past-its-expiry-date/feed/ 6 528
Open Access: Not just a matter for scientists https://access.okfn.org/2012/05/17/open-access-not-just-a-matter-for-scientists/ https://access.okfn.org/2012/05/17/open-access-not-just-a-matter-for-scientists/#comments Thu, 17 May 2012 20:55:16 +0000 https://access.okfn.org/?p=461

Eric Johnson is an engineering professional working as a patent facilitator for a multinational company. One of his jobs is to find information and “connect the dots” related to intellectual property of competitors, to develop research strategies for his company. He is also a multiple occurrence Testicular Cancer survivor who used the medical literature to research his condition and inform his treatment.He says: ”I do not believe I would be alive today if it were not for the information that can only be accessed by the layman (patient) in online sources”.

 

This is just one story of many on the website WhoneedsAccess, where scientists and non-scientists speak out about their need for access to information. Information that is often inaccessible without expensive subscriptions to scientific journals or payment of € 20-30 per publication. The website is the initiative of Mike Taylor, a scientist and open access advocate, and member of the @ccess Initiative, a group dedicated to the promotion of open access to scientific publications and data for everyone, scientists and non-scientists.

The basis for the requirement of open access to information is formed by the following 3 principles:

  1.  Access to information is a fundamental right (similar to the right to clean air, clean water, medical care, education)
  2. The accumulated knowledge of mankind is owned by everyone and cannot and should not be claimed or shielded from access by individuals, organizations, firms or governments.
  3.  Knowledge by itself has no intrinsic value, it only derives a value from being shared with as many people as possible.

The dissemination of knowledge on a large scale only became possible through the distribution of books and journals by publishers among a growing group of (high) educated people. Before the introduction of the Internet (in the 90s of the last century), publishers had built up a monopoly on the production and distribution of knowledge through printed scientific journals and books. The increasing costs of subscriptions to scientific journals were justified by the publishing companies by pointing to increasing production and distribution costs. Scientists and research institutions had no choice but to pay. After the introduction of the internet costs fell significantly and modern digital reproduction and distribution have made these costs nowadays almost negligible. The publishers, however, have continued to increase their prices and to shield most publications from free access on the internet. Because of this, scientists, institutes and other knowledge seekers continue to  pay large sums to publishers for a now basically redundant service.

The reason for this is emotional rather than scientific. Major scientific publishers were able to maintain a monopoly on the dissemination of scientific knowledge, because a growing number of authors with a growing number of publications have felt the compelling need to be published in a very limited number of so-called High Impact Factor journals. These journals are renowned and sought after, because renowned scientists voluntarily continue to publish there. And these journals are largely owned by large multinational publishing companies.

So what is in fact happening is, that scientists are publishing in these journals, because they THINK they HAVE TO, because OTHERS DO SO, and also because scientific committees continue to JUDGE SCIENTISTS by their NUMBER OF PUBLICATIONS IN HIGH_IMPACT JOURNALS-which ARE high impact BECAUSE scientists CONTINUE to publish their best work there. The result is a vicious circle that seems hard to break.

In this way publishers have succeeded in creating near ideal market conditions for themselves: a product that is delivered for free (by scientists), a quality assurance system that is delivered for free (peer-review by scientists), and an absurdly high price for access to information that is determined entirely by the same publishers. For one thing, it is fully unjustified that after publication one has to pay again to get access to the results of the research, as much of research has been already paid for with public funds.

How profitable publishing Science can be, is illustrated by the following figures: in 2011 Elsevier asked $ 7089 for a subscription to Theoretical Computer Sciences (source: American Mathematical Society). That same year Elsevier also made a profit of £768 million on a turnover of £ 2.1 billion, a margin of 37.3%. 78% of those sales came from selling subscriptions to scientific journals. Compare this with a margin of 24% in 2011 for Apple, the highest profit ever in the history of this company. Another example: during the last 6 years, average prices for access to online content from 2 large scientific publishers have increased by an incredible 145%.

For a long time it seemed that the publishers could continue this highly lucrative business without too much trouble. That is …… until 21 January this year when Tim Gowers, Professor of Mathematics at Cambridge University made an appeal to his colleagues to boycott Elsevier, one of the largest scientific publishers. That call was so successful, that the list currently counts over 11,000 signatures. More and more people seem to finally realize that something can be done against the extremely high cost of subscriptions to scientific journals and the inaccessibility of scientific information, namely NOT PUBLISH [in these journals] and a MANDATORY REQUIREMENT FOR OPEN ACCESS.

The call from Tim Gowers has launched what the English press is already calling the Academic Spring. For example, Harvard, one of the richest universities in the world with a total budget of $ 31.7 billion, decided to cancel “too expensive” journal subscriptions because they no longer could afford them, at the same time asking her professors to publish more or less mandatory in open access journals in order to “help increase the reputation of these scientific journals”. In England, the Minister of Science David Willetts announced at a conference with the United Kingdom Publishers Association, that all publicly funded research should be published as open access. The government has called on Jimmy Wales, one of the founders of Wikipedia for help in this process. The Wellcome Trust already had issued an announcement to that same effect for the research that it is funding. The World Bank announced last month, that all existing and new publications, reports and documents will be open access by july 2012. And Neelie Kroes, of the EU Digital Agenda said on May 4, 2012 in a speech in Berlin at the Re: publica conference on the topic of ‘Internet Freedom’, that “entire industries that were once based on monitoring and blocking could now be rebuilt on the basis of customer friendliness , sharing and interaction. “A clearer reference to the scientific publishers can hardly be imagined. The EU now has issued a directive whereby the all research funded by a total budget of € 80 billion must all be published open access from 2014 onwards. And very recently the Access2Research initiative has launched a campaign for open access through a petition to the White House.  The action has yielded over 10,000 signatures in slightly more than 2 days and will probably reach the required 25,000 signatures long before the deadline of june 19, 2012.

 

In the Netherlands, NWO (Dutch Research Organization) has , for a number of years now, been engaged in promoting open access to scientific publications. Last year, a funding of € 5 million has been made available for adapting existing, or creating new open access journals. One of the new journals that will receive funding is the Malaria World Journal, an online open access journal for malaria research of the Netherlands-based MalariaWorld Internet platform.

Advantages and disadvantages of open access

The benefits of open access are economic, social and scientific in nature:

 Economic advantages.The Committee of Economic Development (CED) in America has concluded that the benefit by the introduction of open access to NIH has outweighed the costs many times over. And in Australia it was found that open access to the information held by the Bureau of Statistics had cost $ 4.6 million in investments and yielded $ 25 million in benefits. In England, the Open Access Implementation Group has determined that the public sector has already saved £ 28.6 million by open access, and that each 5% increase in open access publications will save the public sector an additional £ 1.7 million.

Social advantages. Because access to information is the key to development, innovation and prosperity, open access also has significant social implications. Good information for citizens, politicians, businessmen and others, forms the basis for a functioning democracy. This is certainly of great importance for all non-democratic countries such as in Africa. And open access to information for patients, for example, can literally save lives. Open access to information for nurses will increase the knowledge and motivation and contribute to better care. And there are countless other examples to consider.

Advantages for science. The same CED mentioned above has said that its research also clearly showed that the research process itself was considerably accelerated by immediate and free access to the results. Commercial applications followed more quickly and there were less dead-end research projects. The quality of research was in fact found to be markedly improved by open access, probably because of much more feedback and monitoring by college researchers. In addition open access created many more opportunities for innovation because more people and especially more people from different backgrounds tried to solve the same problem. Open access also provides a solution to the so-called “local search phenomenon” in which solutions will be less than optimal as the group members are smaller in number and less diverse.

Disadvantages of open access. The only disadvantage of open access to information actually lies mainly on the side of the old school publishers that do not want change their old business models. In addition, those who do change, will have to accept significantly lower revenues. A poor quality of the publications is often claimed to be a major disadvantage of open access. According to the critics this poor quality would be the result of less selection and lack of peer review. The facts show otherwise. The extreme selection that is practiced by the High Impact journals leads to a large amount of manuscripts that are offered for publication and then withdrawn by the authors, causing very relevant publications to often just go down and end up scattered over small journals. Furthermore, open access and peer review are two different items. Many open access journals (eg PLoS and BioMed Central journals such as PLoS One and Malaria Journal) even have a more extensive peer review process, because they do both pre-and post-publication peer review of their publications. Moreover, because many more people share knowledge other new metrics can and are being used for the assessment of the quality of scientific publications, such as number of downloads, social media messages, number of pageviews and post-publication peer review. These methods have the advantage that they take into account the reality of knowledge dissemination via the World Wide Web, and especially when combined with the conventional citation index (the H index), they are a better indicator for the importance of a study than the citation index alone.

A chronology of open access

Archive.org   first open access repository                                                1991

BioMedCentral first open access publisher                                              2000

Wikipedia                                                                                                       2001

Budapest Open Access definition                                                               2002

Bethesda definition of Open Access                                                          2003

Berlin open access definition                                                                      2003

Open access publisher PLoS                                                                      2003

Combined defintion BBB open access                                                       2008

Academic Spring                                                                                          2012

  • Tim Gowers: CostofKnowledge list Boycott Elsevier
  • Organisations and government requiring Open access:
    • Wellcome Trust
    • UK Science minister
    • World Bank
    • EU horizon 2020 program
  • Access2 Research Petition to the White House
]]>
https://access.okfn.org/2012/05/17/open-access-not-just-a-matter-for-scientists/feed/ 8 461
The Access principle revisited: open access and the Knowledge Commons https://access.okfn.org/2012/05/02/the-access-principle-revisited-open-access-and-the-knowledge-commons/ https://access.okfn.org/2012/05/02/the-access-principle-revisited-open-access-and-the-knowledge-commons/#comments Wed, 02 May 2012 09:23:12 +0000 https://access.okfn.org/?p=439 In a recent article Stevan Harnad, one of the most outspoken supporters of open access, writes that peer access has priority over public access. After noting that  “for some fields of research — especially health-relevant research — public access is a strong reason for public funders to mandate providing public access”, he goes on to give a number of reasons that for other areas public access is less important.

His reasoning in this goes as follows: “most peer-reviewed research reports themselves are neither understandable nor of direct interest to the general public as reading matter” and “Hence, for most research, “public access to publicly funded research,” is not reason enough for providing OA, nor for mandating that OA be provided”.

I strongly object to this representation of the “facts”. There is no reason at all to think that science is too difficult for non scientists. There is all the more reason to believe that non scientists can and will contribute to science in unsuspected ways and on top of this are also in need of information. Thomas Insel, director of the National Institute of Mental Health, recently described two examples of citizen science that were reported at the Sage Bionetworks Commons Congress in San Francisco: Meredith, a 15-year-old high school student from San Diego, wrote this year’s breakthrough paper on modeling global epidemics. And an 11-year-old boy from upstate New York solved a problem in protein folding using a computer game called Foldit.  Another argument in favor of open access for everyone was given by Robbert Dijkgraaf, president of the Royal Netherlands Academy of Arts and Sciences, who said that science should be open because the more science is open, the more attraction it will have for future generations to become scientists. In addition we shouldn’t also forget that the public of non-scientists is a mixed group of people consisting of entrepreneurs, students, lawyers, politicians, health professionals and many others. For them access to information is not less vital than for scientists.

The heart of the matter is that science is not, nor should it be, the exclusive domain of scientists. Science (and for that matter all other human knowledge generating activities), is a Knowledge  Commons. The core of the open access debate stems from the following 3 principles: 1) access to knowledge is a human right, 2) the combined human knowledge can never be the property of any single person or organization, it is a human heritage, 3) knowledge in itself is worthless, it only becomes valuable when it can be freely shared and used. These 3 principles are closely linked to the definition of intellectual freedom as defined by the American Library Association:

Intellectual Freedom is the right of every individual to both seek and receive information from all points of view without restriction. It provides for free access to all expressions of ideas through which any and all sides of a question, cause or movement may be explored. Intellectual freedom encompasses the freedom to hold, receive and disseminate ideas.

There have been times and instances that science was the domain of (a) select group(s). However the invention of printing and the development of publishing businesses have, at least initially, done a great deal to make science accessible to the public.  Also, the time that universities were knowledge temples with exclusive rights on knowledge came definitely to an end, when in 1597 the Gresham College in London became the first Open University where access to lectures was free for everyone. This Gresham College later gave rise to the Royal Society. Thus, openness in science has started centuries ago, but has not yet led to open science.

What has gone wrong?  One of the main things that have gone wrong is, that openness has been severely compromised by the monopolization of knowledge by scientific publishers that has occurred during a great part of the 20th century. The erosion of knowledge as a commons, in the form of restricted access to information, has done great harm to science. But it seems almost certain that this is going to end real soon. Just as the invention and wide-spread use of the internet, specifically designed for sharing [information], has been instrumental in the de-monopolization of software (open source software), books (free ebooks), education (open courseware) and more, it will be instrumental in the liberation of science (open access publications and data). Scientists have been slow to adapt to the realities and opportunities of the internet, while the general public has been quick to accept and use its sharing principle. However, with a growing realization [of these opportunities] among scientists, the movement for open access and open science is gaining momentum by the day. The Cost of Knowledge, a list of scientists who have signed a petition to boycott Elsevier, one of the major global players in scientific publishing, already has more than 10,000 signatories. The time has clearly come for a change , and because the internet cannot be controlled by publishers, scientists and citizens now have an unique opportunity to communicate and share their knowledge freely through it, and through open access publishing, to finally start the new era of networked open science.

]]>
https://access.okfn.org/2012/05/02/the-access-principle-revisited-open-access-and-the-knowledge-commons/feed/ 2 439
Game Over for old-school publishers https://access.okfn.org/2012/04/24/game-over-for-old-school-publishers/ https://access.okfn.org/2012/04/24/game-over-for-old-school-publishers/#respond Tue, 24 Apr 2012 22:45:43 +0000 https://access.okfn.org/?p=415 The announcement by Harvard, or more accurately, by Harvard’s Faculty Advisory Council, must surely have come as a titanic blow for old school publishers like Elsevier.

These publishers may still have been thinking that the whole thing about open access would blow over, but the unsuspected move by Harvard, one of the richest and most prestigious universities in the world, must have told them that it is game over for their very profitable scientific publishing model.

In their announcement to Faculty Members in all Schools, Faculties, and Units, the Council stated that

“ [we] write to communicate an untenable situation facing the Harvard Library. Many large journal publishers have made the scholarly communication environment fiscally unsustainable and academically restrictive. This situation is exacerbated by efforts of certain publishers (called “providers”) to acquire, bundle, and increase the pricing on journals”.

 

Although this introduction points to a strong economic reason for this move against restrictive publishers, they go on to propose 9 options that in their view could solve the problem, all of which will prove disastrous for the latter.

For me it is especially the second option that deals the final blow:  “Consider submitting articles to open-access journals, or to ones that have reasonable, sustainable subscription costs; move prestige to open access (F)”

When scientists take this suggestion seriously, they will effectively expose what is the Achilles heel of many of the high impact, renowned journals, namely that their status is upheld by (excellent) scientists assuming they have to publish their work there. And these scientists feel they have no choice, because their peers publish their work there. This ridiculous closed-looped system can only work as long as everybody believes that they have to publish in these journals for that reason. The moment that scientists start turning away from this very idea, and move towards open access journals, the same mechanism will cause others to start publishing there as well. And the moment seems not far away.

A major factor that is still forcing many (early career) scientists to publish in established high impact journals is the belief of scientific committees dealing with appointments, tenure and grant approvals, that publications in specifically those journals matter most. For any lasting change to take place, it will be essential that these committees change their attitude as well. And guess what, they may even do so on the basis of facts. There are two major persistent myths on open access: 1) the quality of the science published is inferior to conventionally published science,  2) open access publishing is unsustainable.  Both these myths are effectively neutralized by the very successful business models of PLoS and BiomedCentral, that publish peer-reviewed open access journals with high impact factors and publications that already surpass those of many conventionally published journals in number. Other open access publishers are following suit.

With all the events surrounding open access, we shouldn’t lose sight of the main reason for wanting open access in the first place: that progress in science critically depends on the free and unrestricted sharing of knowledge.  This sharing will most probably take place in open science communities using open access information sources. The future of science has no place for any restrictive systems and certainly not for old-school scientific publishing. And because the future has just begun, it really is Game Over for Elsevier and similar publishing businesses.

]]>
https://access.okfn.org/2012/04/24/game-over-for-old-school-publishers/feed/ 0 415
The next revolution in Science: Open Access will open new ways to measure scientific output https://access.okfn.org/2012/04/19/the-next-revolution-in-science-open-access-will-open-new-ways-to-measure-scientific-output/ https://access.okfn.org/2012/04/19/the-next-revolution-in-science-open-access-will-open-new-ways-to-measure-scientific-output/#comments Thu, 19 Apr 2012 07:37:53 +0000 https://access.okfn.org/?p=402 Open Access will not only change the way that science is done, it will also change the way that science is judged. The way that scientific output is measured today centers around citations. Essentially, on an author level this means the number of publications and citations of an author’s articles (author-level metrics). On a journal level, it means the average number of citations that articles published in that journal have received in a given time period (journal-level metrics).

For author-level metrics the Author citation Index has now been replaced by the H-Index that was introduced in 2005 by JE Hirsch. Here the criterion is the number of articles [n] that have received ≥ n citations at a fixed date. In the case of journal level metrics, the Journal Citation Report (JCR) is a databases of all citations in more than 5000 journals—about 15 million citations from 1 million source items per year. From this the journal Impact Factor (JIF) is derived from the number of citations in the current year to items published in the previous 2 years (numerator) and the number of substantive articles and reviews published in the same 2 years (denominator).  It effectively represents the average number of citations per year that one can expect to receive by publishing his / her work in a specific journal.

Although the JIF is meant for large numbers of publications, it is also often used in the evaluation of individual scientists. Granting agencies and university committees for instance often substitute the actual citation counts for the number of articles that an author has published in high impact journals. The introduction of the H-Index has diminished the use of the JIF for individual scientists but the practice has yet to disappear. Apart from this the JIF has other flaws. Imagine a journal only publishing reviews. Such a journal would evidently get a high impact factor but clearly the real impact of the published papers for the field will be much less than that from original research papers. An easy way around this problem is offered by the use of the H-Index methodology for journals. This is precisely what Google Scholar metrics does.  Because Google has only been offering this possibility since 1st april 2012, it is too early to tell whether this will become a widely accepted method for journal-level metrics.

The H-Index, Google Scholar metrics and the JIF are all rather good indicators of scientific quality. However, in measuring real-world impact they are seriously flawed. Think for a moment of how impact is felt for whatever random topic you can think of. Everyone of us  will consider the publication itself, but probably also downloads, pageviews, blogs, comments, Twitter, different kinds of media and social network activity (Google+, Facebook), among other things. In other words, all activities that can be measured by “talking” through social media and other online activities can be used to give a more realistic impression of the real impact of a given research article. Since talking about articles depends on actually being able to read the articles, this is where open access comes into play.  The use of the proposed kind of article-level metrics only makes sense when many people are being able to discuss the actual content of published articles, which in turn is only possible when articles are open access. The optimal conditions for using altmetrics would be when articles would all be published as open access, but even with the current growth of open access published papers the method is already starting to make sense.

A number of article-level metrics services are currently in the start-up phase. A company called Altmetric is a small London-based start-up focused on making article level metrics easy. They do this by watching social media sites, newspapers and magazines for any mentions of scholarly articles. The result is an “altmetric” score which is a quantitative measure of the quality and quantity of attention that a scholarly article has received. The altmetric score is also implemented in UtopiaDocs, a PDF reader which links an article to a wealth of other online resources like Crossref (DOI registration agency), Mendeley (scientist network), Dryad (data repository), Scibite (tools for drug discovery), Sherpa (OA policies and copyright database) and  more. A disadvantage of UtopiaDocs may be that it focuses on the PDF format instead of an open format. Also the system seems to be rather slow. PLoS also uses article level metrics to qualify articles by giving comprehensive information about the usage and reach of published articles onto the articles themselves, so that the entire academic community can assess their value. Different from the above, PLoS provides a complete score build on a combination of altmetrics, citation analysis, post-publication peer-review, pageviews, downloads and other criteria. Finally, Total-Impact also makes extensive use of the analysis of social media and other online statistics, to provide a tool to measure total impact of a given collection of scientific articles, datasets and other collections. Their focus on collections represents still another approach to the problem of evaluating scientific output.

The previous overview is probably far from complete, so please feel free to add other possibilities in your comments to this post. However, I do think that the description above is an accurate reflection of the fact that the field of bibliometrics is moving fast and that Open Access will provide the key to the development and implementation of better ways to evaluate scientific output. Compared with the current practices, all of which are based on citations only, the inclusion of altmetrics plus online usage statistics and post-publication peer-review in an open access world will represent a true revolution in the way that science is perceived by all, scientists included.

]]>
https://access.okfn.org/2012/04/19/the-next-revolution-in-science-open-access-will-open-new-ways-to-measure-scientific-output/feed/ 9 402
Point of No-Return for Open Access https://access.okfn.org/2012/04/11/point-of-no-return-for-open-access/ https://access.okfn.org/2012/04/11/point-of-no-return-for-open-access/#comments Wed, 11 Apr 2012 13:33:24 +0000 https://access.okfn.org/?p=381 The Open Access movement is gaining momentum by the day. Just a short while ago you could still hear voices telling us that this was all  just a hype and would go away  Recent events however, have proved them to be wrong.

The call for action by Tim Gowers may have marked a point of no return for the open access movement. It almost seemed as if scientists suddenly and collectively came to realize the absurdness of a situation that they had taken for granted for all too long. Looking back it really seems absurd that scientists have provided publishers with their work for free, have reviewed these works for free, and that publishers in return have charged the scientists and others to be be able to access the information.

There has been a lot of media attention recently for open access.  The Guardian has put open access on its front page in an article on the Wellcome Trust’s move in favor of open access. Sir Mark  Walport, director of the Wellcome Trust said that his organisation “would soon adopt a more robust approach with the scientists it funds, to ensure that results are freely available to the public within six months of first publication”.

Another major event has been the announcementby the World Bank to “become the first major international organization to require open access under copyright licensing from Creative Commons—a non-profit organization whose copyright licenses are designed to accommodate the expanded access to information afforded by the Internet”. Starting april 10, 2012 the World Bank has launched a  repository as a one-stop-shop for most of the Bank’s research outputs and knowledge products, providing free and unrestricted access to students, libraries, government officials and anyone interested in the Bank’s knowledge. Additional material, including foreign language editions and links to datasets, will be added in the coming year. This move is especially significant since the bank is not just making their work free of charge, but also free for use and reuse.”

But with the increased media attention comes the danger that we may loose sight of what is meant by the term ‘open access’. With everyone starting talking about ‘open access’ as if this were one clearly defined term, it has become more urgent than ever to have clarity on this issue. It was one of the reasons for the start of the @ccess Initiative where this blog is posted.  Because open access can range from somewhat restricted (only free reading) to completely unrestricted (completely free for use and reuse) we have proposed to coin the term @ccess for free and unrestricted access to information in accordance with the BBB definition.

Another reason for the @ccess Initiative, and a matter of increasing importance, is the EASE of access to information. When more and more information will become available through open access, the difficulty of finding the right information will also increase. The use of a great number of institutional repositories can only work when all these repositories are adequately cross-linked and working together, a sheer impossible task to accomplish. A better option would be to reduce the number of repositories by limiting these to big (inter)national organisations like WHO, World Bank, FAO and others.

Another option still, and one I personally favor, can run in parallel with the last option above. This option is the storage and management of information with specialized scientific communities as I have described in another blog and on the @ccess communities page of this website. To give an example, and the one that we are actually working on: together with MalariaWorld we are developing a comprehensive database of malaria related publications. At the same time we will ask researchers to deposit their manuscripts and data in an open access repository that is linked to the database. This database will also link to open access articles. For restricted access publications we will seek to get as many manuscripts as possible deposited in the database as well. The community will eventually provide open access to all information, provide a platform for collaboration and information exchange and serve as a communication platform for everyone seeking information on, or working on malaria.  Other communities can be formed using this model. In this way we would move towards a system of interlinked scientific communities and easy access to pertinent information through these communities. This model would also maximize the chances for scientific collaboration and innovation. The combination of open access and the participation of scientists and citizens in the scientific enterprise will change the way that science is done. Networked scientific communities will have far better chances to tackle the world’s toughest problems, not in the least because open access would give equal opportunities to people in developing countries to profit from,and contribute to science. To quote Peter Suber: ““What you want is for everybody to have access to everything, and not just the north making its research available to the south or vice versa. Open access should be a two-way street.” The  proposed structure for scientific @ccess communities would be perfectly suited for this task.

]]>
https://access.okfn.org/2012/04/11/point-of-no-return-for-open-access/feed/ 1 381
Comment on the RCUK draft Policy on Open Access https://access.okfn.org/2012/04/05/comment-on-the-rcuk-draft-policy-on-open-access/ https://access.okfn.org/2012/04/05/comment-on-the-rcuk-draft-policy-on-open-access/#comments Thu, 05 Apr 2012 10:36:17 +0000 https://access.okfn.org/?p=338 Today I have submitted my comments on the RCUK proposed policy on access to research outputs. Here I am posting these comments publicly.

 

Summary

I am very happy to see these proposed changes in the RCUK’s open access policy. Especially so concerning your policy on text- and datamining described as:

Specifically stating that Open Access includes unrestricted use of manual and automated text and data mining tools; and unrestricted reuse of content with proper attribution

 

I do have strong objections to the acceptance of delayed open access as a valid form of open access. This may be a compromise so that (certain) publishers will accept the policy, however there are enough open access publishers that do not impose an embargo and I don’t see why we (scientists) should give in to the wishes of a specific group of publishers. For me any embargo is obstructing the advancement of science and the timely sharing of knowledge and should thus not be part of open access. I personally would also welcome it when you would refer in your open access definition to the Budapest or to the BBB definition, as we do on the website of the @ccess Initiative where I am an member. Finally, I would like to see more collaboration and cooperation with the EU digital agenda which in my view runs a same course as RCUK.

A few more comments and suggestions for some of the proposed changes are listed here below.

(2) What do the Research Councils mean by Open Access?

Search for and re-use the content of published papers both manually and using automated tools (such as those for text and data mining) without putting restrictions on the amount of data , provided that any such reuse is subject to proper attribution.

(3) How is a Scholarly Research Paper made Open Access?

but in practice the Research Councils will accept that access may be restricted to comply with an embargo period imposed by the publisher

 embargo period is not acceptable

(4) What do journals need to do to be compliant with Research Council policy on Open Access?

a) This may require payment of an ‘Article Processing Charge’ to the publisher

I recommend adding a note on what is an acceptable charge because this should not be left open

(5) What Research Outputs will be covered by Research Council Policy on Access to Research Outputs and where should they be published?

No comment

(6) When should a paper become Open Access?

In future, Research Councils will no longer be willing to support publisher embargoes of longer than six or twelve months from the date of publication, depending on the Research Council

Delayed open access not acceptable (see summary above)

(7) How is Open Access paid for?

Research Council grant funding may be used to support payment of Article Processing Charges to publishers

I think that the policy to have open access papers paid from grants is a good one. I would however impose limits to an acceptable APS see comment under (4)

(8) Acknowledgement of funding sources and access to the underlying research materials

The underlying research materials do not necessarily have to be made Open Access, however details of relevant access policies must be included

In my opinion that the underlying research materials can not be seen as separate from the results of research, they should fall under the same rules and should be open access. In fact the underlying data are vital to be able to judge the quality of research. However, I do recognize the need for exceptions for some datasets like patient medical data, but for these cases a list of exceptions would be sufficient.

(9) Implementation and compliance

No comments except for agreement

 

Dr. Tom Olijhoek
Independent researcher
Consultant Dutch Malaria Foundation
Ass. Editor MalariaWorld Journal
Group Coordinator @ccess Initiative
1336GB Almere
The Netherlands
tomolijhoek@malariastichting.nl
@greboun

 

]]>
https://access.okfn.org/2012/04/05/comment-on-the-rcuk-draft-policy-on-open-access/feed/ 3 338
The new RCUK draft Open Access mandate https://access.okfn.org/2012/03/27/the-new-rcuk-draft-open-access-mandate/ https://access.okfn.org/2012/03/27/the-new-rcuk-draft-open-access-mandate/#comments Tue, 27 Mar 2012 13:17:45 +0000 https://access.okfn.org/?p=291 This blog was sent to me by Ross Mounce for publication on @ccess


Research Councils UK (RCUK) – a partnership of seven core UK research funding bodies (AHRC, BBSRC, EPSRC, ESRC, MRC, NERC, and STFC), has recently released a very welcome draft policy document detailing their proposed Open Access mandate, for all research which they help fund.

The new proposed policies include (quoting from the draft):

  • Peer reviewed research papers which result from research that is wholly or partially funded by the Research Councils must be published in journals which are compliant with Research Council policy on Open Access.
  • Research papers which result from research that is wholly or partially funded by the Research Councils should ideally be made Open Access on publication, and must be made Open Access after no longer than the Research Councils’maximum acceptable embargo period. [6 months for all except AHRC & ESRC for which 12 months is the maximum delay permitted].
  • …researchers are strongly encouraged to publish their work in compliance with the policy as soon as possible. [added emphasis, mine]

 

As a researcher funded by BBSRC myself – I’m thrilled to read this document.

It shows a clear understanding of the issues, including explicit statements on the need of different types of access – both manual AND automated:

The existing policy will be clarified by specifically stating that Open Access includes unrestricted use of manual and automated text and data mining tools. Also, that it allows unrestricted re-use of content with proper attribution – as defined by the Creative Commons CC-BY licence

But as a strong supporter of the Panton Principles for Open Data in Science, and Science Code Manifesto, I’m a little disappointed that the policy improvements with respect to data and code access are comparatively minor. Such underlying research materials need only be ‘accessible’ with few further stipulations as to how. AFAIK this allows researchers to make their data available via pigeon-transport (only) on Betamax tapes, 10 years after the data was generated *if there is no ‘best practice’ standard in one’s field.

The BBSRC’s data sharing policy for example seems to favour cost-effectiveness over transparency: “It should also be cost effective and the data shared should be of the highest quality.” and maddeningly seems to give researchers ownership over data, even though the data was obtained using BBSRC-funding: “Ownership of the data generated from the research that BBSRC funds resides with the investigators and their institutions.” This seems rather devoid of logic to me – if taxpayers paid for this data to be created, surely they should have some ownership of it? Finally ”Where best practice does not exist, release of data within three years of its generation is suggested.” 3 years huh? And that’s only a suggestion! Does anyone actually check that data is made available after those 3 years? I suspect not.

Admittedly, it would be hard to create a good one-size fits all policy, and policing it would cost more money, but I do feel that data & code sharing policies could be tightened-up in places, to enable more frictionless sharing, re-using and building-on previous research outputs.

So all in all this is a great step in the right direction towards Open Scholarship, particularly for BBB-compliant Open Access.

Related reactions and comments which are highly worth reading include posts by Casey Bergman, Peter Suber, and Richard Van Noorden.

 

]]>
https://access.okfn.org/2012/03/27/the-new-rcuk-draft-open-access-mandate/feed/ 5 291
Scientific social networks are the future of science https://access.okfn.org/2012/03/20/scientific-social-networks-are-the-future-of-science/ https://access.okfn.org/2012/03/20/scientific-social-networks-are-the-future-of-science/#comments Tue, 20 Mar 2012 09:10:05 +0000 https://access.okfn.org/?p=283
“@ccess to knowledge is a fundamental human right “ Peter Murray-Rust

Since ancient times, information has always been passed on orally or on paper. In terms of information technology, information on paper is compartmentalized. Finding information is often synonymous with finding the right book or publication. Books mostly give a more or less complete picture, and any links to other works are often just there for reference purposes. This is fundamentally different from information contained in the internet, which is hyperlinked by nature. Rather than finding information, it is the filtering of relevant information that is hard to do on the Internet (David Weinberg 2B2K).  

An enormous amount of information is stored on the net. Open access to this knowledge is critical if it is to be shared between individuals and groups. But sharing alone is not enough. Knowledge only becomes useful when we can distinguish between relevant and less relevant information, when we can discuss aspects of the information, when we can annotate and improve on ideas, when we can devise new approaches and collaborate online. This is what I mean by  “open science”, where scientists have free and unrestricted access to information and use interactive media to collaborate online.

For scientific research, this means that open access to publications is necessary to create opportunities for sharing, and that the social interaction of scientists and citizens in online scientific communities is necessary to both filter the information and do something (useful) with it. 

Online scientific communities come in different flavours. I want to divide them here into two categories: broad interest communities and specialist communities. Examples of the former are found with open access publishers like Frontiers, BiomedCentral, PLoS, Intech, with LinkedIn and with the Google service Google+, which all provide platforms for online (scientific) communication and interactivity (blog, forum, interest groups). Independent academic internet community platforms like Mendeley, academia.edu, Connotea, ResearchGate, MyScienceWork and UnitedAcademics provide in addition online reference managing facilities including real time annotation, repository facilities, working on shared documents and managing of collaborative projects. The filtering of publications in all these cases is mostly done by group members using specially developed tools like Papercritic. A new type of academic social media network, GoingOn has been developed specifically for students as most students aren’t attuned to old forms of information like handbooks on paper and online portals anymore.

Examples of the second group of specialized online communities are MalariaWorld (malaria scientists), the OAD ( Open Access Directory with members interested in Open Access) and BiomedExperts, but many more communities exist. MalariaWorld consists of a community of >7000 malaria scientists who receive weekly updates on new malaria publications by mail. Providing interactivity online is an important aspect of the site. 

Because the students of today are the scientists of tomorrow, I am convinced that open science communities are the future of science and that they will continue to grow and incorporate existing social media like Twitter, Youtube and others in the process. One of the aims of the @ccess initiative therefore, is the formation of @ccess specialist communities where scientists and citizens can share, discuss and collaborate on information in defined areas of interest. We have chosen to start building this by choosing a few areas to begin with. As a first step towards this goal we needed to have at least one motivated specialist community with information to share: MalariaWorld volunteered to be that  first @ccess community. In collaboration with the @ccess initiative they will soon be providing comprehensive open access information on malaria (publications and data) using innovative bibliographic databases, self-archiving, a dedicated open access MalariaWorld Journal (self-publishing) and ultimately also novel tools for impact assessment.  We plan for many more communities to be modeled on this example.

In my view, the future of science will ultimately depend on the formation of many such interconnected scientific communities covering all possible areas. Making optimal use of the internet and social media, scientists and citizens within and between these communities will collaborate to produce more useful knowledge than ever before and to store, maintain and provide information for those who seek it. Especially for medical scientists in the developing world, these communities will provide vehicles for innovation, health improvement and development in their respective countries. Following this line of thought, the only hope of winning the battle against malaria, aids, neglected diseases and other tropical infections will lie in free access to and sharing of information, and in joining forces by way of social media and open science communities. It is for these reasons that a research community like MalariaWorld will be our best hope to win the ongoing battle against malaria. 

The concept of scientific networks and their role in changing the way we do science has been described best by Michael Nielsen in his book “Reinventing discovery”, by David Weinberg in  “Too Big To Know” and by Cameron Neylon in an excellent blog on Networked enabled Research.   I owe a special thank you to these three authors for sharing their thoughts

 

]]>
https://access.okfn.org/2012/03/20/scientific-social-networks-are-the-future-of-science/feed/ 1 283