18.11.2015

Sci journals thomson reuters xenith

Web of Science e?? i†µi•? SCI(E), SSCI, A&HCI e“±iz¬i €e„?i—? i??e??e?? e…?e¬?i?„ e?€i?‰i•  i?? iz?iSµe‹?e‹¤. SCI(E) e“±iz¬i €e„?i—? i??e??e?? e…?e¬?i?€ Science Citation Index Expandede??, SSCI e“±iz¬i €e„?i—? i??e??e?? e…?e¬?i?€ Social Sciences Citation Indexe??, A&HCI e“±iz¬i €e„?i—? i??e??e?? e…?e¬?i?€ Arts&Humanities Citation Indexe?? i??i?¬i•? i›„ e?€i?‰i•?i‹?e?° e°”ez?e‹?e‹¤. Life Sciences (journal), and check out Life Sciences (journal) on Wikipedia, Youtube, Google News, Google Books, and Twitter on Digplanet.
Life Sciences is a weekly peer-reviewed scientific journal covering research on the molecular, cellular, and physiological mechanisms of pharmacotherapy. Orange circles represent fields, with larger, darker circles indicating larger field size as measured by Eigenfactor® score. This map of the social sciences was creating using the same methods describe above on the set of social science journals listed in the Journal Citation Reports.
We created this map by clustering 1.7 million articles in the field of computer science, using the map equation approach to analyze data from Microsoft Academic Search. The two maps above show the flow of citations between the fields of Ecology and Evolution and the field of Medicine. This map of the sciences was created by clustering journal-level citation data from the 2004 edition of the Thomson-Reuters Journal Citation Reports, using the Hierarchical Map Equation methodology (Bergstrom and Rosvall 2010).
This map, constructed using journal-level citation data from the Journal Citation reports, highlights the role of neuroeconomics in bringing together research in economics and in neuroscience. The top 20 institutions over all fields consist of 14 US-based universities, three UK-based universities, and one each in Canada, Japan, and Germany. Leading the way, as it does for nine of the 22 fields, is Harvard University, with 95,291 papers cited a total of 2,597,786 times up until the end of April 2009.
Ranking at #3 is Johns Hopkins University with 54,022 papers cited a total of 1,222,166 times.
Coming in at #4 is the University of Washington, with 54,198 papers cited a total of 1,147,283 times. The first of five California universities on the list, Stanford University, ranks at #5 with 48,846 papers with a total of 1,138,795 cites. The second Californian university ranks at #6: the University of California, Los Angeles, with 55,237 papers cited 1,077,069 times.
Special Topic on Cosmic Microwave Background Radiation, Edward Wright, [see also], is from UCLA's Division of Astronomy and Astrophysics. Other UCLA researchers who have been featured in ScienceWatch include Alexander Young, Adam Aron, and Stephen Marder.
The remaining California institutions are the University of California, Berkeley at #8 (46,984 papers cited 945,817 times), the University of California, San Francisco at #9 (36,106 papers cited 939,302 times) and the University of California, San Diego at #12 (40,789 papers cited 899,832 times).
Sandwiched between all these California institutions is the University of Michigan at #7, with 54,612 papers cited a total of 948,621 times. Rounding out the top 10 is the University of Pennsylvania, with 46,235 papers cited a total of 931,399 times. The remaining institutions on the list include the sole entries from Japan (the University of Tokyo at #11) and Canada (the University of Toronto at #13) and the three UK-based institutions (UCL at #14, the University of Cambridge at #18, and the University of Oxford at #19).
The data and citation records included in this report are from Thomson Reuters Web of ScienceTM. Weekly ByteAustralian Institutions: High Impact in Organic Chemistry, 2009-2013Listed by average citations per paper, among Australian institutions that published at least 75 papers in Thomson Reuters-indexed Organic Chemistry journals between 2009 and 2013. Global Research ReportsResearch Fronts 2013When scientists cite each other's work, they—sometimes unknowingly—link related or identical topics within their scientific research. Using data from Thomson-Reuters 2004 Journal Citation Reports (JCR), we partitioned 6,128 journals connected by 6,434,916 citations into 88 modules. These maps use the map equation on journal-level citation data from the Journal Citation Reports. At the highest level, scholarship splits into four domains: the life sciences, the social sciences, the earth sciences, and the physical sciences. These institutions are the top 20 out of a pool of 4,050 institutions comprising the top 1% ranked by total citation count over all fields. These institutions all produce a high volume of papers resulting in extremely high citation counts—the top six institutions have over one million citations to their credit, and cite counts for the remaining 14 are all well over a half-million. This institution, with its many component facilities, is a heavy hitter in the physical sciences, with the fields of Physics, Chemistry, and Space Science among its top-cited fields.
In the Special Topic on Epigenetics, Johns Hopkins was the top-ranked institution, and two of its top researchers, Stephen Baylin [see also], and John Herman, ranked at #1 and #2, respectively, in this topic.
The #2 ranked researcher in the field of Mathematics, Robert Tibshirani, hails from Stanford, and is a pioneer in microarray research. By far the largest contribution to this cite count comes from Michigan's work in Clinical Medicine—their record in this field includes 16,314 papers with 324,701 total citations. Four US institutions make up the rest of the list: Columbia University at #15, Yale University at #16, MIT at #17, and the University of Wisconsin at #20. These "invisible colleges" identify emerging trends and specialty areas—providing a distinct advantage for world policymakers tasked with furthering research in the face of limited resources. An arrow from field A to field B indicates citation traffic from A to B, with larger, darker arrows indicating higher citation volume.
For visual simplicity, we show only the most important links, namely those that a random surfer traverses at least once in 5000 steps, and the modules that are connected by these links. All direct and two-step links between evolution and medicine are highlighted in light blue; direct citations from evolution to medicine are shown in 2004 in dark blue. The physical sciences are further subdivided into a chemistry and physics cluster, and a mathematics and engineering cluster. In 2010, we observe 195 citations from economics journals to neuroscience journals, and 74 citations from neuroscience journals to economics journals. In fact, the Max Planck Society is the top-ranking institution overall in Physics and Space Science, and ranks at #2 overall in Chemistry. Other prominent Johns Hopkins researchers featured in ScienceWatch include Bert Vogelstein [ see also], Rafael Irizarry, Valina Dawson, David Ginty, Frederick Nucifora, Keerti Shah, and Charles Bennett. David Donoho [see also], another high-ranking Mathematics researcher, has also been featured in ScienceWatch.


Distinguished Penn researchers who have been featured in ScienceWatch include Mitchell Lazer, Mirjam Cvetic, Mauro Guillen, and Wafik El-Deiry.
The assessment of journals is of particular interest to South African authorities as the country's universities are partially funded according to the number of publications they produce in accredited journals, such as the Thomson Reuters indexed journals. Pioneering Alzheimer's researcher Dennis Selkoe [see also ¦ see also] hails from this institution and has spoken with ScienceWatch on numerous occasions about his research.
Ferenc Krausz and Manfred Reetz have both spoken with ScienceWatch about their highly cited work. Stanford was also among the top five institutions in the Special Topic on High Temperature Superconductors. The 2004 map was originally published in Stearns and Nesse (2008) Evolutionary Applications 1:28-48. Other top scorers for Harvard include Stanley Korsmeyer, Martha Shenton [ see also ¦ see also], Ronald Kessler [ see also ¦ see also], and top diabetes researchers David Nathan and JoAnn Manson. Our objective here is to report the performance of the country's journals during 2009 and 2010 according to a number of metrics (i.e. As such, assessments of journals are of interest to a number of stakeholders from scientists and librarians to research administrators, editors, policy analysts and policymakers and for a variety of reasons. Librarians would like to keep available the most reputed journals within their budget constraints.
Research administrators use journal assessments in their evaluations of academics for recruitment, promotion and funding reasons. Editors are interested to know the relative performance of their journal in comparison with competitor journals.
Finally, policymakers have to monitor the quality of journals as they use published articles as indicators of success of the research system, for identification of priorities, for funding and other similar reasons. The importance of the issue arises from the fact that higher education institutions receive financial support from the government for their research activities. Financial support is received by these institutions according to the number of publications produced and published in predetermined journals by their staff members. In expert opinion assessments, experts such as well-known researchers and deans of faculties are asked to assess particular journals. Subsequently, the collected opinions are aggregated and a relative statement can be made. Citations are the formal acknowledgement of intellectual debt to previously published research.
The impact factor of a journal is a measure of the frequency with which the average article in that journal has been cited in a particular year.
Despite the continuous debate related to the validity of Garfield's journal impact factor for the identification of the journals' standing,4,5 citation analysis has prevailed historically. Such criteria include the availability of adequate articles to the journal, publication on time (timeliness) and peer review of the articles submitted to the journal. A journal with a high impact factor means that the journal has not only qualified for inclusion in the indices, but also that researchers often cite its articles. In 2000, the South African government terminated its direct financial support to research journals,6 and only the South African Journal of Science and Water SA continued to receive financial support. An investigation in 20056 showed that the termination of government involvement in the affairs of the journals had on average a beneficial effect on the impact factors of the journals. This strategic framework recommends, among others, the periodic peer review of the country's journals and a change in the publishing approach, i.e. Finally, during 2008, Thomson Reuters increased substantially the coverage of South African journals. The number of journals indexed in the Science Citation Index increased from 17 in 2002 to 29 in 2009 - an increase of 70%. The coverage of social sciences journals in the Social Sciences Index showed an even more substantial increase: from 4 in 2002 to 16 in 2009 - a 400% increase. Firstly, we compare the performance of the journals indexed in Thomson Reuters' Journal Citation Reports® (JCR) during 2002 with their performance during 2009 and 2010. The year 2002 was chosen as it is the time after the government terminated financial support to what used to be called 'national journals'. The relevance of the investigation is emphasised by the fact that South African researchers do not engage actively and do not publish research investigating the assessment of scientific journals. Comparing the journals' performance during 1996 (before the termination of funding) and during 2002 it was stated that6: The South African journals are identified as performing better without government interference imposed by the constraints attached to financial support offered to them. Evidently the editors and the editorial boards have been able to support their journals better without the interference of the bureaucracy. The recommendations included the adoption of best practice by editors and publishers in the country, the undertaking of an external peer review and quality audit of all research journals in 5-year cycles, and the adoption of an open-access publishing model enhancing the visibility and accessibility of the country's research.
The panels carrying out the reviews comprise six to eight experts, at least half of whom are not directly drawn from the disciplinary areas concerned. A pilot site for the SciELO SA, initially on the SciELO Brazil site, has been established and has been live since 1 June 2009. It is expected that almost 200 South African journals will eventually be available on the platform. According to a press announcement of Thomson Reuters13, 'The newly identified collection contains journals that typically target a regional rather than international audience by approaching subjects from a local perspective or focusing on particular topics of regional interest'. The press release emphasised: Although selection criteria for a regional journal are fundamentally the same as for an international journal, the importance of the regional journal is measured in terms of the specificity of its content rather than in its citation impact. Other countries that experienced a large increase in the number of journals indexed were Brazil, with 132 additional journals, Australia with 52, Germany with 50, Chile with 45, Spain with 44, and Poland and Mexico each with 43 additional journals. From the African continent, Nigeria's and Kenya's collections each increased by one journal. Another difference is that whilst the impact factor of a journal has a 1-year census period and uses the previous 2 years for the target window, the Eigenfactor metrics have a 1-year census period and use the previous 5 years for the target window.
The composition of the editorial boards is an indicator of the internationalisation of the journal.
It should be clarified that the editorial boards are both indicators of quality and inputs in the process of publishing a journal.


For example, researchers are selective on how they spend their time and they would prefer to be associated with 'top' journals.
On the other hand, international researchers introduce standards and approaches in the peer review of the articles that may improve the relevant journals.
The ranking to quartiles has been undertaken in order to take into consideration the variation in citations among the various scientific disciplines. Of the 17 South African journals in JCR, 4 journals declined in terms of quartiles from 2002 to 2010. Only one journal - the African Journal of Marine Science - improved its performance and moved from the third quartile to the second quartile. The South African Journal of Geology moved definitively into the second quartile in the list of the relevant disciplinary journals during 2009, while it was exactly in the middle of the second and third quartiles during 2002.
It can be argued that this variability is the result of changes in the management structure of the journal. Prior to 2009, the journal had a full-time editor and from 2009 moved into a model with a part-time editor assisted by an editorial board. With the exception of African Invertebrates, which is positioned in the second quartile of the relevant journals with an impact factor of 1.216, all other journals fall within the fourth quartile of their categories. In this test, two sample means are compared to discover whether they come from the same population (meaning there is no difference between the two population means). As the p-value is bigger than 0.05 (level of significance), we cannot reject the null hypothesis that the two sets of journals come from the same population. South Africa is represented by 16 journals during 2009, whereas only 4 journals were indexed in 2002. Only four journals fall in the third quartile of the lists of the relevant journals in the index; all others fall in the fourth quartile.
The results again indicate that all journals were from the same population in terms of their impact factors.
The SAMJ: South African Medical Journal has the highest Eigenfactor score followed by the South African Journal of Botany and the South African Journal of Science. The International Sportmed Journal and South African Journal on Human Rights have more foreign than local academics on their boards. As the structure of the editorial board reveals, at least partially, the international character and the quality of a journal, this situation is a policy concern.
Table 8 shows the number of journals indexed in the JCR from South Africa and other selected countries. In the social sciences, South Africa has even more than countries like Japan and China, which probably is related to language issues. On the African continent, Nigeria follows South Africa with eight and two journals in the sciences and social sciences, respectively. The dominance of plant and animal journals is the result of the country's wealth in flora and fauna resources. Inclusion in the citation indices is of importance internationally as an indicator of journal visibility. Universities in South Africa receive government subsidy according to a funding formula in which one of the components is the number of research publications.
Universities currently receive more than ZAR120 000 (approximately USD12 000) for each publication that their members of staff and students publish in qualifying journals. During the most recent period, the majority of the South African indexed journals belong to the fourth and third quartile in terms of impact factor. Journals in the tail of the Thomson Reuters ranking are at risk of being dropped from the citation indices. Furthermore, as researchers prefer to submit their articles to high-impact journals, the journals in the tail run the danger of not receiving an adequate number of quality articles and hence will either reduce their quality standards or cease to exist. As many as 20 journals do not have any foreign researchers or academics on their editorial boards.
As international gatekeepers can transmit international standards and practices in the local journals, and because they may increase the prestige of the journal with their presence, the issue should receive attention by the relevant authorities. It should be emphasised that international researchers on editorial boards alleviate the shortcomings of peer review in scientifically small countries like South Africa. It has been argued that in scientifically small countries, a small number of researchers work in the same field, they know each other personally and they are socially tied to each other and to the social community surrounding them.
Their approach of coupling scientometric assessments with peer reviews can further provide evidence of the validity of the above findings. The addition of journals in the indices increased the coverage of the various countries' scientific articles but created discontinuities in the time series data.
Even though Thomson Reuters13 stated that 'the importance of the [inclusion of] regional journal is measured in terms of the specificity of its content rather than in its citation impact', our investigation shows that the newly added journals were of the same quality in terms of impact factor as the pre-existing ones. Similarly, we thank two anonymous referees of the South African Journal of Science for their comments. The seductive power of academic journal rankings: Challenges of searching for the otherwise.
Perspectives on botanical research publications in South Africa: an assessment of five local journals from 1988 to 2002, a period of transition and transformation. National code of best practice in editorial discretion and peer review for South African scholarly journals. The effects of a two stage publication process on the journal impact factor: A case study on the interactive open access journal Atmospheric Chemistry and Physics.
An impact of Croatian journals measured by citation analysis from SCI expanded database in time span 1975-2001.




Schools for acupuncture and oriental medicine
Chinese herbal medicine in oxford


Comments to «Sci journals thomson reuters xenith»

  1. Ramil_Seferov writes:
    Surveys present that a couple of quarter of all adults within the amazing excerpts from 20 of probably.
  2. Princessa_Girl writes:
    Body, in weight-reduction plan and in the cause and.
  3. ARMAGEDDON writes:
    Methodologies, and bioengineered products for orthopedic certain traditional natural medicinal products.
  4. Sibel writes:
    Carried out by penetrating the pores.
  5. Kisia writes:
    Validated by scientific investigation which seeks before attempting any various remedies, test taking these headache.