There are many factors that you need to consider when assessing the quality of a journal.
Which databases index the journal? For example, is it indexed in Web of Science and/or Scopus? And what impact factors or citation analysis does the journal have in Web of Science and/or Scopus? How is the journal rated according to the Publication Forum (Julkaisufoorumi, Jufo) and/or Academic Journal Guide (AJG)?
See Journal ranking indicators below.
Peer analysis is a high-quality control measure of research before publication in a journal. UlrichsWeb contains general information about journals. You can check UlrichsWeb to get the information if an article in a journal undergoes peer review before publishing.
Firstly, search for the journal in the list of e-journals in Hanna. Then click on the SFX icon and choose ulrichsweb.com. If the line “Refereed” is indicated with “Yes” on the journal information page on UlrichsWeb, it means that the journal uses peer reviewers in the publication process.
How are the journal’s circulation and coverage? How frequently is it published? How fast is a journal from submission to publication? What are the journal’s acceptance/rejection Rates?
The information can be found on the journal’s or publisher’s website. Some journals have "received" and "accepted" dates on the first page of their articles.
Journals are categorized as follows according to the Ministry of Education and Culture:
The Journal Impact Factor (JIF) is the original journal ranking indicator for journals in science and social science disciplines, but others are now also available which attempt to take account of variations between subject areas and time periods.
Hanken uses a combination of impact factors in Journal Citation Reports (JCR), the Publication Forum (Jufo) ratings, Source Normalized Impact per Paper (SNIP), and the Academic Journal Guide (AJG) when assessing the quality of a journal.
The Journal Impact Factor (JIF) or Impact Factor (IF) of an academic journal is a measure reflecting the yearly average number of citations to recent articles published in that journal. The impact factor was devised by Eugene Garfield, the founder of the Institute for Scientific Information.
JIF uses citation data from Web of Science database, with full JIF information in Journal Citation Reports (JCR). Impact factors have been calculated yearly from 1975 for journals listed in Journal Citation Reports and are updated yearly in June. Journals with the highest impact factors have one of about 50, while most are below 1.
Calculation: Uses a two-year period to divide the number of times articles were cited by the number of articles that were published.
For example, 2013 impact factor of a journal would be calculated as follows:
200 = Total number of times articles published in 2011 and 2012 were cited by indexed journals during 2013
73 = Total number of "citable items" published in 2011 and 2012
2013 impact factor = 2.73 (200/73).
It means that the journal’s articles in 2011 and 2012 had approximately 2,73 citations during the year 2013.
Note that "citable items" for this calculation are usually articles, reviews, proceedings, or notes. JIF eliminates front matter (e.g. editorials, news, letters to editors) in its calculations.
There is also a five-year impact factor, which is a more reliable indicator in many subject areas. Articles from the years 2012, 2011, 2010, 2009, and 2008 are included in the five-year impact factor for 2013.
Tutorials for JCR: Journal Citation Reports: Learn the Basics.
Note that JIF can be easily skewed, for example, if a single article is very highly cited (see "Metrics: journal's impact factor skewed by a single paper, a 2010 paper in Nature). Journal rankings evaluate the performance of a journal as a whole but say nothing about the impact or quality of individual articles. Most articles have low citation values even though they are published in a highly ranked journal. The impact factor can be often misused in measuring quality of journals.
The impact factors of journals in different fields should not be compared with each other.
The national Publication Forum (in Finnish Julkaisufoorumi, Jufo) rating aims to serve as the quality indicator of the whole scientific publication production of universities in Finland. Following the Norwegian and Danish examples, the Publication Forum system is based on quality classification of scientific publication channels – journals, publication series and book publishers – in all research fields.
The publication channels are divided into three levels (1 = basic; 2 = leading; 3 = top) by 23 field specific expert panels. Other identified publication channels not reaching up to level 1 are listed with 0, and publication channels not yet assessed by the expert panels are listed with the mark -.
On Publication channel search page, you can check which level a journal, publication series, conference or publisher has.
Publication Forum operates within the Federation of Finnish Learned Societies (TSV). The first Publication Forum rating was completed in 2011. From 2015, the rating has been in use within the Ministry of Education and Culture funding model, as well as in the internal funding model at Hanken beginning from the year 2013 (Hanken Foundation’s Publication Rewards).
CWTS Journal Indicators provides free access to bibliometric indicators on scientific journals. The indicators have been calculated by Leiden University’s Centre for Science and Technology Studies (CWTS) based on the Scopus bibliographic database produced by Elsevier. Indicators are available for over 20,000 journals indexed in the Scopus database.
A key indicator offered by CWTS Journal Indicators is the Source normalized impact per paper (SNIP). SNIP indicator measures the average citation impact of the publications of a journal.
CWTS Journal Indicators also provides stability intervals that indicate the reliability of the SNIP value of a journal. More information on the indicators offered by CWTS Journal Indicators is available on the Methodology page.
Unlike the well-known journal impact factor, SNIP as a field-normalized metric corrects for differences in citation practices between scientific fields, thereby allowing for more accurate between-field comparisons of citation impact. It is helpful especially for researchers publishing in multidisciplinary fields.
Allows for comparison of journals from different subject areas and research fields.
SNIP weights citations based on the total number of citations in a subject field and takes into account citation patterns in different disciplines.
The impact of a single citation is given higher value in subject areas where citations are fewer, and vice versa.
The Chartered Association of Business Schools (CABS), England, releases the ranking of business journals: the Academic Journal Guide (AJG), also known as the “ABS list.” It is the successor of the often criticized Academic Journal Quality Guide. This ABS ranking has become quite influential as the guiding journal ranking across management disciplines in the UK. Although the ranking has been heavily criticized, the ABS ranking has been adopted by many business schools also in other countries.
The AJG is a guide to the range and quality of journals in which business and management academics publish their research. Its purpose is "to give both emerging and established scholars greater clarity as to which journals to aim for, and where the best work in their field tends to be clustered."
There are 5 ratings of the AJG: 4*, 4, 3, 2, and 1. See Ratings explained.
The Guide can be freely accessed after you have registered and logged in.