Bibliometrics and research evaluation : (Record no. 73477)

000 -LEADER
fixed length control field 04137nam a2200541 i 4500
001 - CONTROL NUMBER
control field 7845173
005 - DATE AND TIME OF LATEST TRANSACTION
control field 20220712204859.0
008 - FIXED-LENGTH DATA ELEMENTS--GENERAL INFORMATION
fixed length control field 170316s2016 mau ob 001 eng d
020 ## - INTERNATIONAL STANDARD BOOK NUMBER
ISBN 9780262337656
-- electronic bk.
020 ## - INTERNATIONAL STANDARD BOOK NUMBER
-- electronic bk.
020 ## - INTERNATIONAL STANDARD BOOK NUMBER
-- hardcover
041 1# - LANGUAGE CODE
Language code of text/sound track or separate title
100 1# - AUTHOR NAME
Author Gingras, Yves,
245 10 - TITLE STATEMENT
Title Bibliometrics and research evaluation :
Sub Title uses and abuses /
300 ## - PHYSICAL DESCRIPTION
Number of Pages 1 PDF (xii, 119 pages).
490 1# - SERIES STATEMENT
Series statement History and foundations of information science
505 0# - FORMATTED CONTENTS NOTE
Remark 2 The origins of bibliometrics -- What bibliometrics teach us about the dynamics of science -- The proliferation of research evaluations -- The evaluation of research evaluation -- Conclusion: the universities' new clothes.
520 3# - SUMMARY, ETC.
Summary, etc "The research evaluation market is booming. "Ranking," "metrics," "h-index," and "impact factors" are reigning buzzwords. Government and research administrators want to evaluate everything -- teachers, professors, training programs, universities -- using quantitative indicators. Among the tools used to measure "research excellence," bibliometrics -- aggregate data on publications and citations -- has become dominant. Bibliometrics is hailed as an "objective" measure of research quality, a quantitative measure more useful than "subjective" and intuitive evaluation methods such as peer review that have been used since scientific papers were first published in the seventeenth century. In this book, Yves Gingras offers a spirited argument against an unquestioning reliance on bibliometrics as an indicator of research quality. Gingras shows that bibliometric rankings have no real scientific validity, rarely measuring what they pretend to. Although the study of publication and citation patterns, at the proper scales, can yield insights on the global dynamics of science over time, ill-defined quantitative indicators often generate perverse and unintended effects on the direction of research. Moreover, abuse of bibliometrics occurs when data is manipulated to boost rankings. Gingras looks at the politics of evaluation and argues that using numbers can be a way to control scientists and diminish their autonomy in the evaluation process. Proposing precise criteria for establishing the validity of indicators at a given scale of analysis, Gingras questions why universities are so eager to let invalid indicators influence their research strategy."
650 #0 - SUBJECT ADDED ENTRY--SUBJECT 1
General subdivision Evaluation.
650 #0 - SUBJECT ADDED ENTRY--SUBJECT 1
General subdivision Research
-- Evaluation.
650 #0 - SUBJECT ADDED ENTRY--SUBJECT 1
General subdivision Research
-- Evaluation.
856 42 - ELECTRONIC LOCATION AND ACCESS
Uniform Resource Identifier https://ieeexplore.ieee.org/xpl/bkabstractplus.jsp?bkn=7845173
942 ## - ADDED ENTRY ELEMENTS (KOHA)
Koha item type eBooks
264 #1 -
-- Cambridge, Massachusetts :
-- The MIT Press,
-- [2016]
264 #2 -
-- [Piscataqay, New Jersey] :
-- IEEE Xplore,
-- [2016]
336 ## -
-- text
-- rdacontent
337 ## -
-- electronic
-- isbdmedia
338 ## -
-- online resource
-- rdacarrier
588 0# -
-- Print version record.
650 #0 - SUBJECT ADDED ENTRY--SUBJECT 1
-- Bibliometrics.
650 #0 - SUBJECT ADDED ENTRY--SUBJECT 1
-- Research
650 #0 - SUBJECT ADDED ENTRY--SUBJECT 1
-- Education, Higher
650 #0 - SUBJECT ADDED ENTRY--SUBJECT 1
-- Universities and colleges

No items available.