Showing posts with label h-index. Show all posts
Showing posts with label h-index. Show all posts

Wednesday, December 29, 2021

Publications, Metrics and Reputation

 Here is an email I sent to my students about publications and reputation based on a recent paper that just appeared online.  I have removed names to conceal the names of parties that might prefer to remain anonymous.

Dear all,

A conversation about publications often comes up between graduate students and their research mentors.  I know that we have talked about this multiple times.  A valid concern of many students is the strength of their publications record, which is used by future academic employers.  It is easy to count numbers of publications or metrics such as the h-index, but an individual’s reputation is based on substance not simplistic numbers.

First, the research itself must be interesting and useful to others.  There are many papers that wow me even if I never cite them in my own research, winning my highest respect for those who create such gems.  You should work hard and enjoy the process of making new discoveries, and then hold yourselves to the highest standards for the work that you produce.  This is what will open doors to future employment.

The email below from one of my colleagues serves as an example of the reputation that you should seek to build over time.  I hope that this kind of feedback motivates you to persevere through the next phase of your work.  I certainly look forward to all the new insights that we will gain.

To conclude, I congratulate you for your contributions to this work.  I know that some of you were frustrated having to rebuild experiments, repeat measurements, and rewrite the manuscript an endless number of times as we found errors in the calculations and problems with the apparatus.  But in the end, I am proud of the final product, which I believe will be of use to others.

Happy New Year!

 

Best,

Mark G. Kuzyk

Regents Professor of Physics

Washington State University

Pullman, WA 99164-2814

 

Phone: 509-335-4672

Fax: 509-335-7816

 

Web Page: www.NLOsource.com

 

From: [Colleague]
Sent: Wednesday, December 29, 2021 9:11 AM
To: Kuzyk, Mark G <kuz@wsu.edu>; Mark G. Kuzyk <mgk.wsu@gmail.com>
Subject: Fwd: [Applied Sciences] Manuscript ID: applsci-1500266; doi: 10.3390/app12010315. Paper has been published.

Dear Mark,

I forwarded this new paper to my group members. You never cease to amaze me with the thoroughness and rigor of your research. What an amazing piece this last report is! We have a lot to learn from you, indeed.

I will enjoy reading the paper. I hope we can meet up at some point to continue our discussions. [My senior student] will be graduating in January and he wants to pursue a career in the corporate world. I have another student who is a bright and enthusiastic, and we can consider sending him to your lab, if the things with the pandemic get better.

Take care of your health,

[Colleague]

---------- Forwarded message ---------
From: Applied Sciences Editorial Office <applsci@mdpi.com>
Date: Wed, Dec 29, 2021 at 5:02 PM
Subject: [Applied Sciences] Manuscript ID: applsci-1500266; doi: 10.3390/app12010315. Paper has been published.
To: Colleague
Cc: Applied Sciences Editorial Office <applsci@mdpi.com>, Keira Wang <keira.wang@mdpi.com>

Dear [Professor],


We are pleased to inform you that "Photothermal and Reorientational
Contributions to the Photomechanical Response of DR1 Azo Dye-Doped PMMA
Fibers" by Zoya Ghorbanishiadeh, Bojun Zhou, Morteza Sheibani Karkhaneh,
Rebecca Oehler, Mark G. Kuzyk * has been published in Applied Sciences as
part of the Special Issue Composite and Smart Materials: Theory, Methods and
Applications and is available online:

Abstract: https://urldefense.proofpoint.com/v2/url?u=https-3A__www.mdpi.com_2076-2D3417_12_1_315&d=DwIDaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=8Rjnp4aNhHPBrmgt9k4Q6f-pu3z01qzkXZBySmc4rd8&m=o0b_FSbvMnSOnSSq65_Iqvm2Lzwws_d1R8DePemMmvVRGF0JjKCizRgc6-aV4Ati&s=NKk5yL3BySXUrbhhOaPR0aUISAStEHsLXfIt9ZbVctk&e=
HTML Version: https://urldefense.proofpoint.com/v2/url?u=https-3A__www.mdpi.com_2076-2D3417_12_1_315_htm&d=DwIDaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=8Rjnp4aNhHPBrmgt9k4Q6f-pu3z01qzkXZBySmc4rd8&m=o0b_FSbvMnSOnSSq65_Iqvm2Lzwws_d1R8DePemMmvVRGF0JjKCizRgc6-aV4Ati&s=t6QVhD-mbZqQY846ZCCoE2JzF67xY6qSi5A7TOImY3k&e=
PDF Version: https://urldefense.proofpoint.com/v2/url?u=https-3A__www.mdpi.com_2076-2D3417_12_1_315_pdf&d=DwIDaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=8Rjnp4aNhHPBrmgt9k4Q6f-pu3z01qzkXZBySmc4rd8&m=o0b_FSbvMnSOnSSq65_Iqvm2Lzwws_d1R8DePemMmvVRGF0JjKCizRgc6-aV4Ati&s=KrUea9tNwtMeC4JC2sXiRHNYB_UMo3nUyRQqeZfdlq4&e=
Special Issue:
https://urldefense.proofpoint.com/v2/url?u=https-3A__www.mdpi.com_journal_applsci_special-5Fissues_composite-5Fsmart-5Fmaterials&d=DwIDaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=8Rjnp4aNhHPBrmgt9k4Q6f-pu3z01qzkXZBySmc4rd8&m=o0b_FSbvMnSOnSSq65_Iqvm2Lzwws_d1R8DePemMmvVRGF0JjKCizRgc6-aV4Ati&s=8xdNAgcPAFbBFvgIf7SzfgeGSZTqc2tWgqC4aTa4zB4&e=


Wednesday, December 28, 2011

Annual Review of Faculty

I will once again be doing annual reviews of our faculty. Here I share my views on the process; and, describe the metrics I use to make evaluations.

There are three general categories; (1) teaching, (2) research and (3) service to the university and the profession. The annual review is meant to give the faculty feedback about their performance for a particular year, and is one factor in determining raises. However, given budget cuts, there will be no raises in the foreseeable future.

While a faculty member may have a long and distinguished career, it is possible to have up and down years based on the simple fact that there are fluctuations in the metrics; one year a faculty member may have 10 excellent papers followed by a year with none. However, the annual reviews will not fluctuate as much as the metrics given that fluctuations are inevitable and usually not a sign of a problem.

Teaching is evaluated based on several pieces of information. The syllabus shows how the course is organized and faculty usually provide a narrative on new ideas that they explored to enhance learning outcomes as well as student feedback in the form of course evaluations.

I am strictly opposed to standardized teaching evaluation forms because they often lead to meaningless comparisons between professors in the form of numbers that each students assign using very different criteria. I marvel at the well-reasoned forms that each of our professors produce and the unique information that they solicit. Since each faculty member may be using different approaches, a single number assigned by a student does not tell a meaningful story. Thus, I usually read every comment made by each student - even in classes with 250 students. It takes lots of time, but provides quality information with which I can get a better sense of how well our courses are working.

As a case in point, several years ago, I scanned through a bunch of evaluations of a young professor who was teaching a large section with hundreds of students and noticed two things. The mean was slightly below average (i.e. the typical student choose average or below average when comparing this professor to others at our university) and the distribution was bimodal with a quarter of the students giving him the highest marks and three quarters giving him below average ratings.

I gained great insights from the correlations between the numbers and the narratives. The typical student who gave a low ranking said the course was "unfair" because it was too hard. The students giving him higher marks stated that the course was challenging but the professor spent lots of time explaining complex concepts to the students during class as well as during office hours. In addition, this professor scheduled non-mandatory help sessions so that students could ask questions and get extra help.

A picture emerged of a dedicated teacher who expected excellence but was willing to expend a great deal of additional effort to make sure that the students learned. While his average course evaluation was lower than many of the other faculty in our department, I judged him to be a more dedicated teacher than those whose evaluations were high and the comments uniformly positive about the fact that the class was "fun" and "easy."

Research is more difficult to judge. The fact that a funding source is willing to pay top dollar to a given faculty member is a good indication that this individual and his/her work is considered useful or interesting. Similarly, papers that make it through the peer review process at good journals have convinced an editor and a few reviewers of correctness and importance. If a piece of work gets many citations, then it shows that people are reading the papers and finding them useful. Thus, a blend of these factors provides a good indicator of research productivity and quality.

However, these indicators can have the opposite meaning. For example, a paper may be cited many times as an example of bad science. Or, an individual with lots of citations may be a technician who provides specialized samples to many research groups. While such a person is contributing to the science by providing samples, the number of citations may not be a sign that the work is particularly interesting or creative.

Lots of funding is not always a sign of good science. A company may grant big bucks to a researcher to test a trivial property of a product. In contrast, a small grant from the National Science Foundation for work by a single PI who is challenging our perceptions of the nature of space-time would carry much more weight in my mind.

I try to consider all these factors together when evaluating the research productivity of a faculty member, which includes learning a little bit about their research. I may look at the h-index and or the numbers of publications or citations per dollar spent, or other ratios to develop an impression of the research quality. Faculty members who have attained Fellow status in a professional society get such honors from significant lifetime contributions to their fields, so fellowship in a society factors into the annual review. In the end, I make a value judgement that may or may not be in line with what others may think. I accept the fact that the process is far from perfect.

Finally, there is service. Every faculty member is expected to contribute to the operations of the department and the university. Faculty members sit on tenure committees, thesis examination committees, search committees, and do all kinds of thankless tasks such as recruiting students, writing reports for the administration, and doing self studies that support our claims that we are a high quality department relative to our peers. I look for not only quantity of service, but the impact that it has on our department and university.

Service to the profession takes the form of membership on program committees for conferences, being on editorial boards of journals, reviewing papers for journals or proposals for funding agencies, organizing conferences, acting as an external examiner at other universities, and serving on panels that take advantage of an individual's scientific expertise. An active scientist does this kind of service as a matter of daily routine. I thus expect to see a substantial professional service component from each faculty member.

To make a final assessment, all of these factors are taken into account. A positive annual review requires significant accomplishments in at least two of the three major areas, with emphasis on research and education. What makes my job especially difficult is that our department is very strong. All faculty members are doing high-quality research, are well known in their fields, are well funded and are advising undergraduate/graduate students.

I have a few weeks break until I need to get around to this unpleasant task, which takes 5 full days to complete. For now, I am enjoying my break doing some new physics, which includes having completed an interesting calculation that was motivated by my proofing of one of Shoresh's manuscripts. Before the intensity of the new semester begins, I need to proof and submit a few more manuscripts for publication, as well as review a few more papers.

This past year has been a local maximum with a record 12 publications in refereed journals (almost 10% of my lifetime 123 publications), not to mention a bunch of invited and contributed talks. Now I will need to get back to work to maintain this momentum. But not until I take a day off to enjoy family, friends, and this wonderful place we call home.

Tuesday, November 16, 2010

Illogical Scientists

I know that I am often illogical about lots of things; it's a trait that makes us human. However, this does not excuse sloppy thinking by scientists. Scientists often complain about the innumeracy of the general population, but we are the most egregious offenders by associating undue meaning to the numbers, such as our reliance on the h-index to quantify the performance of an individual researcher with a single number.

The h-index is a measure based on citations to a researcher's work. The Wikipedia page on the h-index gives a description of how it is computed, what it purportedly measures, and criticisms of its use. No one would disagree that there are many factors that affect the h-index, making it a highly inaccurate metric. So, why do so many people use it? I believe that it is sheer intellectual laziness. What could be simpler than using a single number for ranking a group of individuals?

One of my beefs with the h-index is that it favors scientists that manage large research groups that publish lots of papers and produce lots of PhD s who then get jobs doing similar work and generate more citations for their research adviser. Is it healthy to reward good managers of science for being good scientists? Perhaps.

Scientists are well aware of the importance of their own h-index on career advancement. It is all too common for reviewers to point out that some important references are missing in a manuscript. As you might guess, reviewers often push for their own publications to be cited. Placing such a high degree of importance on a single indicator has a perverse effect on the scientific enterprise.

Perhaps I am selfish, but I live to think and to produce new ideas. It makes me happy. I want to be in the trenches, scribbling equations with my mechanical pencil on smooth white sheets of paper and tinkering in the lab with neat equipment that I have designed and build with my own hands - all in the company of students who are excitedly doing the same. I understand that there needs to be a balance between managing a lab and doing the work. A PI who spends all the time working in the lab is not passing along his or her expertise to others. It is also inefficient to be involved with all the mundane activities that go along with doing research. More ideas can be investigated when the students have a large degree of autonomy, so while difficult, I try to keep a healthy distance.

On the other extreme is the professor who writes lots of huge grant proposals and uses his or her mega funding to hire an army of postdocs who generate a large chunk of the ideas, advise the graduate students, and implement the work. Such an individual is doing a service to society by concentrating a critical mass of intellectual capital to solve important problems. It is an efficient division of labor where grantsmanship brings in the funds and the most capable people do the work.

I want to make it clear that I am not criticizing the big research groups. They provide an important service to the scientific community of producing the next generation of scientists and adding substantially to the knowledge base. Given that some of my work is amenable to large-scale science, I often question my decision to limit the size of my research group. Am I letting down my employer by bringing in a few hundred thousand dollars of funding per year rather than millions? My decision to operate on a smaller scale is based on my conviction that my science produces value to both the university and to society. Arguably, my group produces more publications and more citations per research dollars than many others. Imagine if a researcher's productivity were measured in citations per dollar of funding.

If truth be told, I would be dissatisfied working in a field with hundreds of researchers who are working on a tiny piece of a puzzle. Rather, I derive satisfaction from thinking about things from a unique and broad perspective. As a result, I have far fewer colleagues doing similar work and my students, as a result, will have fewer job opportunities - unless our work leads to a breakthrough. But, I justify our work in my conviction that such work has the potential for making a large impact in the long term.

However, I am concerned that the emphasis on large research groups garnering huge research grants will squeeze out the smaller groups, where breakthroughs in new thinking are usually generated. Given the present-day economic challenges, researchers with larger grants will undoubtedly be held in even higher regard by their cash-strapped administration.

A second perversion of the scientific enterprise is recognition for simply publishing in a prestigious journal. The quality of a paper should be judged on its own merits, not by its venue. The viscous cycle is reinforced by "me too" papers that cite high-visibility journals to imply that the author's work is in an important research area.

Consider the work of Victor Veselago who in 1967 investigated the properties of materials with a negative refractive index. His work got little attention (and few citations for decades) until the recent explosion of research on meta materials, results of which are routinely published in Nature and in Science with sleek color graphics and exciting titles. Amongst the present-day superstars are their lesser-known brethren such as Veselago who are laying the foundations for future revolutions.

Recently, the program director of an agency that funds my work sent me the following email,

I am collecting information to prepare for my upcoming internal review and I am collecting interesting accomplishments for highlights. This request does not substitute the formal annual progress report specified in the grant document that is due at your grant anniversary date of each year. Please provide a (no more than) one page summary of the significant accomplishments and publications (such as Science or Nature publications) in your program...

Clearly, this program director deems it necessary to justify to his superiors his funding choices not necessarily based on the quality of the work, but on where the work is published. What I find more distressing is that scientists in increasing numbers believe that a publication in these high profile journals carries greater importance.

Many years ago, one of my colleagues suggested that I should package my work in a way to make it publishable in Science or Nature. I did not chose science as a career to focus on marketing. Rather, I am motivated on a daily basis by the promise of understanding something new about nature and to share my understanding with others who are similarly driven. Nothing beats the eager faces in the classroom, arguing with me about my rendition of nature's hidden treasures. Nor the excited chatter of my graduate students as they report on new results or insights. Even day-to-day difficulties bring to light the pleasures of seeking the truth. At the end of the day, the negative aspects of the scientific culture fade into the background, where they belong. But, on occasion, my bliss is interrupted by that unpleasant pang of realization that scientists hold positions that are not supported by reason. We should know better. Shame on us.