Research Methods and Peer Review Advice
(Updated: )Reading time: 3 minutes
Which level of research rigor and result maturity is required to satisfy paper reviewers? I shared tips how to write for reviewability and how to review before. Here are some more.
Background and Motivation
I have been reviewing quite a bit for journals and magazines1 as well as conferences and workshops2 for more than ten years now. Like many of my peers, I get frustrated about the quality of the submissions (and some of the reviews too) more often than necessary.
There are quite a few highly valuable resources actually that teach (or remind) us what makes good research and good papers. I compile some of my favorites below.
Producing Results: Research Methods
For advice on applied/empirical research projects and academic writing, start here:
- How to do research by S. Miksch from TU Vienna is a rich collection of pointers.
- In her ICSE 2003 mini tutorial “Writing Good Software Engineering Research Papers”, M. Shaw explains which validation activities to plan and execute, depending on the type of research (among other topics).
- This presentation by I. Malavolta has a similar theme in the context of software architecture and lists several empirical strategies such as controlled experiments, case studies, survey, and action research.3
- The Design Science Methodology (DSM) and supporting tutorial presentations by Roel Wieringa give even deeper advice. Try his problem framing template and the question taxonomy.
- A Technical Report from the SEI has information on how to conduct surveys.
- H. Erdogmus, a former editor-in-chief of IEE Software advises how to write high quality papers (for IEEE Softwarebut his advice also is applicable for other venues).
- Finally, you can also learn how to organize a thesis on a page by J. W. Chinneck.
My blog post calling for submissions to IEEE Software Insights has tips and tricks for preparing industry experience reports.
Consuming Results: Peer Reviews
In her January/February 2021 editorial “Protecting the Health and Longevity of the Peer-Review Process”, Ipek Ozkaya, the present editor-in-chief of IEEE Software, reminded us that reviewing conference and journal submissions is important (and challenging) volunteer work. She also proposed a “Reviewers’ Oath”, committing reviews to be concrete, actionable, relevant, and timely; submitters of papers should also review themselves, and recommend aspiring members of their networks to grow and sustain the community.
Here are some more pointers and advice:
- Follow the recommendations from the ICSA 2021 organizers to make your reviews helpful both for the PC and for the authors.
- Resist the temptation to point the authors to your own publications directly or exclusively. Some publishers, journals and conference PC chairs ban this explicitly, others leave this decision to the reviewers. Personally I consider blunt self-citations in reviews unethical in most cases. A generalization is more adequate: “a large body of work on the topics of
[keyword list]
has been published at[conferences, journals]
during[time range]
. Please indicate how your work differs from these existing works.” You can select the values for the three[placeholders]
to be specific enough that some of your papers will be in the result set, but also those of other researchers in the named areas. 😉 - Adhere to the FAIR principles for research data (and software).
This post by David Shepherd explains how to review industry experience reports.
Looking Back and Forth
You can find the shorthand markup notation that I use when reviewing (and taking meeting minutes) in another post on this blog. And general technical writing advice is also avaiable.
Let’s get going with our next paper now, and come back to these hints while writing — and when the reviews come in. 😃
– Olaf (a.k.a. socadk)
-
such as IEEE Software, Transactions on Software Engineering (TSE), Transactions on Services Computing (TSC), IEEE Computer Society; Transactions on Software Engineering and Methodology (ToSEM), ACM Computing Surveys; Journal of Information and Software Technology (IST), Journal of Systems and Software (JSS), ICT Express, Elsevier; Journal of Software Evolution and Process, Journal of Universal Computer Science, Springer Computing. ↩
-
ICSA 2020, 2019, 2018 (Technical Track, Engineering/Practice Track), ICSA 2017 (NEMI, Tools), ECSA 2020 (PC member, industry co-chair), 2019, 2018, 2017, 2016, 2015, 2014 (industry co-chair), ETHICOMP 2020, Modelsward 2020, ICSE SEET 2018, Technical Debt (2019, 2018), CBSE 2016 (program co-chair), QoSA (2015), WICSA (since 2009), ICSOC (since 2018), ESOCC/ECOWS conference series (since 2008), WS-REST 2018, Microservices (since 2017; program co-chair 2020); SEI SATURN 2017, 2016, 2015, 2014, 2013 (co-chair), 2012, 2011; SummerSoC (since 2015); OOP 2012 to 2016, 2019, 2020; FSE/ESEC 2013. ↩
-
Not all papers need formal proofs or statistical evidence, but reviewers certainly want to be convinced that the presented work works (in the lab and, even more importantly for applied research, in practice). ↩