Book cover

Experimentation in Software Engineering pp 45–54 Cite as

Systematic Literature Reviews

  • Claes Wohlin 7 ,
  • Per Runeson 8 ,
  • Martin Höst 8 ,
  • Magnus C. Ohlsson 9 ,
  • Björn Regnell 8 &
  • Anders Wesslén 10  
  • First Online: 01 January 2012

10k Accesses

7 Citations

1 Altmetric

Systematic literature reviews are conducted to “ identify, analyse and interpret all available evidence related to a specific research question ” [96]. As it aims to give a complete, comprehensive and valid picture of the existing evidence, both the identification, analysis and interpretation must be conducted in a scientifically and rigorous way. In order to achieve this goal, Kitchenham and Charters have adapted guidelines for systematic literature reviews, primarily from medicine, evaluated them [24] and updated them accordingly [96]. These guidelines, structured according to a three-step process for planning, conducting and reporting the review, are summarized below.

  • Software Engineering
  • Primary Study
  • Systematic Literature Review
  • Data Extraction Form
  • Narrative Synthesis

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

This is a preview of subscription content, log in via an institution .

Buying options

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Purchases are for personal use only

Anastas, J.W., MacDonald, M.L.: Research Design for the Social Work and the Human Services, 2nd edn. Columbia University Press, New York (2000)

Google Scholar  

Andersson, C., Runeson, P.: A spiral process model for case studies on software quality monitoring – method and metrics. Softw. Process: Improv. Pract. 12 (2), 125–140 (2007). doi:  10.1002/spip.311

Andrews, A.A., Pradhan, A.S.: Ethical issues in empirical software engineering: the limits of policy. Empir. Softw. Eng. 6 (2), 105–110 (2001)

American Psychological Association: Ethical principles of psychologists and code of conduct. Am. Psychol. 47 , 1597–1611 (1992)

Avison, D., Baskerville, R., Myers, M.: Controlling action research projects. Inf. Technol. People 14 (1), 28–45 (2001). doi:  10.1108/09593840110384762 http://www.emeraldinsight.com/10.1108/09593840110384762

Babbie, E.R.: Survey Research Methods. Wadsworth, Belmont (1990)

Basili, V.R.: Quantitative evaluation of software engineering methodology. In: Proceedings of the First Pan Pacific Computer Conference, vol. 1, pp. 379–398. Australian Computer Society, Melbourne (1985)

Basili, V.R.: Software development: a paradigm for the future. In: Proceedings of the 13th Annual International Computer Software and Applications Conference, COMPSAC’89, Orlando, pp. 471–485. IEEE Computer Society Press, Washington (1989)

Basili, V.R.: The experimental paradigm in software engineering. In: H.D. Rombach, V.R. Basili, R.W. Selby (eds.) Experimental Software Engineering Issues: Critical Assessment and Future Directives. Lecture Notes in Computer Science, vol. 706. Springer, Berlin Heidelberg (1993)

Basili, V.R.: Evolving and packaging reading technologies. J. Syst. Softw. 38 (1), 3–12 (1997)

Basili, V.R., Weiss, D.M.: A methodology for collecting valid software engineering data. IEEE Trans. Softw. Eng. 10 (6), 728–737 (1984)

Basili, V.R., Selby, R.W.: Comparing the effectiveness of software testing strategies. IEEE Trans. Softw. Eng. 13 (12), 1278–1298 (1987)

Basili, V.R., Rombach, H.D.: The TAME project: towards improvement-oriented software environments. IEEE Trans. Softw. Eng. 14 (6), 758–773 (1988)

Basili, V.R., Green, S.: Software process evaluation at the SEL. IEEE Softw. 11 (4), pp. 58–66 (1994)

Basili, V.R., Selby, R.W., Hutchens, D.H.: Experimentation in software engineering. IEEE Trans. Softw. Eng. 12 (7), 733–743 (1986)

Basili, V.R., Caldiera, G., Rombach, H.D.: Experience factory. In: J.J. Marciniak (ed.) Encyclopedia of Software Engineering, pp. 469–476. Wiley, New York (1994)

Basili, V.R., Caldiera, G., Rombach, H.D.: Goal Question Metrics paradigm. In: J.J. Marciniak (ed.) Encyclopedia of Software Engineering, pp. 528–532. Wiley (1994)

Basili, V.R., Green, S., Laitenberger, O., Lanubile, F., Shull, F., Sørumgård, S., Zelkowitz, M.V.: The empirical investigation of perspective-based reading. Empir. Soft. Eng. 1 (2), 133–164 (1996)

Basili, V.R., Green, S., Laitenberger, O., Lanubile, F., Shull, F., Sørumgård, S., Zelkowitz, M.V.: Lab package for the empirical investigation of perspective-based reading. Technical report, Univeristy of Maryland (1998). http://www.cs.umd.edu/projects/SoftEng/ESEG/manual/pbr_package/manual.html

Basili, V.R., Shull, F., Lanubile, F.: Building knowledge through families of experiments. IEEE Trans. Softw. Eng. 25 (4), 456–473 (1999)

Baskerville, R.L., Wood-Harper, A.T.: A critical perspective on action research as a method for information systems research. J. Inf. Technol. 11 (3), 235–246 (1996). doi:  10.1080/026839696345289

Benbasat, I., Goldstein, D.K., Mead, M.: The case research strategy in studies of information systems. MIS Q. 11 (3), 369 (1987). doi: 10.2307/248684

Bergman, B., Klefsjö, B.: Quality from Customer Needs to Customer Satisfaction. Studentlitteratur, Lund (2010)

Brereton, P., Kitchenham, B.A., Budgen, D., Turner, M., Khalil, M.: Lessons from applying the systematic literature review process within the software engineering domain. J. Syst. Softw. 80 (4), 571–583 (2007). doi: 10.1016/j.jss.2006.07.009

Brereton, P., Kitchenham, B.A., Budgen, D.: Using a protocol template for case study planning. In: Proceedings of the 12th International Conference on Evaluation and Assessment in Software Engineering. University of Bari, Italy (2008)

Briand, L.C., Differding, C.M., Rombach, H.D.: Practical guidelines for measurement-based process improvement. Softw. Process: Improv. Pract. 2 (4), 253–280 (1996)

Briand, L.C., El Emam, K., Morasca, S.: On the application of measurement theory in software engineering. Empir. Softw. Eng. 1 (1), 61–88 (1996)

Briand, L.C., Bunse, C., Daly, J.W.: A controlled experiment for evaluating quality guidelines on the maintainability of object-oriented designs. IEEE Trans. Softw. Eng. 27 (6), 513–530 (2001)

British Psychological Society: Ethical principles for conducting research with human participants. Psychologist 6 (1), 33–35 (1993)

Budgen, D., Kitchenham, B.A., Charters, S., Turner, M., Brereton, P., Linkman, S.: Presenting software engineering results using structured abstracts: a randomised experiment. Empir. Softw. Eng. 13 , 435–468 (2008). doi: 10.1007/s10664-008-9075-7

Budgen, D., Burn, A.J., Kitchenham, B.A.: Reporting computing projects through structured abstracts: a quasi-experiment. Empir. Softw. Eng. 16 (2), 244–277 (2011). doi: 10.1007/s10664-010-9139-3

Campbell, D.T., Stanley, J.C.: Experimental and Quasi-experimental Designs for Research. Houghton Mifflin Company, Boston (1963)

Chrissis, M.B., Konrad, M., Shrum, S.: CMMI(R): Guidelines for process integration and product improvement. Technical report, SEI (2003)

Ciolkowski, M., Differding, C.M., Laitenberger, O., Münch, J.: Empirical investigation of perspective-based reading: A replicated experiment. Technical report, 97-13, ISERN (1997)

Coad, P., Yourdon, E.: Object-Oriented Design, 1st edn. Prentice-Hall, Englewood (1991)

Cohen, J.: Weighted kappa: nominal scale agreement with provision for scaled disagreement or partial credit. Psychol. Bull. 70 , 213–220 (1968)

Cook, T.D., Campbell, D.T.: Quasi-experimentation – Design and Analysis Issues for Field Settings. Houghton Mifflin Company, Boston (1979)

Corbin, J., Strauss, A.: Basics of Qualitative Research, 3rd edn. SAGE, Los Angeles (2008)

Cruzes, D.S., Dybå, T.: Research synthesis in software engineering: a tertiary study. Inf. Softw. Technol. 53 (5), 440–455 (2011). doi: 10.1016/j.infsof.2011.01.004

Dalkey, N., Helmer, O.: An experimental application of the delphi method to the use of experts. Manag. Sci. 9 (3), 458–467 (1963)

DeMarco, T.: Controlling Software Projects. Yourdon Press, New York (1982)

Demming, W.E.: Out of the Crisis. MIT Centre for Advanced Engineering Study, MIT Press, Cambridge, MA (1986)

Dieste, O., Grimán, A., Juristo, N.: Developing search strategies for detecting relevant experiments. Empir. Softw. Eng. 14 , 513–539 (2009). http://dx.doi.org/10.1007/s10664-008-9091-7

Dittrich, Y., Rönkkö, K., Eriksson, J., Hansson, C., Lindeberg, O.: Cooperative method development. Empir. Softw. Eng. 13 (3), 231–260 (2007). doi: 10.1007/s10664-007-9057-1

Doolan, E.P.: Experiences with Fagan’s inspection method. Softw. Pract. Exp. 22 (2), 173–182 (1992)

Dybå, T., Dingsøyr, T.: Empirical studies of agile software development: a systematic review. Inf. Softw. Technol. 50 (9-10), 833–859 (2008). doi: DOI: 10.1016/j.infsof.2008.01.006

Dybå, T., Dingsøyr, T.: Strength of evidence in systematic reviews in software engineering. In: Proceedings of the 2nd ACM-IEEE International Symposium on Empirical Software Engineering and Measurement, ESEM ’08, Kaiserslautern, pp. 178–187. ACM, New York (2008). doi:  http://doi.acm.org/10.1145/1414004.1414034

Dybå, T., Kitchenham, B.A., Jørgensen, M.: Evidence-based software engineering for practitioners. IEEE Softw. 22 , 58–65 (2005). doi:  http://doi.ieeecomputersociety.org/10.1109/MS.2005.6

Dybå, T., Kampenes, V.B., Sjøberg, D.I.K.: A systematic review of statistical power in software engineering experiments. Inf. Softw. Technol. 48 (8), 745–755 (2006). doi: 10.1016/j.infsof.2005.08.009

Easterbrook, S., Singer, J., Storey, M.-A., Damian, D.: Selecting empirical methods for software engineering research. In: F. Shull, J. Singer, D.I. Sjøberg (eds.) Guide to Advanced Empirical Software Engineering. Springer, London (2008)

Eick, S.G., Loader, C.R., Long, M.D., Votta, L.G., Vander Wiel, S.A.: Estimating software fault content before coding. In: Proceedings of the 14th International Conference on Software Engineering, Melbourne, pp. 59–65. ACM Press, New York (1992)

Eisenhardt, K.M.: Building theories from case study research. Acad. Manag. Rev. 14 (4), 532 (1989). doi: 10.2307/258557

Endres, A., Rombach, H.D.: A Handbook of Software and Systems Engineering – Empirical Observations, Laws and Theories. Pearson Addison-Wesley, Harlow/New York (2003)

Fagan, M.E.: Design and code inspections to reduce errors in program development. IBM Syst. J. 15 (3), 182–211 (1976)

Fenton, N.: Software measurement: A necessary scientific basis. IEEE Trans. Softw. Eng. 3 (20), 199–206 (1994)

Fenton, N., Pfleeger, S.L.: Software Metrics: A Rigorous and Practical Approach, 2nd edn. International Thomson Computer Press, London (1996)

Fenton, N., Pfleeger, S.L., Glass, R.: Science and substance: A challenge to software engineers. IEEE Softw. 11 , 86–95 (1994)

Fink, A.: The Survey Handbook, 2nd edn. SAGE, Thousand Oaks/London (2003)

Flyvbjerg, B.: Five misunderstandings about case-study research. In: Qualitative Research Practice, concise paperback edn., pp. 390–404. SAGE, London (2007)

Frigge, M., Hoaglin, D.C., Iglewicz, B.: Some implementations of the boxplot. Am. Stat. 43 (1), 50–54 (1989)

Fusaro, P., Lanubile, F., Visaggio, G.: A replicated experiment to assess requirements inspection techniques. Empir. Softw. Eng. 2 (1), 39–57 (1997)

Glass, R.L.: The software research crisis. IEEE Softw. 11 , 42–47 (1994)

Glass, R.L., Vessey, I., Ramesh, V.: Research in software engineering: An analysis of the literature. Inf. Softw. Technol. 44 (8), 491–506 (2002). doi: 10.1016/S0950-5849(02)00049-6

Gómez, O.S., Juristo, N., Vegas, S.: Replication types in experimental disciplines. In: Proceedings of the 4th ACM-IEEE International Symposium on Empirical Software Engineering and Measurement, Bolzano-Bozen (2010)

Gorschek, T., Wohlin, C.: Requirements abstraction model. Requir. Eng. 11 , 79–101 (2006). doi: 10.1007/s00766-005-0020-7

Gorschek, T., Garre, P., Larsson, S., Wohlin, C.: A model for technology transfer in practice. IEEE Softw. 23 (6), 88–95 (2006)

Gorschek, T., Garre, P., Larsson, S., Wohlin, C.: Industry evaluation of the requirements abstraction model. Requir. Eng. 12 , 163–190 (2007). doi: 10.1007/s00766-007-0047-z

Grady, R.B., Caswell, D.L.: Software Metrics: Establishing a Company-Wide Program. Prentice-Hall, Englewood (1994)

Grant, E.E., Sackman, H.: An exploratory investigation of programmer performance under on-line and off-line conditions. IEEE Trans. Human Factor Electron. HFE-8 (1), 33–48 (1967)

Gregor, S.: The nature of theory in information systems. MIS Q. 30 (3), 491–506 (2006)

Hall, T., Flynn, V.: Ethical issues in software engineering research: a survey of current practice. Empir. Softw. Eng. 6 , 305–317 (2001)

Hannay, J.E., Sjøberg, D.I.K., Dybå, T.: A systematic review of theory use in software engineering experiments. IEEE Trans. Softw. Eng. 33 (2), 87–107 (2007). doi: 10.1109/TSE.2007.12

Hannay, J.E., Dybå, T., Arisholm, E., Sjøberg, D.I.K.: The effectiveness of pair programming: a meta-analysis. Inf. Softw. Technol. 51 (7), 1110–1122 (2009). doi: 10.1016/j.infsof.2009.02.001

Hayes, W.: Research synthesis in software engineering: a case for meta-analysis. In: Proceedings of the 6th International Software Metrics Symposium, Boca Raton, pp. 143–151 (1999)

Hetzel, B.: Making Software Measurement Work: Building an Effective Measurement Program. Wiley, New York (1993)

Hevner, A.R., March, S.T., Park, J., Ram, S.: Design science in information systems research. MIS Q. 28 (1), 75–105 (2004)

Höst, M., Regnell, B., Wohlin, C.: Using students as subjects – a comparative study of students and professionals in lead-time impact assessment. Empir. Softw. Eng. 5 (3), 201–214 (2000)

Höst, M., Wohlin, C., Thelin, T.: Experimental context classification: Incentives and experience of subjects. In: Proceedings of the 27th International Conference on Software Engineering, St. Louis, pp. 470–478 (2005)

Höst, M., Runeson, P.: Checklists for software engineering case study research. In: Proceedings of the 1st International Symposium on Empirical Software Engineering and Measurement, Madrid, pp. 479–481 (2007)

Hove, S.E., Anda, B.: Experiences from conducting semi-structured interviews in empirical software engineering research. In: Proceedings of the 11th IEEE International Software Metrics Symposium, pp. 1–10. IEEE Computer Society Press, Los Alamitos (2005)

Humphrey, W.S.: Managing the Software Process. Addison-Wesley, Reading (1989)

Humphrey, W.S.: A Discipline for Software Engineering. Addison Wesley, Reading (1995)

Humphrey, W.S.: Introduction to the Personal Software Process. Addison Wesley, Reading (1997)

IEEE: IEEE standard glossary of software engineering terminology. Technical Report, IEEE Std 610.12-1990, IEEE (1990)

Iversen, J.H., Mathiassen, L., Nielsen, P.A.: Managing risk in software process improvement: an action research approach. MIS Q. 28 (3), 395–433 (2004)

Jedlitschka, A., Pfahl, D.: Reporting guidelines for controlled experiments in software engineering. In: Proceedings of the 4th International Symposium on Empirical Software Engineering, Noosa Heads, pp. 95–104 (2005)

Johnson, P.M., Tjahjono, D.: Does every inspection really need a meeting? Empir. Softw. Eng. 3 (1), 9–35 (1998)

Juristo, N., Moreno, A.M.: Basics of Software Engineering Experimentation. Springer, Kluwer Academic Publishers, Boston (2001)

Juristo, N., Vegas, S.: The role of non-exact replications in software engineering experiments. Empir. Softw. Eng. 16 , 295–324 (2011). doi: 10.1007/s10664-010-9141-9

Kachigan, S.K.: Statistical Analysis: An Interdisciplinary Introduction to Univariate and Multivariate Methods. Radius Press, New York (1986)

Kachigan, S.K.: Multivariate Statistical Analysis: A Conceptual Introduction, 2nd edn. Radius Press, New York (1991)

Kampenes, V.B., Dyba, T., Hannay, J.E., Sjø berg, D.I.K.: A systematic review of effect size in software engineering experiments. Inf. Softw. Technol. 49 (11–12), 1073–1086 (2007). doi: 10.1016/j.infsof.2007.02.015

Karahasanović, A., Anda, B., Arisholm, E., Hove, S.E., Jørgensen, M., Sjøberg, D., Welland, R.: Collecting feedback during software engineering experiments. Empir. Softw. Eng. 10 (2), 113–147 (2005). doi: 10.1007/s10664-004-6189-4. http://www.springerlink.com/index/10.1007/s10664-004-6189-4

Karlström, D., Runeson, P., Wohlin, C.: Aggregating viewpoints for strategic software process improvement. IEE Proc. Softw. 149 (5), 143–152 (2002). doi: 10.1049/ip-sen:20020696

Kitchenham, B.A.: The role of replications in empirical software engineering – a word of warning. Empir. Softw. Eng. 13 , 219–221 (2008). 10.1007/s10664-008-9061-0

Kitchenham, B.A., Charters, S.: Guidelines for performing systematic literature reviews in software engineering (version 2.3). Technical Report, EBSE Technical Report EBSE-2007-01, Keele University and Durham University (2007)

Kitchenham, B.A., Pickard, L.M., Pfleeger, S.L.: Case studies for method and tool evaluation. IEEE Softw. 12 (4), 52–62 (1995)

Kitchenham, B.A., Pfleeger, S.L., Pickard, L.M., Jones, P.W., Hoaglin, D.C., El Emam, K., Rosenberg, J.: Preliminary guidelines for empirical research in software engineering. IEEE Trans. Softw. Eng. 28 (8), 721–734 (2002). doi: 10.1109/TSE.2002.1027796. http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=1027796

Kitchenham, B., Fry, J., Linkman, S.G.: The case against cross-over designs in software engineering. In: Proceedings of the 11th International Workshop on Software Technology and Engineering Practice, Amsterdam, pp. 65–67. IEEE Computer Society, Los Alamitos (2003)

Kitchenham, B.A., Dybå, T., Jørgensen, M.: Evidence-based software engineering. In: Proceedings of the 26th International Conference on Software Engineering, Edinburgh, pp. 273–281 (2004)

Kitchenham, B.A., Al-Khilidar, H., Babar, M.A., Berry, M., Cox, K., Keung, J., Kurniawati, F., Staples, M., Zhang, H., Zhu, L.: Evaluating guidelines for reporting empirical software engineering studies. Empir. Softw. Eng. 13 (1), 97–121 (2007). doi: 10.1007/s10664-007-9053-5. http://www.springerlink.com/index/10.1007/s10664-007-9053-5

Kitchenham, B.A., Jeffery, D.R., Connaughton, C.: Misleading metrics and unsound analyses. IEEE Softw. 24 , 73–78 (2007). doi: 10.1109/MS.2007.49

Kitchenham, B.A., Brereton, P., Budgen, D., Turner, M., Bailey, J., Linkman, S.G.: Systematic literature reviews in software engineering – a systematic literature review. Inf. Softw. Technol. 51 (1), 7–15 (2009). doi: 10.1016/j.infsof.2008.09.009. http://www.dx.doi.org/10.1016/j.infsof.2008.09.009

Kitchenham, B.A., Pretorius, R., Budgen, D., Brereton, P., Turner, M., Niazi, M., Linkman, S.: Systematic literature reviews in software engineering – a tertiary study. Inf. Softw. Technol.  52 (8), 792–805 (2010). doi: 10.1016/j.infsof.2010.03.006

Kitchenham, B.A., Sjøberg, D.I.K., Brereton, P., Budgen, D., Dybå, T., Höst, M., Pfahl, D., Runeson, P.: Can we evaluate the quality of software engineering experiments? In: Proceedings of the 4th ACM-IEEE International Symposium on Empirical Software Engineering and Measurement. ACM, Bolzano/Bozen (2010)

Kitchenham, B.A., Budgen, D., Brereton, P.: Using mapping studies as the basis for further research – a participant-observer case study. Inf. Softw. Technol. 53 (6), 638–651 (2011). doi: 10.1016/j.infsof.2010.12.011

Laitenberger, O., Atkinson, C., Schlich, M., El Emam, K.: An experimental comparison of reading techniques for defect detection in UML design documents. J. Syst. Softw. 53 (2), 183–204 (2000)

Larsson, R.: Case survey methodology: quantitative analysis of patterns across case studies. Acad. Manag. J. 36 (6), 1515–1546 (1993)

Lee, A.S.: A scientific methodology for MIS case studies. MIS Q. 13 (1), 33 (1989). doi: 10.2307/248698. http://www.jstor.org/stable/248698?origin=crossref

Lehman, M.M.: Program, life-cycles and the laws of software evolution. Proc. IEEE 68 (9), 1060–1076 (1980)

Lethbridge, T.C., Sim, S.E., Singer, J.: Studying software engineers: data collection techniques for software field studies. Empir. Softw. Eng. 10 , 311–341 (2005)

Linger, R.: Cleanroom process model. IEEE Softw. pp. 50–58 (1994)

Linkman, S., Rombach, H.D.: Experimentation as a vehicle for software technology transfer – a family of software reading techniques. Inf. Softw. Technol. 39 (11), 777–780 (1997)

Lucas, W.A.: The case survey method: aggregating case experience. Technical Report, R-1515-RC, The RAND Corporation, Santa Monica (1974)

Lucas, H.C., Kaplan, R.B.: A structured programming experiment. Comput. J. 19 (2), 136–138 (1976)

Lyu, M.R. (ed.): Handbook of Software Reliability Engineering. McGraw-Hill, New York (1996)

Maldonado, J.C., Carver, J., Shull, F., Fabbri, S., Dória, E., Martimiano, L., Mendonça, M., Basili, V.: Perspective-based reading: a replicated experiment focused on individual reviewer effectiveness. Empir. Softw. Eng. 11 , 119–142 (2006). doi:  10.1007/s10664-006-5967-6

Manly, B.F.J.: Multivariate Statistical Methods: A Primer, 2nd edn. Chapman and Hall, London (1994)

Marascuilo, L.A., Serlin, R.C.: Statistical Methods for the Social and Behavioral Sciences. W. H. Freeman and Company, New York (1988)

Miller, J.: Estimating the number of remaining defects after inspection. Softw. Test. Verif. Reliab. 9 (4), 167–189 (1999)

Miller, J.: Applying meta-analytical procedures to software engineering experiments. J. Syst. Softw. 54 (1), 29–39 (2000)

Miller, J.: Statistical significance testing: a panacea for software technology experiments? J. Syst. Softw. 73 , 183–192 (2004). doi:  http://dx.doi.org/10.1016/j.jss.2003.12.019

Miller, J.: Replicating software engineering experiments: a poisoned chalice or the holy grail. Inf. Softw. Technol. 47 (4), 233–244 (2005)

Miller, J., Wood, M., Roper, M.: Further experiences with scenarios and checklists. Empir. Softw. Eng. 3 (1), 37–64 (1998)

Montgomery, D.C.: Design and Analysis of Experiments, 5th edn. Wiley, New York (2000)

Myers, G.J.: A controlled experiment in program testing and code walkthroughs/inspections. Commun. ACM 21 , 760–768 (1978). doi:  http://doi.acm.org/10.1145/359588.359602

Noblit, G.W., Hare, R.D.: Meta-Ethnography: Synthesizing Qualitative Studies. Sage Publications, Newbury Park (1988)

Ohlsson, M.C., Wohlin, C.: A project effort estimation study. Inf. Softw. Technol. 40 (14), 831–839 (1998)

Owen, S., Brereton, P., Budgen, D.: Protocol analysis: a neglected practice. Commun. ACM 49 (2), 117–122 (2006). doi: 10.1145/1113034.1113039

Paulk, M.C., Curtis, B., Chrissis, M.B., Weber, C.V.: Capability maturity model for software. Technical Report, CMU/SEI-93-TR-24, Software Engineering Institute, Pittsburgh (1993)

Petersen, K., Feldt, R., Mujtaba, S., Mattsson, M.: Systematic mapping studies in software engineering. In: Proceedings of the 12th International Conference on Evaluation and Assessment in Software Engineering, Electronic Workshops in Computing (eWIC). BCS, University of Bari, Italy (2008)

Petersen, K., Wohlin, C.: Context in industrial software engineering research. In: Proceedings of the 3rd ACM-IEEE International Symposium on Empirical Software Engineering and Measurement, Lake Buena Vista, pp. 401–404 (2009)

Pfleeger, S.L.: Experimental design and analysis in software engineering part 1–5. ACM Sigsoft, Softw. Eng. Notes, 19 (4), 16–20; 20 (1), 22–26; 20 (2), 14–16; 20 (3), 13–15; 20 , (1994)

Pfleeger, S.L., Atlee, J.M.: Software Engineering: Theory and Practice, 4th edn. Pearson Prentice-Hall, Upper Saddle River (2009)

Pickard, L.M., Kitchenham, B.A., Jones, P.W.: Combining empirical results in software engineering. Inf. Softw. Technol. 40 (14), 811–821 (1998). doi: 10.1016/S0950-5849(98)00101-3

Porter, A.A., Votta, L.G.: An experiment to assess different defect detection methods for software requirements inspections. In: Proceedings of the 16th International Conference on Software Engineering, Sorrento, pp. 103–112 (1994)

Porter, A.A., Votta, L.G.: Comparing detection methods for software requirements inspection: a replicated experiment. IEEE Trans. Softw. Eng. 21 (6), 563–575 (1995)

Porter, A.A., Votta, L.G.: Comparing detection methods for software requirements inspection: a replicated experimentation: a replication using professional subjects. Empir. Softw. Eng. 3 (4), 355–380 (1998)

Porter, A.A., Siy, H.P., Toman, C.A., Votta, L.G.: An experiment to assess the cost-benefits of code inspections in large scale software development. IEEE Trans. Softw. Eng. 23 (6), 329–346 (1997)

Potts, C.: Software engineering research revisited. IEEE Softw. pp. 19–28 (1993)

Rainer, A.W.: The longitudinal, chronological case study research strategy: a definition, and an example from IBM Hursley Park. Inf. Softw. Technol. 53 (7), 730–746 (2011)

Robinson, H., Segal, J., Sharp, H.: Ethnographically-informed empirical studies of software practice. Inf. Softw. Technol. 49 (6), 540–551 (2007). doi: 10.1016/j.infsof.2007.02.007

Robson, C.: Real World Research: A Resource for Social Scientists and Practitioners-Researchers, 1st edn. Blackwell, Oxford/Cambridge (1993)

Robson, C.: Real World Research: A Resource for Social Scientists and Practitioners-Researchers, 2nd edn. Blackwell, Oxford/Madden (2002)

Runeson, P., Skoglund, M.: Reference-based search strategies in systematic reviews. In: Proceedings of the 13th International Conference on Empirical Assessment and Evaluation in Software Engineering. Electronic Workshops in Computing (eWIC). BCS, Durham University, UK (2009)

Runeson, P., Höst, M., Rainer, A.W., Regnell, B.: Case Study Research in Software Engineering. Guidelines and Examples. Wiley, Hoboken (2012)

Sandahl, K., Blomkvist, O., Karlsson, J., Krysander, C., Lindvall, M., Ohlsson, N.: An extended replication of an experiment for assessing methods for software requirements. Empir. Softw. Eng. 3 (4), 381–406 (1998)

Seaman, C.B.: Qualitative methods in empirical studies of software engineering. IEEE Trans. Softw. Eng. 25 (4), 557–572 (1999)

Selby, R.W., Basili, V.R., Baker, F.T.: Cleanroom software development: An empirical evaluation. IEEE Trans. Softw. Eng. 13 (9), 1027–1037 (1987)

Shepperd, M.: Foundations of Software Measurement. Prentice-Hall, London/New York (1995)

Shneiderman, B., Mayer, R., McKay, D., Heller, P.: Experimental investigations of the utility of detailed flowcharts in programming. Commun. ACM 20 , 373–381 (1977). doi: 10.1145/359605.359610

Shull, F.: Developing techniques for using software documents: a series of empirical studies. Ph.D. thesis, Computer Science Department, University of Maryland, USA (1998)

Shull, F., Basili, V.R., Carver, J., Maldonado, J.C., Travassos, G.H., Mendonça, M.G., Fabbri, S.: Replicating software engineering experiments: addressing the tacit knowledge problem. In: Proceedings of the 1st International Symposium on Empirical Software Engineering, Nara, pp. 7–16 (2002)

Shull, F., Mendoncça, M.G., Basili, V.R., Carver, J., Maldonado, J.C., Fabbri, S., Travassos, G.H., Ferreira, M.C.: Knowledge-sharing issues in experimental software engineering. Empir. Softw. Eng.  9 , 111–137 (2004). doi: 10.1023/B:EMSE.0000013516.80487.33

Shull, F., Carver, J., Vegas, S., Juristo, N.: The role of replications in empirical software engineering. Empir. Softw. Eng. 13 , 211–218 (2008). doi: 10.1007/s10664-008-9060-1

Sieber, J.E.: Protecting research subjects, employees and researchers: implications for software engineering. Empir. Softw. Eng. 6 (4), 329–341 (2001)

Siegel, S., Castellan, J.: Nonparametric Statistics for the Behavioral Sciences, 2nd edn. McGraw-Hill International Editions, New York (1988)

Singer, J., Vinson, N.G.: Why and how research ethics matters to you. Yes, you! Empir. Softw. Eng. 6 , 287–290 (2001). doi: 10.1023/A:1011998412776

Singer, J., Vinson, N.G.: Ethical issues in empirical studies of software engineering. IEEE Trans. Softw. Eng. 28 (12), 1171–1180 (2002). doi: 10.1109/TSE.2002.1158289. http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=1158289

Simon S.: Fermat’s Last Theorem. Fourth Estate, London (1997)

Sjøberg, D.I.K., Hannay, J.E., Hansen, O., Kampenes, V.B., Karahasanovic, A., Liborg, N.-K., Rekdal, A.C.: A survey of controlled experiments in software engineering. IEEE Trans. Softw. Eng. 31 (9), 733–753 (2005). doi: 10.1109/TSE.2005.97. http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=1514443

Sjøberg, D.I.K., Dybå, T., Anda, B., Hannay, J.E.: Building theories in software engineering. In: Shull, F., Singer, J., Sjøberg D. (eds.) Guide to Advanced Empirical Software Engineering. Springer, London (2008)

Sommerville, I.: Software Engineering, 9th edn. Addison-Wesley, Wokingham, England/ Reading (2010)

Sørumgård, S.: Verification of process conformance in empirical studies of software development. Ph.D. thesis, The Norwegian University of Science and Technology, Department of Computer and Information Science, Norway (1997)

Stake, R.E.: The Art of Case Study Research. SAGE Publications, Thousand Oaks (1995)

Staples, M., Niazi, M.: Experiences using systematic review guidelines. J. Syst. Softw. 80 (9), 1425–1437 (2007). doi: 10.1016/j.jss.2006.09.046

Thelin, T., Runeson, P.: Capture-recapture estimations for perspective-based reading – a simulated experiment. In: Proceedings of the 1st International Conference on Product Focused Software Process Improvement (PROFES), Oulu, pp. 182–200 (1999)

Thelin, T., Runeson, P., Wohlin, C.: An experimental comparison of usage-based and checklist-based reading. IEEE Trans. Softw. Eng. 29 (8), 687–704 (2003). doi: 10.1109/TSE.2003.1223644

Tichy, W.F.: Should computer scientists experiment more? IEEE Comput. 31 (5), 32–39 (1998)

Tichy, W.F., Lukowicz, P., Prechelt, L., Heinz, E.A.: Experimental evaluation in computer science: a quantitative study. J. Syst. Softw. 28 (1), 9–18 (1995)

Trochim, W.M.K.: The Research Methods Knowledge Base, 2nd edn. Cornell Custom Publishing, Cornell University, Ithaca (1999)

van Solingen, R., Berghout, E.: The Goal/Question/Metric Method: A Practical Guide for Quality Improvement and Software Development. McGraw-Hill International, London/Chicago (1999)

Verner, J.M., Sampson, J., Tosic, V., Abu Bakar, N.A., Kitchenham, B.A.: Guidelines for industrially-based multiple case studies in software engineering. In: Third International Conference on Research Challenges in Information Science, Fez, pp. 313–324 (2009)

Vinson, N.G., Singer, J.: A practical guide to ethical research involving humans. In: Shull, F., Singer, J., Sjøberg, D. (eds.) Guide to Advanced Empirical Software Engineering. Springer, London (2008)

Votta, L.G.: Does every inspection need a meeting? In: Proceedings of the ACM SIGSOFT Symposium on Foundations of Software Engineering, ACM Software Engineering Notes, vol. 18, pp. 107–114. ACM Press, New York (1993)

Wallace, C., Cook, C., Summet, J., Burnett, M.: Human centric computing languages and environments. In: Proceedings of Symposia on Human Centric Computing Languages and Environments, Arlington, pp. 63–65 (2002)

Wohlin, C., Gustavsson, A., Höst, M., Mattsson, C.: A framework for technology introduction in software organizations. In: Proceedings of the Conference on Software Process Improvement, Brighton, pp. 167–176 (1996)

Wohlin, C., Runeson, P., Höst, M., Ohlsson, M.C., Regnell, B., Wesslén, A.: Experimentation in Software Engineering: An Introduction. Kluwer, Boston (2000)

Wohlin, C., Aurum, A., Angelis, L., Phillips, L., Dittrich, Y., Gorschek, T., Grahn, H., Henningsson, K., Kågström, S., Low, G., Rovegård, P., Tomaszewski, P., van Toorn, C., Winter, J.: Success factors powering industry-academia collaboration in software research. IEEE Softw. (PrePrints) (2011). doi: 10.1109/MS.2011.92

Yin, R.K.: Case Study Research Design and Methods, 4th edn. Sage Publications, Beverly Hills (2009)

Zelkowitz, M.V., Wallace, D.R.: Experimental models for validating technology. IEEE Comput. 31 (5), 23–31 (1998)

Zendler, A.: A preliminary software engineering theory as investigated by published experiments. Empir. Softw. Eng. 6 , 161–180 (2001). doi:  http://dx.doi.org/10.1023/A:1011489321999

Download references

Author information

Authors and affiliations.

School of Computing Blekinge Institute of Technology, Karlskrona, Sweden

Claes Wohlin

Department of Computer Science, Lund University, Lund, Sweden

Per Runeson, Martin Höst & Björn Regnell

System Verification Sweden AB, Malmö, Sweden

Magnus C. Ohlsson

ST-Ericsson AB, Lund, Sweden

Anders Wesslén

You can also search for this author in PubMed   Google Scholar

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter.

Wohlin, C., Runeson, P., Höst, M., Ohlsson, M.C., Regnell, B., Wesslén, A. (2012). Systematic Literature Reviews. In: Experimentation in Software Engineering. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-29044-2_4

Download citation

DOI : https://doi.org/10.1007/978-3-642-29044-2_4

Published : 02 May 2012

Publisher Name : Springer, Berlin, Heidelberg

Print ISBN : 978-3-642-29043-5

Online ISBN : 978-3-642-29044-2

eBook Packages : Computer Science Computer Science (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

Systematic literature reviews in software engineering���A systematic literature review

Profile image of Jessica Díaz

A Systematic Literature Review (SLR), also referred as systematic review, is considered one of the key research methodologies of Evidence-Based Software Engineering (EBSE). Systematic reviews have been gaining significant attention from software engineering researchers since Kitchenham, Dyba and Jorgensen's seminal paper on EBSE published in ICSE 2004 . Software Engi-neering (SE) researchers have been conducting and reporting more and more SLRs on diverse topics such as agile software development , regression testing , process modeling , variability management , cost estimation , organizational motivators for CMM-based process improvement , and statistical power . Researchers have also reported best practise and experiences of conducting and reporting systematic reviews . In addition, the techniques for designing the strategies for assessing the quality of the reported primary studies included in a systematic review have been proposed . Moreover, there has been at least one tertiary...

Related Papers

Mahmood Niazi

literature review software engi

Information and Software Technology

Andre Santos

Anayiaz Zaigmie

Flaviana Amorim

Fernando Paulovich

Empirical Software Engineering

Per Runeson

BTH, Sweden Official Website

Affan Yasin

Context: The Internet has become a vital channel for disseminating and accessing scientific literature for both the academic and industrial research needs. Nowadays, everyone has wide access to scientific literature repositories, which comprise of both “white” and “Grey” literature. The “Grey” literature, as opposed to “white” literature, is non-peer reviewed scientific information that is not available using commercial information sources such as IEEE or ACM. A large number of software engineering researchers are undertaking systematic literature reviews (SLRs) to investigate empirical evidence in software engineering. The key reason to include grey literature during information synthesis is to minimize the risk of any bias in the publication. Using the state of the art non-commercial databases that index information, the researchers can make the rigorous process of searching empirical studies in SLRs easier. This study explains the evidence of Grey literature while performing synthesis in Systematic Literature Reviews. Objectives: The goals of this thesis work are, 1. To identify the extent of usage of Grey Literature in synthesis during systematic literature reviews. 2. To investigate if non-commercial information sources primarily Google Scholar are sufficient for retrieving primary studies for SLRs. Methods: The work consists of a systematic literature review of SLRs and is a tertiary study and meta-analysis. The systematic literature review was conducted on 138 SLRs’ published through 2003 until 2012 (June). The article sources used are IEEEXplore, ACM Digital Library, Springer-Link and Science Direct. Results: For each of the selected article sources such as ACM, IEEEXplore, Springer-link and Science Direct, we have presented results, which describe the extent of the usage of Grey literature. The qualitative results discuss various strategies for systematic evaluation of the Grey literature during systematic literature review. The quantitative results comprise of charts and tables, showing the extent of Grey literature usage. The results from analysis of Google Scholar database describe the total number of primary studies that we are able to find using only Google Scholar database. Conclusion: From the analysis of 138 Systematic Literature Reviews (SLRs’), we conclude that the evidence of Grey literature in SLRs is around 9%. The percentage of Grey literature sources used in information synthesis sections of SLRs is around 93.2%. We were able to retrieve around 96 % of primary studies using Google Scholar database. We conclude that Google Scholar can be a choice for retrieving published studies however; it lacks detailed search options to target wider pool of articles. We also conclude that Grey literature is widely available in this age of information. We have provided guidelines in the form of strategies for systematic evaluation of Grey literature

Journal of Software

Emília Mendes

Lianping Chen

Systematic Literature Reviews and Systematic Mapping Studies are relatively new forms of secondary studies in software engineering. Identifying relevant papers from various Electronic Data Sources (EDS) is one of the key activities of conducting these kinds of studies. Hence, the selection of EDS for searching the potentially relevant papers is an important decision, which can affect a study’s coverage of relevant papers. Researchers usually select EDS mainly based on personal knowledge, experience, and preferences and/or recommendations by other researchers. We believe that building an evidence-based understanding of EDS can enable researchers to make more informed decisions about the selection of EDS. This paper reports our initial effort towards this end. We propose an initial set of metrics for characterizing the EDS from the perspective of the needs of secondary studies. We explain the usage and benefits of the proposed metrics using the data gathered from two secondary studies. We also tried to synthesize the data from the two studies and that from literature to provide initial evidence-based heuristics for EDS selection.

RELATED PAPERS

IEEE Transactions on Software Engineering

Norsaremah Salleh

2011 25th Brazilian Symposium on Software Engineering

Fajar Ekaputra

2014 8th. Malaysian Software Engineering Conference (MySEC)

Masitah Ghazali

Barbara Russo

Louis Major

Phoebe Chang

Journal of System and Software

Noor Hasrina Bakar

Proceedings of the 18th International Conference on Evaluation and Assessment in Software Engineering - EASE '14

Federica Sarro

Mansooreh Zahedi

Emanoel Barreiros

Adson Cunha

International Journal of Information Management

Wasif Afzal , Richard Torkar

J. Verner , Mahmood Niazi

Olavo Barbosa

Sezin Yaman , Leah Riungu-Kalliosaari , Tomi Männistö

International Journal of Recent Research Aspects ISSN 2349-7688

Special Issue: Advances in Network …

shweta Singhal

Mark Toleman

Barbara Russo , Brian Fitzgerald

Proceedings of the Second ACM-IEEE international symposium on Empirical software engineering and measurement - ESEM '08

Torgeir Dingsøyr , Tore Dybå

Dietmar Pfahl

Requirements Engineering

First International Symposium on Empirical Software Engineering and Measurement (ESEM 2007)

Torgeir Dingsøyr , Tore Dybå , Geir Hanssen

Proceedings of the 10th International Conference on Predictive Models in Software Engineering - PROMISE '14

Emília Mendes , Frâncila Weidt Neiva

12th Workshop on Experimental Software Engineering - ESELAW 2015

Amanda da Silva , Thiago Reis

Chitu Okoli

RELATED TOPICS

  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024

The blue social bookmark and publication sharing system.

Log in with your username.

I've lost my password.

Log in with your OpenID-Provider.

  • Other OpenID-Provider

Guidelines for performing Systematic Literature Reviews in Software Engineering

  • engineering
  • evidence-based
  • research-methods
  • research.cs.softeng
  • research.support
  • systematic-literature-review
  • systematic-review

@jpmor

Comments and Reviews show / hide

Cite this publication, more citation styles.

BibSonomy is offered by the Data Science Chair of the University of Würzburg, the Information Processing and Analytics Group of the Humboldt-Unversität zu Berlin, the KDE Group of the University of Kassel, and the L3S Research Center .

All-in-one Literature Review Software

Start your free trial.

Free MAXQDA trial for Windows and Mac

Your trial will end automatically after 14 days.

MAXQDA The All-in-one Literature Review Software

MAXQDA is the best choice for a comprehensive literature review. It works with a wide range of data types and offers powerful tools for literature review, such as reference management, qualitative, vocabulary, text analysis tools, and more.

Document viewer

Your analysis.

Literature Review Software MAXQDA Interface

As your all-in-one literature review software, MAXQDA can be used to manage your entire research project. Easily import data from texts, interviews, focus groups, PDFs, web pages, spreadsheets, articles, e-books, and even social media data. Connect the reference management system of your choice with MAXQDA to easily import bibliographic data. Organize your data in groups, link relevant quotes to each other, keep track of your literature summaries, and share and compare work with your team members. Your project file stays flexible and you can expand and refine your category system as you go to suit your research.

Developed by and for researchers – since 1989

literature review software engi

Having used several qualitative data analysis software programs, there is no doubt in my mind that MAXQDA has advantages over all the others. In addition to its remarkable analytical features for harnessing data, MAXQDA’s stellar customer service, online tutorials, and global learning community make it a user friendly and top-notch product.

Sally S. Cohen – NYU Rory Meyers College of Nursing

Literature Review is Faster and Smarter with MAXQDA

All-in-one Literature Review Software MAXQDA: Import of documents

Easily import your literature review data

With a literature review software like MAXQDA, you can easily import bibliographic data from reference management programs for your literature review. MAXQDA can work with all reference management programs that can export their databases in RIS-format which is a standard format for bibliographic information. Like MAXQDA, these reference managers use project files, containing all collected bibliographic information, such as author, title, links to websites, keywords, abstracts, and other information. In addition, you can easily import the corresponding full texts. Upon import, all documents will be automatically pre-coded to facilitate your literature review at a later stage.

Capture your ideas while analyzing your literature

Great ideas will often occur to you while you’re doing your literature review. Using MAXQDA as your literature review software, you can create memos to store your ideas, such as research questions and objectives, or you can use memos for paraphrasing passages into your own words. By attaching memos like post-it notes to text passages, texts, document groups, images, audio/video clips, and of course codes, you can easily retrieve them at a later stage. Particularly useful for literature reviews are free memos written during the course of work from which passages can be copied and inserted into the final text.

Using Literature Review Software MAXQDA to Organize Your Qualitative Data: Memo Tools

Find concepts important to your generated literature review

When generating a literature review you might need to analyze a large amount of text. Luckily MAXQDA as the #1 literature review software offers Text Search tools that allow you to explore your documents without reading or coding them first. Automatically search for keywords (or dictionaries of keywords), such as important concepts for your literature review, and automatically code them with just a few clicks. Document variables that were automatically created during the import of your bibliographic information can be used for searching and retrieving certain text segments. MAXQDA’s powerful Coding Query allows you to analyze the combination of activated codes in different ways.

Aggregate your literature review

When conducting a literature review you can easily get lost. But with MAXQDA as your literature review software, you will never lose track of the bigger picture. Among other tools, MAXQDA’s overview and summary tables are especially useful for aggregating your literature review results. MAXQDA offers overview tables for almost everything, codes, memos, coded segments, links, and so on. With MAXQDA literature review tools you can create compressed summaries of sources that can be effectively compared and represented, and with just one click you can easily export your overview and summary tables and integrate them into your literature review report.

Visual text exploration with MAXQDA's Word Tree

Powerful and easy-to-use literature review tools

Quantitative aspects can also be relevant when conducting a literature review analysis. Using MAXQDA as your literature review software enables you to employ a vast range of procedures for the quantitative evaluation of your material. You can sort sources according to document variables, compare amounts with frequency tables and charts, and much more. Make sure you don’t miss the word frequency tools of MAXQDA’s add-on module for quantitative content analysis. Included are tools for visual text exploration, content analysis, vocabulary analysis, dictionary-based analysis, and more that facilitate the quantitative analysis of terms and their semantic contexts.

Visualize your literature review

As an all-in-one literature review software, MAXQDA offers a variety of visual tools that are tailor-made for qualitative research and literature reviews. Create stunning visualizations to analyze your material. Of course, you can export your visualizations in various formats to enrich your literature review analysis report. Work with word clouds to explore the central themes of a text and key terms that are used, create charts to easily compare the occurrences of concepts and important keywords, or make use of the graphical representation possibilities of MAXMaps, which in particular permit the creation of concept maps. Thanks to the interactive connection between your visualizations with your MAXQDA data, you’ll never lose sight of the big picture.

Daten visualization with Literature Review Software MAXQDA

AI Assist: literature review software meets AI

AI Assist – your virtual research assistant – supports your literature review with various tools. AI Assist simplifies your work by automatically analyzing and summarizing elements of your research project and by generating suggestions for subcodes. No matter which AI tool you use – you can customize your results to suit your needs.

Free tutorials and guides on literature review

MAXQDA offers a variety of free learning resources for literature review, making it easy for both beginners and advanced users to learn how to use the software. From free video tutorials and webinars to step-by-step guides and sample projects, these resources provide a wealth of information to help you understand the features and functionality of MAXQDA for literature review. For beginners, the software’s user-friendly interface and comprehensive help center make it easy to get started with your data analysis, while advanced users will appreciate the detailed guides and tutorials that cover more complex features and techniques. Whether you’re just starting out or are an experienced researcher, MAXQDA’s free learning resources will help you get the most out of your literature review.

Free Tutorials for Literature Review Software MAXQDA

Free MAXQDA Trial for Windows and Mac

Get your maxqda license, compare the features of maxqda and maxqda analytics pro, faq: literature review software.

Literature review software is a tool designed to help researchers efficiently manage and analyze the existing body of literature relevant to their research topic. MAXQDA, a versatile qualitative data analysis tool, can be instrumental in this process.

Literature review software, like MAXQDA, typically includes features such as data import and organization, coding and categorization, advanced search capabilities, data visualization tools, and collaboration features. These features facilitate the systematic review and analysis of relevant literature.

Literature review software, including MAXQDA, can assist in qualitative data interpretation by enabling researchers to organize, code, and categorize relevant literature. This organized data can then be analyzed to identify trends, patterns, and themes, helping researchers draw meaningful insights from the literature they’ve reviewed.

Yes, literature review software like MAXQDA is suitable for researchers of all levels of experience. It offers user-friendly interfaces and extensive support resources, making it accessible to beginners while providing advanced features that cater to the needs of experienced researchers.

Getting started with literature review software, such as MAXQDA, typically involves downloading and installing the software, importing your relevant literature, and exploring the available features. Many software providers offer tutorials and documentation to help users get started quickly.

For students, MAXQDA can be an excellent literature review software choice. Its user-friendly interface, comprehensive feature set, and educational discounts make it a valuable tool for students conducting literature reviews as part of their academic research.

MAXQDA is available for both Windows and Mac users, making it a suitable choice for Mac users looking for literature review software. It offers a consistent and feature-rich experience on Mac operating systems.

When it comes to literature review software, MAXQDA is widely regarded as one of the best choices. Its robust feature set, user-friendly interface, and versatility make it a top pick for researchers conducting literature reviews.

Yes, literature reviews can be conducted without software. However, using literature review software like MAXQDA can significantly streamline and enhance the process by providing tools for efficient data management, analysis, and visualization.

literature review software engi

IMAGES

  1. (PDF) Systematic Literature Review: 5 Years Trend in the Field of

    literature review software engi

  2. Synthesis

    literature review software engi

  3. Best software for literature review

    literature review software engi

  4. Literature Review using Online Software

    literature review software engi

  5. a-systematized-literature-review-of-student-learning-participation-and

    literature review software engi

  6. [PDF] Lessons from applying the systematic literature review process

    literature review software engi

VIDEO

  1. What is Literature Review?

  2. Effective Review of Literature

  3. Literature review and its process

  4. Literature Review In ONE Day

  5. Literature Review & Technical Reading

  6. Literature Review Using Free Tools -- Pubmed

COMMENTS

  1. Guidelines for performing Systematic Literature Reviews in Software Engineering

    The objective of this report is to propose comprehensive guidelines for systematic literature reviews appropriate for software engineering researchers, including PhD students.

  2. Systematic literature reviews in software engineering

    1. Introduction In a series of three papers Kitchenham, Dybå and Jørgensen suggested that software engineers in general, and empirical software engineering researchers in particular, should adopt evidence-based practice as pioneered in the fields of medicine and sociology [1], [2], [3].

  3. Systematic literature reviews in software engineering

    Abstract Systematic Literature Reviews (SLRs) have been gaining significant attention from software engineering researchers since 2004. Several researchers have reported their expe-riences of and lessons learned from applying systematic re-views to different subject matters in software engineering.

  4. Guidelines for snowballing in systematic literature studies and a

    Objective: This paper presents guidelines for conducting literature reviews using a snowballing approach, and they are illustrated and evaluated by replicating a published systematic literature review.

  5. Guidelines for performing Systematic Literature Reviews in Software

    literature reviews appropriate for software engineering researchers, including PhD students. A systematic literature review is a means of evaluating and interpreting all available research relevant to a particular research question, topic area, or phenomenon of interest. Systematic reviews aim to present a fair evaluation of a

  6. Large Language Models for Software Engineering: A Systematic Literature

    Large Language Models (LLMs) have significantly impacted numerous domains, including Software Engineering (SE). Many recent publications have explored LLMs applied to various SE tasks. Nevertheless, a comprehensive understanding of the application, effects, and possible limitations of LLMs on SE is still in its early stages. To bridge this gap, we conducted a systematic literature review on ...

  7. PDF Large Language Models for Software Engineering: A Systematic Literature

    Large Language Models (LLMs) have significantly impacted numerous domains, including Software Engi-neering (SE). Many recent publications have explored LLMs applied to various SE tasks. Nevertheless, a ... related works, where a number of literature reviews or survey papers have been produced [31, 68, 69, 362, 395]. Table1summarises some of these.

  8. PDF Analysing app reviews for software engineering: a systematic literature

    This paper focuses on analysing app reviews for software engineering. App reviews are textual feedback associated with a star rating that app users can provide to other App Store users and app developers about their experience of an app (App Store 2021).

  9. Systematic Literature Reviews

    Kitchenham et al. report 53 unique systematic literature reviews in software engineering being published between 2004 and 2008 [103, 104]. They conclude that there is a growth of the number of systematic literature reviews being published, and that the quality of the reviews tend to be increasing too. However, still there is large variation ...

  10. Evidence-Based Software Engineering and Systematic Reviews

    The book is divided into three parts. The first part discusses the nature of evidence and the evidence-based practices centered on a systematic review, both in general and as applying to software engineering. The second part examines the different elements that provide inputs to a systematic review (usually considered as forming a secondary ...

  11. PDF Systematic literature reviews in software engineering

    Systematic literature reviews in software engineering - A tertiary study Barbara Kitchenhama,*, Rialette Pretoriusb, David Budgenb, O. Pearl Breretona, Mark Turnera, Mahmood Niazia, Stephen Linkmana a School of Computing and Mathematics, Keele University, Staffordshire ST5 5BG, UK bSchool of Engineering and Computing Sciences, Durham University, DH1 3LE, UK

  12. (PDF) Systematic literature reviews in software engineering A

    Systematic literature reviews in software engineering A systematic literature review ... Software Engi- tematic reviews in software engineering in order to build a 978-1-4244-4841-8/09/$25.00 ©2009 IEEE 346 Third International Symposium on Empirical Software Engineering and Measurement body of knowledge to improve the current methodological ...

  13. PDF Large Language Models for Software Engineering: A Systematic Literature

    Large Language Models (LLMs) have significantly impacted numerous domains, including Software Engi-neering (SE). Many recent publications have explored LLMs applied to various SE tasks and applications. ... related works, where a number of literature reviews or survey papers have been produced [29, 58, 59, 338]. Table1summarises some of these ...

  14. Guidelines for performing Systematic Literature Reviews in Software

    The objective of this report is to propose comprehensive guidelines for systematic literature reviews appropriate for software engineering researchers, including PhD students. A systematic literature review is a means of evaluating and interpreting all available research relevant to a particular research question, topic area, or phenomenon of ...

  15. Literature Review Software MAXQDA

    MAXQDA The All-in-one Literature Review Software MAXQDA is the best choice for a comprehensive literature review. It works with a wide range of data types and offers powerful tools for literature review, such as reference management, qualitative, vocabulary, text analysis tools, and more. All-in-one Literature Review Software

  16. PDF Analysis of Software Engineering Practices in General Software and

    Various studies have been conducted about software engi-neering practices in the tech industry. Also, there are many studies conducted on software engineering practices in star- ... tions and conducted a systematic literature review of multiple published papers in different venues. RQ1: Which software engineering practices are followed

  17. Automation of systematic literature reviews : A systematic literature

    Automation of systematic literature reviews : A systematic literature review Information and Software Technology Dinter, Raymon; Tekinerdogan, Bedir; Catal, Cagatay ... and labor-intensive tasks in the software engi-neering domain and other fields such as medicine [6]. The time from the

  18. UML Diagrams in Software Engineering Research: A Systematic Literature

    BackgroundIn 2004 the concept of evidence-based software engineering (EBSE) was introduced at the ICSE04 conference.AimsThis study assesses the impact of systematic literature reviews (SLRs) which ...

  19. PDF Software Engineering Meets Deep Learning: A Literature Review

    results of a literature review covering 81 papers about DL & SE. 1 Introduction Deep learning (DL) applications are increasingly important in many areas, such as auto- ... software engi-neering (SE) researchers are also starting to explore the application of DL in traditional SE problems and areas, such documentation [8, 9, 10], defect ...

  20. Software Engineering Education and Games: A Systematic Literature Review

    Recently, researchers have shown an increased interest in the usage of games in software engineering. In this paper, we are presenting a systematic review and analysis of 350 papers regarding ...

  21. PDF Large Language Models for Software Engineering: A Systematic Literature

    Large Language Models (LLMs) have significantly impacted numerous domains, including Software Engi-neering (SE). Many recent publications have explored LLMs applied to various SE tasks. Nevertheless, a ... related works, where a number of literature reviews or survey papers have been produced [31, 67, 68, 359, 393]. Table1summarises some of these.