ArticlePDF Available

Towards identification and classification of core and threshold concepts in methodology education in computing

Authors:

Abstract

Research methodology is a quintessential component of science, but methods differ greatly between sciences. In computing, methods are borrowed from many fields, which causes difficulties to methodology education in computing. In our methodology courses in computing, we have observed a number of core and threshold concepts that affect students' success. This essay describes a work in progress towards understanding those core and threshold concepts in methodology education in computing, classified along two dimensions. We classify methodological concepts in terms of standard elements of research design in students' projects in computing, and in terms of their centrality and difficulty. We present examples of three types of troublesome knowledge concerning methodology: the strangeness and complexity of methodological concepts, misimpressions from everyday experience, and reasonable but mistaken expectations.
Towards Identification and Classification of Core and
Threshold Concepts in Methodology Education in
Computing
Matti Tedre, Danny Brash, and
Sirkku Männikkö-Barbutiu
Department of Computer and Systems Sciences
Stockholm University
Sweden
matti|danny|sirkku@dsv.su.se
Johannes Cronjé
Faculty of Informatics and Design
Cape Peninsula University of Technology
South Africa
cronjej@cput.ac.za
ABSTRACT
Research methodology is a quintessential component of sci-
ence, but methods differ greatly between sciences. In com-
puting, methods are borrowed from many fields, which causes
difficulties to methodology education in computing. In our
methodology courses in computing, we have observed a num-
ber of core and threshold concepts that affect students’ suc-
cess. This essay describes a work in progress towards under-
standing those core and threshold concepts in methodology
education in computing, classified along two dimensions. We
classify methodological concepts in terms of standard ele-
ments of research design in students’ projects in computing,
and in terms of their centrality and difficulty. We present ex-
amples of three types of troublesome knowledge concerning
methodology: the strangeness and complexity of method-
ological concepts, misimpressions from everyday experience,
and reasonable but mistaken expectations.
Categories and Subject Descriptors
K.3.2 [Computers and Education]: Computer and In-
formation Science Education—computer science education,
curriculum
General Terms
Human Factors
Keywords
Methodology, Computer science education, Computing cur-
ricula, Threshold concepts, Troublesome knowledge
1. INTRODUCTION
Methodology is a quintessential component of scientific re-
search and dictionary definitions of ‘scientific’ refer to method-
ologically sound, rigorous activities. As different kinds of
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for profit or commercial advantage and that copies bear this notice and the full citation
on the first page. Copyrights for components of this work owned by others than the
author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or
republish, to post on servers or to redistribute to lists, requires prior specific permission
and/or a fee. Request permissions from permissions@acm.org.
ITiCSE’14, June 21–25, 2014, Uppsala, Sweden.
Copyright is held by the owner/author(s). Publication rights licensed to ACM.
ACM 978-1-4503-2833-3/14/06 ...$15.00.
http://dx.doi.org/10.1145/2591708.2591758.
research employ different methods, methodology education
differs between academic disciplines. In natural sciences stu-
dents learn things like empirical laboratory work and statis-
tical methods while in social sciences various qualitative and
quantitative methods are part of the curriculum.
However, in computing, methodology has historically been
an overlooked activity. Reviews of methodology in comput-
ing have pointed out deficiencies in many fields of comput-
ing, including information systems [31], computer science
[23], and software engineering [33]. Some critics have argued
that methodological terminology is sloppily used in comput-
ing [34]. In computing experimentation terminology is used
differently from many other fields [28]. Others have argued
that precautions against bias are often not taken [7, 8]. Yet
others have argued that computing researchers just act like
scientists but do not really do science [17].
The science of computing is so broad that different fields
emphasize different kinds of research methodology educa-
tion. In the field of information systems, which has social
science methodology at its core, calls for stringent adherence
to recognized research methodology can be found [10, 11].
Theoretical computer science employs deductive reasoning
and analytical methods, although those are rarely explicitly
described in research papers [10]. Human-computer inter-
action uses a variety of methods from various traditions,
including behavioural sciences and social sciences. In sys-
tems development, engineering methods and techniques are
common. In the recent years, methodological debates con-
cerning system development has lead to an increased use of
design science [15] as a methodological framework.
No quick improvement can be expected in research prac-
tice as long as methodology is not properly addressed in
computing education. It has been argued that instead of
methodology education, students learn methodological skills
through mentoring and model learning—such as examining
previous research reports [9]. Curricula guidelines in com-
puting are thin on methodology courses. The ACM/IEEE
curriculum report CC2001 [6] does not include a single course
on methodology. Some of computing’s field-specific curricu-
lum guidelines—such as CS2008 (for computer science) and
GSwE2009 (for software engineering)—offer little in terms
of methodology. Other curricula—such as IS2010 for infor-
mation systems and CE2004 for computer engineering—are
more concerned with methodology education in their fields.
237
Education in computing fields is not just an academic is-
sue. ICT has a great impact on the economy and functions
of society, and when ICT systems fail, modern societies may
grind to a halt. ICT’s crucial role in industry and society cre-
ates various kinds of pressure towards university education
in computing. Computing education has to balance between
legislation regarding higher education, academic goals, and
the needs of industry. In legislative terms, methodologi-
cal knowledge and skills are, in many countries, decreed by
the national agency for higher education. In our case, Swe-
den, methodological education is legislation-bound already
on the B.Sc. degree level [26]. In terms of academic goals,
computing education must prepare students for the possibil-
ity of following research careers, which puts more pressure
on methodology education. In terms of industry, computing
education must supply a skilled workforce—which is often
research-intensive R&D (research and development) work.
This essay is based on the experiences of methodology ed-
ucators from three universities: Cape Peninsula University
of Technology, South Africa; Tumaini University, Tanzania;
and Stockholm University, Sweden. The essay uses Stock-
holm University’s computer science department, DSV, as an
example. Typical of educational programs in computing, for
35 years after DSV was founded in 1966, methodology edu-
cation did not exist as a curricular subject. The main focus
of education and research at the department was artifact
development. Over the years the department grew steadily
in size and scope, encompassing an increasingly broad range
of computing fields. That diversification encouraged the de-
partment to begin teaching research methodology in 2001,
after which various initiatives have been introduced to im-
prove the quality and quantity of methodology education.
Concerned with students’ methodological understanding
and skills, DSV initiated a department-wide initiative to
improve methodology education. A number of courses were
established on three levels—B.Sc., M.Sc., and Ph.D. level—
and their coherence with the broader aims of education at
the department was monitored. Emphasis of methodology in
thesis grading was strengthened. A department-wide three-
level quality control system for theses was introduced (super-
visor, reviewer, and examiner). However, during the imple-
mentation of some changes we identified a number of trou-
blesome concepts for students; concepts that were pivotal for
understanding the topic but that were conceptually difficult,
counter-intuitive, or alien to students [18]. Such threshold
concepts need to be properly addressed because the learning
of other concepts requires understanding of those threshold
concepts [18, 20]. The issues with methodology education
were, and are, complicated by the widely differing back-
grounds of DSV’s students: At the start of their studies
some students have an extensive methodological training,
while other students have very little.
This essay identifies troublesome concepts in methodol-
ogy courses in computing, and proposes a distinction be-
tween core concepts [1] and threshold concepts [18, 20] in
methodology courses in computing. This essay also presents
examples of three sources of trouble concerning method-
ological concepts: misimpressions from everyday experience,
the strangeness and complexity of methodological concepts,
and reasonable but mistaken expectations [20]. The essay
paves a way for further research and educational interven-
tions concerning threshold concepts of methodology educa-
tion in computing.
2. TROUBLESOME KNOWLEDGE
Threshold concepts and troublesome knowledge are top-
ics of interest in educational research [19]. Perkins [20] dis-
cussed five kinds of “troublesome knowledge”: ritual, inert,
alien, tacit, and conceptually difficult knowledge. In com-
puting education, many threshold concepts in methodology
courses are conceptually difficult knowledge. Perkins wrote
that difficulties with conceptually difficult knowledge may
arise from misimpressions from everyday experience, the
strangeness and complexity of scientific concepts, and rea-
sonable but mistaken expectations. All those can be readily
identified in methodology education in computing fields.
Some concepts in computing methodology education are
muddied by misimpressions from everyday experience. Take,
for instance, “hypothesis”: The everyday use of the word
“hypothesis” refers to a supposition or assumption, and the
casual meaning of the word may carry to terms like “null hy-
pothesis,” which in methodology language should be testable
and neutral. Similar, the casual meanings of terms like “fac-
tor”, “random”, and“significant”sometimes carry to research
reports. Other concepts may appear strange and complex:
Take ontological categories, for instance: The fact that 8
bits make a byte is a mind-dependent fact [29] may sound
strange (how can a fact be mind-dependent?). Yet other
concepts are confusing due to reasonable but mistaken ex-
pectations. Take, for instance, statistics in research: After a
certain point in sample size, population has negligible effect
on confidence—for example, statistically speaking, a ran-
dom sample of 500 respondents is equally useful for the 150
thousand users of a specialized software product as it is for
the 150 million users of a social networking website.
Equally difficult with some central methodological con-
cepts might be what Meyer and Land [19] called each field’s
“ways of thinking and practicing” and what Perkins [20,
p.42] called “epistemes”—“manners of justifying, explaining,
solving problems, conducting enquiries, and designing and
validating various kinds of products or outcomes”. Epis-
temes are perhaps the main expected learning outcome of
methodology courses, but much of the epistemes lie hid-
den beneath the surface of publications. Research reports
provide the student only limited windows to epistemes of
the field. Perkins [20, p.42] warned that without epistemes,
students in science and mathematics may display ritualized
routines instead of genuine disciplinary enquiry and problem
solving. Insofar as knowledge is ritualized and tacit, learning
how to conduct scientific work can be seen as a lonely rite de
passage where little guidance aside from mentor-apprentice
relationships is provided. However, as both the student body
as well as computing as a discipline are becoming increas-
ingly heterogeneous, there is an increasing need to introduce
methodology, an essential element of disciplinary ways of
thinking and practicing, into curricula, as well.
In computing, the context of knowledge creation is sepa-
rated from the knowledge of justification. It is excessively
difficult to learn the disciplinary ways of thinking and prac-
ticing from research reports, as they only present the context
of justification. For example, similar to mathematics, in var-
ious theoretical branches of computing the research report
presents a proof, but offers little insight on the actual pro-
cess that led to the proof: Which heuristics were applied,
what dead-ends were traversed, and in general, how exactly
was the problem approached (see, e.g., [21]). In experiment-
based branches of computing, few traces of the context of
238
Knowledge and Understanding Competence and Skills Judgment and Approach
B.Sc. Degree
Students should . . .
be aware of similar research problems
that have been studied in recent years in
the same domain.
understand what types of situations qual-
ify for ‘research problems,’ on the basis of
previous research.
Students should be able to . . .
formulate a research problem in such a
way that it deals with applied or theoretical
problems that actually need solving and fill
a gap of knowledge within the domain.
ground a research problem in scientific
literature or justify a design problem.
Students should be able to . . .
evaluate the degree of relevance of par-
ticular research problems for filling gaps in
knowledge in a particular domain.
assess the ethical and societal dimensions
of a chosen research problem.
M.Sc. Degree
Students should . . .
understand similar research problems
within a domain.
Students should be able to . . .
distinguish and identify different types of
problems, such as a lack of knowledge, ar-
tifacts, or processes that require improve-
ment, a need for evaluation, or the need for
development of new artifacts or processes.
Students should be able to . . .
assess and discuss the implications of
dealing with particular and varying re-
search problems in varying circumstances
and under changing conditions.
Table 1: Examples of Learning Objectives at DSV: Problem Description
discovery can be found from research reports: What really
drove the researcher, what bad theories and bad hypotheses
were filtered out, what guesses, intuitions, and approxima-
tions went into the model-building, and so forth. In en-
gineering types of computing research, things like product
specifications offer little information about exactly how and
why did those specifications emerge. Methodology educa-
tion should be able to open the black box of research reports
and let students understand how knowledge is created.
3. CORE CONCEPTS
A major difficulty in methodology education in computing
comes from the vast amount of new, conceptually difficult
knowledge. Terminology is often dense, unrelated to other
computing courses, and core concepts and threshold con-
cepts can be hard to identify. In order to clarify the concep-
tual complexity, we present the core concepts and learning
objectives for each state of a typical student research project.
Different concepts are needed at different levels. This section
outlines the vast amount of conceptually difficult knowledge,
which methodology education in computing has to deliver,
and which is often new to students. The section also uses
those concepts to construct examples of learning objectives
for a methodology course (Table 1 and Table 2).
Research Problem
Each research study in computing is a response to a prob-
lem, ranging from scholarly problems to evaluation problems
to design and development problems [24]. Computing deals
with a range of different kinds of problems, and different
types of problems are rooted in different intellectual tradi-
tions [30]. Understanding what different kinds of problems
entail is difficult as it stands, different kinds of problems each
introduce their own sets of concepts, and they all point to
different research trajectories. The problem statement plays
a fundamental role by describing the rationale for research,
grounding the study within a specific domain, and driving
the research project. Table 1 presents example learning out-
comes concerning research problem statements.
Core Concepts: Gap of knowledge, Research problem,
Topic, Problem types, Literature review, Background
Aims, Objectives, and Questions
Proceeding from a research problem to research questions
might not be conceptually challenging, but it introduces ter-
minological complexity, and terminology should be in agree-
ment throughout the project. A research aim, or purpose,
when it exists, corresponds to the research problem, and
is often expressed in verbs that define the type of research
and resonate well with the problem type; verbs such as ‘ex-
plore’, ‘describe’, ‘develop’, and ‘evaluate’ [5]. Other com-
mon conceptualizations of research aims include exploratory,
explanatory, descriptive, and emancipatory research [16, 32],
or observation, description, prediction, and experimentation
(e.g., [14, 22]). In some studies, particularly in evalua-
tion and development, objectives present specific statements
about how the research aim is going to be met, expressed in
strong active verbs, such as ‘classify’, ‘define’, ‘implement’,
and ‘test’ [27]. If aims, objectives, research questions, and
subquestions are all present, they should form a tight fit. In
some studies, such as some design research projects, there
are no explicit questions but aims and objectives instead (cf.
[15]), and sometimes research is defined through research
acts [24, pp.32–34]. Table 2 presents example learning out-
comes concerning research aims, objectives, and questions.
Core Concepts: Empirical research, Analytical / theoreti-
cal research, Desk study, Research acts, Question, Subques-
tion, Aim, Objective, Explanation, Demonstration, Evalua-
tion, Description, Prediction, Exploration
Research design
The next sets of concepts arise from research designs, which
vary between fields of science and between qualitative, quan-
titative, and mixed-methods types of research [4]. In ex-
ploratory types of research, designs are often descriptive,
while in confirmatory types of research, experimental designs
and correlational designs are common. Strategies of inquiry
[4, pp.11–15] are common tools for a high-level overview
and organization of research projects—and for improving
alignment and congruence, which refers to how the parts of
research study logically follow from each other and fit to-
gether. The different types of research design introduce a
large number of complex but central concepts, which should
be introduced from B.Sc. level upwards.
Core Concepts: Research design, Strategies of inquiry,
Resource planning, Focus, Scope, Delimiters, Qualifiers, Hy-
pothesis, Experiment, Independent and dependent variables,
Construct, Category, Intervention, Control group, Experi-
mental group, Survey, Phenomenology, Case study, Align-
ment, Congruence
Theoretical and conceptual frameworks
Theory can play various roles in research, but understand-
ing those roles poses conceptual difficulties for students. In
quantitative research, theories may provide, for instance,
testable variables, models, or frameworks [4, Ch.3]. In quali-
tative research, theories may provide, for instance, a “vocab-
239
Knowledge and Understanding Competence and Skills Judgment and Approach
B.Sc. Degree
Students should . . .
understand what constitutes a research
question and how it is derived from, and
related to, a research problem.
be familiar with different types of re-
search questions.
Students should . . .
be able to search for and pinpoint rele-
vant scientific literature to enable the for-
mulation of research questions.
Students should . . .
be able to evaluate the potential implica-
tions of answering particular research ques-
tions for the researcher, for potential bene-
ficiaries of research, and society in general.
M.Sc. Degree
Students should . . .
understand the feasibility of answering
particular research questions.
understand the potential and expected
scientific, technical, and social impacts of
answering the research questions.
be familiar with the different types of an-
swers that different types of research ques-
tions require, such as descriptions, expla-
nations, artifacts, correlations, causal rela-
tions, comparisons, identification, and def-
inition.
Students should be able to . . .
define aims of research, and derive mea-
surable objectives that respond to the re-
search aims.
refine and divide research questions into
concrete, coherent, and researchable com-
ponents that become subquestions.
formulate research questions in a variety
of ways that allow the focus to be shifted
towards different perspectives, depending
on context and timing of the research as
well as on different stakeholders.
Students should be able to . . .
evaluate the need for particular questions
to be answered.
determine the limitations and utility of
answering particular research questions.
Table 2: Examples of Learning Objectives: Research Aims, Objectives, and Questions
ulary” for phenomena and processes or a “lens”for interpret-
ing results. Instead of a theoretical framework, many stud-
ies employ conceptual frameworks, both of which require
properly done literature reviews [25]—although small-scale
practice projects can rarely be literature driven. Theoretical
frameworks are tightly interwoven with aims of research, as
well as different philosophical assumptions, yet students can
be very selective with some of those concepts [12, 29].
Core Concepts: Theoretical and conceptual framework,
Ontology (e.g., mind-dependence, mind-independence), Epis-
temology (e.g., subjectivity, intersubjectivity), Model, The-
ory, Realism, Relativism, Postpositivism, Critical theory,
Constructivism
Information needs and data collection instruments
Answering an empirical research question requires a prop-
erly executed collection of data that is of the right type and
comes from the right source. Consequently, getting the data
collection instruments right is crucial for a successful re-
search project. Students often have preconceptions of terms
‘research data’ and ‘methodology’ due to widespread use
of those terms. A number of concepts are related to data
collection, and confusion is common between, for instance,
primary and secondary data and qualitative and quantita-
tive data. In empirical testing of computer systems, inputs,
outputs, measuring instruments, parameters and databases
are not always obvious and require specific attention. Sim-
ulation, emulation, and benchmarking are often used in ex-
perimental strategies for design research in computing [13].
Core Concepts: Data types, Data formats, Qualitative
data, Quantitative data, Data collection method, Discrete,
Continuous, Primary data, Secondary data, Demographic
data, Computer-generated data, Questionnaire, Interview,
Observation, Input, Output, Parameter, Simulation, Emu-
lation, Benchmarking
Selection of information sources
Towards the end of the research project, retaining concep-
tual and terminological coherence becomes increasingly dif-
ficult. Depending on the type of research, a number of con-
cepts are associated with sampling, selection of informants,
or choice of data sources. In quantitative research ‘random-
ness’ is one of the more difficult concepts, as the term is often
confused with its everyday meaning. In qualitative research,
purposeful sampling strategies are often not described to the
extent they should be described—such as maximum varia-
tion, theory-based, or deviant case selection [3]. Data col-
lection through social networks is notoriously problematic
for traditional sampling terms. Laboratory and field testing
of computer systems often lack proper description and jus-
tification of the setup, parameter choices, measured outputs
and variables, and choice of competing systems [2, 7, 8].
Core Concepts: Selection of informants, Sampling strate-
gies, Probability and nonprobability sampling, Sample, Pop-
ulation, Representative and purposeful sampling, Stratified
and comprehensive sampling, Exploratory sample, Random
/ systematic / convenience sampling, Data saturation, So-
cial networks, Automatic data collection, Log files
Research ethics
The relevant concepts of research ethics in methodology ed-
ucation range from those that should be introduced to stu-
dents on their first year, like plagiarism, to those that are
relevant much later in students’ work, such as authorship in
journal articles and conflicts of interest. Most importantly,
those computing research studies that use human partici-
pants require proper permits and procedures.
Core Concepts: Research ethics, Informed consent, Data
protection, Security, Anonymity, Research permits, Plagia-
rism, Authorship, Intellectual property, Conflicts of interest,
Disclosure / non-disclosure
Data analysis and discussion
Analysis of data presents a set of concepts tightly interre-
lated with the previous sets, as well as a number of specific
concepts. Qualitative and quantitative data and methods of
analysis form a fourfold table, in which each quadrant in-
troduces a specific set of concepts, principles, and processes
[24]. Some of the harder concepts include, for instance, the
distinction between correlation and causality, the differences
between types of measurement (nominal, ordinal, interval,
ratio), and the differences between reporting, analyzing, and
discussing. The term “analysis” is sometimes used by stu-
dents in its everyday use—meaning the provision of a de-
tailed or complex opinion and reflection. That is, students
may provide opinions instead of a rigorous, methodologically
sound analysis.
Core Concepts: Data analysis method, Qualitative and
quantitative analysis, Inference, Descriptive statistics, Infer-
ential statistics, Transcript of data, Types of measurement,
240
Basic concepts Intermediate concepts Advanced concepts
Research problem Background, Research problem*,
Topic
Problem types, Systematic literature
review
Gap of knowledge*
Aims, Objectives,
Questions
Empirical research*, Research
aim, Research question, Description,
Exploration
Subquestion, Objective, Demonstra-
tion, Evaluation
Analytical / theoretical research,
Desk study, Prediction, Explanation,
Research acts
Research design Case study, Research design, Delim-
iters, Focus
Strategies of inquiry, Resource plan-
ning, Qualifiers, Scope, Context,
Congruence, Survey, Phenomenol-
ogy, Category
Intervention, Hypothesis, Experi-
ment*, Independent and dependent
variables, Control group, Experimen-
tal group, Alignment, Construct
Theoretical frame-
works
Theory*, Concept Model, Theoretical framework Conceptual framework
Research philoso-
phy
Epistemology*, Subjectivity, Inter-
subjectivity
Ontology, Mind-dependence, Mind-
independence, Realism, Relativism,
Postpositivism, Critical theory, Con-
structivism
Data collection Data types, Data formats, Quali-
tative data*, Quantitative data*,
Data collection method, Interview,
Observation, Primary data*
Discrete, Continuous, Secondary
data, Computer-generated data,
Social networks, Questionnaire,
Input, Output, Parameter
Demographic data, Simulation, Em-
ulation, Benchmarking
Sources of informa-
tion
Selection of informants, Exploratory
sample, Convenience sampling, Pur-
poseful sampling strategies
Nonprobability sampling, Sample,
Population, Systematic sampling,
Data saturation, Stratified sampling,
Comprehensive sampling, Automatic
data collection, Log files
Probability sampling*, Represen-
tative sampling
Research ethics Research ethics, Plagiarism, In-
formed consent*, Anonymity
Data protection, Security Research permits, Disclosure, Non-
disclosure, Authorship, IPR, Con-
flicts of interest
Data analysis Data analysis method, Qualitative
analysis, Coding, Memoing, Classify-
ing, Transcript of data
Quantitative analysis, Descriptive
statistics, Types of measurement,
Midpoints, Spread of data, Statisti-
cal significance, Data visualization
Inference, Inferential statistics*,
Hypothesis testing
Quality measures Rigor, Confirmability, Trustworthi-
ness, Credibility, Transferability, De-
pendability, Peer checks, Member
checks, Bracketing
Reproducibility, Measurability,
Bias*, Generalizability, Reliability,
Validity
Triangulation, Testability
Reporting Abstract, Citation, Bibliography,
Paraphrasing, Verbatim quote,
Tenses, Rich description*, Practi-
cal contribution
Congruence, Facts, Interpretations,
Layers of data
Theoretical contribution, IMRAD,
Style manual
* A candidate for threshold concept
Table 3: A Survey of Concepts from Basic to Advanced
Midpoints, Spread of data, Statistical significance, Coding,
Memoing, Classifying, Rich description, Data visualization
Quality measures
Some of the most difficult new concepts to students is con-
cerned with the self-evaluative aspects of a research study.
Validity and reliability—the “twin pillars of science”—can
be broken into a complex variety of concepts, and the four
criteria of qualitative research—credibility, confirmability,
transferability, and dependability [12]—each have their own
nuances. The number of possible sources of bias, such as
design, sampling, and nonresponse bias, is equally large and
often those concepts are difficult to apply to research studies.
Core Concepts: Rigor, Reproducibility, Testability, Tri-
angulation, Measurability, Bias, Generalizability, Reliabil-
ity, Validity, Confirmability, Trustworthiness, Credibility,
Transferability, Dependability, Member checks, Peer checks,
Bracketing
Reporting
Compiling the pieces of research together into a coherent,
well aligned report requires yet another set of conceptual
and technical elements: those related to technical and aca-
demic writing. Congruence (compatibility and consistency
of research elements) is a measure of conceptual and termi-
nological rigor in a research report. Proper quoting, para-
phrasing, and citing requires understanding of why literature
references are necessary. Academic writing, as well, has a
great impact on the readability and credibility of a report.
Core Concepts: Abstract, Citation, Bibliography, Para-
phrasing, Verbatim quote, Congruence, Tenses, Facts, Inter-
pretations, Rich description, Theoretical and practical con-
tribution, Layers of data, IMRAD, Style manual
4. DISCUSSION
The amount of new, conceptually difficult knowledge makes
methodology courses hard for computing students, and it
makes course design a challenge for educators. A conceptual
analysis of the course contents, explication of learning ob-
jectives, and identification of threshold concepts plays a key
role in developing methodology courses. This essay presents
a set of core concepts in methodology education in comput-
ing and proposes candidates for threshold concepts (Table
3). The essay also gives examples of learning objectives for
methodology education in computing (Tables 1 and 2). Our
proposals, however, are only applicable to DSV’s social and
human-oriented computing research, and are analytical in-
stead of empirically grounded. The next steps in our project
are to empirically evaluate the perceived difficulty of each
of those concepts, to analytically establish the connections
between the concepts, to survey the extent to which the
same difficulties are found in other computing institutions,
and to propose educational interventions for resolving the
difficulties with those concepts.
241
5. REFERENCES
[1] J. Biggs and C. Tang. Teaching for Quality Learning
at University: What the Student Does. Open
University Press, New York, NY, USA, 4th edition,
2011.
[2] J. Carreira and J. G. Silva. Computer science and the
Pygmalion effect. Computer, 31(2):116–117, 1998.
[3] J. W. Creswell. Qualitative Inquiry and Research
Design: Choosing Among Five Approaches. Sage
Publications, Thousand Oaks, CA, USA, 3rd edition,
2007.
[4] J. W. Creswell. Research Design: Qualitative,
Quantitative, and Mixed Methods Approaches. Sage
Publications, Thousand Oaks, CA, USA, 3rd edition,
2009.
[5] J. C. Cronj´e. The ABC (aim, belief, concern) instant
research question generator. Unpublished Manuscript,
2012.
[6] P. J. Denning, C. Chang, and CC2001 Joint Task
Force. Computing curricula 2001: Computer science
volume. pdf, March 2001.
[7] D. G. Feitelson. Experimental computer science: The
need for a cultural change. Unpublished Manuscript,
December 3, 2006, 2006.
[8] P. Fletcher. The role of experiments in computer
science. Journal of Systems and Software,
30(1–2):161–163, 1995.
[9] R. L. Glass. A structure-based critique of
contemporary computing research. Journal of Systems
and Software, 28(1):3–7, 1995.
[10] R. L. Glass, V. Ramesh, and I. Vessey. An analysis of
research in computing disciplines. Communications of
the ACM, 47(6):89–94, 2004.
[11] M. Goldweber, J. Impagliazzo, I. A. Bogoiavlenski,
A. G. Clear, G. Davies, H. Flack, J. P. Myers, and
R. Rasala. Historical perspectives on the computing
curriculum. SIGCUE Outlook, 25(4):94–111, 1997.
[12] E. G. Guba and Y. S. Lincoln. Competing paradigms
in qualitative research. In N. K. Denzin and Y. S.
Lincoln, editors, Handbook of Qualitative Research,
pages 105–117. SAGE, London, UK, 1994.
[13] J. Gustedt, E. Jeannot, and M. Quinson. Experimental
methodologies for large-scale systems: A survey.
Parallel Processing Letters, 19(3):399–418, 2009.
[14] C. G. Hempel. Aspects of Scientific Explanation And
Other Essays in the Philosophy of Science. The Free
Press, New York, NY, USA, 1965.
[15] P. Johannesson and E. Perjons. A design science
primer. Unpublished Manuscript, February 25 2012.
[16] C. Marshall and G. B. Rossman. Designing
Qualitative Research. Sage Publications, Thousand
Oaks, CA, USA, 4th edition, 2006.
[17] G. McKee. Computer science or simply ’computics’ ?,
the open channel. Computer, 28(12):136, 1995.
[18] J. H. F. Meyer and R. Land. Threshold Concepts and
Troublesome Knowledge: Linkages to Ways of
Thinking and Practicing within the Disciplines.
Number 4 in Occasional Reports. ETL Project,
Universities of Edinburgh, Coventry and Durham,
2003.
[19] J. H. F. Meyer and R. Land. Threshold concepts and
troublesome knowledge: An introduction. In J. H. F.
Meyer and R. Land, editors, Overcoming Barriers to
Student Understanding: Threshold Concepts and
Troublesome Knowledge, pages 3–18. Routledge,
London, UK, 2006.
[20] D. Perkins. Constructivism and troublesome
knowledge. In J. H. F. Meyer and R. Land, editors,
Overcoming Barriers to Student Understanding:
Threshold Concepts and Troublesome Knowledge,
pages 33–47. Routledge, London, UK, 2006.
[21] G. P´olya. How to Solve It. Penguin Books Ltd.,
London, UK, 2nd edition, 1957.
[22] K. Popper. The Logic of Scientific Discovery.
Routledge, London, UK, 1959.
[23] V. Ramesh, R. L. Glass, and I. Vessey. Research in
computer science: An empirical study. The Journal of
Systems and Software, 70(1–2):165–176, 2004.
[24] J. J. Randolph. Multidisciplinary Methods in
Educational Technology Research and Development.
HAMK University of Applied Sciences, H¨
ameenlinna,
Finland, 2008.
[25] J. J. Randolph. A guide to writing the dissertation
literature review. Practical Assessment, Research &
Evaluation, 14(13):1–13, 2009.
[26] Swedish National Agency for Higher Education.
National qualifications framework. Technical Report
12-5202-10, H¨
ogskoleverket, May 2011.
[27] M. Tedre. Methodology education in computing:
Towards a congruent design approach. In Proceedings
of ACM Computer Science Education (SIGCSE) 2013
Conference, pages 159–164, Denver, CO, USA, March
6–9 2013. ACM.
[28] M. Tedre and N. Moisseinen. Experiments in
computing: A survey. The Scientific World Journal,
2014(# 549398):1–11, 2014.
[29] M. Tedre and J. Pajunen. An easy approach to
epistemology and ontology in computing theses. In
Proceedings of the 13th Koli Calling International
Conference on Computing Education Research, Koli
Calling ’13, pages 97–104, New York, NY, USA, 2013.
ACM.
[30] M. Tedre and E. Sutinen. Three traditions of
computing: What educators should know. Computer
Science Education, 18(3):153–170, 2008.
[31] I. Vessey, V. Ramesh, and R. L. Glass. Research in
information systems: An empirical study of diversity
in the discipline and its journals. Journal of
Management Information Systems, 19(2):129–174,
2002.
[32] G. H. von Wright. Explanation and Understanding.
Routledge & Kegan Paul, London, UK, 1971.
[33] M. V. Zelkowitz and D. R. Wallace. Experimental
validation in software engineering. Information and
Software Technology, 39(11):735–743, 1997.
[34] M. V. Zelkowitz and D. R. Wallace. Experimental
models for validating technology. Computer,
31(5):23–31, 1998.
242
... Over the years, many reviews, meta-reviews, and scientometric analyses of CER publications have been conducted [10,40,54]. Previous reviews have classified CER publications from many perspectives; analyses have focused on specific publication venues of CER [8,10,43], several reviews have targeted computer programming as a topic to teach [2,28], while other research has investigated theory use, research design and methodology use in CER [4,45,49]. Modern scientometric studies have analysed collaboration networks, geographical diversity, and keyword trends in CER publications [9,10,15,42,54]. Many previous reviews show, e.g., the dominance of computer programming as a topic of research in CER [28]. ...
Chapter
Scientometric analyses of publication data from all major computing education research (CER) outlets show that many countries and whole continents are greatly underrepresented on the global map of contributions to CER. For example, only a minor portion of CER has originated from countries in the Global South (GS) or has addressed challenges of computing education in the GS. In this chapter, we shift the focus to scientometrically analyse CER papers that originate from countries in the GS. From the metadata of all CER publications in central publication outlets of CER, we have selected a subset of articles with authors affiliated to an institution in a GS Country, as defined by the United Nations (UN). The analysis shows publication trends, prolific authors, and country collaboration patterns. A number of crucial and interesting avenues for future research and collaboration are presented.
Conference Paper
The skill to read research literature critically belongs in every university graduate's toolbox. I have attempted to teach this skill in a master's degree level course in programming languages over 15 years using, at various times, simulated conferences, voluntary reading exercises, evidence-based practice training, and a flipped classroom with mandatory reading assignments. I discuss my experience and analyze preliminary qualitative data on the use of evidence-based practice and a flipped classroom for this purpose. I present no firm conclusions, but expect that future work (by me or others) will be able to use my experience as a baseline for better teaching of research literature reading.
Article
This paper arises from ongoing research undertaken by the Economics team of the ESRC/ TLRP Project 'Enhancing Teaching and Learning Environments' (ETL) 1 . This forms part of the large scale ESRC Teaching and Learning Research Programme Phase 2. ETL is seeking to identify factors leading to high quality learning environments within five disciplinary contexts across a range of HE institutions. Meyer's notion of a threshold concept was introduced into project discussions on learning outcomes as a particular basis for differentiating between core learning outcomes that represent 'seeing things in a new way' and those that do not. A threshold concept is thus seen as something distinct within what university teachers would typically describe as 'core concepts'. Furthermore, threshold concepts may represent, or lead to, what Perkins (1999) describes as 'troublesome knowledge' — knowledge that is conceptually difficult, counter-intuitive or 'alien'. The paper attempts to define characteristics of threshold concepts and, in the light of Perkins' work, to indicate correspondences between the notion of threshold concepts and that of 'troublesome knowledge.'
Article
This volume distinguishes between two main traditions in the philosophy of science - the aristotelian, with its stress on explanation in terms of purpose and intentionality, and the galilean, which takes causal explanation as primary. It then traces the complex history of these competing traditions as they are manifested in such movements as positivism, idealism, Marxism and contemporary linguistic analysis. Hempels's theory of scientific explanation, the claims of cybernetics the rise of an analytic philosophy of action and the revival of hermenuetics are all discussed. The volume also deals with causal explanation, intentionality and teleological explanation, and explanation in history and the social sciences. The author concludes that explanation of human actions cannot be reduced to simple causality, and discusses the implications of this conclusion for the disciplines of history and sociology.