« Previous Article
Next Article »

Original Research 


Aziz et al, 2018;2(3):114–123.

International Journal of Medicine in Developing Countries

A qualitative evaluation of alignment between intended and emergent components in a graduate program through instructors

Muhammad Owais Aziz1, Muhammad A. Siddiqui2*

Correspondence to: Muhammad A. Siddiqui

*Department of Research, Saskatchewan Health Authority, 23rd Ave, WRC, Regina, SK, Canada.

Email: muhammad.siddiqui@saskhealthauthority.ca

Full list of author information is available at the end of the article.

Received: 29 June 2018 | Accepted: 02 September 2018


ABSTRACT

Objectives:

Evaluation of this program was undertaken to gauge the alignment between learning outcomes (LOs), activities, and assessments at the course level. Simultaneously, identify the outcomes, activities, and assessments, which were not specified in documents but are parts of the program, these programmatic processes and outcomes are termed as “emergent” in this evaluation.


Methods:

This evaluation was accomplished by thematic analysis of the collected data utilizing a case study approach in subsequent steps; first, identifying and involving the key stakeholders to enhance the utility and credibility; second, engaging the participants who were the instructors of the program; third, collection of data through the program documents and interviews of the participants, and finally, data were analyzed in six steps using a qualitative data analysis software.


Results:

Study results found the courses LOs were well aligned with the activities and assessments, and the progressive nature of the program had resulted in emergent components.


Conclusions:

Consequently, the evaluation generated a summary report which detailed the findings and recommendations, such as emergent should be continuously evaluated to maintain alignment in the course components, and formal documentation of emergent LOs, activities, and assessment can further indicate achievements of the courses. Furthermore, this evaluation has explicit program design which will provide the foundation for future monitoring and impact evaluations of this program.


Keywords:

Program evaluation, qualitative evaluation, course alignment, intended components, emergent components.

Introduction

Informal evaluation is a concept of daily life based on individual knowledge, likes, dislikes, and circumstances [1]. Formal evaluation, on the other hand, involves a scientific approach essential for process improvement and enhancing outcome [13]. Additionally, to conduct a standard evaluation, domains for investigation are required to be identified [3,4]. Owen and Rogers [1] have classified the components which can be evaluated into four domains for a comprehensive program evaluation, including the program policies, processes, products, and individuals [2]. However, in consensus with the program stakeholders, the evaluation of this program was intended to evaluate the processes and products (outcomes) at the course level in the program.

Furthermore, the outcome-based models for academic program evaluations may only focus on the intended outcome(s) of a program [2,5]. However, concurrent with this is the identification of non-intended outcomes emphasizing the need of exploring “what else is happening” (process) and “what else are the outcomes” in a program along with the intended or documented “happenings” and “outcomes”. Thus, another goal of this study was to identify undocumented (emergent) processes and outcomes.

Here, the processes of a program include activities and assessments. The evaluation has further explored the developing program logic through documentation, interviews, and observation. It will identify the new goals (emergent outcomes) and describes the perceptions of instructors about the progress of the program and addresses their concerns [1,6,7].

The graduate program under evaluation commenced in 2012 as a part-time program to accommodate students who want to enhance their knowledge and skills while simultaneously pursuing their professional careers. It has been offered through synchronous and asynchronous delivery methods via distance learning and face-to-face sessions. During the development of the program, the course goals, activities, and assessments were aligned. However, based on the nascence of the field, the complexity of classroom and advancements in technology, program and course outcomes and processes were gradually altered to accommodate the needs of students, thereby necessitating the need for evaluation to maintain alignment in the program.

Therefore, in June 2016, an evaluation of this graduate program resulted in a summary report which addresses questions such as what are the intended learning outcomes (ILOs), activities, and assessments, what are undocumented characteristics of the course, which are called emergent LOs (eLOs), activities, and assessments for this evaluation, and how are the intended and emergent, Los, activities, and assessment aligned? The report also encompassed suggestions for further enhancement of the program. It was anticipated that the finding of the evaluation will explicit the program design and consistency between intended and implemented program components. This documentation of program design will provide a basic framework on which future evaluations could be performed.

Material and Methods

Brief conceptualization of the theories of evaluation

The two theories which were evaluated in this program evaluation are “course alignment” and “emergent outcomes” which are briefly described in the following section.

Theoretical overview of alignment

In student-centered curriculum, the course alignment addresses issues such as, what a teacher is going to deliver, what a student will learn, how the activities should be designed, and how to evaluate students’ learning [8]. To design an aligned course, the course ILOs are documented so a student understands what he/she is to learn. Simultaneously, an instructor creates expectations about the standard of the student’s learning. An ILO comprises three essential components including a statement indicating expected learning, a specific verb, and content area [9,10]. This verb, indicative of the level of learning, is based on the instructors’ preferences and course requirements [1012]. For example, “After completion of the lecture, students will be able to (statement) describe (verb) the position of the heart in the chest cavity (content).”

After drafting the course ILO(s), the teaching and learning activities of a course should be designed in aligned with the ILO(s). If an instructor implements a verb to focus on a higher level of learning such as hypothesize, create or invent, then activities must be designed accordingly. Multiple activities can be structured in an effort to achieve an ILO; for example, a lecture or structured reading, practical experiment, memorizing, creating ideas, and/or structured readings, etc. Another scenario might have the instructor presenting a real-life problem to the students after which through research and group discussions a solution to the problem can be achieved. During the activity, continuous formative feedback plays an important role in the students learning and encourages them to become self-regulated learners [10,1315].

In the final step, course assessments are developed. These may be formative, summative, direct or indirect but they should be directly aligned with the ILOs and be indicative of the student’s learnings [810]. An assessment can evaluate one or more ILOs. A student can be assessed by using multiple tools of learning such as multiple-choice questions, select the best answer, short answers, and many more. According to Biggs and Tang [10], assessing a course activity (CA) is the best way to evaluate the student’s learning based on an example of a case-based learning activity where intended outcomes are assessed by the instructors on the basis of the procedure followed by the students to solve the problem. Furthermore, an instructor needs to consider the relative importance of each ILO and the value each accordingly.

Whereas the application of the pedagogical principles of course alignment promotes a deeper learning for the students and further ensures the dynamic, logical, and effective nature of a course [12].

Theoretical overview of emergent

The critiques of an aligned curriculum argue that the adherence to ILOs may suppress the emergent aspects of a course [16]. In student-centered learning, students are provided with an opportunity to learn based on their previous learnings and experiences under a predesigned pedagogical structure. In this program, the eLOs and activities are integral to the courses and are indicative of the dynamic nature of the program. The instructors continuously modify their courses and if an activity proved to be beneficial, then the instructors make it a component of their curriculum to better accommodate the needs of the students.

A classroom is a complex environment where students from multiple backgrounds, experiences, and professions are working together resulting in variable LOs. The actual LOs differ from the intended, because of the interaction between a teacher and students varies for each student. A CA may only target and benefit a specific type of learner, certain topics may not stimulate the same excitement and motivation in all students, and the teaching and learning environment may not be conducive for all types of learners. These factors can influence the LO of a student and require timely interventions and guidance from the instructor to keep the students focused on the ILO(s). Even after the interventions, emergent outcomes are to be expected and have become an integral component of a curriculum [17,18]. Students in this program are from diverse backgrounds with distinguished experiences and learning styles. The assumption that the same process will end with the same results is linear and simplified, but the educational interventions are influenced by multiple factors which make them unpredictable [2,12,19]. The unplanned LOs are called “eLOs” in this graduate program evaluation [20,21]. The eLOs can be separate from or embedded within the original goals and can be classified as desirable (positive) or undesirable (negative) (Figure 1) [17].

The process of this program evaluation was accomplished in different steps including, (a) identifying and involving the key stakeholder, (b) involving the participants, (c) collecting the data, and (d) analysis and dissemination of the data.

Figure 1. Presence of diabetic complications within gender.

Stakeholder

Involvement of the key stakeholder(s) is vital for the utility and credibility of an evaluation, as they can directly influence the functions of a program [22,23]. In general, anyone who is affected by a program can be considered a stakeholder, including faculty, students, administrative and support staff, managers, directors, and policymakers. In the process of identifying the key stakeholder, overall stakeholders can be categorized into two broad groups: (a) those directly affected and (b) those indirectly affected by the program. Group 1 can be further classified into four clusters including the program decision makers, administrators, financiers, and clients [24]. Although those occupy group 2 are indirectly affected, an evaluation cannot address the needs of these stakeholders. In group 1, those stakeholders who can influence the processes and outcomes of a program are a key stakeholder. It is, therefore, of paramount importance to identify the key stakeholders to ensure the utility and credibility of an evaluation [22,23]. The key stakeholders for this evaluation were identified as the Assistant Professor and Associate Director of this program. These individuals remained actively involved in the development of the program and were a vital part of the faculty. Their influence and involvement in the program administration have enhanced the utility of this evaluation [22,23].

Following discussion with the key stakeholder, the goals and process of this evaluation were finalized. It was anticipated that the results of this evaluation will assist the stakeholder to further enhance the alignment between outcomes, activities, and assessment in the courses and program. It was also predicted that this evaluation will enhance the quality of this well respected and innovative program. After securing an ethical approval from the Health Science Research Ethics Board by the stakeholder, the evaluation plan was designed and submitted to the key stakeholder for approval by the evaluators.

Study participants

When the plan for the program evaluation was approved by the key stakeholders, the faculty was introduced to the evaluation of this program through a presentation, which describes the importance of the evaluation and encouraged their participation. During the presentation, the evaluators were introduced, the model of the evaluation was described and ethical issues were addressed. In addition, the faculty was provided an opportunity to ask questions and express opinions about the evaluation. Those faculty members who did not attend the presentation were emailed a video, which introduced them to the program evaluation and included the email addresses of the evaluating team and ethic board, so the queries could be appropriately addressed.

Data collection

The data collected for this study included administrative documents; published articles related to the program, 2015 course syllabi, and interviews with the course instructors. Those who volunteered to participate were referred to as participant(s) throughout this evaluation. Subsequently, after correspondence with the participants, the interviews were conducted during June and July of 2016 by one evaluator. These interviews comprised of semi-structured open-ended questions that provided an opportunity for the interviewer to manage the conversation and explore areas of interest in more detail [25,26]. The interviews focused on three themes; ILOs, activities, and assessments in the courses. The same protocol was followed for each interview [26]. A total of 24 interviews were recorded with an average duration of 27 minutes and were then transcribed by a professional transcription company.

Data analysis

Thematic analysis of the collected data was performed using a case-study approach by following six steps [27,28].

Step 1: One hundred and fifty-six codes were manually generated from the written documents and segregated into three categories; intended ILOs, CAs, and course assessment techniques (CATs) (Table 1).

Step 2: The interviews were closed coded in three categories: LOs, activities and assessments using qualitative data analysis software (ATLAS Ti). These codes were further classified as emergent or aligned after comparing with ILO, CA, and CAT from Step 1. The aligned codes were placed in the group of aligned LOs (aLOs), aligned CA (aCA), and aligned CATs (aCATs). The emergent codes were categorized as the eLOs, emergent CA (eCA), and emergent CATs (eCATs). A total of 175 codes were distributed into categories as illustrated in Table 2.

Step 3: Codes of the ILOs from Step 1 were compared with the codes of the aLOs of Step 2 and the matching codes of ILOs and aLOs were combined into the ILO themes [27]. Likewise, the CA codes were compared with the aCA codes, and the CAT codes with the aCAT codes (Figure 2).

Step 4: All themes were numerically identified. These identifiers represented the number of the course and sequence in which the themes and codes were documented in the evaluation; for example, theme ILO 21 refers to an ILO from course 2 and the sequence of the ILO was 1.

Step 5: Alignment was evaluated between the themes from Step 4. All ILOs and eLOs were compared with the CA, eCA, CAT, and eCAT to understand the relationship between them.

Step 6: The finding and results of the analysis were documented after repeating the same 6 step process for each course.

The respected results for each course were initially shared with the course director, who agrees with the findings and appreciated the effort to document emergent of their respected courses. Finally, these findings were shared with the key stakeholders through a presentation and written summary report.

Table 1. Distribution of codes.

Course number ILO CA CAT
1 14 5 4
2 9 12 4
3 7 12 5
4 5 1 2
5 9 15 7
6 8 6 3
7 4 9 0
8 4 8 3
Total 60 68 28

Table 2. Codes from step 2.

Code category aLO aCA aCAT eLOs eCA eCAT
Number of codes 43 60 09 41 21 01

Table 3. Course alignment.

LOs CAs CATs
ILO 01 CA 01, 02,03,06,07,08
eCA 01,03,06,07
CAT 02,03,04
ILO 02 CA 05, eCA 06 CAT 01,02
ILO 03 CA 05, eCA 05 CAT 03, 04
eLO 01 CA 01, 02,03,06,07,08
eCA 01,03,06,07
CAT 02,03,04
eLO 02 CA 07, eCA 01
eLO 03 CA 05,06,08 CAT 02,03,04
eLO 04 CA 03

Figure 2. Steps of data analysis.

Results

This graduate program is a part-time blended program consisting of eight compulsory and one elective course. Random code numbers are used throughout this paper and the only amended result of one course is documented to preserve the identity of the courses and individuals associated with the courses and program. In the program, five courses are strictly online whereas three courses are delivered face-to-face in the classroom. Of the face-to-face courses, two are offered in the home country while an elective course is offered abroad. Additionally, the students submit a research thesis as a requirement of the program.

The evaluated course is an introductory 1-week intensive on-campus course designed to orient students with the program (ILO 02). The students develop a basic understanding of health care quality (ILO 01). This course also provides an opportunity for the students to become familiar with the programs’ of teaching faculty, staff, and their classmates (eCA 05). Prior to starting the course, the students submit a paper (eCA 04) outlining their understanding of healthcare quality. During the in-class sessions, the students give a brief 5-minute presentation worth 30% of their course mark (CAT 01), in which they highlight their rationale for joining the program (CA 02). The students provide peer feedback worth another 30% of their course mark on the presentations, focusing on new ideas, suggestions, and scientific reviews (CAT 02). The instructors ensure all students can post to the learning management system (ILO 02). The peer feedback is based on a literature review (CA 06) which provides them a deeper insight into the theories of healthcare quality (ILO 01).

The students systematically use tools to measure healthcare quality standards. The students are also informed about the organizational culture, economic cost, and the impact of technology (ILO 03) in modern healthcare through structured readings (CA 01), and lectures by faculty (eCA 06) and guest speakers (eCA 07). The instructors also conduct interactive sessions (eCA 01) and provide the students with an opportunity to apply the concepts of healthcare quality (eLO 01). Additionally, a formal session of the debate is organized (CA 03) in which the students learn how to communicate their point of view effectively (eLO 04). During these sessions, the students are divided into interprofessional groups (eLO 02), so the students can learn from each other’s experience.

Moreover, the students are also taken on tours of the library (CA 05) and hospitals (CA 04). This provides the students with a medical background an opportunity to observe healthcare setups in different hospitals, and because of the library tour, the students can familiarize with this resource (eLO 03) which they utilize throughout the program for the course and research purposes. Following the tours, the students discuss their learning experience and write an annotated bibliography (CA 08). This activity includes a minimum of five articles on the topics of the group presentation (CA 07) and accounts for 20% of their course mark (CAT 03). For the final assessment, the students are divided into interprofessional groups (eLO 03), and they prepare and deliver a 15-minute presentation on the topic of their choice for 20% of their mark (CAT 04).

What is happening in course?

The following is a list of the ILOs, activities, and assessments which were documented.

Intended Learning Outcome’s (ILO’s):

ILO 01: Understand the basic concept of quality in healthcare

ILO 02: Orientation to the resources of the program (software and library)

ILO 03: Orientation with the history of quality in healthcare

Course Activities (CA):

CA 01: Structured reading

CA 02: Students presentations

CA 03: Debate sessions

CA 04: Tour of the hospital

CA 05: Tour of the library

CA 06: Literature review

CA 07: Group presentations

CA 08: Annotated bibliography

Course Assessment Techniques (CATs):

CAT 01: 5-minute student presentation

CAT 02: Posting to online learning management system

CAT 03: Annotated bibliography

CAT 04: 15 Minute group presentation

What else is happening in course?

The following is a list of the eLOs, activities, and assessments, i.e., these are undocumented characteristics of the program.

Emergent Learning Outcome’s (eLO’s):

eLO 01: Apply, analyze and compare the concepts of healthcare quality

eLO 02: Learn to work in interprofessional groups

eLO 03: Learn to conduct basic research

eLO 04: Learn argumentation skills

Emergent Course Activities (eCA):

eCA 01: Interactive sessions

eCA 02: Assignment with the learning activities

eCA 03: Hands-on activities

eCA 04: Writing a paper (prior to arrival at course location) (not evaluated)

eCA 05: Meet and greet, students and the faculty

eCA 06: Faculty presentations

eCA 07: Guest speaker

Emergent Course Assessment Techniques (eCATs):

none

Alignment between learning outcomes, course activities, and assessment

As a result of the comparison between intended and emergent, LOs, CAs, and assessments following the results were obtained. Table 3 indicates that which LO is aligned with what activities and assessments.

Discussion

Historically, in an evaluation, a systemic assessment is conducted of the processes and outcomes of a program. An academic program evaluation provides valuable information to the stakeholders about the achievement of the program and suggests areas for improvement. The intent of this form of evaluation is to further develop and improve program quality [2931].

A program evaluation can be performed to assess outcomes utilizing Kirkpatrick’s four-level model for an academic program evaluation [31]. This model also identifies additional criteria of a program evaluation, its encompassed reaction, learning, behavior, and results. The reaction criterion is classified as the perception which student has about the program. The learning criterion is the measurement of learning through the assessments, and by demonstrating the skills learned throughout the program. The behavior criteria are measured by observing learner performance during the process while the result criteria are based on the end result of the program by observing the benefits to the individual, institute, and society [29,30]. Most of the outcome-based models for academic program evaluations only focus on the intended outcome(s) of a program [2,5]. However, concurrent with this is the identification of non-intended outcomes emphasizing the need of exploring “what else is happening” and “what else are the outcomes” in a program along with the intended or documented happenings and outcomes. Thus, one of the primary goals of this evaluation was to identify both intended and unintended (emergent) processes and outcomes.

In June 2016, an evaluation of this graduate program was initiated to develop a better understanding of the alignment between LOs, assessments, and instructions at the course level. In addition, the evaluation identified programmatic functions and qualities that were not specified in the documentation but were identified through an interview narrative. These functions and qualities are referred to as “emergent” in this evaluation report.

Subsequently, a summary report built upon inquiry-based information was submitted to the program stakeholders with the intention to assist in the further optimization of the programmatic functions. This report identifies and recommends that the LOs of the courses are well aligned with the activities and assessments, simultaneously formal documentation of emergent components in the curriculum will further indicate the achievements of the students. The finding of this evaluation will assist the programs’ stakeholders in generating, modifying, and implementing policies and processes, which will amplify the quality of this program. Additionally, the analysis of the programs logic and design established the program as more plausible to the policymakers. Moreover, based on this evaluation, “monitor evaluation” and “impact evaluation” can be performed to monitor the progress and identify the impact of the program, respectively.

Emergent Characteristics of the Program

The courses development in this program are based on the students’ needs and are planned to modify the skills, behaviors, and knowledge of the students [32,33]. The courses of this program are more than organizing the content; it is also about planning the sequence(s) of activities to facilitate the students’ learning. In addition, the content, activities, and assessments of a course were modified according to the needs of students, which makes the courses a dynamic entity [33]. Generally, the combination of instructors’ willingness to allow for an emergent curriculum, the nascence of the healthcare quality field, and evolving concepts, perspectives, and practices have led to a very fluid curriculum of this program. Additionally, technological advances, such as changes in electronic medical records, have catalyzed the speed and extent to which theoretical advances can be implemented.

Considering the importance of emergent characteristics in this program, the program undergoes annual internal review complimented with an external evaluation every 4 years to accommodate the pace of change and monitor programmatic progress. The instructors incorporate the emergent positive LOs as a component of their courses and the emergent negative LOs are addressed proactively. The instructors should be encouraged to continue to innovate as constant change is an integral part of a modern curriculum. The program implements a process to manage the emergent components of their courses. Increase use of online learning resources can free up valuable faculty time [18] thereby providing an opportunity to manage the emergent components more effectively.

Assessments

A good quality assessment enhances students’ learning. Whereas, the qualities of a good assessment includes coherence with the ILO(s), consistency, equivalency, feasibility, educational and catalytic effects, and should be acceptable to the stakeholders [34,35]. Coherence or validity ensures the assessment is related to the ILO being assessed; consistency in an assessment means that under a similar situation the results will remain the same; equivalency means the same assessment will bear similar results if administered at a different location; feasibility ensures the practical and realistic approach of an assessment; educational effects motivate the students for the academic gain; catalytic effects promote future learning, and; acceptability ensures the credibility of the assessment [34].

Although direct assessments based on standard rubrics are a major component of this program, the instructors also employ indirect assessments to collect peer feedback and promote group learning. Simultaneously, summative assessments remained an important component of all the courses. Additionally, instructors ask questions and solve problems with the students, and continuously provide feedback which is reflections of formative assessment. Instructors in the program reserve documentation for direct, indirect, and summative assessments but utilize multiple formative assessments within their curriculum with no formal documentation.

Conclusion

This graduate program evaluation identified key elements that will amplify the quality of this well respected and innovative program. The progressive nature of the program has resulted in emergent outcomes and processes which need to be continuously monitored and adjusted to preserve alignment between program components. Additionally, the evaluation will make foundations on which the program can be monitored, and impact evaluation could be performed [36]. Eventually, the program’s stakeholders will utilize the information for further advancement in the program quality.

Future research/limitation

Finally, to further this evaluation, the involvement of students will provide a complete and comprehensive picture of the program outcomes and processes. This evaluation only took the perspective of instructors where they might believe that they are achieving an outcome but in reality, students are not achieving those milestones.

Besides, identifying what is and what else is happening in the program, the evaluator can also assess what was intended and is not happening in the program. By identifying what is not happening or what goals are not being achieved can provide the stakeholder an opportunity to reevaluate the significance of intended process or outcome and modify the curriculum accordingly and will portray a complete picture of the program achievements.

Acknowledgment

The authors are thankful to Dr. Leslie W. MacKenzie and Dr. Rylan Egan who monitored and facilitated this evaluation to keep it unbiased.


List of Abbreviations

aLOs Aligned learning outcomes
aCAs Aligned course activities
aCATs Aligned course assessment techniques
CAs Course activities
CATs Course assessment techniques
eLOs Emergent learning outcomes
eCAs Emergent course activities
eCATs Emergent course assessment techniques
ILOs Intended learning outcome(s)
Los learning outcome(s)

Funding

None.


Declaration of conflicting interests

None.


Author details

Muhammad Owais Aziz1, Muhammad A Siddiqui2

  1. St Lawrence College, 2 St. Lawrence Drive, Cornwall, ON, Canada
  2. Department of Research, Saskatchewan Health Authority, Regina, SK, Canada

References

  1. Owen JM, Rogers P. Program evaluation: forms and approaches. 3rd ed. Sage, London; 2010.
  2. Haji F, Morin MP, Parker K. Rethinking programme evaluation in health professions education: beyond ‘did it work?’ Med Educ 2013; 47(4):342–51. Epub 2013/03/16. https://doi.10.1111/medu.12091
  3. Rossi PH, Lipsey MW, Freeman HE. Evaluation: a systematic approach [Online]. Sage; 2003 [cited 2016 Mar 18]. Available from: https://books.google.ca/books?hl=en&lr=&id=QF9WBAAAQBAJ&oi=fnd&pg=PR9&dq=Rossi,+P.+H.,+Lipsey,+M.+W.,+%26+Freeman,+H.+E.+(2003).+Evaluation:+A+systematic+approach.+Sage+publications&ots=9zsa0EbKG_&sig=w5BnLUGOSNCq6wsz6Nw2M7XkSG0
  4. Yarbrough DBSL, Hopson RK, Caruthers F. The program evaluation standards: a guide for evaluators and evaluation users. [Online]. Sage; 2010 [cited 2016 Jan 21]. Available from: https://books.google.ca/books?hl=en&lr=&id=81kXBAAAQBAJ&oi=fnd&pg=PR1&dq=The+program+evaluation+standards:+A+guide+for+evaluators+and+evaluation+users.+Sage+Publications.&ots=8BvRr8LFK1&sig=sUhZoUAYuaURrD5UHk5FlhhJYFA
  5. Scriven M. Prose and cons about goal-free evaluation. Eval Pract 1991; 12(1):55–63.
  6. Cooksy LJ. Program evaluation: forms and approaches (3rd ed.), by John M. Owen. New York: Guilford Press, 2006. Am J Eval [Online]. 2008; 29(1):108–12. [cited 2016 Mar 17]. Available from: http://aje.sagepub.com/content/early/2008/01/08/1098214007313387
  7. Rutman L, Wholey JS. Planning useful evaluations: evaluability assessment [Online]. Sage, Beverly Hills, CA, 1980 [cited 2016 Mar 17].
  8. Biggs J. Enhancing teaching through constructive alignment. High Educ 1996; 32(3):347–64.
  9. Ascough RS. Learning (about) outcomes: how the focus on assessment can help overall course design. Can J High Educ 2011; 41(2):44.
  10. Biggs J, Tang CY. Applying constructive alignment to outcomes-based teaching and learning. In: Training Material for “Quality Teaching for Learning in Higher Education” Workshop for Master Trainers, Ministry of Higher Education, Kuala Lumpur [Online]. 2010, pp 23–25 [cited 2016 Feb 15]. Available from: https://intranet.tudelft.nl/fileadmin/Files/medewerkersportal/TBM/Onderwijsdag_2014/What-is-ConstructiveAlignment.pdf
  11. Biggs JB, Collis KF. Evaluating the quality of learning: the SOLO taxonomy (Structure of the Observed Learning Outcome) [Online]. Academic Press; 2014 [cited 2016 Jan 19]. Available from: https://books.google.ca/books?hl=en&lr=&id=xUO0BQAAQBAJ&oi=fnd&pg=PP1&dq=Evaluating+the+quality+of+learning:+The+SOLO+taxonomy+(Structure+of+the+Observed+Learning+Outcome).+&ots=aooudYPtKe&sig=pBDFPCgPst-ULcIsI9tOkhqhbzI
  12. Wang X, Su Y, Cheung S, Wong E, Kwong T. An exploration of Biggs’ constructive alignment in course design and its impact on students’ learning approaches. Assess Eval High Educ 2013; 38(4):477–91.
  13. Biggs JB. Teaching for quality learning at university: what the student does [Online]. McGraw-Hill Education (UK); 2011 [cited 2016 Feb 15]. Available from: https://books.google.ca/books?hl=en&lr=&id=VC1FBgAAQBAJ&oi=fnd&pg=PP1&dq=Biggs,+J.+B.+(2011).+Teaching+for+quality+learning+at+university:+What+the+student+does.+McGraw-Hill+Education+(UK).&ots=E6AOpF9GLn&sig=Iu95gVx-RED1iO45B3Ov-IGKjdQ
  14. Lilly J, Richter UM, Rivera-Macias B. Using feedback to promote learning: student and tutor perspectives. Pract Res High Educ 2010; 4(1):30–40.
  15. Nicol DJ, Macfarlane-Dick D. Formative assessment and self-regulated learning: a model and seven principles of good feedback practice. Stud High Educ 2006; 31(2):199–218.
  16. Scott I. The learning outcome in higher education: time to think again? Worcest J Learn Teach 2011 [Online] [cited 2016 Mar 18]. Available from: http://eprints.worc.ac.uk/1241/
  17. Hussey T, Smith P. The uses of learning outcomes. Teach High Educ 2003; 8(3):357–68.
  18. Williams R, Karousou R, Mackness J. Emergent learning and learning ecologies in web 2.0. Int Rev Res Open Distrib Learn 2011; 12(3):39–59.
  19. Regehr G. It’s NOT rocket science: rethinking our metaphors for research in health professions education: it’s NOT rocket science. Med Educ 2010; 44(1):31–9.
  20. Megginson D. Planned and emergent learning: a framework and a method. Exec Dev 1994; 7(6):29–32.
  21. Megginson D. Planned and emergent learning consequences for development. Manag Learn 1996; 27(4):411–28.
  22. Patton MQ. Essentials of utilization-focused evaluation [Online]. Sage; 2011 [cited 2016 Jan 21]. Available from: https://books.google.ca/books?hl=en&lr=&id=BaMgAQAAQBAJ&oi=fnd&pg=PR1&dq=).+Essentials+of+utilization-focused+evaluation.+&ots=LmhfEmrNoK&sig=LnjP9duRwlwBmsNTWwFDL4SCLpM
  23. Sweet L. Essentials of utilization-focused evaluation [Book Review]. Eval J Australas 2015; 15(1):48.
  24. Mathison S. Encyclopedia of evaluation [Online]. Sage; 2004 [cited 2016 Mar 18]. Available from: https://books.google.ca/books?hl=en&lr=&id=iyV1AwAAQBAJ&oi=fnd&pg=PP1&dq=Encyclopedia+of+Evaluation.&ots=IJYHa6iPl3&sig=r34-xRC0dDBMMtB4JkU9uuGaE3Q
  25. Dicicco-Bloom B, Crabtree BF. The qualitative research interview. Med Educ 2006; 40(4):314–21. Epub 2006/04/01. http://doi.10.1111/j.1365-2929.2006.02418.x. PubMed PMID: 16573666
  26. Jacob S, Furgerson S. Writing interview protocols and conducting interviews: tips for students new to the field of qualitative research. Qual Rep 2012; 17(42):1–10.
  27. Creswell JW. Qualitative inquiry and research design: choosing among five approaches [Online]. Sage; 2012 [cited 2016 Jan 21]. Available from: https://books.google.ca/books?hl=en&lr=&id=Ykruxor10cYC&oi=fnd&pg=PR1&dq=Qualitative+inquiry+and+research+design:+Choosing+amoung+five+approaches.+Sage.&ots=4bm3nJk5Bp&sig=3YOBxviCZez1DZJMlbHsC58iTHc
  28. McMillan JH, Schumacher S. Research in education: evidence-based inquiry. Pearson Higher Ed, Boston, MA; 2014.
  29. Kirkpatrick DL. Techniques for evaluating training. Train Dev J 1979; 33(6):78–92.
  30. Kirkpatrick DL. Techniques for evaluating training programs. Class Writ Instr Technol 1996; 1(192):119.
  31. Praslova L. Adaptation of Kirkpatrick’s four level model of training criteria to assessment of learning outcomes and program evaluation in higher education. Educ Assess Eval Account 2010; 22(3):215–25.
  32. Cullen R, Harris M, Hill RR. The learner-centered curriculum: design and implementation. John Wiley & Sons, San Francisco, CA; 2012. p 272.
  33. Lattuca LR. Shaping the college curriculum: Academic plans in context [Online]. John Wiley & Sons, San Francisco, CA; 2011 [cited 2016 Jan 19]. Available from: https://books.google.ca/books?hl=en&lr=&id=vFYTp9cze2kC&oi=fnd&pg=PR13&dq=Shaping+the+college+curriculum:+Academic+plans+in+context.&ots=Zrw-lonqwZ&sig=jwYNlQlqWJVa1aPSF57DmtqB_0w
  34. Norcini J, Anderson B, Bollela V, Burch V, Costa MJ, Duvivier R, et al. Criteria for good assessment: consensus statement and recommendations from the Ottawa 2010 Conference. Med Teach 2011; 33(3):206–14. Epub 2011/02/25. http://doi.10.3109/0142159x.2011.551559. PubMed PMID: 21345060
  35. Van Der Vleuten CP. The assessment of professional competence: developments, research and practical implications. Adv Health Sci Educ Theory Pract 1996; 1(1):41–67. Epub 1996/01/01. https://doi.10.1007/bf00596229. PubMed PMID: 24178994
  36. Nielsen SB, Hunter DEK. Challenges to and forms of complementarity between performance management and evaluation. New Dir Eval 2013; 2013(137):115–23.


How to Cite this Article
Pubmed Style

Aziz MO, Siddiqui MA. A qualitative evaluation of alignment between intended and emergent components in a graduate program through instructors. IJMDC. 2018; 2(3): 114-123. doi:10.24911/IJMDC.51-1530280338


Web Style

Aziz MO, Siddiqui MA. A qualitative evaluation of alignment between intended and emergent components in a graduate program through instructors. http://www.ijmdc.com/?mno=302642450 [Access: November 21, 2018]. doi:10.24911/IJMDC.51-1530280338


AMA (American Medical Association) Style

Aziz MO, Siddiqui MA. A qualitative evaluation of alignment between intended and emergent components in a graduate program through instructors. IJMDC. 2018; 2(3): 114-123. doi:10.24911/IJMDC.51-1530280338



Vancouver/ICMJE Style

Aziz MO, Siddiqui MA. A qualitative evaluation of alignment between intended and emergent components in a graduate program through instructors. IJMDC. (2018), [cited November 21, 2018]; 2(3): 114-123. doi:10.24911/IJMDC.51-1530280338



Harvard Style

Aziz, M. O. & Siddiqui, . M. A. (2018) A qualitative evaluation of alignment between intended and emergent components in a graduate program through instructors. IJMDC, 2 (3), 114-123. doi:10.24911/IJMDC.51-1530280338



Turabian Style

Aziz, Muhammad Owais, and Muhammad Asadullah Siddiqui. 2018. A qualitative evaluation of alignment between intended and emergent components in a graduate program through instructors. International Journal of Medicine in Developing Countries, 2 (3), 114-123. doi:10.24911/IJMDC.51-1530280338



Chicago Style

Aziz, Muhammad Owais, and Muhammad Asadullah Siddiqui. "A qualitative evaluation of alignment between intended and emergent components in a graduate program through instructors." International Journal of Medicine in Developing Countries 2 (2018), 114-123. doi:10.24911/IJMDC.51-1530280338



MLA (The Modern Language Association) Style

Aziz, Muhammad Owais, and Muhammad Asadullah Siddiqui. "A qualitative evaluation of alignment between intended and emergent components in a graduate program through instructors." International Journal of Medicine in Developing Countries 2.3 (2018), 114-123. Print. doi:10.24911/IJMDC.51-1530280338



APA (American Psychological Association) Style

Aziz, M. O. & Siddiqui, . M. A. (2018) A qualitative evaluation of alignment between intended and emergent components in a graduate program through instructors. International Journal of Medicine in Developing Countries, 2 (3), 114-123. doi:10.24911/IJMDC.51-1530280338