This card presents the 2020 ASEE paper, "Comparison of Entrepreneurial Mindset Course Learning Objectives: Evaluating Consistency and Clarity".
This card highlights key concepts from the 2020 ASEE paper, Comparison of Entrepreneurial Mindset Course Learning Objectives: Evaluating Consistency and Clarity
. This paper was presented in the Entrepreneurship & Engineering Innovation Division. This card will describe how three sets of learning objectives for EM content were evaluated against each other, and how and why you should develop learning objectives that can be used for direct assessment. Motivation and Background
- No standard set of learning objectives for EM material beyond KEEN Student Outcomes (KEEN Starter educational Outcomes), which are not designed for the purpose of direct assessment
- Many institutions created their own frameworks of learning objectives for assessing EM content in courses
- A literature review, including a study by Brunhaver et al., identified that many universities were having difficulty assessing EM in their courses using the existing methodologies (student surveys, examination, classroom observation, review of written course content)
Compare three existing sets of EM learning objectives (Figure 1) to determine if they:
- Consistency: Measure content similarly
- Clarity: Can be used for direct assessment purposes
|KEEN Starter educational Outcomes; 18 total |
|Expanded set of outcomes developed by Ohio Northern University |
|EML Objectives developed by The Ohio State University; three efficacy levels |
Figure 1 Intended Audience
The approach in this paper will provide benefit for those looking to establish a common set of EM learning objectives that can be used to measure the EM content of their courses. A clear and measurable set of objectives could make the process of integrating entrepreneurship into engineering courses more accessible and consistent for a broad audience of engineering students.Methodology
- Three courses used to compare the three frameworks
- Spanned an entire college career
- First-year engineering course
- Third-year technical course
- Senior capstone design course
- Courses were used to compare the frameworks in a relative way
- Each course was evaluated using the three frameworks
- 9 total datasets generated (Figure 2)
- Courses were evaluated based on only written content (syllabi, assignments, student work) similar to the materials required for an evaluation by ABET
- Researchers used a qualitative document analysis to assign numerical values to each course
- Numerical scores were grouped by each of the 6 C’s
|3rd Year Technical Course |
The numerical results for the three frameworks were then compared for each course. For example, data sets 1, 2, and 3 from the above figure were compared using a visual analysis.ResultsConsistency
The results of this investigation indicate that the three sets of learning objectives evaluated are measuring different things. There was no noticeable visual pattern in the data when comparing the three different frameworks for any given course. For example, the Senior Capstone course was measured as very high in the "Character" category when measured using eKSO and EMLO, but was measured very low using KSO. A different pattern emerged for the other two courses in the "Character" category however, so there was no framework that consistently produced higher scores.This result means that a course that would measure as having high EM content at one university may score very low in EM at another because their learning objectives differ.
This result is despite the fact that all three frameworks measured are based on the same basic outcomes of the 6 C's.Clarity
The clarity of the frameworks evaluated were measured by their ability to be used for direct assessment, and many objectives were determined to not be very useful for this purpose. Although all of the objectives in each of the three sets are important EM values and skills, they were not specific or measurable enough for direct assessment using our chosen methodology. Examples of objectives that were difficult to use for this purpose follow.
- eKSO objective #50
- Researchers found it difficult to imagine a course in which students would not meet commitments (due dates, class attendance, and assessments)
- Defining a threshold would make this objective measurable
"demonstrate constant curiosity"
- KSO objective
- Needs a way to measure if students are constantly curious
- Difficult to use this objective for direct assessment purposes using our methodology
"listen to others to advance ideas and gather input"
RecommendationsWhy make learning objectives that can be used for direct assessment?
- EMLO objective
- Difficult to measure with the course content in this study
- Recommend more specificity and language change
- “Continuously gather feedback from stakeholders”
What is the next step for educators?
- No additional work of generating a survey, exam, or independent observation of courses
- Less biased than student surveys
- Scalable and repeatable across courses and semesters
- Utilize and possibly modify existing EM learning objectives from various sources if the idea of direct assessment is useful to you
- Generate a condensed set of learning objectives from some of the existing sets and narrow them based on those that are important to stakeholders
This paper was presented in the Entrepreneurship & Engineering Innovation Division in "Session R224·ENT Division Technical Session: Assessment Tools and Practices".