Welcome to SFU.ca.
You have reached this page because we have detected you have a browser that is not supported by our web site and its stylesheets. We are happy to bring you here a text version of the SFU site. It offers you all the site's links and info, but without the graphics.
You may be able to update your browser and take advantage of the full graphical website. This could be done FREE at one of the following links, depending on your computer and operating system.
Or you may simply continue with the text version.

*Windows:*
FireFox (Recommended) http://www.mozilla.com/en-US/firefox/
Netscape http://browser.netscape.com
Opera http://www.opera.com/

*Macintosh OSX:*
FireFox (Recommended) http://www.mozilla.com/en-US/firefox/
Netscape http://browser.netscape.com
Opera http://www.opera.com/

*Macintosh OS 8.5-9.22:*
The only currently supported browser that we know of is iCAB. This is a free browser to download and try, but there is a cost to purchase it.
http://www.icab.de/index.html

Instructors and students try a new approach to teaching and course evaluation

Wednesday, October 16th, 2013

Teaching and Course Evaluation Project

Last year the Vice-President, Academic, established a Teaching and Course Evaluation Project (TCEP) under the direction of Corinne Pitre-Hayes. The project team was charged with the task of recommending a replacement for the 30-year-old TCE instrument and evaluation process used by many SFU academic units.

This summer, following consultation with faculty members and a review of the extensive research literature on TCE, the project team conducted a proof of concept (PoC) to test an evaluation model incorporating some of its key findings. The PoC involved 14 faculty members and more than 1,300 students in 18 courses (including seminars, lecture courses and distance education courses). It differed from the approach commonly used at SFU in three important ways:

  1. The PoC used an online survey platform rather than paper questionnaires. Students received an email containing a survey link roughly two weeks before the end of their course(s), followed by several reminders. The survey was open for approximately two weeks.
  2. The PoC employed a flexible, customized approach to generate questions tailored to each course. Questions were selected at four levels: institutional (eight questions common to all course evaluations); Faculty (up to four questions common to all evaluations within a Faculty); departmental (up to four questions common to all evaluations within a department or school); and instructor (up to four questions determined by the instructor for an individual course). To ensure consistency, questions that used a rating scale were drawn from a question bank developed by the University of Toronto.
  3. The PoC provided results in the form of tailored online or downloadable reports for specific audiences, including instructors, students and administrators. Responses to instructor questions were seen only by the instructor unless, optionally, the instructor chose to share those responses with students. The online data collection system allowed for the inclusion of some demographic information (for example, gender, age ranges and grade averages of respondents), but the team was very careful to preserve student anonymity.

The student evaluation period ended in early August, and reports were distributed in early October. Among the initial observations:

  • The survey completion rate for all courses, including distance education courses, was 72 percent; for lecture courses and seminars, the completion rate was 82 percent, a result that Pitre-Hayes says is well above typical response rates for online evaluations. One reason for the strong response may have been active promotion of the evaluation process by participating instructors.
  • The PoC collected more data than SFU’s current system and made results available more quickly and in a more contextualized way.

Pitre-Hayes is cautious about drawing detailed conclusions until she completes a review of the PoC and evaluation results with faculty participants in late October. However, she does express satisfaction with the flexibility the PoC model offers for both creation of questions and reporting of results.

The project team hopes to prepare a final report in time for the December Senate meeting.

Five more years of “accomplished and principled leadership”

Friday, February 1st, 2013

President Andrew Petter announced this morning (February 1, 2013) that Jon Driver will be reappointed for a second five-year term as Vice-President, Academic, and Provost, beginning on September 1, 2013. Driver has been a strong proponent of teaching and learning initiatives at SFU, including the WebCT Replacement Project, the Teaching and Course Evaluation Project, and the Learning Outcomes and Assessment Project. Here is the President’s full announcement:

To the Campus Community:

I am pleased to report that, following the unanimous advice of the Search Committee, the Board of Governors has approved my recommendation to reappoint Dr. Jonathan Driver as Vice-President Academic and Provost for a second five-year term commencing September 1, 2013.

Dr. Driver’s tenure as VP Academic has been characterized by accomplished and principled leadership and a collegial and collaborative approach. He has fostered a vibrant and inclusive culture to support teaching, learning and research at SFU, and has overseen the implementation of important new initiatives. He has also encouraged a deeper sense of institutional commitment and engagement amongst faculty, staff and students, and ably represents SFU in the external community.

Dr. Driver is an exceptional academic leader who has the necessary skills and attributes to position SFU for continued success in the future. I look forward to working with him in the years ahead.

I am grateful to the members of the Search Committee and to all members of the community who assisted the committee in its deliberations.

Jon Driver talks about the next five years

Thursday, January 3rd, 2013

Jon Driver, VP Academic

At the end of November, Jon Driver, Vice-President, Academic, sat down for a conversation about his academic vision for the next five years. He emphasized the importance he attaches to teaching within the university culture, addressed the subject of learning outcomes, and talked about some key priorities in the next Academic Plan. Here are excerpts from his comments.

What is the relationship between teaching and research?

There’s a real connection between teaching and research, a two-way connection. First of all, when people are active in research, that makes their teaching more interesting. But also, we can take our research methods and apply them to try and understand our teaching.

How would you compare the place of teaching vis-à-vis research at SFU?

We value teaching and research equally. We try to evaluate faculty members equally based on their teaching and their research. And so it would be nice if we had a culture in which people talked about “What have I done new in teaching” as well as “What have I done new in research.”

What can be done to make good teaching a priority?

On research we’re very good at being able to say, “Your performance is not as good as it should be”, or “your performance is satisfactory”, or “your performance is really good” … On teaching, we tend to say either “Your performance is not as good as it could be” or “It’s satisfactory.” And we stop at satisfactory … In fact, I just reviewed all of the guidelines that every department in the university has for how they evaluate their colleagues during salary review and during tenure and promotion, and what I found was that most – not all, but most – of the departments talk about satisfactory teaching, and then they talk about what you should be doing if your teaching isn’t satisfactory, but they don’t talk about how they’re going to measure outstanding teaching and how they’re going to reward outstanding teaching. And so, having reviewed all of these documents, I’m going to go back to the departments and make some suggestions about what methods they could use to identify the outstanding teachers and reward them through the salary review process or through the tenure and promotion process.

Are there other obstacles to good teaching?

The issue that would be raised by many people is the time issue. If I don’t get rewarded for being a really good teacher and I’ve got a lot of pressure to do research, why would I focus on teaching and learning? One of my answers to that is [that] I wouldn’t expect people to be doing this continually, but … that maybe every few years they would … spend a semester working primarily on some changes in their teaching or getting some new skills around teaching.

You’ve raised the issue of learning outcomes. Why do you think they are important?

I think one of the most important reasons for stating [learning] outcomes and trying to assess them is to communicate to students. Students want to know what the purpose of the course is. They want to know, “How does this course that I’m about to take fit into my overall major?” … The second thing relating to students is that you can explain to them how the evaluation that you’re using relates back to the outcomes … I think for students it’s really important that they get that sense of why they’re in a course and why they are doing the things that they’re being asked to do … The other component of learning outcomes is partly about ensuring that we’re getting the results that we think we’re getting. And one of the ways to do that is to define what you want students to get out of your course and then to try to assess that.

Some people see this process as a threat to academic freedom.

My attitude is [that learning outcomes are] up to the department. I don’t want to tell people what to do. We’re still in the process [of considering learning outcomes] here, but I think one of the outcomes of this project ought to be that assessing how well you are doing should not be a function of my office. It should be your own colleagues who [do that] in the context of external review.

But there do seem to be concerns about a loss of control.

The other problem is that we do have some external bodies that accredit our programs. So, for example, the Canadian Engineering Accreditation Board accredits our Engineering programs, and they have very specific requirements around learning outcomes that are much more narrowly defined than what I would expect a department to do, and there’s a tendency for what happens in Engineering to be cited as though this is what it’s going to be like for everybody.

In May you released an Academic Plan for 2013–2018. How would you compare that plan to the previous one?

I’ve tried to put less detail into the upcoming plan and I’m trying to encourage academic departments to come up with their own [approaches for] the way they would like to handle some of the goals of the program … My aim with the Academic Plan is to have some general goals and then encourage departments to find ways that they can meet those goals.

Is the new plan a continuation of the previous plan or does it represent a shift?

I think it’s more a continuation. The plan that we’re just wrapping up now certainly had a focus on the undergraduate student experience. I think perhaps the current plan has got more of a focus on teaching as a component of the undergraduate experience. It references some projects that we have actually already started – like the support we can give to students for whom English is not the first language, that’s mentioned, the learning outcomes [initiative] is mentioned very specifically, getting to a better system for evaluating teaching is mentioned – so there is some reference to ongoing projects, and there probably is more reference generally to teaching and learning rather than the overall student experience.

How do you see teaching contributing to the undergraduate experience in the next few years?

What I would like to see in terms of support for students in the classroom is, firstly, that a more supportive environment is created by having a greater range of teaching practices, so that the way in which teaching is done matches the learning outcomes better … We know that students or people generally don’t learn by sitting in a classroom and having people tell them things. They learn by doing things … We need to worry less about the content and speaking the content of a discipline to students, and we need to worry more about them getting the major principles and the theories and the methods and somehow experiencing that.

The Experiential Education Project concluded that there is breadth but not depth of experiential learning opportunities at SFU. Do we need to do more?

I don’t think there’s anything wrong with having a large number of courses with a smallish component of experiential [education]. What we really need to do is select some areas of the university for a deeper experiential education, and when I say select, I don’t mean I would select them. I mean people could self-select … I think we just need to encourage departments to approach it strategically, to identify an area within their curriculum where they think a deeper experiential component would be really valuable to students and to try to build those areas first.

Related links:

Academic Plan 2013–2018

Report of the Learning Outcomes and Assessment Working Group (draft report; see link in right sidebar)

The State of Course Based Experiential Education at SFU (report)

What’s in the Learning Outcomes and Assessment draft report?

Thursday, December 13th, 2012

Learning outcomes and assessment

In fall 2011, Jon Driver, Vice-President, Academic (VPA), appointed the Learning Outcomes and Assessment Working Group (LOAWG) to “consider whether and how SFU might move to a process of defining and assessing learning outcomes for academic programs.” The group was chaired by Paul Budra, Associate Dean, Faculty of Arts and Sciences, and included representatives from various Faculties as well as the VPA’s office, Institutional Research and Planning (IRP), and the Teaching and Learning Centre (TLC). It released a draft report on November 1 with a call requesting feedback and comments by December 7. The final report and a response from the VPA will go to Senate some time in the new year.

Here are some excerpts from the document.

Why an LOA approach?

“Increasingly universities are facing competition from alternative education institutions such as professional schools, which often publicize student learning outcomes data … Given this reality, universities must take the initiative to proactively identify and solve perceived shortcomings in their teaching and learning systems, rather than have these identified arbitrarily by outsiders who may have their own ‘solutions’ in mind.”

“The onus is on Canadian universities to act now in assuring the public that students are achieving expected learning outcomes. Recognizing that quality assurance in the postsecondary education sector is a growing global concern, Ontario has recently established a Quality Assurance Framework … Earlier this year, Quebec publicized official recommendations to review and ‘adjust’ its universities’ quality assurance mechanisms … Also this year, British Columbia solicited public opinion on enhancing quality assurance at its own educational institutions and issued a Quality Assurance Consultation Document stating that provincial quality assurance processes must ‘adapt’ in order to remain current with international standards, better market BC’s postsecondary institutions, and reassure employers and students alike that a university education is relevant to their needs.”

“Research shows that university students respond favourably when clearly articulated learning outcomes are built into their programs, courses and assignments. Dr. Richard S. Ascough of Queen’s University explains how ‘the pressure to articulate [learning] outcomes is not simply a top-down process; it also arises from our students themselves … [who] want and often demand a clear idea of the return on investment of a given activity.’ “

Crucial definitions

Learning Outcome – A ‘learning outcome’ is an area of knowledge, practical skill, area of professional development, attitude, higher-order thinking skill, etc., that an instructor expects students to develop, learn, or master during a course or program. Learning outcomes are observable and measurable by quantitative or qualitative assessment models. For some examples see the Eberly Center for Teaching Excellence website at Carnegie Mellon University.

Teaching Goal – A ‘teaching goal’ is anything that an instructor or a program coordinator intends that students will learn in their course or program (note: in this report it is assumed that instructors will retain autonomy to determine the pedagogical approach they will use to meet the learning outcomes, as well as the autonomy to teach material falling outside defined outcomes).

Program – A ‘program’ is a set of coherent curricular requirements leading to a Senate-approved academic credential (e.g., bachelor or graduate degree, certificate or diploma, etc.). The definitive list of such programs is included in the SFU Calendar.”

Principles for investigating and making recommendations concerning LOA

“1. The primary purpose of learning outcomes and assessment processes is to communicate transparently the purposes of all degree, program and course requirements.

“2. As per its Strategic Vision, SFU is committed to academic and intellectual freedom. Learning outcomes for courses and programs will be developed and determined at the local academic unit level and will reflect local disciplinary cultures. These will be aligned with enduring institutional goals, values, and principles as articulated in the SFU Strategic Vision.

“…”

Current LOA use in academic units

“Over the summer of 2012 all academic units at SFU were asked to complete an online survey of their current practices (if any) surrounding LOA … Out of 457 programs solicited by the survey, a total of 273 were completed … 8% of respondents reported that they are already accredited by an external accrediting body (in most cases by a disciplinary professional body) while a further 4% reported that they are currently seeking accreditation by an external accrediting body … A handful of programs indicated they are currently implementing some form of learning outcomes and/or assessment process for some or all of their undergraduate and/or graduate programs or courses. 18% reported having program-level learning outcomes and 64% within this group said they are also assessing students to determine whether these outcomes are being achieved …”

Current curricular assessment processes

“We have found no consistent curricular assessment process used across SFU’s academic units. Curriculum committees meet anywhere from annually to bi-weekly. The units that are currently employing some form of learning outcomes assessment have different approaches to administering them: some have formal meetings of curriculum committees, some use a Dean’s or Chair’s advisory council, and others use departmental retreats … For the majority of SFU programs, creating and adopting meaningful course and program learning outcomes, and undertaking assessment of those, will be a significant change.”

Appropriate processes for SFU

“Given the number of academic programs at SFU and their diversity, it is not clear that there can or should be one approach for LOA across the institution. SFU’s academic units have different structures, histories, faculty complements, pedagogies, and needs. Therefore LOA processes at SFU need to reflect the University’s decentralized culture as well as commitment to academic freedom and integrity.

“Nonetheless … any approach adopted for SFU should have the following components:

“1. Learning outcomes are made explicit to students in course outlines and other materials, course assignments, and other assessments.

“…”

“In summary, the infrastructure to support implementation of LOA processes at SFU already exists. However, the work of the multiple existing units which comprise that infrastructure needs to be identified and coordinated …

“Given the breadth of disciplines, forcing a single approach on units would be unlikely to yield the best possible results vis-à-vis student learning and course/program improvement. The choice of approach taken to the development and articulation of course- and program-level learning outcomes, along with corresponding assessment practices, should be within the purview of the individual Faculties, schools and departments. Academic units should have the option to retain LOA data that is generated locally. When cyclical processes (such as departmental reviews) require it, University-level evaluation of academic programs’ LOA activities can and should be performed using aggregated data from the units …”

Pending Senate approval

“If Senate approves the LOA initiative, we propose that:

“1. As new courses and programs are developed, learning outcomes will be brought forward to SCUS [Senate Committee on Undergraduate Studies] and SGSC [Senate Graduate Studies Committee].

“2. Learning outcomes will have to be developed gradually and systematically for existing courses and programs.

“3. … We propose that academic units have learning outcomes for all of their courses and programs in place by their next regularly scheduled external review, beginning in spring 2014. Units that have an external review scheduled for 2014 will be expected to include reference to learning outcomes and assessment in self-study documents prepared in fall 2013, but will not be expected to be able to comprehensively respond to this in their review of curricula.

“4. A mechanism by which University-wide LOA affairs are facilitated and supported should be set up or identified.

“5. Assessment approaches must be integrated into academic units. These will vary from discipline to discipline … The most important assessment will take place during the regular external reviews that all academic units presently undergo every seven years, and subject to a mid-term review report between full reviews.”

Providing support

“To perform … the implementation of LOA at SFU, we recommend consideration of the following options.

“Option 1: Align and Enhance Capacities of Existing Resources

“The first option proposed for consideration is to expand the services currently embedded in the TLC, and IRP, with coordination to ensure continuous improvement and alignment with institutional goals …

“Option 2: Add Capacity to the Office of the VPA with a New Unit

“The second option is to establish a compact LOA unit that, at least during SFU’s period of transition to an ongoing process of LOA, would report to the VPA …

“Option 3: Blended or Evolving Services

“The third option would be to begin with the services currently in place, with additional responsibilities and resources assigned in each. Over time, as needs evolve and/or service gaps are identified, a new unit can be developed …”

Recommendations

“1. Programs that do not already have processes built into part of their disciplinary accreditation adapt either or both of the approaches to learning outcomes described above (program- to course-level outcomes, or course- to program-level outcomes).

“2. Academic units will continue to complete the learning outcomes sections of the new course and program proposal forms used by SCUS and SGSC.

“3. Academic units have the option to retain within the unit assessment data that is generated locally. When cyclical processes require it, University-level analysis related to learning outcomes assessment will use aggregated data received in report form from the units, which will be provided with standardized templates for the purpose.

“4. The cycle of regular assessment of learning outcomes should be built into the external review cycle, beginning with units externally reviewed in spring 2014. LOA will become part of the regular process of external reviews, incorporated into the self-study documents as part of curricular review as of fall 2013 and subsequently every seven years. Curricular review, including comments on the assessment of learning outcomes, will also form part of the external review mid-term report.

“5. The VPA will establish enhanced supports for LOA via one of three options: enhanced capacities in the existing units of the TLC and IRP; or added capacity to the office of the VPA via a compact unit responsible for LOA that would be in place by summer 2013 with a mandate as described above or an evolving blend of these options.”

Related links:

LOAWG draft report (PDF)

LOAWG web page

SFU moves toward a more effective teaching and course evaluation system

Thursday, November 29th, 2012

Corinne Pitre-Hayes

At SFU, the process used for student evaluation of courses and instructors hasn’t changed substantially in 30 years. Corinne Pitre-Hayes (above) is on her way to ending that dubious streak.

As leader of SFU’s Teaching and Course Evaluation Project (TCEP), Pitre-Hayes has a straightforward assignment: to recommend a replacement for the instrument (the survey form) used by the university for student evaluations of teaching and courses and to develop a best-practice guide for using and interpreting the evaluation data.

The assignment was handed to Pitre-Hayes and her team by Jon Driver, Vice-President, Academic, in December 2011 in response to recommendations by the Senate Committee on University Teaching and Learning and the Task Force on Teaching and Learning.

Valid concerns

Pitre-Hayes is aware that many members of the academic community view student evaluations – both the data gathered and the way it is used – with skepticism, and she readily enumerates the sources of concern, including doubts about reliability and validity, suspicions about bias, and worries about academic freedom. Her response, in a word, is research.

“There’s a lot of evidence in the research about the concerns that most people talk about,” she says. “These things have been researched for more than 50 years.”

She cites the common concern that evaluation results will be used inappropriately for tenure and promotion decisions. “Such decisions should not be made on the basis of teaching and course evaluations alone. That’s a key finding that surfaces repeatedly in the research. The results should be combined with other evaluative processes.”

More useful feedback

But for Pitre-Hayes, providing a better instrument and best-practice guide is only “square one.” What really excites her is the possibility of enabling faculty members and instructors to make greater formative use of the evaluation data.

“There’s this enormous opportunity that relates to teaching and learning,” she says. “We have a bunch of data here that could be incredibly useful to instructors and that we could be making constructive use of.” It’s a message she has been spreading at community consultations with administrators and faculty members in various Faculties, beginning with Education in May.

“I would like to plant the seeds for that shift [in thinking]. The key will be putting infrastructure in place that enables this to happen.”

Pitre-Hayes imagines a tool that will give instructors more control and flexibility: “I can envision instructors potentially using the instrument and the system for the purpose of getting specific student feedback regularly or on an ad hoc basis at various points of the year so that they can experiment with things in advance, during, and at the end of the course.”

The vision of an evaluation tool that responds to the requirements of individual instructors and departments will shape the recommendations of her project team: “The instrument needs the flexibility to be fine-tuned so that it’s useful for a wide variety of courses with a range of formats.”

It’s all part of her effort to move in the direction of formative uses of student evaluations in a way that she hopes instructors themselves will embrace.

Related links:

Go to the TCEP website >>

Peter Wolf will talk about improving curriculum through learning outcomes

Tuesday, October 30th, 2012

Special presentation: Fostering Continuous Improvement of Curriculum in Higher Education

This event has been cancelled.

Learning Outcomes and Assessment website: www.sfu.ca/vpacademic/committees_taskforces/LOAWG.html

Peter WolfWhen Senate approved principles to guide the consideration of learning outcomes and assessment processes at SFU in June, a commitment to community consultation and involvement was high on the list. Peter Wolf’s upcoming presentation on “Fostering Continuous Improvement of Curriculum in Higher Education,” scheduled for November 1 at SFU Burnaby, is one result of that commitment.

Wolf is director of the Centre for Open Learning and Educational Support at the University of Guelph and has been involved with the institution’s Teaching Support Services since 1998. He will share his experience in the implementation of learning outcomes in a Canadian post-secondary setting, his knowledge of the learning outcomes and assessment landscape in Ontario, and his expertise in the area of academic program assessment.

More about Peter Wolf

Peter Wolf has gained considerable recognition for his work in educational development in higher education. In 2006, he was a co-recipient of the University of Guelph Provost’s Award for Innovation for designing, developing, and implementing a pioneering new distance education course. In 2008, he won the University of Guelph President’s Exemplary Staff Award for Innovative Leadership in recognition of his role as a proponent of educational development and reform and for improving the educational experience at Guelph, providing consultative services, and becoming a trusted guide and leader to faculty, staff, and students. His involvement in educational development in higher education spans more than 20 years, and he has an academic background in adult education. He was a Visiting Scholar at the University of Victoria in 2010, is a member of the board of directors for the Society for Teaching and Learning in Higher Education (2008–present), and is co-editor of and contributor to New Directions for Teaching and Learning in Higher Education (issue 112, “Curriculum Development in Higher Education: Faculty-driven processes and practices,” 2008).

Related links:

Peter Wolf’s web portfolio >>

University of Guelph Centre for Open Learning and Educational Support >>

Experiential education at SFU: wide but not deep

Tuesday, July 17th, 2012

Roughly 32% of credit courses at SFU incorporate some form of experiential education, but in most cases the experiential component is limited, and “deeply immersive, highly engaging experiences are few in number and largely inaccessible to the majority of the student body.” That’s the conclusion of a summary report released by the Experiential Education Project (EEP) in early July.

The project team, led by Jennifer McRae and Deanna Rogers under the direction of an advisory committee drawn from the SFU community and chaired by Sarah Dench (director, university curriculum and institutional liaison), was commissioned to compile an inventory of course-based experiential education opportunities at SFU by Jon Driver, vice-president, academic, in late 2010. The initiative grew out of similar surveys conducted previously within the Faculty of Environment and the Faculty of Arts and Social Sciences. It was, in part, a response to the Task Force on Teaching and Learning (2010), which called for the university to “provide more opportunities for … learning that extends beyond the classroom.” It also fits neatly within SFU’s “engaging the world” theme, particularly its call to engage students and communities, and complements the work of SFU’s long-standing and internationally recognized co-operative education program and Work Integrated Learning unit.

What is experiential education?

McRae explains that the project employed a deliberately broad definition of experiential education in order to capture as much experiential activity as possible. Specifically, she and her team used the following definition:

The strategic, active engagement of students in opportunities to learn through doing, and reflection on those activities, which empowers them to apply their theoretical knowledge to practical endeavours in a multitude of settings inside and outside of the classroom.

The survey methodology was also flexible. Potentially experiential courses were identified from course outlines, and the course descriptions were supplemented where possible by faculty interviews and other follow-up activities. Courses were classified as experiential if they included at least one of six broad “practices”:

  • Reflective experiences (e.g., journal writing)
  • Field experiences (e.g., field trips or field work for labs)
  • Creative project experiences (e.g., blogging, video production, and portfolio-based work)
  • Community experiences (e.g., internships and community-based research)
  • Collaborative experiences (e.g., learner-directed environments and inter-institutional or inter-disciplinary activities)
  • Problem-based experiences (e.g., simulations, case studies, and real-world problem solving)

Ultimately, 3,774 undergraduate and graduate courses were identified, 2,684 were reviewed, and 1,213 were classified as “experiential courses.” However, many of the experiential courses qualified on the basis of a “single practice descriptor,” and the report readily acknowledges that “a majority of these single-experience courses would likely not be captured by a second review under a tighter definition.”

What does the report recommend?

Nevertheless, the results provide a starting point for understanding the current landscape. The report presents evidence to suggest that students and instructors have a strong interest in experiential education, and it offers two sets of recommendations for promoting experiential education at SFU. The first set outlines steps for “aligning the course-based curriculum with the [university's] strategic vision”:

  • Review and arrive at an institutional definition of experiential education
  • Develop infrastructure and support mechanisms for community-based experiences
  • Develop an internal teaching exchange program (e.g., the Honeycomb program)
  • Continue and preserve Teaching and Learning Development Grants and the Honeycomb Retreat

The second set of recommendations identifies steps for “increasing access to course-based experiential education”:

  • Use lectures for content and tutorials for process and experience
  • Continue support for innovation in experiential course delivery (e.g., City Studio, Semester in Innovation, The Change Lab)
  • Make the experiential opportunities that do exist more visible to students and the broader community
  • Focus on developing first-year engaged experiences
  • Award more credits for certain experiences

Jon Driver, in his Foreword to the report, interprets the findings as a call to action:

The analysis of a broad range of student experiences at SFU challenges academic administrators to find out what undergraduate students want from their education and to provide support for those instructors who take on the difficult task of creating intensely experiential learning. The report challenges instructors to think less about the content of a course and more about how students learn. And students themselves are challenged to become more engaged with their education.

The full report is available here and on the Experiential Education website and blog.

Video: Presentation on learning outcomes draws a crowd

Friday, July 13th, 2012

Dr. Kathi A. Ketcheson of Portland State University emphasized the value of learning outcomes in improving the student learning experience.

What is the value of learning outcomes, and how can they be implemented effectively? In a presentation that drew a large audience to the Halpern Centre on June 28, Dr. Kathi A. Ketcheson of Portland State University (PSU) shared lessons from the implementation process at her university. Ketcheson is a research professor and director of the Office of Institutional Research and Planning at PSU and frequently acts as a consultant on learning outcomes and assessment to other schools. Her remarks provided both an introduction to, and an illustration of, the use of learning outcomes and assessment to improve the student experience. The presentation and the following question-and-answer session were webcast and are now available on video through the Teaching and Learning Centre.

In the first part of her talk, Ketcheson used the PSU experience to address a number of themes:

  • Overview of assessment: What is it? Why do it? Who owns it?
  • Conditions for good assessment
  • Some faculty concerns, including worries about academic freedom and increased workload
  • Trigger words: Accountability, compliance, measurement, reporting, data-driven, drill down, transparency
  • Assessment for improvement
  • How does assessment improve learning?
  • Scholarship of assessment
  • Components of program assessment
  • Defining outcomes

Later, she addressed practical issues such as how to begin defining learning outcomes at the program level and how to encourage participation by faculty members. The presentation was followed by a lively and wide-ranging discussion of issues and questions raised by audience members.

Ketcheson’s appearance was arranged by SFU’s Learning Outcomes and Assessment Working Group (LOAWG), which has been asked by Jon Driver, vice-president, academic, and provost, to produce recommendations for the implementation of learning outcomes at SFU. For more information about the LOAWG project and the theme of learning outcomes, follow these links:

LOAWG website: www.sfu.ca/vpacademic/committees_taskforces/LOAWG.html
Accreditation and the Northwest Commission on Colleges and Universities: www.sfu.ca/vpacademic/accreditation/nwccu.html
Ketcheson presentation: tlcentre.sfu.ca/archive/lecture-series/2012/2012-06-28_mecs_1007133-h/

A special presentation on learning outcomes by Dr. Kathi Ketcheson

Friday, June 22nd, 2012

The topic of learning outcomes and assessment is especially relevant at SFU for two reasons: first, the current Academic Plan includes a call to “define learning outcomes for each course and programme” as part of its emphasis on achieving a “high-quality student experience”; and second, the Northwest Commission on Colleges and Universities (NWCCU) accreditation process in which SFU is currently engaged requires the articulation of clear learning outcomes. That’s the context for an upcoming presentation on “Learning Outcomes and Assessment: Challenges and Opportunities” by Dr. Kathi A. Ketcheson, a research professor and director of Institutional Research and Planning at Portland State University.

Dr. Ketcheson has published and presented widely on the topic of institutional portfolios in accreditation, assessment, and accountability, and she frequently serves as a consultant to higher-education institutions on program evaluation and assessment.

Her appearance at SFU comes at the invitation of the Learning Outcomes and Assessment Working Group, a team established by Jon Driver, vice-president, academic, to recommend principles and processes for establishing and assessing learning outcomes. It forms part of the working group’s commitment to community consultation and involvement. If you are interested in hearing her presentation, plan to attend either in person or via webcast.

When: Thursday, June 28, 2012, 1:30 pm–3:30 pm
Where: Halpern Centre 126, SFU Burnaby
Registration: No cost. Registration is not required.

Online: A live webcast will be accessible via this page.

For more information on the Learning Outcomes and Assessment Project, visit the project website.

New associate vice-president, academic, will start in September

Friday, March 23rd, 2012
Gordon Myers

Gordon Myers will begin his term as associate vice-president, academic, in September.

Jon Driver, vice-president, academic, announced the appointment of a new associate vice-president, academic, today (March 23). Here is the text of his message:

I am very pleased to announce that the Board of Governors has
approved the appointment of Dr. Gordon Myers as associate
vice-president, academic (AVPA), for a five-year term commencing
September 1, 2012. Chair or associate chair of the Department of
Economics for more than half of his 12 years at SFU, and currently
serving on both Senate and the Board of Governors, Gordon Myers
brings to the complex AVPA portfolio an excellent administrative
background and a self-professed passion for the role of universities
in society.

Gordon Myers joined the Department of Economics at SFU in 1999. He received his BA from Queen’s University and his MA and PhD from McMaster University (1990). Before joining SFU, he was an assistant professor at the University of Western Ontario and an associate professor at the University of Waterloo. He has been an academic visitor at the University of Essex (England) and the University of Bonn (Germany).