hero image
Sarah DeMark, Ph.D. - Western Governors University. Salt Lake City, UT, UNITED STATES

Sarah DeMark, Ph.D.

Vice President, Program Development | Western Governors University

Salt Lake City, UT, UNITED STATES

Sarah DeMark joined nonprofit Western Governors University (WGU) in September 2014, and serves as the Vice President of Academic Programs.

Social

Biography

Sarah DeMark serves as Vice President of Academic Programs at Western Governors University, where she has worked since September 2014. She helps keep WGU at the forefront of competency-based education by directing the university’s academic portfolio strategy as well as program design and development, with a portfolio of more than 60 degrees, 600 courses, and 1,000 assessments.

Prior to WGU, DeMark spent more than 15 years at leading IT companies, serving in various leadership roles where she oversaw the strategy and execution of the design, development, and deployment of innovative, large-scale curriculum and assessment portfolios. Previously, she was an independent consultant working with state and local school districts, as well as working with The College Board on SAT and AP program evaluation.

DeMark also served on ANSI’s Personnel Certification Accreditation Committee, which serves to validate whether certification programs adhere to standards.

Areas of Expertise (4)

Innovative Learning and Assessment Models

Competency-based Education

Strategic & Analytical Leadership

Driver of Learning & Business Outcomes

Education (2)

Arizona State University: Ph.D., Educational Psychology 2002

Vanderbilt University: B.A., Education, Psychology 1995

Affiliations (5)

  • American Educational Research Association (AERA)
  • American Psychological Association (APA)
  • American Statistical Association (ASA)
  • Association of Test Publishers (ATP)
  • IT Certification Council (ITCC)

Selected Media Appearances (2)

'One-and-done' model of higher ed is in the past

Education Dive  online

2018-03-05

Sarah DeMark, vice president of program development at Western Governors University, told Education Dive that given how quickly jobs are changing, there is a need for higher ed to think more broadly about what students are coming to campus to achieve. “No longer is it this one-and-done model where you get your bachelor’s degree and you’re set for your career,” she said, noting a need to think of the traditional college or university as “more of a continuing education model.” ​

view more

What can data do for students?

Education Dive  online

2018-01-29

Experts say newer learning analytics are at the beginning stages of predicting when and where students will have problems, therefore creating the space for educators intervene before a student falls behind. “We also leverage data to do real-time intervention with students. If we see that a student might be struggling in a particular area, then a mentor will see that and they’’ know that that’s a student they need to reach out to, said Sarah DeMark, Western Governors University Vice President of Program Development.

view more

Event Appearances (1)

WGU: 20 Years of Experience Bringing Competency-Based Education to Scale

ELI Annual Meeting  New Orleans, Louisiana

2018-01-20

Selected Articles (4)

Western Governors University's assessment quality rubrics: high standards for assessments in CBE programs


The Journal of Competency-Based Education

Sarah DeMark

2016-08-09

All students deserve high‐quality assessments that are balanced and reliable. For CBE programs in particular, assessments serve a key function in enabling students to demonstrate their competence and thereby advance through a curriculum at their own pace.

view more


Design Rationale for a Complex Performance Assessment


International Journal of Testing

David M. Williamson , Malcolm Bauer , Linda S. Steinberg , Robert J. Mislevy , John T. Behrens & Sarah F. DeMark

2009 In computer-based interactive environments meant to support learning, students must bring a wide range of relevant knowledge, skills, and abilities to bear jointly as they solve meaningful problems in a learning domain. To function effectively as an assessment, a computer system must additionally be able to evoke and interpret observable evidence about targeted knowledge in a manner that is principled, defensible, and suited to the purpose at hand (e.g., licensure, achievement testing, coached practice). This article describes the foundations for the design of an interactive computer-based assessment of design, implementation, and troubleshooting in the domain of computer networking. The application is a prototype for assessing these skills as part of an instructional program, as interim practice tests and as chapter or end-of-course assessments. An Evidence Centered Design (ECD) framework was used to guide the work. An important part of this work is a cognitive task analysis designed (a) to tap the knowledge computer network specialists and students use when they design and troubleshoot networks and (b) to elicit behaviors that manifest this knowledge. After summarizing its results, we discuss implications of this analysis, as well as information gathered through other methods of domain analysis, for designing psychometric models, automated scoring algorithms, and task frameworks and for the capabilities required for the delivery of this example of a complex computer-based interactive assessment.

view more


Using Statistical Natural Language Processing for Understanding Complex Responses to Free-Response Tasks


International Journal of Testing

Sarah F. DeMark & John T. Behrens

2009 Whereas great advances have been made in the statistical sophistication of assessments in terms of evidence accumulation and task selection, relatively little statistical work has explored the possibility of applying statistical techniques to data for the purposes of determining appropriate domain understanding and to generate task-level scoring rules. Now that complex tasks are becoming increasingly prevalent, the inattention to item-level scoring is becoming more problematic. This study utilizes exploratory techniques to examine the differences between experts and novices in command usage and troubleshooting strategies in the field of computer networking. Participants were students and instructors of the Cisco Networking Academy Program as well as experts from the field of networking. Each participant was asked to perform troubleshooting tasks and a log of their actions was recorded. Log files containing all commands that participants entered while completing the troubleshooting tasks were analyzed using techniques of Statistical Natural Language Processing (SNLP). Results indicated that experts and novices differed in the types of commands that were used as well as in the sequence of those commands. Moreover, some patterns of examinee response that were found were entirely unexpected, leading to a rethinking of the appropriate conceptualization of the domain and the tasks. Previous assumptions about expert novice differences were shown to be faulty along with previously constructed scoring rules based on those assumptions. Comprehensive research in the application of statistical techniques to the understanding of domains and the validation of scoring rules are recommended.

view more


The Seven C's of Comprehensive Online Assessment: Lessons Learned from 36 Million Classroom Assessments in the Cisco Networking Academy Program


Online Assessment and Measurement: Case Studies from Higher Education, K-12 and Corporate

John T. Behrens (Cisco Systems, USA), Tara A. Collison (Cisco Systems, USA) and Sarah DeMark (Cisco Systems, USA)

2006 During the last 6 years, the Cisco Networking Academy™Program has delivered online curricula and over 36 million online assessments to support instructors and schools teaching computer networking skills to students. This chapter describes the context of this work and lessons learned from this endeavor. Through discussions with stakeholders concerning the central aspects of the Cisco Networking Academy Program assessment activities, seven themes have evolved, each starting with the letter C: claims, collaboration, complexity, contextualization, computation, communication, and coordination. These themes address many aspects of assessment, including design, development, delivery, and the management of assessment resources, which are all necessary to ensure a quality assessment program.

view more


 Your profile is not published.

Contact