Evaluation and Metrics Planning Documents

Evaluation and Metrics Planning Documents = Overview =

This page includes planning, brainstorming and other process documents that relate to the Open.Michigan evaluation project (fall 2010-winter 2011) that initiated an ongoing strategic plan to better define and scope Open.Michigan's activities as they relate to the needs and interests of the University of Michigan community. These documents were initally created and used in fall 2010 and should be considered drafts.

For a timeline, overview and finalized documents pertaining to this project, see the Evaluation Metrics Project and the Strategic Plan and Objectives pages.

= Anticipated Outcomes =

We expect to measure our impact through the following methods and outcomes, including a combination of qualitative and quantitative data. We plan to measure not only access to, but awareness, use and support of Open.Michigan and OER more generally. In communities where we have deep connections (for example the Medical School and the School of Information) and collaborations (African Health OER Network) we expect to see a tangible impact of U-M created OER in teaching and learning practices.

Production of OER
Method: Quantitative data analysis (tallies of internal data and data scraping)

Open.Michigan

OER


 * number of courses published by Open.Michigan
 * total
 * by school/department
 * pre-published
 * number of open resources/websites/textbooks (Can we include MERLOT content in this?)
 * number of tools built to help facilitate "open" &lt;--explain
 * (For media content see Use: Blip.tv)

Participation


 * number of participating faculty
 * creating OER
 * using OER
 * number of dScribes (active and alumni)
 * Percentage of University as a whole

U-M


 * number of open projects outside of our office, see: Open.Michigan Projects
 * number of deposits in institutional archive (Deep Blue) [follow up needed: what types of deposits are we looking for?]

Investment in and Perception of OER at U-M
Method: Qualitative and Quantitative data analysis Data Source: CTools OCW survey and Open.Michigan surveys

Resources allocated to Open.Michigan


 * monetary
 * infrastructure
 * endorsement: see Dean statements and interviews &lt;--also may want to develop surveys approaching this

Number of collaborating institutions/organizations, see: Health OER Collaborations, University Research Corridor, Global dScribes Google Group

U-M Community Perception

see: Survey topics by target audience


 * Document faculty committed to openness [this has not been done yet]
 * number of signatures on open education declaration (OER oath, pledge)

number of signatures on open textbooks statement

Learning Resulting from OER
Method: Qualitative data analysis

How do we want to measure this? Surveys? Interviews? Focus groups? Are there any published learning outcomes on the results of using OER in teaching settings?

Access and Use of OER
Method: Quantitative and qualitative data analysis Questions to ask: What is being used? (quant) How is it being used? (qual) Who is using it? (quant/qual)

Open.Michigan Website and Content Repositories

Access


 * Number of referatories seeing our content (include page rank in this analysis?)
 * Number of resources peer-reviewed in places like MedEdPortal (is this where this goes?)
 * Growth over time
 * User locations and access (where are our users coming from?) [remove staff data for local population]
 * Page visits versus content
 * Number of resources versus access (focus on Medical and SI)
 * De-emphasize but capture:
 * Visits-duration, number, content accessed, number of page views

[*number of resources "discovered"]

Use


 * Number of orgs that use OERca
 * Number of orgs that use OERbit
 * Number of U-M faculty and students publishing in open access journals
 * Number of outside participants in our "open" courses
 * Number of U-M students demanding OER or OCW (CTools survey)

Data Source: iTunes U

Access Number of courses available

Use Number of downloads Likes

Data Source: Blip.tv (including Vimeo channel and videos)

Access Number of resources

Use Number of views Comments Uploads "likes"

Channel: subscribers, channel views, upload views

Education about OER
Method: Qualitative data analysis (develop user surveys for users of our services or beneficiaries of our training and education sessions)


 * number of people trained in "open”
 * number of orgs that adopt our processes, tools, or instruments of practice
 * number of presentations/talks/seminars
 * number of policies influenced at U-M (for open) (out of scope, how do we define and measure this?)
 * Data from previous dScribe surveys

Analyze: exit surveys: data about motives for dScribing, impressions of the process, impact of the work on academic life Numbers of volunteers

To gauge: creation of OER: process evaluation; immersion within community; learning outcomes after training and involvement with creating OER; use of OER

= Survey Topics by Target =

Strategy: create a template survey that can be tailored to each survey target that progressively addresses issues of 1) awareness (Open.Michigan; OER); 2) support; 3) use/reuse and motivations. Note: some qualitative data from the quantitative section (such as user comments on YouTube, etc.) will be used to measure these three factors as well.

Survey Targets

Faculty Students Administration Staff

Departments and Units UMMS SI Nursing Public Policy and/or Public Health School of Ed Dentistry Library College of Engineering LSA

Faculty
[Has OER impacted your teaching methods?]


 * Awareness of OER and OM
 * Awareness of OA journals and Open Textbooks?
 * Awareness of OER?
 * Awareness of Open.Michigan specifically?

Investment in idea of “open”


 * What is faculty opinion of/interest in OER?
 * Have faculty created OER?
 * Would faculty be interested in creating OER?
 * What has prevented them from doing so, if yes?
 * Why not, if no?
 * How strongly does faculty believe in importance of open?
 * Why does/does not faculty support open?

Opinions of effects of OER on various issues, personal and other?


 * Tenure?
 * Reputation?
 * Financial/royalty concerns?
 * Reputation with OA journals?
 * What is “right” thing to do?
 * What is in the best interest of education/spreading knowledge/research?

Publishing (or re-publishing) materials as OER


 * Course materials
 * Syllabi
 * Use and Reuse of OER
 * Class materials
 * Syllabi
 * Curricula
 * To improve teaching
 * For personal growth

Students
[Has OER impacted your learning methods?]

Awareness of OER and OM


 * Awareness of Open Textbooks?
 * Awareness of OER?
 * Awareness of Open.Michigan specifically?

Investment in idea of “open”


 * What is student opinion of/interest in OER?
 * Have students created OER/dScribed?
 * Would students be interested in creating OER/dScribing?
 * What has prevented them from doing so, if yes?
 * Why not, if no?
 * How strongly do students believe in importance of open?
 * Why do/do not students support open?

Creation of OER Have students created OER or reused OER for anything in school?

Use and Reuse of OER


 * In class
 * Do you know if you have had professors use OER?
 * If so, what kinds of OER materials were used?
 * What would you think of your professor using OER?
 * For personal growth external to class

Administration

 * Awareness of OER and OM
 * Awareness of OA journals and Open Textbooks?
 * Awareness of OCW?
 * Awareness of OER?
 * Awareness of Open.Michigan specifically?
 * Awareness of the number/quality of schools already offering OA, OER and OCW?

Investment in idea of “open”


 * What is admin opinion of/interest in OER?
 * As it affects instructors?
 * As it affects students?
 * As it affects the image of the school?
 * Opinions of instructors creating OER and using OER in classrooms?
 * Is there a distinction in opinion between the two?
 * How strongly does admin believe in importance of open?
 * Why does/does not admin support open?
 * Opinions of effects of OER on various issues, personal and other?
 * Affects on school enrollment/image
 * What is “right” thing to do?
 * What is in the best interest of education/spreading knowledge/research?

Creation of OER


 * Curricula?
 * Use/Reuse of OER
 * To produce/modify curricula?
 * To improve teaching?
 * For personal growth among admin?

= Strategic Questions and Considerations  =

What is the scope of our measurements? We should focus on U-M community and our local and regional impact.

How do these things fit together (the progression of open to closed to open transfer of knowledge in the education system)? When should something be not adaptable?

How do we define these items?


 * number of non-UM courses/resources/websites we've influenced to become open
 * % of faculty demanding to open their course content (what does this measure? Use or Education?)
 * number of open data repositories (what does this mean?)
 * Documenting innovations in "open"

Is this out of our scope?


 * number of public domain books scanned for Google Books project (Hathi Trust)

Domain of Copyright Office?

 * legal
 * policy
 * technical
 * process
 * influence of other org's legal/policy decisions

Barriers

 * Fears leading to make content not adaptable (eg. technological and cultural constraints):
 * misinterpreted information
 * incomplete package of information
 * technology not there/too costly
 * attribution loss/commodization [faculty need to go beyond this idea of controlling their knowledge objects]
 * [history &gt; shift in culture to preserve and protect; advancement of technology &gt; new obstacles yet new opportunities to share -- By encouraging sharing of knowledge (knowledge transfer) [getting back to the root idea of knowledge institutions] we are making the University more effective.] *Source?* philosophical root of education (new knowledge production through transfer and sharing) &gt; current commercialization and control of knowledge production/silo &gt; future of sharing/adaptability afforded by technology
 * [technology = easy sharing of knowledge] [want to take advantage of this] [give up this bit of control and share more!] [because: the current preserve/protect mentality defeats/limits knowledge sharing; to increase knowledge sharing we need to shift the culture of this sharing]

Rationale
Misinterpretation and incompleteness can happen in any publishing form so adaptable digital educational resources don’t necessarily increase this possibility; What is the basis of transparency? Does this change? [OM working toward the future when technology will facilitate this knowledge sharing more]

Knowledge Sharing

In what ways are we sharing? What improvements can we make to this? [best way of sharing is completely adaptable knowledge sharing and use]

Identify types of knowledge sharing where transformation/use/transparency fits: going between types of knowledge sharing and making content adaptable (addressing fears) Show types of knowledge sharing and address/disspell myths of fears about our way of sharing knowledge (OER) Acknowledge the different types of knowledge (eg. peer reviewed; raw data; guides; instruction; theories) and how they are shared; not all these are shared the same way; open sharing encourages lifecycle of knowledge production.

Other [Transparency to use to transformation might be a progression, but doesn't need to be.]