Deans & Chairs Resources

On This Page:


IDEA Dean and Chairs Webinars

October 3, 2017 recording of webinar by Tara Kai from the IDEA Center.

On July 5, 2017 Tara Kai from the IDEA Center provided an overview of the types of reports available to Chairs and Deans. Watch the recorded webinar.


IDEA General Guidelines

An important note: It is highly recommended that you visit your IDEA “dashboard” regularly. Most, if not all, information that you need can be found when you log in to your “dashboard,” so it is typically the most reliable place to get correct, current information.

How St. Thomas information is used to set up IDEA course surveys:

  • Each term, a master document is sent to the unit/department IDEA Coordinator after the 10th day of the term, and that person is asked to review all information especially the following:
    • Course start & end dates
    • Which form, long or short, should be used (the long form is the default if this is not specified)
    • Cross listed courses and courses that should not be evaluated

Faculty should communicate with their IDEA coordinator about the above items for their courses so that the correct information is entered into the master document. Chairs can let faculty know who their IDEA coordinator is.

  • Data from Banner about courses is imported into IDEA, powered by Campus Labs.
  • The St. Thomas IDEA Administrator then uses the master document and the data from Banner to create Administrations, or “survey groups”.
    • Survey groups are groups of courses that share the same end date and form choice
  • Once a course is added to a survey group, the OSF (Objective Selection Form) is available to be filled out by the instructor. An email will be sent out to instructors once this happens.
  • Surveys typically are open for 10 business days the week before finals. If a term does not have a set finals week, the survey will be open 10 business days prior to last day of the class.
  • If an OSF is not completed before the survey closes, then all selections for relevant course objectives will default to “Important.”

Email Communication:

Typically, all emails are set to send at 6am CST

To faculty:

  1. OSF Available: Faculty will receive an email once their courses have been added to an Administration and the OSF is available to be filled out. This is generally a week or two after the 10th day of the term.
  2. Survey opening in 1 week:
    • For the Fall and Spring terms, faculty will receive an email to alert them that their course surveys are opening in one week.
    • For JTerm & Summer, faculty may not receive this email due to timing constraints but we will do our best to send this communication.
  3. Survey open: An email will be sent the morning when the survey opens
  4. Survey closing: An email will be sent the morning of the day when the survey closes which is normally at 11:59 p.m.
    • This is the last opportunity to complete your OSF
  5. Reminder to do OSF: An email will go out to all faculty who have not completed their OSF a day prior to the close of their survey.
  6. Faculty reports available: An email will be sent when faculty reports are available, which is typically three business days after grades are due.

To Students:

  1. Survey is open: Sent the morning the survey opens
  2. Reminder to do survey: Sent normally 5 business days into survey window
  3. Survey closing: Reminder that one or more of your surveys are closing that day

Deans/Chairs

  1. Faculty Reports available: A manual email will be sent from the IDEA Administrator the day or day prior to when faculty reports are available to view.

Accessing Reports

If you are a dean or department chair, you will have access to faculty results as a "Report Administrator" through the Campus Labs dashboard.

Where to begin:

Getting and interpreting my reports:

FAQ and Q & A from Webinars

  • Will the Longitudinal/Trend Analysis (Fall 2017 release) be available to download in PDF or Excel?

In PDF - Although Trends Analysis is meant to be an interactive experience, we understand that there will be requests to save or print the report. Three options will be available, but the timing of availability may be post release. Browser printing, browser save to PDF and last but not least a screenshot.

  • Will the Distributed Control allow Department Heads to view their USR results and filter:
    • Course by course - In its first release before the fall setup season, user privileges by Courses may not be ready in time but will be available for Spring 2018
    • Full-time/Part-time - Unfortunately an option to filter the USR for faculty roles is not on the road-map.
  • What is the "minimum threshold" class size for a report to be generated?

By default for CampusLabs, at least three responses or 100% of enrolled students are necessary to generate a report for faculty or an administrator to see. So a class of 2 students would be released if both responded even though it has less than three responses because 2 is 100% of that class. 

  • Do you have any data around graduate vs undergraduate courses?

Since its inception, IDEA has always been used by instructors teaching either undergraduate or graduate courses. Our experience is that most instructors find the learning objectives and teaching methods appropriate regardless of course level or delivery method. Our research shows that differences among course levels have mostly to do with student self-reported variables: motivation to take the course, typical work habits, and background preparation. Graduate students tend to report higher motivation, better typical work habits, and greater background knowledge relative to undergraduates. IDEA controls for these variables in its adjusted scores, which makes using one form practical. Once we control for these individual student variables, the course level differences largely disappear. Statistically, it is more precise to control via adjusted scores than to partition students into groups, which can lead to grouping error. (In truth, some undergraduates are more motivated, have better work habits, and better background preparation than some graduate students.)  To read more about the motivation levels (i.e., desire to take the course) that are very comparable between graduate students, lower-division students in the major, and upper-division students in the major, see technical Report No. 18.

  • How can we differentiate between departments/units who are using the Learning Essentials (LE) instrument and departments who are using the Diagnostic Instrument (DI) when we use the "Teaching Methods Priorities" tab (the first tab in the Unit Summary Report [USR]) to develop professional development workshops?

"Teaching Methods Priorities" tab draws data from the DI only since an LE instrument does not ask students any questions about the teaching methods. For units who use the DI - the teaching methods in the "Teaching Methods Priorities" tab are correlated with student achievement of learning objectives that were selected by more than two-thirds of course sections in your unit as Important or Essential. When the system notices that some teaching methods are used infrequently in your unit compared to the IDEA database, it recommends to "increase use of these teaching methods."

  • How many semesters should we wait until we get a "good sample size" before we look at our faculty data longitudinally?

At least 3 semesters. 


The OSF

If faculty forget to select objectives on the OSF, they will receive an email reminding to log into the Campus Labs Faculty Dashboard to select the learning objectives for the course(s) they are teaching that semester. If a faculty nmember fails to complete the OSF by the end of the evaluation period, all thirteen learning objectives will default to Important.

  • Can faculty change their OSF? Faculty members are able to make changes to their OSF for as long as the administration is opened. 

Preparing Students

Research and best practice consistently show the single greatest influence on increasing participation in student ratings surveys is for faculty to express and demonstrate how the results are important and used in making meaningful change. The next most influential factor is to set aside time in class to complete the surveys, regardless of delivery modality. Emulate the “captive audience” nature of in-class paper ratings by asking students to login and complete your IDEA survey during class. Students can use their mobile phones, laptops, or desktop computers in a lab.

Response rates for online surveys are a legitimate concern, and there are a number of ways that faculty can improve response rates for online course evaluations. 

What Students See

How students access their evaluation(s)

How many questions will appear on the screen/mobile phone before students have to click on next?

  • 19 for the Diagnostic Feedback
  • 13 for the Learning Essentials