Deans & Chairs Resources
IDEA Dean and Chairs Webinars
October 3, 2017 recording of webinar by Tara Kai from the IDEA Center.
On July 5, 2017 Tara Kai from the IDEA Center provided an overview of the types of reports available to Chairs and Deans. Watch the recorded webinar.
If you are a dean or department chair, you will have access to faculty results as a "Report Administrator" through the Campus Labs dashboard.
Where to begin:
Getting and interpreting my reports:
- How can Deans and Chairs see results for their colleges/departments?
- Summative and Formative Reports for IDEA Campuses
- Reporting Tools + IDEA (Video)
- How the mean is calculated
FAQ and Q & A from Webinars
- Will the Longitudinal/Trend Analysis (Fall 2017 release) be available to download in PDF or Excel?
In PDF - Although Trends Analysis is meant to be an interactive experience, we understand that there will be requests to save or print the report. Three options will be available, but the timing of availability may be post release. Browser printing, browser save to PDF and last but not least a screenshot.
- Will the Distributed Control allow Department Heads to view their USR results and filter:
- Course by course - In its first release before the fall setup season, user privileges by Courses may not be ready in time but will be available for Spring 2018
- Full-time/Part-time - Unfortunately an option to filter the USR for faculty roles is not on the road-map.
- What is the "minimum threshold" class size for a report to be generated?
By default for CampusLabs, at least three responses or 100% of enrolled students are necessary to generate a report for faculty or an administrator to see. So a class of 2 students would be released if both responded even though it has less than three responses because 2 is 100% of that class.
- Do you have any data around graduate vs undergraduate courses?
Since its inception, IDEA has always been used by instructors teaching either undergraduate or graduate courses. Our experience is that most instructors find the learning objectives and teaching methods appropriate regardless of course level or delivery method. Our research shows that differences among course levels have mostly to do with student self-reported variables: motivation to take the course, typical work habits, and background preparation. Graduate students tend to report higher motivation, better typical work habits, and greater background knowledge relative to undergraduates. IDEA controls for these variables in its adjusted scores, which makes using one form practical. Once we control for these individual student variables, the course level differences largely disappear. Statistically, it is more precise to control via adjusted scores than to partition students into groups, which can lead to grouping error. (In truth, some undergraduates are more motivated, have better work habits, and better background preparation than some graduate students.) To read more about the motivation levels (i.e., desire to take the course) that are very comparable between graduate students, lower-division students in the major, and upper-division students in the major, see technical Report No. 18.
- How can we differentiate between departments/units who are using the Learning Essentials (LE) instrument and departments who are using the Diagnostic Instrument (DI) when we use the "Teaching Methods Priorities" tab (the first tab in the Unit Summary Report [USR]) to develop professional development workshops?
"Teaching Methods Priorities" tab draws data from the DI only since an LE instrument does not ask students any questions about the teaching methods. For units who use the DI - the teaching methods in the "Teaching Methods Priorities" tab are correlated with student achievement of learning objectives that were selected by more than two-thirds of course sections in your unit as Important or Essential. When the system notices that some teaching methods are used infrequently in your unit compared to the IDEA database, it recommends to "increase use of these teaching methods."
- How many semesters should we wait until we get a "good sample size" before we look at our faculty data longitudinally?
At least 3 semesters.
- What are the national standards that IDEA aligns with?
- NESSI alignment - NAEP-Education Statistics Services Institute (NESSI, formerly ESSI)
- Alignment with AAC&U Leap Student Learning Outcomes
- Alignment with HLC Standards
- Domains of Learning—IDEA Learning Objectives Map
- Alignment with Degree Qualification Profile (DQP)
- How IDEA Teaching Methods measure best practice teaching philosophies
If faculty forget to select objectives on the OSF, they will receive an email reminding to log into the Campus Labs Faculty Dashboard to select the learning objectives for the course(s) they are teaching that semester. If a faculty nmember fails to complete the OSF by the end of the evaluation period, all thirteen learning objectives will default to Important.
- Can faculty change their OSF? Faculty members are able to make changes to their OSF for as long as the administration is opened.
Research and best practice consistently show the single greatest influence on increasing participation in student ratings surveys is for faculty to express and demonstrate how the results are important and used in making meaningful change. The next most influential factor is to set aside time in class to complete the surveys, regardless of delivery modality. Emulate the “captive audience” nature of in-class paper ratings by asking students to login and complete your IDEA survey during class. Students can use their mobile phones, laptops, or desktop computers in a lab.
Response rates for online surveys are a legitimate concern, and there are a number of ways that faculty can improve response rates for online course evaluations.
What Students See
How many questions will appear on the screen/mobile phone before students have to click on next?
- 19 for the Diagnostic Feedback
- 13 for the Learning Essentials