Compliance Certification Report


Prev  Next
Compliance Report Index
Comprehensive Standard 3.3.1.1    


Judgment of Compliance    


Rationale for Judgment of Compliance    

Athens State University (ASU) offers 31 academic programs approved by the Alabama Commission on Higher Education (ACHE) and the State Board of Education (SBE), each leading to one of three types of baccalaureate degrees.

The ASU Outcomes Assessment System defines educational (academic) programs as those organizations responsible for curriculum planning, development, and implementation with direct authority over instructional content and delivery. Only degree programs within the Colleges of Arts and Sciences, Business, and Education fall under this category. Academic standards and learning outcomes for all courses and programs are the same for both distance learning and traditional instructional delivery on and off-campus. Due to the interrelationships among their respective academic disciplines and their national accreditation status, the College of Business (accredited by the Association of Collegiate Business Schools and Programs) and the College of Education (accredited by the National Council for Accreditation of Teacher Education) conduct college-wide outcomes assessment. The College of Arts and Sciences, given the variety of its academic offerings, conducts individual outcomes assessments for each one of their degree programs.

All educational programs offered at ASU have formulated and measured expected learning outcomes based on the College mission and program purpose as determined by the faculty. In addition to unique skills pertaining to a specific degree program, all educational programs have identified a set of common learning outcomes consistent with University Goal 1. In addition to student learning outcomes, the Institutional Effectiveness Matrix reflects additional outcomes associated with the administration of educational programs (mostly under the responsibility of College Deans and the Vice President for Academic Affairs) to include accreditation, enrollment, curriculum, instruction delivery, faculty resources, assessment activities, program review, academic advising, and community service/outreach.

Learning outcomes are measured through both evidenced-based and indirect methods. A variety of assessment instruments are used to collect quantitative and qualitative data on the demonstrated success of students in achieving the knowledge, skills, and abilities (KSA) or other competencies as a result of having gone through the curriculum. Exit and field exams, senior research projects, and student portfolios are the most frequently used evidenced-based methods of assessment, typically applied in a capstone course. In addition, targeted assignments assessing specific skills are used in designated courses throughout the curriculum. Both the PRAXIS II Content Knowledge Test and the Alabama Prospective Teacher Test (APTT) are also part of the assessment methodology used by the College of Education. Students’ self-assessment of their entering and exiting competencies in 18 KSAs is also measured via the Graduating Senior Exit Survey at time of graduation.

Evidence of outcomes assessment activities, including the formulation and measurement of outcomes and the actions planned as a result of the evaluation process, is documented in the program’s Consolidated Annual Assessment Plan (AAP, AAR, AP) including actual data from academic instruments and surveys, and the program’s assessment compliance certification (ACC). As reported in the Institutional Effectiveness Continuous Improvement Report, (2007-08, 2008-09) the compliance of academic programs with the assessment cycle, as required by the University, was 100% in 2007-08 and 2008-09. All 31 programs have submitted the 2009-10 AAP, located in AMOS, and are currently in the data collection phase of the assessment cycle. Assessment findings for this academic year will be reported in September 2010. Three new programs in the College of Business were approved in Fall 2009 so there is no data available for them at this time.

Evidence of the use of assessment results in operational improvements is reported in the Institutional Effectiveness: Programs’ Use of Assessment Findings for Continuous Improvement report, which categorizes the actions planned/taken by each administrative unit. Examples of planned or executed changes resulting from assessment findings are also reported in the Institutional Effectiveness Continuous Improvement Report.

The following examples demonstrate how academic programs have used assessment findings to improve student learning outcomes.

College of Business (COB) (4 majors):

For the purpose of the most recent ACBSP self study, exit exam results from the years 2001 through 2005 were compiled and analyzed. A total of 54 exam sessions were included in the analysis. The data indicated a mean score of approximately 73 and a positive score trend over time. However, there was consensus among the faculty that student weaknesses were better addressed sooner in the curriculum. To gain additional perspective and allow the faculty an earlier focus on student weaknesses, the capstone post-test began to be used as a pre-test for first-semester business students beginning in 2006. In 2008-09, 92.3% of business students showed improvement on knowledge of business subject matter from the pre and post-test as assessed by the capstone exam. These data results continue to guide the COB in setting or revising goals for the core curriculum. Refinements in the wording of questions continues to take place, and curricular and course changes have been adopted as a result of analysis conducted on student scores on the assessment exam. In addition, the COB has made changes in two of the core courses, Principles of Management (MG 346) and Managerial Communications (MG 320) as a direct result of outcomes assessment. A survey sent to local businesses in 2005 indicated that students needed improvement in their leadership skills. Accordingly, the leadership skill component previously addressed in MG 320 was placed in the MG 346, a more appropriate course in which to examine the concept and allow students to work on the skill. Furthermore, the COB added a seminar course (MG 480) which is taken in the last semester. This course allows the student to pull together all the skills gained from the professional core curriculum, common to all majors. This was implemented in fall 2009 and to date 100% of students have met the criteria set by the College.

College of Education (COE) (5 majors):

Since the fall of 2004 the College of Education has held an Assessment Retreat a annually. For the first two years, the retreat addressed creation of rubrics and assessment methods for the nine COE student learning outcomes. Beginning in the fall of 2006 and continuing to the present, an annual assessment manual has been created which includes senior methods and culminating portfolios, the principals’ survey, Professional Education Personnel Evaluation (PEPE) assessments for interns and for first year teachers, and Praxis Content Knowledge Test scores. Each year at the COE Assessment Retreat, these assessment results are discussed in detail. If assessment results indicate a need for improvement, improvement strategies are proposed and, if necessary, are approved by COE faculty.

Portfolios are critical in the assessment of student learning outcomes in the COE. Since the inception of the use of portfolios, students have selected specific artifacts that demonstrate each outcome. As early as the fall of 2006, it was found that some portfolios were unacceptable for the following reasons: no rationales were written for the selected artifacts, artifacts were being placed haphazardly in the portfolio, and there was a general lack of knowledge about artifact selection. At its fall of 2006 Assessment Retreat to improve the quality of portfolios, the COE faculty determined that in ED 301 and ED 302 (introductory education courses) students would be provided a list of suggested artifacts, samples of well-written rationales for artifact selection, and questions to determine under what outcome artifacts should be placed. Tegrity sessions in Blackboard were created to demonstrate this entire process. Since 2006, scores for students’ portfolio assessments for the 2008-09 Senior Methods Portfolio and the 2008-09 Culminating Portfolio have increased significantly; a target outcome of one hundred (100) percent was achieved for all nine student learning outcomes for 2007-2008 and 2008-2009 for those portfolios.

Another assessment method for the COE is use of a Principals’ Survey to assess the nine student learning outcomes of beginning teachers. Historically, classroom management has been an outcome that showed the need for improvement. The highest percentage of unacceptable ratings (ranged from 3% to 9%) by principals occurred for the desired outcome of being a capable classroom manager. These data were also supported by PEPE scores during internship and from principals for first year teachers. For the last year, classroom management scores assessed for interns and for first year teachers have shown significant improvement as indicated on the Consolidated Annual Assessment Plan.

A third assessment method used by the COE is the Praxis Content Knowledge Tests. The Praxis tests which are specific to majors must be passed prior to internship. Teacher candidates in Alabama cannot seek certification unless the appropriate Praxis Content Knowledge Test is passed. After the first several administrations of the Praxis tests, it became apparent that students needed preparation for those Praxis tests; therefore beginning in the fall of 2007 COE faculty began conducting study and test preparation sessions prior to each administration of those tests. Furthermore, faculty who teach methods courses discuss the content knowledge necessary to pass the Praxis tests. One hundred (100) percent of the COE students have earned passing scores on the Praxis Content Knowledge Tests for the last two years.

These are just three examples of how the College of Education has utilized assessment data to make programmatic improvements.

Computer Science:

An analysis of the 2007-08 Computer Science exit exam results revealed overall lower scores. Only 59.4% of the students scored at or above the 75th percentile, representing twenty percentage points below expected outcome of 80%. Further item analysis of the exam identified weaknesses in topic areas related to programming techniques and applications traceable to CS 317 and CS 318. Aggregated scores from capstone course exams (CS 452) demonstrated that 73% of the students achieved acceptable levels of performance further confirming that the exit exam scores were negatively affected by basic programming weaknesses. Consequently, the program decided to focus its attention in strengthening the content of the programming courses (CS 317 and CS 318). Faculty added material and a strong emphasis on the topic of programming and debugging techniques and applications, and a revised course/lab syllabus was used in 2008-09. Assessment of learning outcomes in CS 318 during 2009, revealed that 90% of the students who were sampled (n=100, N=423 where N is enrollment in all CS/CIS/CN classes) scored 75% or above on assignments such as programs, tests, and problem solving tasks. In addition, students in CS 318 overwhelmingly fell in the target level on programming assignments, resulting in 10% above expectations. Although it is too early to determine whether improvements in programming areas will result in increases in the exit exam scores when this group of students reaches the capstone course, the faculty continues to monitor student progress in basic programming skills as the foundation for continuous improvement.

Assessment of General Education Requirements

The assessment of general education outcomes of entering students is based on the results obtained in the ETS® Proficiency Profile Test, formerly MAPP, designed to measure a common set of skills: critical thinking, reading, writing and mathematics. The test, implemented in spring 2010, will allow the University to assess the effectiveness of the general education background of incoming students in order to develop teaching and learning strategies to address identified weaknesses. Due to its recent implementation, there is no data available at this time. Additional information on the assessment of general education requirements is presented in the narratives for CR 2.7.4 and CS 3.5.1.

Previous to the implementation of the Proficiency Profile Test, College competencies were evaluated individually at the program level through their respective assessment processes. For instance, critical thinking, writing, and math skill levels are captured through pre-test exams given in a required introductory management course in the College of Business, while the College of Education uses the Alabama Prospective Teacher Test (APTT) as mandated by the State Department of Education. Since the same skills remain learning outcomes at the program degree level, each College tracks student performance as they advance through the curriculum. Educational support resources such as the Math Lab and the Writing Center have been available to students as required.

Assessment of Distance Learning

Since standards and expected learning outcomes for all academic programs are the same for distance learning (DL) and traditional instruction, on and off-campus, assessment data for DL is captured through the institutional outcomes assessment process used by all academic programs. In addition to learning outcomes, the Institutional Effectiveness Matrix reflects program-operational and service delivery outcomes associated with the support of DL programs under the Center for Instructional Technology.

Learning Outcomes

Prior to enrollment, the University provides all incoming students with the opportunity to access their own readiness for a distance learning environment through the “Should I Take a Distance Learning Course?” survey. This voluntary self-assessment instrument, administered online prior to the student's first enrollment in a distance learning course, captures a set of skills and life-styles associated with the likelihood of success in a distance education environment. Data results for 2009 indicates that approximately 95% of the students who took this survey (n=1202) scored above 5 points. This score suggests that the student has the basic skills to successfully complete an online course.

Course and Program Level Learning Outcomes: Since standards and expected learning outcomes for all academic programs are the same for distance learning (DL) and traditional instruction, on and off-campus, assessment data for DL is primarily captured through the institutional outcomes assessment process as reflected in the Consolidated Annual Assessment Plans. Starting in 2009, some academic programs initiated comparative analyses of learning outcomes between instructional formats in some courses. The methodology varied by College since there is no standardized university-wide methodology for strictly DL assessment. Preliminary data produced mixed results. Faculty deliberations concerning reliable methods to compare outcomes between both formats continue.

To augment an overall and integrated assessment strategy in regard to distance learning, the University has created a new senior administrative position, the Associate Vice President for Academic Affairs. Among other duties, this Vice President will supervise the implementation of a systematic assessment and strategic planning regimen for all aspects of the distance learning academic program. All campus on-line delivery systems are evaluated every semester, with the Blackboard Course Content Evaluation form serving as the basic mode of analysis.

ASU has also adopted a number of technology based tools to address some differences between traditional classroom delivery and technology-based delivery courses. The institution is utilizing TEGRITY, an asynchronous delivery tool that allows us to imbed voice and video into on-line courses. In addition, this platform has features that allow students to see a white board that shows the instructor working problems and writing on the board, similar to the experience in a traditional course. In addition, the University uses WIMBA, which allows for both synchronous and asynchronous methods of course delivery. Synchronous classes, which are totally interactive in “real time,” can be recorded and are available in an asynchronous format for replay by students. The institution strives to make the on-line experience as close to a traditional classroom experience as possible. Additional information on technology used in distance learning courses can be found in the narrative for Comprehensive Standard 3.4.12.

Indirect methods include data comparisons of students' self assessment of eighteen entering and exiting competencies (KSA) captured via the Graduating Senior Exit Survey (GSES). Results from 2009 indicates comparable competency levels for DL (n=583) and Non-DL (n=342) students. Since this assessment was initiated in Spring 2008, the reliability of the data cannot be fully established at this time and, therefore, additional data is required.

Modality Assessment-Student Academic Profile: Course Grades Distribution and Student Retention: Although course grades are not an adequate method of assessment, they provide a general view of student success in coursework completion. Grade comparisons between online and traditional courses for academic years 2005 to 2009 indicate comparable performance (within two percentage points) between DL and Non-DL students regarding grades B, C, D, I, and W, and over two percentage points differences in grades A and F.

Student Retention: Comparative data on retention rates by delivery format from Fall 2006 to Fall 2009 indicates that DL students have lower retention rates than Non-DL students. However, the same analysis indicates a 14% increase in the retention rate of DL students compared to a 4% increase for Non-DL students. In making interpretations of the overall retention rate of DL students, external factors, other than the format itself, must be considered. This is of particular relevance to the University given that the student profile indicates that the majority of ASU students attend school on a part-time basis, work full-time, and have family obligations, all circumstances commonly associated with lower retention whether in a distance learning or traditional environment.

While the issue of distance learning retention is a national challenge, the University has attempted to provide imaginative programs to resolve any technology-based obstacle that might affect retention. Accordingly, resources have been put in place to address the needs of students enrolled in distance learning courses. A separate Student Support technology office under the Office of the Vice President for Academic Affairs addresses all student technology issues for our course delivery systems, rather than those issues related to faculty use of technology. A Student Orientation, available on-line, introduces students to all aspects of the institution. SMARTHINKING, an online tutorial service, provides subject area tutors free of charge to ASU students. Presidium, a 24-hour technology service desk, assists on-line students who have problems accessing the course management system. The institution continues to assess how this aspect of our academic program can be improved, including the regular assessment of support services related to distance learning.

Program Operational Outcomes

In addition to the learning outcomes (KSA) assessment conducted at the course and program level, the University evaluates the overall quality of its academic programs as it relates to curriculum and courses, faculty, technology and learning resources, and support services for both DL and campus-based instruction.

Teaching and Learning Experience: Data from the DL Faculty and Student Surveys capture the perspectives of both faculty and students regarding the teaching and learning experience. Both surveys were instrumental in the implementation of the Distance Learning Program during its initial and developmental stage. Currently, both surveys are being revised to reflect a new stage in the life cycle of Distance Learning.

Course Content and Teaching Effectiveness: College Deans and/or Department Chairs evaluate course content and management of instructional technology by instructor and course format as one of several components of the overall instructor's evaluation. The following samples of completed Blackboard Course Content Evaluation Forms demonstrate that DL and campus-based courses compare favorably in course content and instructors’ use of technology to support both formats. In addition, student evaluations of individual courses and instructors captured through the Faculty Course Evaluation (FCE) at the end of each term are used to assist faculty in making their own individual plans for improvement. FCE data from 2008-2009 indicated high student ratings of overall course quality (mean=4.27) and instructor’s teaching effectiveness (mean=4.21), although DL student ratings were slightly lower (means=4.23 and 4.17) than those of Non-DL students (means=4.38 and 4.36).

DL Faculty Professional Development Assessment: The Faculty Instructional Technology Survey (FITS), conducted annually by the Center for Instructional Technology (CIT), captures information on faculty usage of technology, training, and support services available to facilitate teaching and learning. Data on faculty training captured through the CIT’s internal records document faculty participation in technology-based training activities. Data from the 2009 survey (n=26) indicated a 96% faculty participation in at least one technology-related training or webinar session. Almost 96 percent “agree/strongly agree” that faculty instructional technology training is available upon request.

Service Delivery Outcomes Assessment

Student and Faculty Technology Support: Data captures the availability, access, and quality of resources supporting distance learning for students and faculty through a variety of online surveys. Results from three specific questions in the 2008 and 2009 GSES indicate an increase in student satisfaction/agreement regarding the instructor’s use of web-based course management tools, information technology course support, and technology support assistance related to access and training. Satisfaction with the Academic Environment, University Life, and Student Services is slightly higher for DL than Non-DL students as captured in the 2008-09 GSES. Specific questions from the FITS indicate that 80% of faculty respondents agree that there are adequate support services for academic use of technology and that technical support assistance is available upon request.



Documentation    


Prev  Next
Compliance Report Index