Compliance Certification Report


Prev  Next
Compliance Report Index
Comprehensive Standard 3.3.1.3    


Judgment of Compliance    


Rationale for Judgment of Compliance    

University-wide, there are sixteen organizational units, identified for their impact on institutional effectiveness, providing academic and non-academic support services to students and faculty. The ASU Outcomes Assessment System defines student support services within two categories based on their support of academic or non-academic activities.

Student Support Services-Academic/Extracurricular represent those support functions with an academic component not directly connected to curriculum requirements. Organizations under this category are: the ASU Library, the Accounting Lab, Admissions and Records, the Center for Instructional Technology (CIT), Academic Advising (Faculty), the Math Lab, Testing Services, the Transfer Advising Center, and the Writing Center.

Student Support Services-Non Academic represent those support functions with no direct academic component but considered relevant in assisting the student pursue his/her educational goals. Organizations under this category are: Career Services, Counseling Services, Disability Services, Recruitment, Student Activities, Student Financial Services, and Veterans Affairs.

All student support services have formulated and measured expected outcomes at two operational levels: program operational and service delivery. The Institutional Effectiveness Matrix reflects all performance indicators associated with the administration of a particular student service or function supporting academic and non-academic activities and includes management of available resources to support the organizational function. Some organizations supporting academic functions have also included student learning outcomes external to the curriculum. For instance, the Library uses the Information Literacy Test to capture students’ knowledge and skills in finding, evaluating, and organizing information sources. Results from this test are included in the Library’s Consolidated Assessment Plan.

Evidenced-based methods to measure the effectiveness of academic support functions (program operational outcomes) include program/office internal records or log systems and third-party audits and certifications, if applicable. Data gathering methods and techniques used to measure service delivery outcomes are based on satisfaction or quality ratings of services captured via organization-specific Point-of-Service (POS) surveys and the Graduating Senior Exit Survey. Information obtained through the assessment process is integrated into the organization’s next planning and budgeting cycle.

Evidence of all outcomes assessment activities, including their identification and measurement and the actions planned as a result of the evaluation process, is presented in the organization’s Consolidated Annual Assessment Plan (AAP, AAR, AP) including actual data from internal records, audits/certifications, surveys and the organization’s assessment compliance certification (ACC).

As reported in the Institutional Effectiveness Continuous Improvement Report (2007-08, 2008-09) academic/extracurricular support services organizations’ compliance with the assessment cycle, as required by the University, was 29% in 2007-08 and 67% in 2008-09. Three organizations that did not conduct formal assessment activities in 2007-08 and 2008-09 have submitted the AAP for 2009-10 and are now available in AMOS. Compliance with the assessment cycle for non-academic student support services was 100% in 2007-08 and 2008-09. All have submitted the AAP for 2009-10.

Evidence of the use of assessment results in operational improvements is reported in the Institutional Effectiveness: Programs’ Use of Assessment Findings for Continuous Improvement report, which categorizes the actions planned/taken by each administrative unit. Examples of planned or executed changes resulting from assessment findings are also reported in the Institutional Effectiveness Continuous Improvement Report.

All Consolidated Annual Assessment Plans are available to reviewers though this link.

The following examples demonstrate how academic and non-academic student support organizations have used their results to effect continuous improvement.

Athens State University Library

Driven by a strong student demand for online courses an nd subsequent enhancements of technological resources, the Athens State University Library has taken assertive action to ensure that students participating in distance-learning courses have access to adequate and appropriate learning resources. Information from surveys and focus groups indicated the need for an increased focus on library instruction and easier access and navigation of the web page. Based on these results, the Library made a commitment to enhancing the instructional component of the Library web site. Librarians made an effort to promote the embedded librarian service more proactively. A Library tab was added to the Blackboard course management system to identify the Library more clearly in that key online learning resource. In addition, Library staff uploaded all electronic journal holdings into the catalog.

The Library also made additional improvements following a usability analysis of its web page conducted by students enrolled in a computer software class. After discussing the results with the Library staff, several actions were taken. In the migration to the new Library catalog, the Library focused on designing a clear, simple, easy-to-use interface. In addition, it strengthened the instructional component of the Library web page, and the Library staff began the creation of a visible FAQ section on the web page.

As a result of these efforts, students have easier access to the University Library and also the virtual library resources within the State of Alabama. Library services available to distance students include the holdings that are accessed through the online library catalog, approximately 40,000 electronic books, online databases, full-text journals available online, research assistance, and inter-library loan. Distance learning students may access the virtual library by using an access login and password that are provided by the Library. Results from the Graduating Senior Exit Survey conducted in 2008-09 indicated that 89.49% of students are highly satisfied with the ASU Library.

Math Lab

Previous to the implementation of the ASU Outcomes Assessment System, data for the Math Lab was mostly anecdotal with no real capability to track performance. However, there appeared to be consensus that student attendance was low, resources were not optimal, and operations were not appropriately supervised. Taking advantage of the assessment process, in 2007-08 the Math faculty began deliberations regarding the Lab, formulated expected outcomes, and initiated the development of internal operational procedures and documentation to allow the systematic collection of information on which to make decisions. Although the Math Lab obtained an 80% student satisfaction with the services by the end of 2007-08, based on data from the Graduating Senior Exit Survey (GSES), concerns about the operations of the Lab remained. Furthermore, the faculty felt that the general nature of the GSES did not provide enough information to make performance determinations. Consequently, the Point-of-Service Survey for the Math Lab was implemented in 2008-09. In the Spring of 2009, the Department Chair appointed a Math professor as Lab Director. Following the appointment of the Director, the upgrade of instructional material, the development of a log system, new operational procedures, and promotional strategies aimed at students and faculty, the Math Lab had realized considerable improvement by the end of the 2008-09 academic year.

During the Spring and Summer of 2009, 179 students from 10 different majors and 14 different courses requested and received tutorial services from the Math Lab. Through expanded Lab hours including evenings, Lab tutors provided an estimated 176.79 hours of tutorial assistance during the Spring 2009 semester and 165.49 hours during the 10-week Summer 2009 session. Mean scores of student satisfaction (possible maximum=5) obtained via the GSES increased from 3.86 in 2007-08 to 4.17 in 2008-09. Mean scores from the POS survey during 2008-09 ranged from 3.95 to 4.40 out of a possible maximum of 5.0. The highest score, 4.40, was obtained by the Math Lab’s ability to provide needed help, followed by “qualified tutors”, 4.30, “accessibility to learning resources”, 4.15, and “lab facilities”, 3.95. Overall experience with the Math Lab was positive, with a mean score of 4.25.

Transfer Advising Center

The Transfer Advising Center (TAC) was established in 2006 in response to inefficiencies identified in the transfer process. Previous to its establishment, the evaluation and application of transfer credits was done individually by faculty in each College, who may or may not have been familiar with the statewide general education requirements and articulation agreement stipulated by the State of Alabama. As a result, there were inconsistencies in the evaluation and acceptance of transferred credits in general education courses, consequently creating problems for students at the time of registration. This problem was of particular relevance to the University, given that, as the only upper-division baccalaureate degree-granting institution in the Alabama Community College System (ACCS), Athens State’s primary mission is to serve the needs of ACCS transfer students. In addition, there was a gap between the time the student was accepted to the University and the students’ assignment of a faculty advisor to follow-up with major-specific programs of study. With the establishment of the TAC, trained counselors are centralized at a single location and newly admitted students are provided with an initial program of study based on the choice of major, and are assigned a faculty advisor. Further improvement in 2008 and 2009 was the implementation of individual on-line program guides for all three colleges, which has reduced the processing time of transfer credit evaluations by 2%. In 2008, internal records indicated that TAC counselors assisted approximately 90% of all new degree-seeking transfer students by assigning a faculty advisor immediately upon admission. Further analysis of internal records indicate that cross training and frequent staff meetings produce consistency in counseling of new students regarding general education requirements.

Comments obtained from the TAC Point of Service Survey indicate that students who seek the service of the TAC do confront fewer problems than those who do not utilize the TAC. Data results also indicate high quality ratings for timeliness and helpfulness of information, mean scores of 4.62 and 4.73, respectively, and overall satisfaction at 4.66 (maximum score=5.0). Results from the 2008-2009 GSES indicate that 86.75% and 82.84% of students are satisfied/somewhat satisfied with transfer advice and appointment of a faculty advisor, and the guidance received at the initial time of admission.

Office of Admissions and Records

Following results of assessment findings, the Office of Admissions and Records has taken assertive actions regarding business processes and development of technology-based applications in the admissions, registration, and graduation processes. These actions have resulted in improvements in system procedures, data integrity, and expanded use of online resources, significantly enhancing the effectiveness and efficiency of the office.

During the last two years, in response to student demand, the Office of Admissions and Records expanded the availability of online services to distance learning and campus-based students via the ASU website. As a result, students have the capability to: (1) apply, track application, and register online; (2) view and print their student transcripts online; (3) d ownload the necessary forms needed to complete 21 available transactions (such as Change of Major, Change of Address, Readmission, Graduation Application, etc.); and (4) view the institution's policies via the online or hard copy catalog.

Eighty percent of new applicants were registered online in 2008-09 compared to 32% in 2007-08, an increase of 48 percentage points or 150%. The capacity of students who have utilized available resources for online registration compared to those who used the resource of registering in-house grew 8% from Spring 2008 to Summer 2009. This growth is significant since most of the efforts regarding enhanced availability of online resources were initially targeted at newly admitted students. Continuing growth in overall student use of these capabilities is expected. In addition, online requests for student transcripts through the Transcript Ordering Service from the National Student Clearinghouse doubled between 2008 and 2009 from 4.9% to 9.4% of total transcripts requested, respectively.

Business processes and documentation were reviewed and changed to improve the Office’s ability to fulfill its functional responsibility of admitting students and deliver its services. The average processing time for a new applicant for admissions was typically 3 days. A review of the process indicated that having to wait on a signature page from new students was causing the delay. With no major impact on the integrity of the process, the signature page was removed resulting in a reduction in processing time of 67%, from 3 days to one day.

An Admissions and Records Service Log was implemented in April 2008 that allows the Office to track its productivity through improved documentation of day-to-day operations. Through this enhanced ability to document heavy workloads, the Office was able to identify needed resources and was better prepared to justify a budget request for an additional staff and professional development funds for current staff.

Through better technology-based testing mechanisms, improved business processes, and staff training, the Office increased the accuracy of student records and its ability to identify and correct errors at the source. As a result, errors in student records have been reduced by 29% from 131 identified errors in July 2007 to 93 errors in July 2009.



Documentation    


Prev  Next
Compliance Report Index