University-wide, there are twelve organizational units, identified for their impact on institutional effectiveness, providing administrative support to the overall operations of the University and that are not directly related to any academic component as defined in the ASU Outcomes Assessment System. Organizations under this category are: Auxiliary Services; Business Office; Campus Security; Human Resources; Information Technology; Physical Plant & Maintenance; Printing and Publications; Alumni Association; ASU Foundation; Public Relations & Marketing; the Office of Institutional Planning, Research and Assessment (OIPRA); and the ASU Off-Campus Sites.
All administrative/support organizational units have formulated and measured expected outcomes at two operational levels: program operational and service delivery. The Institutional Effectiveness Matrix reflects all performance indicators associated with the administration of a particular service or function to support continuing operations and includes management of available resources to support the organizational function.
Evidenced-based methods used to measure the effectiveness of administrative support functions (program operational outcomes) include program/office internal records or log systems and third-party audits and certifications, if applicable. Data gathering methods and techniques used to measure service delivery outcomes are based on satisfaction or quality ratings of services captured via organization-specific Point-of-Service (POS) surveys and the Graduating Senior Exit Survey (GSES). Information obtained through the assessment process is integrated into the organization’s next planning and budgeting cycle.
Evidence of all outcomes assessment activities, including their identification and measurement and the actions planned as a result of the evaluation process, is presented in the organization’s (AAP, AAR, AP) including actual data from internal records, audits/certifications, surveys and the organization’s assessment compliance certification (ACC).
As reported in the Institutional Effectiveness Continuous Improvement Report, (2007-08, 2008-09) the compliance with the assessment cycle (sample - Business Office ACC) of all administrative and non-academic student support services units, as required by the University, was 92% in 2007-08 and 2008-09. One organization that did not conduct formal assessment activities in 2008-2009 has submitted the AAP for 2009-10 and is now officially in the Assessment Management Online System (AMOS).
Evidence of the use of assessment results in operational improvements is reported in the Institutional Effectiveness: Programs’ Use of Assessment Findings for Continuous Improvement report, which categorizes the actions planned/taken by each administrative unit. Examples of planned or executed changes resulting from assessment findings are also reported in the Institutional Effectiveness Continuous Improvement Report.
All
The following examples demonstrate how administrative support organizations have used their results to bring about improvement in their operations.
Business Office:
In 2005-2006, the Business Office was experiencing severe problems with payment processing down-time and after-hour monitoring of the payment server compromising the efficiency of both student and university business transactions. To address these problems, the Business Office, in coordination with the Information Technology Department, began exploring the leverage of new technology in order to provide more reliable services for students and related entities, while insuring that regulatory guidelines were followed. After careful analysis, the decision to contract the services to external hosting of the payment server was the most effective and cost-efficient way to address the problems. The University has successfully migrated to external hosting of its payment server for tuition and fees and applications. The online applications were successfully implemented in November of 2008. Although the exact percentage could not be determined, it is estimated that the monitoring time was reduced by 95% to 97% and the down-time for payment processing was less than 5%. Further analysis indicates that the down-time required for routine maintenance and upgrades represents less than 1% of the total time. After-hours monitoring by University personnel has been decreased almost 100%. Unscheduled down-time for payment processing was estimated to be less than 1%. Furthermore, economies of scale were realized with the enhanced ability to send e-bills that assist with the timeliness of billing for student receivables and eliminate the need for mailed bills, in many instances resulting in lower paper and mailing costs. Internal records indicate that postage and paper costs were reduced by 5.5% and 18.6%, respectively. Student payment options have been expanded with the addition of Discover, American Express and e-Checks. Results from the 2008-09 Graduating Senior Exit Survey indicate that 94.96% of students are highly or somewhat satisfied with the billing/fee procedures.
Campus Security:
Although Athens State University has historically been considered a safe place to learn, work, and visit, as evidenced by the Clery Act Reports, the institution is very proactive in executing risk management strategies. Following an analysis of incident reports which revealed a minor increase i
in crime (one burglary case) during 2008, the University immediately added more surveillance equipment strategically located throughout the campus. As a preemptive action to the possibility of natural or man-made disasters, the University implemented e-Campus® emergency notification system. Continued monitoring of this system identified performance and cost inefficiencies. As a result, the University promptly replaced it with SchoolCast, a system that provides equal or better services and capabilities at a lower cost. This system communicates swiftly and effectively with all employees and students at once via Mobile, Phone, Desk or Home Phone, email and Web regarding important information and/or impending emergencies that may require immediate action. Over ninety-six percent (96.21%) of 2008-09 graduating seniors reported “feeling safe on campus”.
Information Technology Department:
In response to an increasing student demand for distance learning education and related services and the need for expanded uses of information to support both administrative and academic environments, the Information Technology (IT) Department has taken assertive actions to improve its infrastructure capabilities and support services. On the administrative side, IT upgraded local area network infrastructure, upgraded Internet connectivity, initiated wireless Internet service for main campus, set up off-site redundancy/backup for mission critical resources and segmented the main campus local area network for better traffic flow and efficiency. In addition, it expanded online tools to support registration and payment, added additional space on the e-mail server to allow 10 Gigabytes of storage (approximately 200% increase) in the mailbox of students and faculty, and implemented a friendlier helpdesk interface for use by faculty, staff, and students. On the academic support side, IT executed the global implementation of Tegrity software, where instructors use the application to create audio/video lessons to stream to their traditional lecture and distant learning students, installed DriveShield software in several of the labs to better manage, control, and protect the configuration settings of the student accessed computers so as to prevent unauthorized configuration changes and software installations on laboratory computers and added Smart-Classroom hardware (computers, projectors, document cameras, WACOM tablet monitors, and associated mounts and wiring), providing Smart-Classroom technology for the rest of the traditional classrooms on campus. Student satisfaction with technology course support; classroom, lab, and studio facilities; and technical support in matters of access, training, and information was 82.16%, 86.24%, and 83.33%, respectively in 2008-09.
Office of Institutional Planning, Research, and Assessment:
In response to requests from faculty and staff, OIPRA reviewed all data analysis and reporting protocols for all surveys conducted throughout the University. Following implementation of the new protocols, OIPRA began reporting results from surveys on a yearly basis (as opposed to individual terms) and expanded the level of analysis of the GSES to include segmentation by major and course schedule. Data users' initial response has been positive indicating that the new reporting format facilitates the interpretation and use of assessment data from surveys and speeds up the completion of the Annual Assessment Report. Satisfaction with OIPRA’s services was 95% in 2009.
|