About TU


OFFICE OF TECHNOLOGY SERVICES (OTS)

Core Technologies and Secutiy

Software and Hardware Testing Process

Before a new application, operating system, computer system or peripheral can be fully supported by OTS, it first must pass a rigorous testing procedure. Only once it has been thoroughly tested can we recommend it for campus use. OTS has developed the following methodology based upon experience and industry standards.

Software Testing

The first action in testing software revolves around documentation. What are the new features? What features are discontinued? What has changed or been updated? What are comparable features in other products?

Next, we analyze the answers to these questions and weigh them against other factors. If an application has significant updates to a previous version with many new and useful features, is the cost of upgrading to the newest version worth the additional functionality? Will the additional features (or lack of previous features) create functional or support problems that make the switch undesirable?

If it is decided that the upgrade cost is worth the additional functionality, then the application must be tested for interoperability. (Interoperability may still be a show-stopper, but application functionality should be the initial, primary focus of testing.) How the application interacts with other standard applications and the operating system is a difficult matter to test, since there are thousands of variables and scenarios. Our testing procedure involves running the application through its paces on a standard configuration system via a testing utility which runs tasks that a basic user would perform and logs the results. For example, if testing a word processor, the utility creates a new document, opens an existing document, makes changes and prints, among others. Typical functions that would require interaction with another application would also be tested.

The application must be tested against the campus network and any applicable enterprise systems. Almost every application nowadays has some sort of network functionality, even for something as basic as a word processor. In the highly network-managed environment here at Towson, it is extremely important that applications interact well with TowsonU, SMS, Exchange, LDAP, etc.

Once the application has undergone a closed environment test, it must then be tested in a limited user environment, often times referred to as a pilot project. A small group of users volunteer or are asked to test the software in production for a preset amount of time. The application rollout project is then modified based upon their experience and feedback.

If at any point in this testing process the application is deemed to be not worthwhile (due to cost, insufficient functionality, too many bugs, etc.), then its classification will be changed from "testing" to "unsupported."

This entire process should take between three and six months to complete, depending upon the complexity of the application and/or the implications to the campus. Sometimes there are uncontrollable variables that delay testing, such as release of an update by the manufacturer, a necessary third-party utility or back-end system, or budgetary concerns. Also, the start of the testing process is not always quantifiable. OTS often acquires beta versions with which some of the testing can be done, sometimes accelerating the process (though the entire testing process must be completed with the final version).

Hardware Testing

The hardware testing methodology is very similar to the software testing process, but with fewer steps. Except in extremely rare circumstances, hardware does not undergo a pilot project, and OTS relies upon the manufacturers to have rigorously tested the basic functions of the hardware.

Prior to even receiving a piece of hardware for testing, OTS evaluates the device based upon experience, industry standards and the needs of the campus. Here are some of the common variables considered when evaluating hardware:

  • Desktops: Cost; current memory size, speed and type, hard drive capacity and speed, and processor speed requirements for tier one applications, and projected for the next three years; monitor size; keyboard and mouse functionality; chassis size and functionality; parts warranty; maintenance record; acoustic properties; and available I/O ports.
  • Laptops: Same variables as the desktops, as well as docking solutions; battery life; size and weight; spindle size (number of slots for peripherals); wireless network functionality; and portability.
  • Printers: Cost; speed; quality; acoustic properties; ease of cartridge replacement; cost of cartridge and other parts; warranty; maintenance record; and I/O interface.
  • PDAs: Cost; memory; operating system; screen quality and resolution; size and weight; synchronization capabilities; wireless functionality; multimedia capabilities; and expandability.

A primary concern in evaluating hardware is to keep parity between comparable devices (Macintosh versus Windows, Palm versus Pocket PC) and not duplicate devices. For example, the specifications for the recommended Dell laptop and the Apple laptop are kept as comparable as possible. However, we will not recommend two Dell laptops that are extremely similar (e.g., Latitude C600 and Latitude C400). There must be a compelling reason to recommend more than one device in the same category.

 

 

Map

Emergencies
410-704-4444

University Police
410-704-2134

Closings & News
410-704-NEWS (6397)

Text Alerts
Sign up now