Informal Science Education II
EEL 5881 - Software Engineering - Fall 2000
|Version 0.0||September 15, 2000||G. H. Walton||Templater|
|Version 1.0||November 13, 2000||Michael Wales||Initial document|
Team Name: Miles Computer Engineering
Contents of this Document
Overall Objective for Software Test Activity
SECTION 1: Introduction
SECTION 2: Description of Test Environment
Most of the testing will be done in the same environment as the code is developed in because it is much easier for us to do. The software will probably also be tested on both a Linux and SGI machine. We don't anticipate any differences in the way the program operates in another OS because we are using Java but we would like to document any abnormalities should they occur. The environments we are using for development are:
The testing will be performed by the developers. Whenever possible, Jeff and Mike will test the modules developed by the other group member to get as great of variety during testing as possible. The testing that is performed during the code's development will obviously be done by the person creating the module.
SECTION 3: Stopping Criteria
Most of our testing will be performed as the code is developed. The developer will create a short section of code, recompile the effected classes, and then perform an exhaustive test on that module to check it for errors. If each method is thoroughly checked as it is created, we should eliminate most bugs that would be found during the final testing phase.
When the final testing is performed, we will focus on correctness and stability. If all testing goes well, I would expect that the both of us would perform testing for about 2 hours. If any minor errors occur, they will be corrected at that time, while larger problems will be documented later fixing. If there are more than 3 or 4 large problems, testing will be suspended till we can fix the problems and start the tests over.
To test for correctness we will probably compare our simulation to some of the simulations that can be downloaded from the internet. There are many simple life simulations available for free, and we can set up our simulation identically to the downloaded simulation and compare how similar the results are.
Our product will not be delivered to our customer if we feel that it is not correct. A program is pretty much worthless if it does not do its test correctly. We do not anticipate us giving the product to the customer with any type of bugs or glitches in it, but that has to really be decided after testing and encountering such problems if they exist.
SECTION 4: Description of Individual Test Cases
|Test Objective||To test functionality and robustness of the state creation command in the state and transition editting menu.|
|Test Description||The functionality will be tested by correctly creating a state and it's corresponding image. The robustness will be determined by creating invalid inputs. Invalid inputs that we will test will be incorrect file name, really long state name, no state name, loading a picture with incorrect dimensions, loading a picture with corrupt contents, and loading an image of each file type.|
|Test Conditions||The test conditions will be in the state and transition editting menu.|
|Expected Results||The test should load a correct state and corresponding image if there is a valid image file loaded. The program will be verified by testing to make sure that the created state made it back to the main simulation interface. Failed cases should end up with messages back to the user informing them of the problem with the suspect input. The program should not crash in any of these common occuring cases.|
|Test Objective||To test to make sure the program correctly verifies and saves the data entered into the text fields of the simulation state and transition editting window.|
|Test Description||The tester will load the program, enter the simulation transition and state editting window, and and enter some parameters into the text fields. All text fields will be filled with valid input and only one will be incorrect. The tester will place invalid data into each text field for each test to make sure all fields have been tested and none have been overlooked.|
|Test Conditions:||Any program conditions will be OK for this test. If any abnormal behavior occurs, the conditions of the program will be recorded so we can track down the source of the error.|
|Expected Results||After each invalid input, the user will hit OK and the program will show the user a dialog box explaining the source of the error. The tester will make sure that the dialog box correctly points to the source of the invalid input and that the program doesn't have any negative side effects other than not accepting the input parameter|
|Test Objective||To verify the correct operation of the user interface relating to the state selection on the state and transition editting window.|
|Test Description||The tester will attempt to delete, move a state up, and move a state down without any states existing. The tester will also test the same functions with only one state present in the current simulation. The tester will also test the functions when there is no state selected at all.|
|Test Conditions||As stated above, the simulation will only have zero or one state in the simulation.|
|Expected Results||In all these simulations, the simulator should have no response at all to the users commands.|
|Test Objective||To test the ability to load and save states and transitions.|
|Test Description||The tester will load a variety of states and transitions over and over while making sure it apppears that the everything is loading correctly. Also will load a saved set of states that has 0 states.|
|Test Conditions||The software will be tested in a variety of conditions including a fresh start, and even after many simulations have been loaded and reloaded.|
|Expected Results||The simulator should load and save each set of states and transitions. The simulator should prompt the user if the file that is loaded is invalid.|
|Test Objective||To test functionality and robustness of the open a saved simulation command.|
|Test Description||The functionality will be tested by attempting to open a file which is a proper saved simulation. The robustness will be determined by attempting to open a file which is not a valid simulation archive.|
|Test Conditions||The test conditions will be under normal operating conditions with the program having just been started, and we will also run the same test after another simulation has already been run.|
|Expected Results||The test should load the correct file given that it is a valid simulaiton archive. If the archive is invalid then the program should give display and error dialog box.|
|Test Objective||To test functionality and robustness of the save a simulation command.|
|Test Description||The functionality will be tested by correctly creating a simulation and attempting to save it. The robustness will be determined by user attempting to save over an exisiting file.ype.|
|Test Conditions||The test conditions will be under normal operating parameters (a simulation has just finished running).|
|Expected Results||The test should save the proper data to a valid archive which can be read back into the simulator.|
|Test Objective||To test functionality of the entire program.|
|Test Description||This test will start off by loading a simulation file which was creating in a previous test. After this is complete the tester will examine the state transitions window to confirm the the proper data was loaded. The tester will also check the main window to see if the proper initial conditions were loaded. Next the tester will proceed to add additional initial conditions. The program will then be run until the tester pauses the simulation. Once the simulation is paused the tester will then save the simulation and close it.|
|Test Conditions||The test conditions will be under initial operating state (program has just been started).|
|Expected Results||The program should perform this test and should not have any errors. The ending simulation window apperance will be dependent on the input file which will be created in an earlier test.|
|Test Objective||To test the overall long term stabiltiy and robustness of the product.|
|Test Description||Try to perform all of the above tests and just keep the program running the whole time. Would like to see the program run for several hours without failure. Also want to try it for long runs in the Linux and SGI operating systems also.|
|Test Conditions||As many different test conditions as possible|
|Expected Results||We don't want the program to crash. We are simply testing the long term stability of the program. If we encounter any problems during this testing they will probably be the hardest types of problems to fix.|
Template created by G. Walton ( GWalton@mail.ucf.edu ) on March 28, 1999 and last modified on August 15, 2000.
This page last modified by Michael Wales ( Mag7Rule@aol.com ) on November 13, 2000.