Systems Analysis and Design Project Based on Harvard

Systems Analysis and Design Project Stage 3 New System Specifications And Prototype Based on Harvard Business School case 9-692-015 Manzana Insurance ...

8 downloads 672 Views 892KB Size
Systems Analysis and Design Project Stage 3 New System Specifications And Prototype

Based on Harvard Business School case 9-692-015 Manzana Insurance – Fruitvale Branch

Presented to Professor Michael Palley Stevens Institute of Technology MGT 772 SB Analysis and Development of Information Systems

By Team 3

TEAM 3

MGT 772SB

Systems Analysis and Design Project Stage 3

Analysis and Development of Information Systems Professor Michael Palley

Table of Contents

SYSTEM OVERVIEW ...............................................................................................................................................4 DESCRIPTION .............................................................................................................................................................4 POLICY REQUEST TYPES ............................................................................................................................................4 BACKGROUND ...........................................................................................................................................................4 CURRENT ORGANIZATIONAL PROBLEMS AND GOALS FOR THE NEW SYSTEM ............................................................4 FILE DESIGN..............................................................................................................................................................7 BACHMAN DIAGRAMS IN THIRD NORMAL FORM (3NF).............................................................................................7 SYSTEM RISKS, CONTROLS AND AUDIT TRAILS...........................................................................................8 OVERVIEW OF CONTROL RISKS ..................................................................................................................................8 CONTROLS AND AUDIT TRAILS USED TO LIMIT CONTROL RISKS ...............................................................................8 Controlled Access to the System...........................................................................................................................8 Separation of Function ........................................................................................................................................9 Integrity Constraints and Data Filters.................................................................................................................9 Concurrency Control ...........................................................................................................................................9 Encryption............................................................................................................................................................9 Access Logs and Audit Trails...............................................................................................................................9 Contingency Site Plans ........................................................................................................................................9 Backup and Archive Plan...................................................................................................................................10 Documentation ...................................................................................................................................................10 SYSTEM-SPECIFIC CONTROLS ..................................................................................................................................10 Request Processing Monitor ..............................................................................................................................10 FUNCTIONAL PROTOTYPE.................................................................................................................................11 HUMANS FACTORS CONSIDERED IN SCREEN DESIGNS .............................................................................................11 DISCLOSURE STATEMENT ........................................................................................................................................11 SAMPLE SCREEN SHOTS...........................................................................................................................................12 Request Entry Screen 1 ......................................................................................................................................12 Request Entry Screen 2 ......................................................................................................................................13 Overall Work Queue ..........................................................................................................................................14 Departmental Work Queue ................................................................................................................................15 Branch Manager Menu ......................................................................................................................................16 Return Record Screen ........................................................................................................................................17 Property Risk Evaluation Screen .......................................................................................................................18 ORGANIZATION DESIGN CONSIDERATIONS ................................................................................................19 EXISTING ORGANIZATION CHART ............................................................................................................................19 RECOMMENDED CHANGES.......................................................................................................................................19 IMPLEMENTATION PLAN ...................................................................................................................................20 SOFTWARE QUALITY ASSURANCE PLAN ..................................................................................................................20 TRAINING PLAN .......................................................................................................................................................22 SYSTEM IMPLEMENTATION AND CONVERSION PLAN ...............................................................................................22 TEAM LEADER’S ATTENDANCE/CONTRIBUTION REPORT .....................................................................23

Page 2 of 36

TEAM 3

Systems Analysis and Design Project – STAGE 3

MGT 772SB

APPENDIX A:EXHIBITS...................................................................................................................................... 24 EXHIBIT 1: DEPARTMENT PRODUCTIVITY ............................................................................................................. 24 EXHIBIT 2: PROCESSING TIME ............................................................................................................................... 24 EXHIBIT 3: COMPARISON OF BRANCH PROFITS TO RUN AND RERUN PREMIUMS ............................................... 25 EXHIBIT 4: COMPARISON OF MANZANA PROCESSING TIMES BY DEPARTMENT ..................................................... 26 APPENDIX B: SOFTWARE QUALITY ASSURANCE TEST PLAN .............................................................. 27

Page 3 of 36

Systems Analysis and Design Project – STAGE 3

TEAM 3

MGT 772SB

Case basis: Manzana Insurance - Fruitvale Branch (Harvard Business School case 9-692-015)

System Overview Description The system being studied encompasses the operations workflow and processing of property insurance policies at the Fruitvale branch of Manzana Insurance. The system processes four types of policy requests and relies on the distribution of work among various departments within the branch. The new system will attempt to create operational efficiencies and streamline the processing workflow. Policy Request Types • • • •

RUN – Request for underwriting – processing of a new policy RERUN – Request for renewal – processing of a policy renewal RAIN – Request for additional insurance – processing of a policy endorsement RAP – Request for a price – processing of a price quote

Background The Fruitvale branch has become Manzana’s worst performing branch, and Manzana is being severely outperformed by a competitor, Golden Gate Casualty, in Fruitvale’s territory. The turnaround time (TAT) for processing a request has grown to exceed five days at a time when Golden Gate began guaranteeing one day TAT. This important measure of service could be a large factor in Fruitvale’s loss of business. Improving the system should result in operational efficiencies, improved service, and hopefully an increase in business. Current Organizational Problems and Goals for the New System The organizational problems and goals were derived from a study of the facts provided in the case and supported by an organizational and financial analysis submitted during stage 1. They are presented here in rank order. Problem 1: Average turnaround time (TAT) is too lengthy. Turnaround time is an important indicator of service quality used by originating agents to help customers choose an insurance company. Fruitvale’s average TAT for processing a request had grown to six days when Golden Gate began guaranteeing one day TAT. In addition, Manzana is already processing at a rate faster than the 95% Standard Completion Time (SCT) being used to calculate TAT (Exhibit 4). This causes Manzana to overstate TAT and hurt their chances to win business.

Page 4 of 36

TEAM 3

Systems Analysis and Design Project – STAGE 3

MGT 772SB

Goal: Modify system to decrease turnaround time. The new system will automate and streamline certain tasks and track the processing status of requests. These changes should create efficiencies that will lead to shortened turnaround time. In addition, TAT calculations will be periodically reviewed and revised to ensure that new processing efficiencies are accurately reflected. Problem 2: Large number of late renewals. RERUNs are currently held until the last day before their due date. Agents expect a renewed contract offer before the expiration of the old policy. Late renewals result in a large renewal loss rate, representing a significant loss of business. Goal: Modify system to decrease the number of late renewals. The new system will deliver RERUNs to the originating agents with sufficient time before expiration. The prioritization of processing requests will be controlled by the system using a modified FIFO approach. Processing will be monitored so that alerts are generated when processing falls behind schedule. Problem 3: Large backlog of policies. An inconsistent priority system among departments causes downstream problems. For example, the Rating Department is unable to address its backlog because of consistently late RERUNs from the Underwriting Department. Goal: Modify system to decrease backlog of policies. The new system will automatically calculate processing due dates and assign processing priorities to each request that enters the system. A new computerized workflow will help standardize and enforce prioritization. Problem 4: Organization structure and operations workflow are not optimal. The current inefficient workflow is based on a non-optimal distribution and utilization of staff. Goal: Align staffing with needs of the new system. A structural reorganization will facilitate implementation of the new system. The suggested changes are discussed in the Organization Design section below. Problem 5: Departments do not adhere to documented procedures. Different departments seem to follow their own interpretation of procedures and company policies. Goal: Document and standardize procedures. Updated procedures will be developed along with the new system. Training will deliver documentation and emphasize the importance of adhering to it. In order for efficiencies to be realized, proper procedures must be observed and maintained.

Page 5 of 36

TEAM 3

Systems Analysis and Design Project – STAGE 3

MGT 772SB

Problem 6: Salaries and bonuses are not aligned with profitability. The "Salary Plus" program rewards senior underwriters and division managers for each new policy written. As a result, certain employees place increased emphasis on new policies and price quotes at the expense of renewals. Exhibits 2 and 3 show that renewal business is more profitable than new business. Goal: Provide management with profitability analysis of new and renewal policies. A profitability analysis report will show that a new compensation structure should be developed to place emphasis on profitability rather than on a specific type of policy.

Page 6 of 36

TEAM 3

Systems Analysis and Design Project – STAGE 3

File Design Bachman Diagrams in Third Normal Form (3NF)

Page 7 of 36

MGT 772SB

TEAM 3

Systems Analysis and Design Project – STAGE 3

MGT 772SB

System Risks, Controls and Audit Trails Overview of Control Risks In the new system, a greater dependence on technology magnifies certain existing risks and even introduces some new risks. One of the most significant changes that can be seen throughout the entire system is the move from a paper-based workflow to a computer-based workflow. Once the system is completely implemented, the business will find it difficult to operate when the system is unavailable. Concentration of data in the new system introduces another risk that is closely related to the dependence on technology. The data will be primarily stored in one integrated database. There is a risk of data loss and data contamination from a variety of sources, including software errors, user errors, physical damage, and security breaches. Software application design decisions often have the unintended consequence of introducing a wide variety of risks into a system. These risks include the improper handling of system and data conditions such as user input error, concurrent data access, and inappropriate system access. The new system is being designed with all the appropriate considerations from a university masterslevel course on the analysis and development of information systems. Of particular importance is the overall consideration of fraud and security threats. There are a number of such concerns stemming from internal and external sources. Employees are considered the biggest threat to the system due to the possibility of covert attacks and negligence. A smaller yet still important consideration is the threat of covert attacks by non-employees. The use of force to cause physical damage is also considered a threat to the system. Controls and Audit Trails Used to Limit Control Risks Controlled Access to the System The application will reside on a centralized application server and the database will reside on a centralized database server. The servers will be housed in a secure data center (or computer room), with appropriate security, power and climate control systems. Each user will have a login ID and password to access the system. Access Control Lists will be used to control access privileges within the system. The privilege to read and/or write specific types of data will be determined by a review of each department's functional responsibilities. Users from each department will then be granted default minimal access, following the principle of the level of privilege necessary for the job and no more. Exceptions to the standard Access Control Lists will be reviewed and granted in accordance with a documented procedure. The computer operating systems will also be configured following the principle of the level of privilege necessary for the job and no more. Specifically, users will not have the ability to modify or install system components or unauthorized programs. Only designated System Administrators

Page 8 of 36

TEAM 3

Systems Analysis and Design Project – STAGE 3

MGT 772SB

will have administrative access to the operating systems. Additionally, procedures will be put in place to keep system security patches and anti-virus software up-to-date. Separation of Function The basic standard is that any security relevant event should take place in front of witnesses. This concept will be implemented by ensuring that no request can flow through the system without being processed by multiple departments. The Branch Manager will also specify a small group of individuals who have the authority to provide final approval to a policy before it is issued. Integrity Constraints and Data Filters The use of integrity constraints and data filters will serve to control a variety of risks including user input error and certain attempts at committing fraud. Static integrity constraints will be implemented in the form of range checks for all appropriate numeric fields, such as insurance policy coverage amount. When a coverage amount exceeds a pre-determined limit, a notification is sent to the Branch Manager or his designate. Subsequently, override authorization can be provided or a security investigation can be initiated. Concurrency Control Users cannot have more than one active login at any given time. Furthermore, Multiple transactions will not be permitted to access the same data at the same time. Locking will be performed at the level of the processing job. The effect will be that only one user can work on a particular processing job at a time. This approach will help prevent contamination of data. Encryption All confidential data will be encrypted in the database. Only the production application program will have the ability to decrypt data before presenting it to the user. The development and testing databases will use a different encryption key. This design will ensure that direct access to the database does not reveal confidential data. The use of encryption will also prevent confidential data from being inappropriately retrieved from backup tapes. Access Logs and Audit Trails All system logins are recorded in system security logs. All application activity is also recorded for history and audit purposes. For example, the Bachman Diagrams above show that the activity of every user who works on a processing job is recorded in the Activity file. Procedures and/or automated tools will be put in place to periodically review the logs to identify suspicious activity. Activity such as excessive failed logins will be reviewed to determine cause, including attempted security breach or forgotten password. Contingency Site Plans It is recommended that an alternative geographic site be setup and maintained to house contingency servers and workstations. The contingency site must be able to support the business if necessary. A plan to populate production data into the contingency site will also need to be developed. The most desirable solution involves real time data replication over a wide area network link.

Page 9 of 36

TEAM 3

Systems Analysis and Design Project – STAGE 3

MGT 772SB

Backup and Archive Plan Full nightly database backups will be performed. An outsourced company will retrieve the backup tapes on a daily basis for off-site storage. The daily backup tapes will be returned and reused after 30 days. The last backup of the month will be designated as an archival backup to be stored for the amount of time required by law. Documentation Documentation will be provided for all system components, operating procedures, maintenance procedures and audit procedures. The documentation itself must also be maintained. System-Specific Controls Request Processing Monitor The system will monitor the processing status of all requests. Alerts will be generated when the request processing time has exceeded a pre-determined time limit threshold. Department managers and the Branch Manager will receive alerts at different threshold levels. This monitor will help detect processing delays within specific departments and processing delays caused by certain employees.

Page 10 of 36

TEAM 3

Systems Analysis and Design Project – STAGE 3

MGT 772SB

Functional Prototype Humans Factors Considered in Screen Designs The prototype is still in its early stages of development. Additional features will be added along with a more pleasing screen design. Nonetheless, the current prototype incorporates several human factors into its design, including the following: 1. Screen designs are clean and simple. Items on input screens are displayed in a manner consistent with the user's mental model. 2. To provide visual coherency, all related menu choices are grouped together and displayed in a natural order. Department-specific menus have been created to ensure that users do not see features that are unknown or unnecessary. 3. Flow of control has been carefully considered so that screens are presented in the natural workflow order. 4. Color-coding is used in the work queues to visually group requests by processing status. 5. The system displays on-screen messages during back-end processing so that the user is informed prior to a system pause between screens. The user is also informed that an action completely successfully. 6. Users are alerted to potentially damaging actions, such as rejecting an insurance request. A reason for the rejection is required along with a secondary confirmation to proceed. 7. User input errors are greeted with friendly error messages that provide positive feedback in common English. Error recovery then allows the user to correct a mistake. 8. All terminology is that of the users. Care has been taken to avoid jargon of the developers. 9. All screens were created with mixed case lettering, a design feature that keeps people calmer and allows them to react better. Disclosure Statement All screen designs shown in the prototype are original work produced by Team 3.

Page 11 of 36

TEAM 3

Systems Analysis and Design Project – STAGE 3

Sample Screen Shots Request Entry Screen 1

Page 12 of 36

MGT 772SB

TEAM 3

Systems Analysis and Design Project – STAGE 3

Request Entry Screen 2

Page 13 of 36

MGT 772SB

TEAM 3

Systems Analysis and Design Project – STAGE 3

Overall Work Queue

Page 14 of 36

MGT 772SB

TEAM 3

Systems Analysis and Design Project – STAGE 3

Departmental Work Queue

Page 15 of 36

MGT 772SB

TEAM 3

Systems Analysis and Design Project – STAGE 3

Branch Manager Menu

Page 16 of 36

MGT 772SB

TEAM 3

Systems Analysis and Design Project – STAGE 3

Return Record Screen

Page 17 of 36

MGT 772SB

TEAM 3

Systems Analysis and Design Project – STAGE 3

Property Risk Evaluation Screen

Page 18 of 36

MGT 772SB

Systems Analysis and Design Project – STAGE 3

TEAM 3

MGT 772SB

Organization Design Considerations Existing Organization Chart Manzana Insurance (Fruitvale Branch) Branch Manager (John Lombard) Assistant Manager (Bill Pippin) Underwriting (Bob Melrose)

Rating (Rick Ramirez)

Underwriting Teams (3) (Team 1) Underwriter 1 Tech Assistant 1

(Team 2) Underwriter 2 Tech Assistant 2

Distribution Clerks (4)

Policy Writing (Phyllis Chen) Raters (8)

(Team 3) Underwriter 3 Tech Assistant 3

Policy Writers (5) Record Clerks (3)

Copying Clerk Mail Clerk

Recommended Changes Manzana Insurance (Fruitvale Branch) Branch Manager (John Lombard) Assistant Manager (Bill Pippin)

(Team 1) Underwriter 1 Tech Assistant 1

Underwriting (Bob Melrose)

Rating (Rick Ramirez)

Underwriting Teams (3)

Raters (6)

(Team 2) Underwriter 2 Tech Assistant 2

Administration (Phyllis Chen) Policy Writers (3)

Distribution Clerks (4)

Administrative Clerks (4)

(Team 3) Underwriter 3 Tech Assistant 3

The current inefficient workflow is based on a non-optimal distribution and utilization of staff. A structural reorganization will facilitate implementation of the new system and allow it to operate more efficiently. The first two recommended changes directly address two concerns of Manzana's senior vice president for underwriting operations.

Page 19 of 36

TEAM 3

Systems Analysis and Design Project – STAGE 3

MGT 772SB

1. Eliminate the regional alignment of the underwriting staff. There is little evidence in the case that any benefit is derived from organizing the underwriting staff along geographic lines. To the contrary, there is evidence that it may be detrimental. A process flow based on territorial alignment creates unnecessary operating bottlenecks. An analysis of the three underwriting teams indicates that Team 1 processed more policies and had a much higher productivity level than Team 2 and Team 3 (Exhibit 1). 2. Reduce staff in Rating and Policy Writing. The new system will create operating efficiencies that will allow for some minor staff reductions. These staff reductions will improve branch profitability. Exhibit 1 shows that the Rating and Policy Writing departments are operating at 76% and 64% efficiency. The case states that rating had become "purely mechanical," and the amount of time taken to write a policy had decreased significantly in recent years. In addition, the automation of assembling policies and pricing risk will further reduce processing time. Analysis supports a reduction of rating staff by 24% and policy writing staff by 36%. This translates to a staff reduction of two employees in each department, which is reflected in the recommended organization chart above. 3. Combine administrative jobs into a restructured Administration department. Policy Writers, Distribution Clerks, Record Clerks and Copy/Mail Clerks all perform administrative duties, many of which are related. Cross training can be provided to allow administrative personnel to function in multiple roles. This will create operational efficiencies. 4. Allow Administrative Clerks to report directly to the department manager. In the existing system, administrative clerks report to Policy Writers. In the new system, all clerks will report directly to the Administration department manager. This change should encourage them to be more proactive and allow them a better chance to take on additional responsibilities. The manager will also be in a better position to oversee the work of all employees.

Implementation Plan Software Quality Assurance Plan An outline for the actual Software Quality Assurance (QA) Plan is shown in Appendix B. It is important to note that the Software QA is dependent upon the information provided in the release notes from the developer / vendor, and requirements provided by the relevant stakeholders. Since this information is important and necessary to build a test plan, it will be included in Section 1, which is an Overview of the application requirements, and Section 2, which is an interview of the

Page 20 of 36

TEAM 3

Systems Analysis and Design Project – STAGE 3

MGT 772SB

developer on potential problems areas. The subsequent sections of the test plan have been divided into several related, but independent tests. The testing plan can be divided into two aspects. The first aspect will be the actual Quality Assurance of the application. QA equates to verification that the application installs and functions properly. The installation verification will focus on unit testing on the appropriate hardware models (Section 3) and compatibility testing on the relevant software platforms (Section 6). The functionality testing will be approached from two angles as well. The QA Tester will push live historical data through the application to make sure that functionality under the usual conditions is satisfied. The historical data can be requests that have been processed within the last two weeks. It has been calculated that there have been an average of 40+ requests per week, which will equate to a test of 80+ data feeds. This should be sufficient for the verification of live data. The tester will need to make sure that the information being calculated by the system coincides with the information that was calculated in the historical requests. For the second part of functionality testing, the QA Tester will need to push through fictional or artificial data. These test cases will be incorporated in Exploratory and Performance Testing, which are Sections 7 and 8 respectively. The purpose of this will be to prove that this system works under duress. Sample test cases can include testing limits of data fields; ensuring that illegal entries cannot be submitted (e.g., numeric data in alpha fields); and testing of possible concurrency issues with multiple users accessing a single processing job. The second aspect of the QA process will be verification that the application fulfills user requirements in terms of usability and process flow. An application that is not intuitive to the user will face higher levels of resistance. To accomplish this the user will be involved in the QA process portion called User Acceptance Testing (UAT). The focus of UAT is to ensure that the user likes what he sees and that the process flow of the application makes operational sense. Users will be able to give feedback during training and during the piloting of the application. The final sections of the attached test plan will focus on putting the results together in a meaningful way for management, fellow testers, and stakeholders. A summary (Section 9) of the results will be provided and it will detail the major findings and potential problems identified during testing. This will allow management to understand the issues and risks of deploying the application. This section should be in a good level of detail. The following section (Section 10) will be a composition of all the Issues and Bugs found during testing, and serves as a “Lessons Learned” for future testers. These are problems identified in the current test phase so that subsequent testers will know where the earlier problems areas were. An issue caught once should never occur again. The final section will be the Recommendation section (Section 11) of the tester based on his findings. For management concerned with only a green light or red light, this is the most important section. If it is given a red light, the tester can reference information in the Summary section.

Page 21 of 36

TEAM 3

Systems Analysis and Design Project – STAGE 3

MGT 772SB

Training Plan Manzana employees are computer literate and the changes we proposed for Manzana Insurance are enhancements to their existing system. Therefore, the majority of the staff already has a working knowledge of the existing system and it is expected that they will openly accept the system revisions. The training will focus on the changes made in the new system and how it ties into the existing system. The training will be broken into two components. First, the users will receive classroom overview that will utilize a prototype of the user interface and show how the data flows from screen to screen. Next, users will receive hands-on training utilizing a tutorial walkthrough that will allow the user to enter data and navigate through the system. The test data will utilize historical Manzana policy data so results can be verified. This will prepare the user to perform user acceptance testing (UAT). In order to limit processing disruptions, the first part of the training will be conducted as a working lunch session over a two-day period. One half of the underwriters, raters and policy writers will attend separate sessions and one half of the record and distribution clerks will attend the simultaneous sessions in two separate rooms. This portion of the training will be carried out over two business days. The second part of the training will be broken into two three-hour sessions starting at 3:00PM. The two sessions will accommodate one half of the branch - underwriters, raters, policy writers, and record and distribution clerks. In this way they will be able to see the end-to-end processing for each of the insurance products. The policy data will be made up of test data specifically constructed for this course. It will test the evaluation and risk measurement skills against decisions made at other insurance companies. In this final session, completion times will be captured and compared against the current standard completion times and completion times of other Manzana branches. This will provide a baseline to measure future productivity after the system has been fully implemented and the staff is well versed in the new system. Finally, a dedicated support line will be offered for 2 weeks after the install date. Manzana staff can call this number between 8:00 AM and 6:00 PM (PT) and get answers specific to their application. After the two-week period, Manzana can call the standard help desk line. Each person trained will be given a system documentation manual that will contain a written narrative of the system, the data flow diagrams, Bachman diagrams and data flow dictionaries. System Implementation and Conversion Plan The process conversion methodology of choice will be the pilot. Pilot was chosen because the operational process has not changed significantly from the new system and because there is a relatively small population of users, less than 25, who will be affected by this change. The similarities in operational workflow before and after implementation of the new system will limit confusion among the users. The small size of the firm allows them to easily flip the switch and convert the rest of the users upon successful pilot testing. The focus of the pilot is user

Page 22 of 36

TEAM 3

Systems Analysis and Design Project – STAGE 3

MGT 772SB

acceptance testing on a small percentage of the population. The plan is to have one Distribution Clerk, one Underwriting Team, one Rating employee, and one Policy Writing employee work on the new system during pilot. The goal of the pilot is to help work out stability issues of the application, to train the personnel to work with the system in an operational setting, and to verify that the system meets operational requirements. By limiting the exposure of the new system to a small percentage of the population, the firm will be able to hedge their risk in case the application does not satisfy requirements and will take a longer period of time to roll out. At the same time, even if there is an issue with the system, it will be easy for the user to revert back to the original system if necessary. Another key point that needs to be addressed is the data conversion plan. It has been decided that the conversion of historical data into the new system is necessary to ensure that the system functions to specifications. The system is responsible for calculating premiums based on risk ratings assigned by the underwriting team. So, the historical information that will be required for this calculation will be historical risk, rating, and pricing premium data. Since this information is currently included in the existing system as the “policy premium file” and the “branch policy archive file,” manual data entry will not be necessary. There will be a scripted process to migrate the data from the old files to the new merged “branch policy archive file” in the proposed system. The key question of how much historical data is necessary to compile the new “branch policy archive file” is also answered with scripting. The migration script will retain all the historical information to be migrated from the old system to the new system. Again, the bulk of the work will not be in manual data entry. Instead, it will be in testing that the calculations of the new system are legitimate. This has already been detailed in the Software QA plan where testers will push live historical data to verify accuracy and functionality. One file that will still serve a purpose in the new system is the “standard page file.” Insurance policies will still have the standard page information from the file for use. The issue will be with the integration of the file into the new application in terms of ensuring the ability of the system to access the existing file information. The testing of this functionality will also be included in the Software QA test plan under functionality testing of live historical data. One last key point is that the function of the record clerk is not being removed. Record clerks, now known as administrative clerks, will be responsible for archiving a hard copy of the policy after approval. The system will begin archiving soft copies of the requests once the pilot begins. Upon confirmation of the policy, the record clerk will maintain a hard copy for audit and control purposes. Since the paper trail will not be entirely removed in the new system, it was decided not to spend personnel time computerizing all the historical information.

Team Leader’s Attendance/Contribution Report All team members attended all scheduled meetings and contributed equally to the project.

Page 23 of 36

Systems Analysis and Design Project – STAGE 3

TEAM 3

MGT 772SB

Appendix A:Exhibits Exhibit 1: Department Productivity

RUN

RAP

Productivity RAIN RERUN

Total

Department Distribution (4 staff)

11%

42%

9%

27%

89%

Underwriting Team - 1 (2 staff per team) Underwriting Team - 2 (2 staff per team) Underwriting Team - 3 (2 staff per team) Total Underwriting

13% 8% 7% 9%

54% 36% 37% 42%

8% 5% 5% 6%

22% 29% 21% 24%

97% 78% 70% 82%

6%

27%

7%

36%

76%

9%

39%

64%

Rating (8 Staff) Policy Writing (5 staff)

16%

n/a

Note: The "Mean" processing time was used to calculate dept. productivity. Based on polices processed in the first half of 1991.

Exhibit 2: Processing Time

Distribution Run RERUN

Processing Time Underwriting Rating

Writing

Total

68.5

43.6

75.5

71

258.6

28

18.7

75.5

50.1

172.3 86.3

A new Policy takes 1/3 longer to process and is less profitable!

Manzana should not prioritize new business over renewal business.

Page 24 of 36

33%

Systems Analysis and Design Project – STAGE 3

TEAM 3

MGT 772SB

Exhibit 3: Comparison of Branch Profits to RUN and RERUN Premiums

Comparison of Branch Profits to Gross Premiums (in thousands) $2,500

$7,400

$2,000

$7,200 $7,000

$1,500

$6,800

Branch Profit RUN Premiums

$1,000 $6,600 $500

RERUN Premiums

$6,400

$0

$6,200

0 0 9 9 1 1 0 0 9 9 r ' 8 un '8 ep '8 e c '8 ar ' 9 un '9 ep '9 e c '9 ar ' 9 un '9 J J J M D S M D S Ma ($500)

$6,000

 The black line denotes a change in operating policy to emphasize new policy growth by awarding a $300 bonus for each new policy written.  Clearly, this change in policy had an impact on branch profits.

Page 25 of 36

TEAM 3

Systems Analysis and Design Project – STAGE 3

MGT 772SB

Exhibit 4: Comparison of Manzana Processing Times by Department

RUN Underwriting * Min Max Mean # of teams Actual Time

Estimated Processing times in minutes RAP RAIN RERUN

0.57 199.67 14.53 3 14.55

2.00 131.67 12.67 3 12.68

0.17 137.00 7.53 3 7.54

1.40 240.10 6.23 3 6.24

Actual Time to Process * 14.55 12.68 7.54 6.24 35.73 29.17 16.47 20.93 95% SCT (used in existing system B/W than 95% SCT 21.18 16.49 8.92 14.69 * Completion time per underwriting team. Actual time assumes no excess capacity. Distribution Min Max Mean # People Actual per person

RUN 7.63 35.50 17.13 4 17.17

RAP 7.88 31.00 12.50 4 11.41

RAIN 6.75 71.50 10.88 4 9.93

Actual Time to Process * 17.17 11.41 9.93 32.03 26.95 17.03 95% SCT (used in existing system B/W than 95% SCT 14.86 15.54 7.09 * Completion time per person. Actual time assumes no excess capacity.

Rating Min Max Mean # People Actual - Per Person

RUN 0.88 58.13 9.44 8 9.41

RAP 1.00 52.13 8.09 8 7.58

RAIN 1.88 54.88 8.19 8 7.68

Actual Time to Process * 9.41 7.58 7.68 14.04 11.09 11.18 95% SCT (used in existing system B/W than 95% SCT 4.63 3.50 3.50 * Completion time per person. Actual time assumes no excess capacity.

Policy Writing Min Max Mean # People Actual - Per Person

RUN 7.90 74.20 14.20 5.00 15.99

RAP

RAIN 6.00 55.10 10.80 5.00 10.79

Actual Time to Process * 15.99 10.79 17.86 14.42 95% SCT (used in existing system B/W than 95% SCT 1.87 3.63 * Completion time per person. Actual time assumes no excess capacity.

Page 26 of 36

RERUN 5.13 69.00 7.00 4 6.39 6.39 10.80 4.41

RERUN 0.88 58.13 9.44 8 8.85 8.85 11.53 2.67

RERUN 7.80 74.10 10.02 5.00 10.01

10.01 13.40 3.39

TEAM 3

Systems Analysis and Design Project – STAGE 3

MGT 772SB

Appendix B: Software Quality Assurance Test Plan

Software QA Test Plan Manzana Insurance Processing System

Copyright 2004 Manzana Insurance, Inc. This document contains material confidential to Manzana. The contents are protected by specific employee and contractor agreements regarding confidentiality. Reproduction for distribution outside Manzana is prohibited without express written permission. All rights reserved.

Page 27 of 36

Systems Analysis and Design Project – STAGE 3

TEAM 3

MGT 772SB

TA B L E O F C O N T E N T S 1

OVERVIEW ...................................................................................................................................... 29 1.1 1.2 1.3 1.4

OVERVIEW.................................................................................................................................. 29 INSTALLATION REQUIREMENTS AND GUIDELINES ...................................................................... 29 APPLICATION ENHANCEMENTS & FIXES .................................................................................... 29 KNOWN ISSUES / PROBLEMS / LIMITATIONS ............................................................................... 29

2

RISK ASSESSMENT........................................................................................................................ 29

3

TEST ENVIRONMENTS................................................................................................................. 30 3.1 HARDWARE TEST ENVIRONMENTS ............................................................................................. 30 3.1.1 Hardware Test Environment 1 .............................................................................................. 30 3.1.2 Hardware Test Environment 2 .............................................................................................. 30 3.1.3 Hardware Test Environment 3 .............................................................................................. 30 3.1.4 Hardware Test Environment 4 .............................................................................................. 30 3.2 SOFTWARE BASE TEST ENVIRONMENTS ..................................................................................... 30 3.2.1 Software Base Test Environments (SBTE) ............................................................................ 31

4

INSTALLATION VERIFICATION ............................................................................................... 31 4.1

5

FUNCTIONALITY TESTING ........................................................................................................ 32 5.1

6

TEST CASES ................................................................................................................................ 34

EXPLORATORY TESTING ........................................................................................................... 34 7.1

8

TEST CASES ................................................................................................................................ 32

COMPATIBILITY TESTING ......................................................................................................... 34 6.1

7

TEST CASES ................................................................................................................................ 31

TEST CASES ................................................................................................................................ 35

PERFORMANCE TESTING........................................................................................................... 35 8.1

TEST CASES ................................................................................................................................ 35

9

SUMMARY........................................................................................................................................ 36

10

ISSUES/BUGS ................................................................................................................................... 36

11

RECOMMENDATIONS .................................................................................................................. 36

Page 28 of 36

Systems Analysis and Design Project – STAGE 3

TEAM 3

1 Overview Test Coordinator Developer Version

Test Period

Vendor / Developer

Tester(s)

Contact Name

Phone Number

1.1 Overview Quality Assurance Test Plan for the Manzana Insurance Processing System (MIPS).

1.2 Installation Requirements and Guidelines

1.3 Application Enhancements & Fixes

1.4 Known Issues / Problems / Limitations

2 Risk Assessment Software Risk Analysis and Assessment

Risk Assessment Which functionality is most important to the project's intended purpose? Which functionality is most visible to the user? Which functionality has the largest impact on the user? Which aspects of the application are most important to the user? Which parts of the code are most complex? Which parts of the application were developed in rush or panic mode?

Page 29 of 36

MGT 772SB

Systems Analysis and Design Project – STAGE 3

TEAM 3

MGT 772SB

Which aspects of similar/related previous projects caused problems? Which parts of the requirements and design is unclear or poorly thought out? What do the developers think are the highest-risk aspects of the application? What kinds of problems would cause the worst impact? What kinds of problems would cause the most help desk complaints?

3 Test Environments 3.1 Hardware Test Environments The purpose of this section is to detail the units tested or that will need to be tested during UAT. This is to provide a baseline in terms of hardware that this application will work with no issues. 3.1.1

Hardware Test Environment 1

Workstation Model CPU Memory

3.1.2

Hardware Test Environment 2

Laptop Model CPU Memory

3.1.3

IBM ThinkPad T30 Pentium III 1 GHz 512 Mb

Hardware Test Environment 3

Laptop Model CPU Memory

3.1.4

Compaq Deskpro EN Pentium III 533 MHz 128 Mb

IBM ThinkPad T20 Pentium III 833 MHz 256 Mb

Hardware Test Environment 4

Laptop Model CPU Memory

Dell GX150 Pentium III 833 MHz 512 Mb

3.2 Software Base Test Environments The purpose of this section is to detail the software that will be tested in conjunction with the MIPS application testing. This will include the OS and Service Pack level information. This is necessary so that administrator can identify a baseline where the application can work with no issues.

Page 30 of 36

Systems Analysis and Design Project – STAGE 3

TEAM 3

3.2.1

MGT 772SB

Software Base Test Environments (SBTE)

SBTE

OS

Hardware Platform Desktop

Applications

NT4

Service Pack SP6a

1

2

W2k

SP4

Desktop

Acrobat Reader 6 Acrobat Writer 6 Microsoft Office 2000 Internet Explorer 6 McAfee Virus Scan

3

W2k

SP4

Laptop

Acrobat Reader 5 Acrobat Writer 6 Microsoft Office 2000 Internet Explorer 5.5 McAfee Virus Scan

4

XP

SP1

Desktop

Acrobat Reader 6 Acrobat Writer 6 Microsoft Office XP Internet Explorer 6 McAfee Virus Scan

Acrobat Reader 5 Acrobat Writer 5 Microsoft Office 97 Internet Explorer 5.01 McAfee Virus Scan

4 Installation Verification 4.1 Test Cases These are the test actions to verify installation of the MIPS Application on the multiple hardware and OS platforms described in Section 3. The UA Tester will follow the usual input steps to verify that the process flow of the installation makes sense and that information flows between installation modules. Test (P) Or (F) Number Action 1 Start Installation - Run ‘setup.exe’ 2a

Expected Result Actual Result Welcome Dialog Box appears

Welcome Dialog Box - Test ‘Next’ button

Page 31 of 36

End User License Dialog Box appears

Systems Analysis and Design Project – STAGE 3

TEAM 3 2b

Welcome Dialog Box - Test ‘Cancel’ button

3a

Do You Want to Exit Dialog Box - Test ‘Yes’ button

3b

Do You Want to Exit Dialog Box - Test ‘No’ button

4a

End User License Dialog Box - Test ‘I Agree’ button

4b

End User License Dialog Box - Test ‘Cancel’ button

5a

Choose Destination Dialog Box -Test Entry Field 1.

MGT 772SB

Do You Want to Exit Dialog Box appears Exit Installation. No residual data on system Continue Installation. Prior Dialog Box appears Choose Destination Dialog Box appears Do You Want to Exit Dialog Box appears.

Enter installation folder location in entry field.

Etc.

5 Functionality Testing 5.1 Test Cases Live (historical) Data These are the test actions to verify functionality of the MIPS Application. Initially, the input will be live (historical) data to ensure that the test performs well under regular scenarios. The sample size will be about a month’s worth of historical input. Test (P) Or (F) Number Action Run Programs>Applications>MIPS v1.0 Icon 1

2a

Login Screen - Enter valid login information and password information in data entry fields

2b

Login Screen - Enter Distribution login information and test ‘Login’ button Login Screen - Click ‘Exit’ button

2c

Page 32 of 36

Expected Result Actual Result MIPS application launches. Login Screen Appears Data entry fields are large enough to account for login information Distribution Menu Appears Application exits

Systems Analysis and Design Project – STAGE 3

TEAM 3 3a 3b

Distribution Menu - Test ‘View Entire Processing Queue’ button Distribution Menu - Test ‘New Request’ Button

4a

Distribution Clerk Entry Page 1 - Input historical Client Information

4b

Distribution Clerk Entry Page 1 - Test ‘Proceed’ button

4c

Distribution Clerk Entry Page 1 - Test ‘Cancel’ button

5a

Distribution Clerk Entry Page 2 - Input historical Client information

5b

Distribution Clerk Entry Page 2 - Test ‘Submit’ button

5c

6a

6b

6c

Distribution Clerk Entry Page 2 - Test ‘Cancel’ button

New Request Review Screen - Verify accuracy of data on review screen

Entire Processing Queue appears Distribution Clerk Entry Page 1 appears Data Entry Fields have adequate character spaces to account for Client Information Distribution Clerk Entry Page 2 appears New Request Cancelled. Standard Queue appears. No residual data in database. Data Entry Fields have adequate character spaces to account for client information New Request Review Screen appears New Request Cancelled. Standard Queue appears. No residual data in database. Data reflects data input in prior Entry Page 1 and 2.

New Request Review Screen - Test ‘Submit’ button

Potential Due Date Calculated and displayed. Data entered into database.

New Request Review Screen - Test ‘Cancel’ button

Request enters Underwriting Dept Queue. - Active Dept changes - Processing Status changes New Request Cancelled.

Page 33 of 36

MGT 772SB

Systems Analysis and Design Project – STAGE 3

TEAM 3

MGT 772SB

Standard Queue appears. No residual data in database. Etc.

6 Compatibility Testing 6.1 Test Cases Compatibility testing will focus on installation and usage of the MIPS application concurrently with other horizontal applications. Horizontal applications defined as programs that are used throughout the firm on almost every computer. These programs have been defined in the software base test environments (SBTE) in Section 3.2. Test (P) Or (F) Number Action Execute and use Microsoft Office Application in 1 conjunction with MIPS.

2

3 4 5

Expected Result Actual Result No Error Messages.

Execute and use Acrobat Reader Application in conjunction with MIPS

No performance issues. No error messages.

Test installation on Windows NT4.0 - Follow steps in Section 4 Test installation on Windows 2000 - Follow steps in Section 4 Test installation on Windows XP - Follow steps in Section 4

No performance issues. Pass all Section 4 test cases Pass all Section 4 test cases. Pass all Section 4 test cases.

Etc.

7 Exploratory Testing Exploratory Testing is done outside the realm of the standard current production environment. This is to take into account possible future conflict scenarios. Included in this Section will be tests for Concurrency Control, Audit Controls implemented, etc. Fictional (artificial) Data The second part of functionality testing will involve stress testing. Artificial data will be created to test and ensure that there are no fault points in regards to data input fields and process flow.

Page 34 of 36

Systems Analysis and Design Project – STAGE 3

TEAM 3

MGT 772SB

7.1 Test Cases Test (P) Or (F) Number Action Test Invalid Login attempts 1

2

Test Multiple Instances of User Login

3

Test Multiple User Access on the same active request. Every request can only have one person accessing at a time and other user attempts will be locked from access.

Expected Result Actual Result Invalid login credentials will not access account. Multiple invalid login attempts result in User ID lock Users cannot log in simultaneously from multiple locations. Only one user access at a time per record. Attempts to access locked file will produce message stating reason code (which user is currently accessing, etc.)

Etc.

8 Performance Testing All performance logging is to be performed before, during, and after installation with markers to represent event changes where available. Prior and post logging should last for duration of no less than 5 minutes and no more than 10 minutes.

8.1 Test Cases (P) Or (F) Test Number Action P1

Expected Result Actual Result

Perfmon Trace of Available Bytes (Memory) Any change during installation is returned to state prior to installation.

Page 35 of 36

Systems Analysis and Design Project – STAGE 3

TEAM 3

MGT 772SB

Any change during installation is returned to state prior to installation. Any change during installation is returned to state prior to installation.

P2

Perfmon Trace of Pages/sec (Memory)

P3

Perfmon Trace of % CPU Utilization

P4

Perfmon Trace of % Disk Utilization

Any change during installation is returned to state prior to installation.

P5

Perfmon Trace of % Network Utilization

Any change during installation is returned to state prior to installation.

9 Summary The tester will include a Summary of his or her findings. All potential problems must be reported in detail in this section.

10 Issues/Bugs All issues and bugs in the system must be reported and documented in detail in this section. This will allow all interested parties to understand the changes that have taken place. This section will also serve as a “lessons learned” so that future of testers of the application will know to test for these cases as well.

11 Recommendations Finally, this is the recommendation section for the User Acceptance Tester. This is basically a sign-off or rejection of the application. Have all the issues and potential problems been addressed (Sections 9 and 10)?

Page 36 of 36