...

IBM Rational ClearQuest Web Version 7.1 Performance Report (Windows)

by user

on
Category: Documents
22

views

Report

Comments

Transcript

IBM Rational ClearQuest Web Version 7.1 Performance Report (Windows)
IBM Rational ClearQuest Web
Version 7.1 Performance Report (Windows)
This report measures the performance and scalability of IBM Rational ClearQuest Web (CQ Web)
7.1 and compares it to the 7.0.1 release.
Version 7.1 of Rational ClearQuest and Rational ClearCase introduces the Change Management
Server (CM Server), which provides server-side support for wide area network (WAN) interfaces
to Rational ClearQuest and Rational ClearCase. CM Server is a unified application server that
combines the IBM WebSphere Application Server and IBM HTTP Server to provide Web
support for Rational ClearQuest Web (CQ Web) and Rational ClearCase Remote Client (CCRC).
CM Server leverages the performance, security and scalability of WebSphere Application Server
version 6.1. For more information about the CM Server architecture, deployment scenarios, and
administration, see the Rational ClearQuest 7.1 information center.
NOTE: Any performance data contained herein was measured in a controlled environment.
Therefore, the results obtained in other operating environments might vary significantly. Users of
this document should verify the applicable data for their specific environment.
Test Environment and Configuration
A series of tests were conducted using IBM Rational Performance Tester (RPT) to measure CQ
Web performance and scalability for the 7.0.1 and 7.1 releases. All tests were performed in a test
network environment with a 100 millisecond round-trip latency between the simulated browser
clients and the CQ Web server. Network connectivity between the CQ Web server and the
database server was 100 Mbps, full-duplex Ethernet with no latency (see Figure 1).
The CQ Web server was configured without SSL encryption in a single-machine configuration
containing all the CQ Web server components. Both the 7.0.1 and the 7.1 versions of CQ Web
were installed on identical hardware. There were no customizations to any of the default CQ Web
or CM Server settings. The CQ schema used for testing is of moderate complexity.
`
Web browser
simulation
(RPT)
ClearQuest Web server
CM Server
Database
server
Figure 1: Test configuration diagram
CQW0710PRW
1
Rev. 1.0
Hardware Configuration:
Test driver
IBM ThinkCentre with two dual 3 GHz Pentium 4 CPUs, 2.5 GB
memory
Operating system: Microsoft Windows XP Professional Version 2002
Service Pack 2
CQ Web server
IBM IntelliStation Z Pro with two dual 2.40 GHz Intel Xeon CPUs, 4
GB memory, Physical Address Extension enabled
Operating system: Microsoft Windows 2003 Server Enterprise Edition
with Service Pack 2
Database server
Dell Server PE2650 with two dual 2.4 GHz Intel Xeon CPUs, 2 GB
memory
Operating system: Microsoft Windows 2000 Service Pack 4
Database server: Microsoft SQL Server 2000 Service Pack 4
Transaction Details
IBM Rational Performance Tester (RPT) 7.0.2 was used to simulate virtual CQ Web users. For
single-user benchmarks, transactions were executed by an RPT script that iterated 50 times as
quickly as possible. The response times for each transaction were then averaged and compared.
In the environment described above, RPT measured the CQ Web response times for the
transactions in Table 1:
Use Case Tested
Login
Description
Time from submitting the login request to fully loading the
workspace
Create new defect
(initiate)
Wait for the blank form to display
Create new defect
(commit)
Commit (save) the defect after filling in the form
Find defect
Find an existing defect
Modify defect (initiate)
Wait for the form to display in edit mode
Modify defect (commit)
Commit (save) the defect after making modifications
Run saved query
(100 records)
Query for 100 defects
Load defect from results
set
View a defect returned in a query result set, wait for the
defect form to display
Logout
Log out of the application
Table 1: Rational ClearQuest Web transactions
Figure 2 shows the relative distribution of the transactions in Table 1 that were used for multiuser testing.
CQW0710PRW
2
Rev. 1.0
Modify Defect
(Commit)
6%
Modify Defect
(Initiate)
6%
Run Query (100
records)
12%
Find Defect
16%
Load Record from
Result Set
12%
Create New Defect
(Commit)
12%
Create New Defect
(Init)
12%
Other Transactions
12%
Logout
6%
Login
6%
Figure 2: Rational ClearQuest Web transaction mix
7.1 ClearQuest Web Architecture
ClearQuest Web 7.1 uses Web 2.0 and Ajax (Asynchronous JavaScript and XML) technologies.
Key advantages of these technologies include automatic pre-fetching of data, data caching in the
browser layer, and the ability to execute concurrent tasks (run several queries at once, commit a
defect while finding another, and so on). Consequently, our 7.1 single-user benchmarks track the
first instance of a transaction when no data is called from the browser’s cache (first request) and
the second and subsequent instances of a transaction when static data is called from the browser’s
cache (second request).
Findings
These test results are specific to the product software, test configuration, workload and
environment used. Product performance in other environments or conditions might be different
compared to these results.
Single-user benchmark (SUB)
Figure 3 shows the single-user benchmark (SUB) for CQ Web response times under the test
conditions described above. For the CQ Web 7.1 release, response times for the first and second
request were measured.
CQW0710PRW
3
Rev. 1.0
ClearQuest Web 7.1 Compared with ClearQuest Web 7.0.1 Single-User Benchmark
2.00
Response Time (sec.)
1.80
1.60
1.40
1.20
7.0.1
1.00
7.1--1st request
0.80
7.1--2nd request
0.60
0.40
0.20
0.00
Login
Create
Create
New
New
Defect
Defect
(initiate) (commit)
Find
Defect
Modify
Modify Execute
Defect
Defect
Saved
(initiate) (commit) Query
(100
records)
Load
Record
from
Result
Set
Logout
Transaction
Figure 3: Rational ClearQuest Web 7.1 compared with ClearQuest Web 7.0.1 single-user
benchmark
Multi-user performance and scalability
During the development cycle, iterative performance testing tracked overall progress and revealed
product areas to target for performance improvement. Multi-user scalability tests were run at a
per-user transaction rate averaging 24 transactions per hour (15 transactions per hour is a typical
developer rate). Response times for the 150-user run are averaged across 40 continuous hours of
testing.
Figure 4 compares CQ Web response times for single-user and 150-user workloads. The results
show that CQ Web transactions scale well as user load increases on a CM Server.
CQW0710PRW
4
Rev. 1.0
ClearQuest Web 7.1 Scalability
Comparison of Response Time 1 user vs. 150 users averaging 24 transactions-per-hour-per-user
Response Time (sec.)
2.50
2.00
1.50
1 user
150 users
1.00
0.50
0.00
Login
Create New Create New
Defect
Defect
(Initiate)
(Commit)
Find Defect
Modify
Defect
(Initiate)
Modify
Defect
(Commit)
Run Query Load Record
(100
from Result
records)
Set
Logout
Transactions
Figure 4: Rational ClearQuest Web 7.1 scalability: Averaged transaction response times with
1 and 150 simulated users
Conclusions
Based on the test results, CQ Web 7.1 response times maintain close parity to CQ Web 7.0.1
while providing a richer client experience. Under moderate load, CQ Web 7.1 response times
increase only marginally.
At the first instance of most CQ Web 7.1 transactions, the browser receives preferences and form
data that will be used for subsequent user transactions, such as viewing and modifying records.
Table 2 compares the login transaction between CQ Web 7.1 and CQ Web 7.0.1 and shows that
the CQ Web 7.1 login performs more actions than did CQ Web 7.0.1.
7.0.1 Login Sequence
Request Welcome Page
Request Login Page
Login to Requested
User Schema
Get Restricted User
Properties
7.1 Login Sequence
Request Welcome Page
Request Login Page
Login to Requested
User Schema
Get Restricted User
Properties
Get Preferences
Get Default Record
Type
Get Workspace Nodes
Set Preferences
Get CQ Form Data
Get Default Record
Type
Get Workspace Nodes
Table 2: Comparison of login between Rational ClearQuest Web 7.0.1 and ClearQuest Web 7.1
To optimize CM Server scalability under a large user load, performance tuning may be required.
For hardware planning, consider using server systems with at least four CPU cores (same as with
7.0.1) and at least 4-GB memory for CM Server deployment on the Windows platform. See the
ClearQuest 7.1 Minimum Hardware Requirements.
CQW0710PRW
5
Rev. 1.0
Legal Notices

Copyright IBM Corporation 2009 All Rights Reserved.

Dell and Precision are registered trademarks of Dell Inc. in the United States, other
countries or both.

Java is a registered trademark of Sun Microsystems Incorporated in the United States,
other countries or both.

IBM, Rational, ClearCase, ClearQuest, Rational Performance Tester, and System x are
trademarks of International Business Machines Corporation in the United States, other
countries or both.

Intel, Pentium and Xeon are registered trademarks of Intel Corporation or its subsidiaries
in the United States, other countries or both.

FreeBSD is a registered trademark of The FreeBSD Foundation in the United States,
other countries or both.

Windows is a registered trademark of Microsoft Corporation in the United States, other
countries or both.

Other company, product or service names may be the trademarks or service marks of
others.
CQW0710PRW
6
Rev. 1.0
Fly UP