|
eValid -- Automated Web Quality Solution
Browser-Based, Client-Side, Functional Testing & Validation,
Load & Performance Tuning, Page Timing, Website Analysis,
and Rich Internet Application Monitoring.
|
|
eValid -- URL vs. HTTP Relative Performance Comparison
eValid Home
Main Idea:
The time it takes to download a page to a user's browser is often
much less than the total of the times it takes to download all of the
page's components individually.
This difference is due to the fact that the browser is multi-threaded and
thus does a lot of the work in parallel.
Summary
The table below compares the measured empty-cache download times
for various website top pages
against the total empty-cache download time of the same page's
set of component URLs when loaded serially.
The process used to collect this data was the following:
- Navigate to the website.
- Make a single URL recording, script.evs.
- Play it back against an empty cache and note the bytecount
and total download time.
- Set eValid up to create a URL.script.evs automatically
(using the URL Trace Creation Feature)
and rerun the playback.
This process creates the derived URL.script.evs file
which has a GetURL for each separate component of the page.
- Load the derived URL trace script and play it back
and note the total download times.
Results
Here are the collected results from a sample of typical websites.
#
| Page Byte Count
| Website
| Single URL Download Time (sec)
| # GetURLs
| Total GetURLs Download Time (sec)
| Ratio
|
1 | 1,020,240 | www.cnn.com | 7.484 | 187 | 53.063 | 7.090
|
2 | 493,418 | www.latitude28.com | 4.656 | 19 | 9.265 | 1.980
|
3 | 669,578 | www.ibm.com | 10.422 | 89 | 26.297 |
2.520
|
4 | 152.027 | www.cityofchicago.org | 6.734 | 32 |
17.531 | 2.750
|
5 | 256,354 | www.wikipedia.org |
4.078 | 29 | 14.328 | 3.513 |
6 | 471,921 | www.slashdot.org |
4.765 | 59 | 21.726 | 4.568
|
7 | 85,568 | www.inventblog.com | 3.516 | 23 | 10.350 |
2.955
|
8 | 265,535 | www.wittmanhart.com | 2.813 | 25 |
7.593 | 2.699
|
9 | 274,968 | www.e-valid.com | 6.953 | 20 | 12.672 | 1.823
|
10 | 208,363 | book.google.com | 4.031 | 35 | 16.890 | 4.190
|
11 | 624,939 | www.skicb.com | 12.984 | 58 | 35.313 | 2.719
|
12 | 720,357 | www.keynote.com | 5.875 | 67 | 23.343 |
3.803 |
13 | 119,536 | www.groundwork.it | 7.453 | 48 | 15.609 | 2.094
|
Observations & Conclusions
Here are some observations about the above data.
- The number of component URLs -- from 20 to nearly 200 --
is typical of the way websites are composed.
Remember, each separate image is treated as a distinct URL.
- The ratio of the download times is a rough approximation
to the "parallelism" of the full-browser download process.
The eValid browser -- like the underlying IE browser --
runs multiple independent threads that download individual
components of a page.
If the ratio is 3:1 this implies that, on average,
three threads were running when the page was downloaded by the browser.
(The actual number of threads active at any one time varies on the
page size and the size of the various components URLs).
- What the user sees is the Single URL Download Time.
However, there are monitoring and performance measurements solutions
that attempt to approximate browser behavior (end user experience
measurement) by downloading the parent URL and then, in sequence,
all of the component URLs.
This approach measures cumulative download time, not parallel download time.
- The system overhead figures (the time eValid takes to make the
detailed measurements) for the above data appear to be
< 1% of the total time.
- The error factor in measurement (the Ratio, above)
appears to be the largest for the pages
that are largest.
- On some of these sites the page size and number of component elements varies with time,
so if you make the same test several times you'll get slightly different results.
The data above appear to be representative.