- Device Simulation/Emulation
Using the eValid
SetUserAgent command to set the
User Agent String (UAS)
so that the browser activity is reported to the server as if
it was being requested from the specified device.
eValid does simulation/emulation tricks without needing the actual hardware and software!
The server responses to client-side user requests are the same
when talking to the actual mobile device.
- Content Validation
Once a test script is running well,
you need to validate
that the eValid rendered output matches what you expect in the actual device.
Here is a
Rendering Comparison for Common Devices Vs. Device Emulator.
As you can see from this comparison --
setting aside pixel density factors --
the rendering by eValid is identical to that for the mobile device.
So you can easily
confirm -- and automatically validate --
that evalid is rendering your mobile application correctly.
- Functional/Performance Testing
To illustrate how this works we ran an experiment that applied
a simple functional test to
100 Mobile Devices,
and recorded the download byte counts and times.
The data is shown here,
with some
Selected Screenshots
of how the server content varied as a function of the device simulated/emulated.
Note how much the downloaded byte counts vary across devices.
An interesting secondary aspect is the asymmetric ratio of
Download Bytes vs. Download Time,
which illustrates that some servers
are not well prepared to deliver mobile content to every kind of device.
- Server Loading
eValid functional tests lift easily to a server loading context,
with multiple eValid instances (which we call "Browser Users" or BUs)
working in parallel to create high load levels.
This is illustrated in an experiment
that drives a mobile application from a simulated smartphone up to
1,000 BUs
-- up to 1,000 simulated users.
The resulting
Chart of Derived Internal Response Times
shows how the server involved began to have a response time problem
above ~200 simultaneous users.
- Mobile Device Monitoring
To see how pages perform over time,
we set up a demo of six monitoring objects.
Three objects represent using a particular mobile device to view
three popular e-commerce sites in sequence,
and then three more objects show how the three sites respond to each of the devices.
The devices useed in this demo are a
smartphone (an iPhone),
a tablet computer (an iPad),
and a standard desktop computer (PC).
The monitoring interval used is 15 minutes, and
web access is a standard 5 Mbps DSL connection.
The results can be seen on
an active eValidator Monitoring Portal
using the account name mobile and the password mobile.
In all six examples the variation in response as a function of time of day
are quite intriguing.