Firefox Performance Testing : A Python framework for Windows
Contents
- 1 Project Name
- 2 Project Description
- 3 Project Leader(s)
- 4 Project Contributor(s)
- 5 Project Details
- 6 Project Problems and Solutions
- 6.1 Problem: Firefox doesn't know how to open this address, because the protocol (c) isn't associated with any programs
- 6.2 Solution: Firefox doesn't know how to open this address, because the protocol (c) isn't associated with any programs
- 6.3 Problem: ZeroDivisionError: integer division or modulo by zero
- 6.4 Solution: ZeroDivisionError: integer division or modulo by zero
- 6.5 Problem: This page should close Firefox. If it does not, please make sure that the dom.allow_scripts_to_close_windows preference is set to true in about:config
- 6.6 Solution: This page should close Firefox. If it does not, please make sure that the dom.allow_scripts_to_close_windows preference is set to true in about:config
- 7 Project News
- 7.1 Saturday, September 23, 2006
- 7.2 Sunday, September 24, 2006
- 7.3 Monday, September 25, 2006
- 7.4 Friday, September 29, 2006
- 7.5 Sunday, October 1, 2006
- 7.6 Wednesday, October 4, 2006
- 7.7 Friday, October 6, 2006
- 7.8 Wednesday, October 11, 2006
- 7.9 Thusday, October 12, 2006
- 7.10 Friday, 20 Oct 2006
- 7.11 Tuesday, 31 Oct 2006
- 7.12 Tuesday, 21 Nov 2006
- 7.13 Wednesday, 29 Nov 2006
- 7.14 Sunday, 3 Dec 2006
- 7.15 Sunday, 10 Dec 2006
- 7.16 Wednesday, 13 Dec 2006
- 8 Project References
- 9 Project Events
Project Name
Firefox Performance Testing : A Python framework for Windows
Project Description
The goal of this project is to:
- get the current framework up and running to help work with others
- get the framework running in an automated fashion
- help with the creation and execution of new tests
- work to upgrade the framework to work with a mozilla graph server
- work with the mozilla community and contribute to an open source project
From this project, you will:
- learn python
- learn about white box testing methodologies
- work with an open source community
- more generally learn about the functioning of QA in an open source community
This will benefit you in the future when presented with a new program, you'll be able to give an idea of how to approach testing - to give adequate coverage and be able to provide some metric of program stability and functionality
Note: This is NOT the typical mundane black box testing
Project Leader(s)
Project Contributor(s)
Dean Woodside (dean)
- Submitted an sh file that automates the tedious performance testing framework configuration
Alice Nodelman
- Discussed on the things that need to be fixed to improve and strengthen the framework
- Gave suggestions on the new Performance Testing framework
Ben Hearsum (bhearsum)
- Set up the VM for performance testing
- Helped get me started with the debugging process for report.py, run_tests.py and ts.py
Michael Lau (mylau)
- Added comments on the documentation for setting up Performance Testing framework for Windows
- Tested the new and improved Performance Testing framework (version 1)
- Gave constructive feedback on documentation (version 1)
Eva Or (eor)
- Tested the new and improved Performance Testing framework (version 1)
- Gave constructive feedback on new documentation (version 1)
David Hamp Gonsalves (inveigle)
- Gave pointers on flushing buffer
- Helped with some grammar and sentence structuring for documentation (version 1)
- Tested and gave constructive feedback on the framework (version 1)
- Looked into a batch file for automating configuration, gave pointers
- Tested and commented on the new framework (version 2)
Tom Aratyn (mystic)
- Introduced Closures in Python
In-Class Contributors
Please let me know if I missed you out. I've only listed the people whom I've received comments from. Those of you who participated but isn't listed as an in-class contributor, please list your comments here
- Mark D'Souza (mdsouza)
- Sherman Fernandes (sjfern)
- Aditya Nanda Kuswanto (vipers101)
- Dave Manley (seneManley)
- Colin Guy (Guiness)
- Mohamed Attar (mojo)
- Man Choi Kwan (mckwan)
- Mark Paruzel (RealMarkP or FakeMarkP)
- Jeff Mossop (JBmossop)
- Melissa Peh (melz)
- Paul Yanchun Gu (gpaul)
- Vanessa Miranda (vanessa)
- Philip Vitorino (philly)
- Paul St-Denis (pstdenis)
- Mohammad Tirtashi (moe)
- Cesar Oliveira (cesar)
- Andrew Smith (andrew)
- Ben Hearsum (bhearsum)
- Erin Davey (davey_girl)
- Tom Aratyn (mystic)
- David Hamp
- Dean Woodside (dean)
Project Details
Improved Documentation
Latest
New Firefox Performance Testing Documentation
First Attempt
Performance Testing Setup Configuration Documentation
Details
This is different from Tinderbox. Two major differences are:
- First, it doesn't build, it just runs the performance test given a path to the executable. This is helpful if you're testing the performance of an extension or a build from another server. (You could build on a fast server, and then run performance tests on a machine with low memory).
- Second, it measures performance characteristics while it's running the pageload tests--you can track cpu speed, memory, or any of the other counters listed here.
Web logs
- Mozilla Quality Assurance and Testing Blog for Mozilla Firefox and Thunderbird
- Mozilla Quality Assurance Performance Testing
Submission
Still in progress!
While it's sizzling, try out the Firefox Performance Testing Framework (version 2) at New Firefox Performance Testing Documentation Quick Start section - 3 easy steps
Artifact | Details | Links |
---|---|---|
run_tests.py |
|
|
perfconfig.sh | This script is used to automate the tedious Firefox Performance Testing Configuration
|
|
Progress bar class |
|
|
paths.py |
|
|
Firefox Performance Testing Documentation (Version 1) |
Firefox Performance Testing Framework (Refer to Readme.txt)
|
|
Firefox Performance Testing Documentation (Version 2) |
Goals achieved:
|
|
Status Documentation |
|
Extended Progress Chart (Version 2)
Task | Details | Priority | Contributors | Status |
---|---|---|---|---|
Read all student comments about framework and improve it |
|
High |
100% completed
|
|
Drill and narrow down the student comments to improve framework |
|
High |
In progress
|
|
Fix Performance Testing Setup Configuration Documentation and framework configuration |
|
High |
New status New Firefox Performance Testing Documentation
Will start on this after drilling down the student comments
In progress
|
|
Pre-requisites: Python, Cywin and .dll Installation Guidelines |
|
Medium |
New Firefox Performance Testing Framework comments 100% done
|
|
Configuring Environment Specifications in documentation |
BASE_PROFILE_DIR
|
Medium |
I have nothing to work with besides the README.TXT file by Annie Sullivan. Therefore, I have to consult Alice Nodelman in this area. In progress
|
|
Automate environment configuration |
|
Medium |
|
Files: 100% completed
|
Porting framework to other OSes |
|
Medium |
0% completed
|
Progress (Version 1)
Task | Details | Priority | Contributors | Status |
---|---|---|---|---|
Performance Testing Setup Configuration Documentation |
|
High |
90% completed - Needs to be reviewed and tested out
|
|
Study performance testing framework |
The framework has to be strengthened and improved. A discussion with Alice Nodelman is planned to discuss about things that could be done to make the framework stronger. |
High |
|
Tested the framework and went through the coding in the framework. Made a list of the weaknesses of the framework and planned various resolutions.
We have established that the following has to be done:
100% completed
|
Configuration checker |
The configuration checker will check if all the configuration is done before running the performance testing. The checker is in run_tests.py and it entails:
This can only be done when the yaml file validator and paths.py validator are completed. |
High |
Started and ongoing
|
|
yaml file validator |
In run_tests.py: The validator of yaml file is weak. It only checks for certain items in the file and will crash if those items are not there or if those items doesn't have any value. It doesn't check for unexpected values and doesn't give the user a clue that their yaml file has a problem. |
High |
|
Changed the validator to check if items exist before storing the value. If one of the items doesn't exist, the program will terminate and it will let the user know that the yaml file has to be fixed. The yaml validator works in the following manner:
100% completed (Alice reviewed)
|
paths.py validator |
Currently the run_tests.py file doesn't validate the paths.py file. If the user misses a path or does a bad directory configuration, the program will crash and give this error: Traceback (most recent call last): File "C:\proj\mozilla\testing\ performance\win32\run_tests.py", line 129, in ? test_file(sys.argv[i]) File "C:\proj\mozilla\testing\ performance\win32\run_tests.py", line 122, in test_file TP_RESOLUTION) File "C:\proj\mozilla\testing\ performance\win32\report.py", line 152, in GenerateReport mean = mean / len(ts_times[i]) ZeroDivisionError: integer division or modulo by zero The following has to be done in the run_tests.py file to validate the paths.py file:
|
High |
|
Most updated progress
Older progress
extension_perf_testing(dir) | | base_profile (dir) | |__ bookmarkbackups (dir) | | | |__ .html files | |__ Cache (dir) | |__ .bak, .html, .ini, .dat, .txt, .rdf, .mfl files (most important file - perf.js)
100% completed
|
Get all the configuration in one place |
The framework is currently very confusing and the configuration is all over the place! This has to be fixed, but it's not the main priority:
As long as a good documentation is provided, this is not a main priority |
Medium |
Changed structure of directory
|
|
Setup and test out the current Performance Testing framework using the provided documentation (not mine) to discover more flaws in either the framework or documentation. This will facilitate me in using a more user-oriented approach when improving the framework and documentation |
More input is welcome on the current Performance Testing framework to help improve it. |
Medium |
on-going
|
|
Get class to test out the Performance Testing framework |
The improved framework has to be tested to get constructive feedback from the users using the new Performance Testing Setup Configuration Documentation |
Low |
|
100% completed
|
Comments on the README.TXT Documentation (By Mike Lau)
Getting Started
- I was not sure what to do after unzipping the win32.zip file.
Comments on The README.TXT File
- The README.TXT file is hard to read in notepad. There's horizontal scrolling after opening the file.
- The file should have output samples to show the user what output to expect and to ensure they are on the right track.
- The pre-requisites part of the README.TXT file is missing minimum size to be allocated in the computer to complete the install.
- Following the procedures for installation was difficult. The document should number the procedures and have sample outputs to ensure the user is going on the right track
- Some of the folders which was listed in the Directory Structure was missing in the setup procedures. The Directory Structure should be used as a guide to ensure users have the right folders in place. Also, there should be an image to represent the directory structure. Some of the folders which needed to be created on top of following the procedures provided were: base_profile, extension_perf_testing, extension_perf_reports
- The Setup part was hard to follow. Most of the steps were not intuitive.
- Step 4 of the procedures was unclear. I was not sure what kind of YAML config file to be created. The document should tell the user expcity what the file name needs to be called as or tell the user they can call it however they wish. It should also show an example filename as well.
Running The Application
After following the setup procedures, I typed the following command:
c:\> run_tests.py config.YAML
And I got the following error message in the command prompt and a popup window ( Liz Chak - Solution to this problem):
Project Problems and Solutions
Problem: Firefox doesn't know how to open this address, because the protocol (c) isn't associated with any programs
If you didn't configure the paths.py paths correctly, you may run into this problem when you run the Performance Testing Framework:
Solution: Firefox doesn't know how to open this address, because the protocol (c) isn't associated with any programs
In paths.py, the paths for INIT_URL, TS_URL and TP_URL have to be a local file url, not file path (file:///c:/):
"""The path to the file url to load when initializing a new profile""" INIT_URL = 'file:///c:/project/mozilla/testing/performance/win32/initialize.html' """The path to the file url to load for startup test (Ts)""" TS_URL = 'file:///c:/project/mozilla/testing/performance/win32/startup_test/startup_test.html?begin=' """The path to the file url to load for page load test (Tp)""" TP_URL = 'file:///c:/project/mozilla/testing/performance/win32/page_load_test/cycler.html'
Problem: ZeroDivisionError: integer division or modulo by zero
Traceback (most recent call last): File "C:\proj\mozilla\testing\performance\win32\run_tests.py", line 129, in ? test_file(sys.argv[i]) File "C:\proj\mozilla\testing\performance\win32\run_tests.py", line 122, in te st_file TP_RESOLUTION) File "C:\proj\mozilla\testing\performance\win32\report.py", line 152, in Gener ateReport mean = mean / len(ts_times[i]) ZeroDivisionError: integer division or modulo by zero
Solution: ZeroDivisionError: integer division or modulo by zero
Check if there is contents in the base_profile directory that you have set for BASE_PROFILE_DIR in paths.py:
- By default in paths.py: BASE_PROFILE_DIR = r'C:\extension_perf_testing\base_profile'
- The BASE_PROFILE_DIR could be a different path, it doesn't have to be like the one above.
Problem: This page should close Firefox. If it does not, please make sure that the dom.allow_scripts_to_close_windows preference is set to true in about:config
- You have to change the dom.allow_scripts_to_close_windows preference to true
Solution: This page should close Firefox. If it does not, please make sure that the dom.allow_scripts_to_close_windows preference is set to true in about:config
- Type in about:config in the address bar
- Scroll down and look for dom.allow_scripts_to_close_windows
- Double click on it to set it to true
Project News
Saturday, September 23, 2006
Performance tests didn't run sucessfully.
- There weren't any results generated in the extension_perf_testing\base_profile and extension_perf_reports folders.
- Output after the performance tests were run:
Traceback (most recent call last): File "C:\proj\mozilla\testing\performance\win32\run_tests.py", line 129, in ? test_file(sys.argv[i]) File "C:\proj\mozilla\testing\performance\win32\run_tests.py", line 122, in te st_file TP_RESOLUTION) File "C:\proj\mozilla\testing\performance\win32\report.py", line 152, in Gener ateReport mean = mean / len(ts_times[i]) ZeroDivisionError: integer division or modulo by zero
Sunday, September 24, 2006
Understand further the approach to testing with the Python framework
Monday, September 25, 2006
elichak will be working on a resolution with alice to get the results generated in the extension_perf_testing\base_profile and extension_perf_reports folders.
Friday, September 29, 2006
elichak re-configured the environment of the machine to run the tests again. Cleaned up old files to do a clean test. Reinstalled Cygwin (replaced Make 3.80 with Make 3.81) and updated the testing files through CVS.
Sunday, October 1, 2006
Alice has successfully run the tests. The Zero Division error didn't occur again after she updated her test files. There were results generated in the extension_perf_testing\base_profile and extension_perf_reports folders. elichak attempted to run the test with the alice's code but the Zero Divsion Error still occured on her machine.
Wednesday, October 4, 2006
Elichak consulted Robcee about the Zero Division Error and he suggested a few things, like debugging the script. Elichak found out that the value of ts_time in the report.py file is empty but couldn't find out why the value of ts_time isn't assigned. According to alice, she didn't debug the scripts and only had to update the files to make them work.
Friday, October 6, 2006
Ben set up the VM for elichak to run her performance testing in that environment.
Wednesday, October 11, 2006
- elichak configured the environment in the VM for her testing. The tests still gave the same results as before:
- Zero Division Error at lines 122 and 129 in run_tests.py and line 153 in report.py
- 2 files in the extension_perf_reports dir are generated but there are no graphs
- elichak also changed the TS_NUM_RUNS, TP_NUM_CYCLES, TP_RESOLUTION values to 1 in run_tests.py to shorten the cycles of the performance testing for the purpose of debugging the scripts.
- The error occurs in report.py because ts_time is empty, therefore, this fails:
for ts_time in ts_times[i]: mean += float(ts_time) mean = mean / len(ts_times[i])
- Ben assisted elichak with the debugging process. elichak and Ben hacked deeper down into the scripts.
- We speculate that the thing that is affecting the value of ts_time being generated is in ffprocess.py: RunProcessAndWaitForOutput always returns None in line 232
return (None, True)
- Further debugging by elichak is in process
Thusday, October 12, 2006
Work completed
The Zero Division Error is solved. Turns out that it was just a configuration problem. The documentation to set up the environment was rather subtle and needs a re-work.
Solution
Contents in the C:\proj\mozilla\testing\performance\win32\base_profile should also be in C:\extension_perf_testing\base_profile dir.
All work for this project is done on the VM, hera.senecac.on.ca
Work in progress
- Trying out a few things in the framework to find out which direction I would like to do to the framework, either building new tests, improving on existing ones, strengthening the framework itself or porting it to other OS's
- Revise the Firefox Performance Testing documentation
Friday, 20 Oct 2006
Last week, elichak has established to work on automating the setup of the environment and performance testing. The performance testing and environment setup is currently all over the place and is tedious for the developer to set it up.
The automation will entail:
Generating directories, dropping files in directories, installation of libraries, options to configure the performance testings etc.
Tuesday, 31 Oct 2006
Alice and Liz had a meeting and have established the key things that need to be done. What needs to be done:
- ease configuration of python framework
- too many config files to edit
- have to know whole framework to configure it
- not flexible
- tedious
- too many directories to create
- too many extra libraries to load
- a lot of dependancies!
- things have to be copied to special directories
- bad configurations don't cause errors!
- too many config files to edit
How do we fix this?
- configuration checker
- yaml file validator
- paths.py validator
- checking the paths for existance
- notify user if path doesnt exist and ask user if they want it created
- checking if the directories have contents
- next steps
- get all the configuration in one place!
- paths.py, config.yml, constants
- have to run both ts and tp at the same time
- get all the configuration in one place!
Tuesday, 21 Nov 2006
Refer to progress chart. Performance Testing Framework progress chart
Wednesday, 29 Nov 2006
In-class Performance Testing Framework Configuration
- DPS909 students contributed by testing the framework
- Students have listed constructive feedback and many great suggestions to help improve the setup of the framework
Sunday, 3 Dec 2006
- Dean Woodside has submitted a sh scripting for this. It works great. Needs to be touched up.
- Liz Chak will fix up the sh script
- Doesn't make a reports folder
- Still runs the whole script even if it couldn't download the pre-requisites, should terminate
- Missed out the paths.py file
- Requires touch ups, but everything else is good
Sunday, 10 Dec 2006
- Narrowing down the things that need to be done before 13 December 2006
- Refer to Extended Progress Chart
Wednesday, 13 Dec 2006
- Created a new documentation
- Finished my parts for the Performance Testing Framework
- Refer to submission
The current Firefox Performance Testing Framework is effective and efficient.
Things To-do (Immediately)
- Clean up this page - almost done
- Document source - almost done
- Fix up the Performance Testing Documentation
Project References
- I recommend these sites to learn about python:
Project Events
In class Firefox Performance Testing
Before you begin
- You have to be on a Windows operating system
- Take a deep breath and go through the Performance Testing Setup Configuration Documentation
- Make a page under Comments on the Firefox Performance Testing Framework (see below) to list out your comments on the framework
- The performance testing is only successful if you see generated results in the reports directory (you'll understand what this means once you start)
- Ask Liz Chak if you have any questions
Things to look out for
- If the documentation is easy to follow
- Did you stumble upon any difficulty? If so, state them
- List any system errors, if any (with the error message and a brief description of what you think caused it)
Comments on the Firefox Performance Testing Framework
Instructions:
- Make a page starting with your name/nick and end with "perf comments" (to ensure uniqueness)
- Example: Liz Chak perf comments
List of comments:
- Mark D'Souza perf comments
- Sherman Fernandes perf comments
- Aditya Nanda Kuswanto perf comments
- Richard Chu perf comments
- David Manley perf comments
- Djhamp-g perf comments
- Colin Guy perf comments
- Mohamed Attar perf comments
- Man Choi Kwan perf comments
- Mark Paruzel pref comments
- Jeff Mossop perf comments
- Melissa Peh perf comments
- Paul Yanchun Gu comments
- Vanessa Miranda comments
- Philip Vitorino perf comments
- Paul St-Denis Moe Bagheri perf comments
- Cesar Oliveira perf comments
- Ben Hearsum
- Tom Aratyn
- Erin Davey
- Dean Woodside
Bon Echo Community Test Day
- Friday, October 06, 2006, from 7am - 5pm PDT