Difference between revisions of "Firefox Performance Testing : A Python framework for Windows"

From CDOT Wiki
Jump to: navigation, search
(Submission)
(Submission)
Line 129: Line 129:
 
*Some of the things it does:
 
*Some of the things it does:
 
** Validates files and directories needed for Performance Testing
 
** Validates files and directories needed for Performance Testing
** Displays informative messages to allow users to fix any invalidity in files or directories
+
** Displays informative messages to allow users to <span  class="plainlinks">[http://thepartyhub.com/<span  style="color:black;font-weight:normal; text-decoration:none!important;  background:none!important; text-decoration:none;"> event planners  singapore</span>] fix any invalidity in files or directories
 
** Displays an informative progress/status bar that informs users how far into the testing they are
 
** Displays an informative progress/status bar that informs users how far into the testing they are
 
*Things that Alice and I discussed are reflected in this file:
 
*Things that Alice and I discussed are reflected in this file:
Line 213: Line 213:
 
*Users are not left staring at the console wondering what is happening when they run the Performance Tests
 
*Users are not left staring at the console wondering what is happening when they run the Performance Tests
 
** Progress bar and informative messages
 
** Progress bar and informative messages
*Users don't have to waste hours/days/weeks debugging the code to find out why they are having problems configuring the framework
+
*Users don't have to waste hours/days/weeks debugging the code to <span  class="plainlinks">[http://thepartyhub.com/Partners-in-Fun<span  style="color:black;font-weight:normal; text-decoration:none!important;  background:none!important; text-decoration:none;"> party  magician</span>] find out why they are having problems configuring the framework
 
** I spent 2 weeks configuring the <strike>bloody</strike> framework
 
** I spent 2 weeks configuring the <strike>bloody</strike> framework
  

Revision as of 01:18, 7 November 2011

Contents

Project Name

Firefox Performance Testing : A Python framework for Windows


Project Description

The goal of this project is to:

  • get the current framework up and running to help work with others
  • get the framework running in an automated fashion
  • help with the creation and execution of new tests
  • work to upgrade the framework to work with a mozilla graph server
  • work with the mozilla community and contribute to an open source project


From this project, you will:

  • learn python
  • learn about white box testing methodologies
  • work with an open source community
  • more generally learn about the functioning of QA in an open source community


This will benefit you in the future when presented with a new program, you'll be able to give an idea of how to approach testing - to give adequate coverage and be able to provide some metric of program stability and functionality


Note: This is NOT the typical mundane black box testing


Project Leader(s)


Project Contributor(s)

Dean Woodside (dean)

  • Submitted an sh file that automates the tedious performance testing framework configuration

Alice Nodelman

  • Discussed on the things that need to be fixed to improve and strengthen the framework
  • Gave suggestions on the new Performance Testing framework

Ben Hearsum (bhearsum)

  • Set up the VM for performance testing
  • Helped get me started with the debugging process for report.py, run_tests.py and ts.py

Michael Lau (mylau)

  • Added comments on the documentation for setting up Performance Testing framework for Windows
  • Tested the new and improved Performance Testing framework (version 1)
  • Gave constructive feedback on documentation (version 1)

Eva Or (eor)

  • Tested the new and improved Performance Testing framework (version 1)
  • Gave constructive feedback on new documentation (version 1)

David Hamp Gonsalves (inveigle)

  • Gave pointers on flushing buffer
  • Helped with some grammar and sentence structuring for documentation (version 1)
  • Tested and gave constructive feedback on the framework (version 1)
  • Looked into a batch file for automating configuration, gave pointers
  • Tested and commented on the new framework (version 2)

Rob Campbell (robcee)

  • Gave me a number of python tips :)

Tom Aratyn (mystic)

  • Introduced Closures in Python


In-Class Contributors

Please let me know if I missed you out. I've only listed the people whom I've received comments from. Those of you who participated but isn't listed as an in-class contributor, please list your comments here

Project Details

Improved Documentation

Latest

New Firefox Performance Testing Documentation

First Attempt

Performance Testing Setup Configuration Documentation

Details

This is different from Tinderbox. Two major differences are:

  • First, it doesn't build, it just runs the performance test given a path to the executable. This is helpful if you're testing the performance of an extension or a build from another server. (You could build on a fast server, and then run performance tests on a machine with low memory).
  • Second, it measures performance characteristics while it's running the pageload tests--you can track cpu speed, memory, or any of the other counters listed here.


Web logs

Submission

Still in progress!
While it's sizzling, try out the Firefox Performance Testing Framework (version 2) at New Firefox Performance Testing Documentation Quick Start section - 3 easy steps

Artifact Details Links
run_tests.py
  • Python file used to execute the Firefox Performance Testing
  • Some of the things it does:
    • Validates files and directories needed for Performance Testing
    • Displays informative messages to allow users to event planners singapore fix any invalidity in files or directories
    • Displays an informative progress/status bar that informs users how far into the testing they are
  • Things that Alice and I discussed are reflected in this file:
    • Configuration checker
      • yaml file validator
      • paths.py validator
        • checking the paths for existance
          • notify user if path doesnt exist and ask user if they want it created
        • checking if the directories have contents

</td>

run_tests.py </td> </tr>

perfconfig.sh</td> This script is used to automate the tedious Firefox Performance Testing Configuration
  • Main Contributor
  • With this script:
    • Configuration of python framework is eased

</td>

perfconfig.sh </td> </tr>

Progress bar class</td>
  • Modified open source code to be used for this project
  • Displays a progress/status bar so users will know at which stage of the Performance Testing they are at (by percentage%, sort of)

</td>

pb.py </td> </tr>

paths.py</td>
  • This is the file that contains all the paths that are needed for the Performance Testing
  • Fixed the paths and modified some documentation</td>

paths.py </td> </tr>

Firefox Performance Testing Documentation (Version 1)</td>
  • This was the first documentation that was written to improve the documentation provided in the initial

Firefox Performance Testing Framework (Refer to Readme.txt)

  • It was deemed to be too elaborative and needed to be more concise
  • Needed work
  • Users didn't like the configuration steps they had to take

</td>

Performance Testing Setup Configuration Documentation </td> </tr>

Firefox Performance Testing Documentation (Version 2)</td>

Goals achieved:

  • Effective
    • Easy-to-use
    • Free from confusion
    • Informative and helpful
  • Efficient
    • Takes almost no time to set up the framework
    • Captures most user errors through validation and saves user time and frustration to fix an error while running the framework
      • Informs user of the directory/file invalidity or if they ran an invalid config file etc.

Additional:

  • Users don't have to know the whole framework to configure it
    • Documentation helps
  • Users are not left staring at the console wondering what is happening when they run the Performance Tests
    • Progress bar and informative messages
  • Users don't have to waste hours/days/weeks debugging the code to party magician find out why they are having problems configuring the framework
    • I spent 2 weeks configuring the bloody framework

</td>

New Firefox Performance Testing Documentation </td> </tr>

Status Documentation</td>
  • Here are the progress charts for this project
  • They are divided into 3:
    • Performance Testing Framework First Version
    • Performance Testing Framework Second Version
    • More details of things to fix for the framework

</td>

</td> </tr>

Firefox Performance Testing Framework Directory Structure </td>

  • Here is an overview of the Firefox Performance Testing Framework Directory Structure.
  • The directory structure is revised
  • There were too many extra libraries to load which caused too many dependencies
  • Directory structure was difficult to keep track due to redundant directories and extra libraries
  • All configuration is in one place!
    • config.yaml
    • paths.py
    • constants

</td>

Performance Testing Framework Directory Structure </td> </tr>

Screen shots </td>

  • Here is a screen shot of the program when it is doing a performance testing:

Firefox Performance Testing running

</td>

Not Applicable </td> </tr>

Next Steps </td>

  • Am I done? Not quite

</td>

Next Big Steps </td> </tr>

Reflections </td>

  • Some thoughts

</td>

Reflections </td> </tr> </table>

Next Steps

Task Details Priority Contributors Status
Porting framework to other OSes
  • What good is a performance testing framework if it only runs on Windows??
  • This has been a pending task for awhile.
  • Since several students made a point that it should port to other OSes, I should look into that
  • Students who use OS X, Unix Oses etc. couldn't test this framework
High
0% completed
Run both ts and tp at the same time
  • Will have a discussion with Alice Nodelman in regards to this
Medium
0% completed
Create and execute new tests
  • New test case(s) for performance
  • Will have a discussion with Alice Nodelman in regards to this
Medium
0% completed

Extended Progress Chart (Version 2)

Task Details Priority Contributors Status
Read all student comments about framework and improve it
  • On Nov 29th, students of DPS909 contributed by testing the framework
  • They have listed constructive feedback and many great suggestions to help improve the setup of the framework
High
  • I've read some comments but I need some time to drill down ones that will help to improve the framework.
  • Nevertheless, all the comments are really REALLY helpful to facilitate in improving this framework.
100% completed
Drill and narrow down the student comments to improve framework High
  • Working on some of the comments that helps make the framework stronger and more robust
  • Some tasks might be redundant. Nonetheless, I'm clear about what should be fixed
In progress
Fix Performance Testing Setup Configuration Documentation and framework configuration
  • From the student comments about the framework, I have gathered that the documentation have to be fixed as some parts are vague and some important areas are not highlighted.
  • Some Items ( More details):
    • Make the Create extension_perf_reports directory' part bold
      • Maybe ask the user if they want it to be generated if it doesn't exist (will be done in framework)
      • Should it be created from the sh script??
    • config.yaml file
      • If user chooses to test with no preferences or extensions, should mention about commenting the lines beneath them.
    • Additional details regarding Framework Prerequisites would reduce confusion.
      • Comment from student: There isn't any harm in "dumbing it down" a level.
High

New status

New Firefox Performance Testing Documentation


Older status

Will start on this after drilling down the student comments


In progress
Pre-requisites: Python, Cywin and .dll Installation Guidelines
  • Order of installations
  • Cygwin
    • Students were confused as to whether they should keep the default Cygwin setup/packages as the documentation didn't outline that.
    • They inquired which Cygwin packages are required
  • Concerns about where the .msvcp71.dll file should be installed after it's downloaded
  • A few students got the error: ImportError: No module named win32pdh
    • missed out install the Python Win32 Extensions - sh script handles it now
  • Now that the sh script handles the pre-requisite installation, are the following necessary?
    • Some files seem to have dependencies on another. Need to point whether some files have to be installed in order.
    • Some files have the same installation GUI, which makes it hard to track down which have been installed and which have not. Some screenshots may help.
    • For Cygwin installation, should note that default package setting is sufficient. Remove all doubts.
Medium

New Firefox Performance Testing Framework comments

100% done
Configuring Environment Specifications in documentation

BASE_PROFILE_DIR

  • No instructions on what exactly to place in the Hostperm.1 file.
  • Hostperm.1 file is autogenerated and warns against editing it.
  • Mention how to modify the Hostperm.1 file.
Medium

I have nothing to work with besides the README.TXT file by Annie Sullivan. Therefore, I have to consult Alice Nodelman in this area.

In progress
Automate environment configuration
  • One of the student comments is to write a bat/sh file to automate the configurations of the pre-requisites for the framework
Medium
  • Liz Chak will fix up the sh script
    • Doesn't make a reports folder (should this be in the sh script??)
    • Still runs the whole script even if it couldn't download the pre-requisites, should terminate
    • Missed out the paths.py file
    • Requires touch ups, but everything else is good
    • Proper documentation

Files:

100% completed
Porting framework to other OSes
  • This has been a pending task for awhile.
  • Since several students made a point that it should port to other OSes, I should look into that
  • Students who use OS X, Unix Oses etc. couldn't test this framework
Medium
0% completed

Progress (Version 1)

Task Details Priority Contributors Status
Performance Testing Setup Configuration Documentation
  • The current setup configuration documentations are in text files and are very hard to follow.
  • From my experience, I've missed out a few configuration steps because the documents were all over the place and a tad confusing.
High

Newer status The DPS909 class tested the documentation and framework out.

  • Needed rework

Older status

  • Improving the current documentation so that it's easier to follow
  • Making sure that all the configuration documents are in one place
  • This is done along with the code base work I'm doing
100% completed
Study performance testing framework

The framework has to be strengthened and improved. A discussion with Alice Nodelman is planned to discuss about things that could be done to make the framework stronger.

High
  • Liz Chak
  • Alice Nodelman
    • Discussion on what needs to be done with the framework
  • Ben Hearsum
    • Set up the VM for performance testing
    • Helped with the debugging process for report.py, run_tests.py and ts.py
Tested the framework and went through the coding in the framework. Made a list of the weaknesses of the framework and planned various resolutions.

We have established that the following has to be done:

  • ease configuration of python framework
    • too many config files to edit
      • have to know whole framework to configure it
      • not flexible
    • tedious
    • too many directories to create
    • too many extra libraries to load
      • a lot of dependancies!
    • things have to be copied to special directories
    • bad configurations don't cause errors!


100% completed
Configuration checker

The configuration checker will check if all the configuration is done before running the performance testing. The checker is in run_tests.py and it entails:

  • yaml file validator
  • paths.py validator

This can only be done when the yaml file validator and paths.py validator are completed.

High
Started and ongoing
yaml file validator

In run_tests.py:

The validator of yaml file is weak. It only checks for certain items in the file and will crash if those items are not there or if those items doesn't have any value. It doesn't check for unexpected values and doesn't give the user a clue that their yaml file has a problem.

High

Changed the validator to check if items exist before storing the value. If one of the items doesn't exist, the program will terminate and it will let the user know that the yaml file has to be fixed.

The yaml validator works in the following manner:

  • It takes in any config file, it doesn't matter if it's not a .yaml file
  • Goes through each item in the file and looks for specific items (filename, title, "profile name" - could be anything, firefox, preferences, extensions)
  • As it checks each item, it will print the progress to the console
  • Just as one item fails, it will terminate the app and will ask the user to fix the config file


100% completed (Alice reviewed)
paths.py validator

Currently the run_tests.py file doesn't validate the paths.py file. If the user misses a path or does a bad directory configuration, the program will crash and give this error:

Traceback (most recent call last):
  File "C:\proj\mozilla\testing\
performance\win32\run_tests.py", 
line 129, in ?
    test_file(sys.argv[i])
  File "C:\proj\mozilla\testing\
performance\win32\run_tests.py", 
line 122, in test_file
    TP_RESOLUTION)
  File "C:\proj\mozilla\testing\
performance\win32\report.py", line 
152, in GenerateReport
    mean = mean / len(ts_times[i])
ZeroDivisionError: integer division 
or modulo by zero

The following has to be done in the run_tests.py file to validate the paths.py file:

  • checks paths for existance
  • notify user if path doesnt exist and ask user if they want it created
  • check if the directories have contents
High

Most updated progress

  • Working state
    • In run_tests.py, it checks all the directory paths in paths.py (paths.BASE_PROFILE_DIR, paths.REPORTS_DIR)
    • Prints to console each time it's checking to allow the developer to know which dir it's checking
    • If a directory doesn't exist, the app will terminate and will let the developer know which directory has to be made
    • Alternative, change the path in paths.py to point to the right path
    • Initially I have it set to make the directories if they don't exist, however, Alice adviced that I should allow the developer to do it himself/herself
    • Checks if base_profile contents exist!
    • Got some suggestions from Rob Campbell, going to work with his suggestions
    • Checks file urls
  • Things left to do
    • Check if base_profile contents exist - DONE
      • Alice adviced that I should check contents in general - no specific files or directories because they can change depending on what the developer is testing
    • Check files in (file:///c:/) format - DONE
      • create a temporary variable for the local pathname, convert the path to the os path and then check
      • split and join
      • urlparse
    • Work on checking Cygwin paths
      • split and join


Older progress

  • I have fixed the run_tests.py to check if the user's directories exist on their system and it prompts them to make the directories.
  • I'm currently working on checking if the following directories exist:
    • extension_perf_reports
      • The graphs and results will be generated in this folder
    • extension_perf_testing directory and base_profile directory.
      • There are several levels in the directory. Here is the basic outline of the directory structure:
extension_perf_testing(dir)
        |
        |
    base_profile (dir)
        |
        |__ bookmarkbackups (dir)
        |         |
        |         |__ .html files
        |
        |__ Cache (dir)
        |
        |__ .bak, .html, .ini, 
            .dat, .txt, .rdf, 
            .mfl files 
            (most important file - perf.js) 
  • I haven't went through a thorough discussion with Alice on which files should be validated in the base_profile dir. From what I've gathered from the other discussions we had, the perf.js file will crash the program is it's non-existent.
  • I have fixed the program to check for the existence of the base_profile dir and it also checks if the bookmarkbackups, Cache dirs and perf.js dir exist.
100% completed
Get all the configuration in one place

The framework is currently very confusing and the configuration is all over the place! This has to be fixed, but it's not the main priority:

  • paths.py, config.yml, constants
  • have to run both ts and tp at the same time

As long as a good documentation is provided, this is not a main priority

Medium

Changed structure of directory


100% completed - needs to be reviewed
Setup and test out the current Performance Testing framework using the provided documentation (not mine) to discover more flaws in either the framework or documentation. This will facilitate me in using a more user-oriented approach when improving the framework and documentation

More input is welcome on the current Performance Testing framework to help improve it.

Medium
on-going
Get class to test out the Performance Testing framework

The improved framework has to be tested to get constructive feedback from the users using the new Performance Testing Setup Configuration Documentation

Low
  • Liz Chak
  • Eva Or & Mike Lau
  • Whole class!
    • Setup framework using the new documentation - Things to look out for: If it's user-oriented and if it is easy to setup
    • Gave constructive feedback on documentation
100% completed

Comments on the README.TXT Documentation (By Mike Lau)

Getting Started

  • I was not sure what to do after unzipping the win32.zip file.


Comments on The README.TXT File

  • The README.TXT file is hard to read in notepad. There's horizontal scrolling after opening the file.
  • The file should have output samples to show the user what output to expect and to ensure they are on the right track.
  • The pre-requisites part of the README.TXT file is missing minimum size to be allocated in the computer to complete the install.
  • Following the procedures for installation was difficult. The document should number the procedures and have sample outputs to ensure the user is going on the right track
  • Some of the folders which was listed in the Directory Structure was missing in the setup procedures. The Directory Structure should be used as a guide to ensure users have the right folders in place. Also, there should be an image to represent the directory structure. Some of the folders which needed to be created on top of following the procedures provided were: base_profile, extension_perf_testing, extension_perf_reports
  • The Setup part was hard to follow. Most of the steps were not intuitive.
  • Step 4 of the procedures was unclear. I was not sure what kind of YAML config file to be created. The document should tell the user expcity what the file name needs to be called as or tell the user they can call it however they wish. It should also show an example filename as well.


Running The Application

After following the setup procedures, I typed the following command:

c:\> run_tests.py config.YAML

And I got the following error message in the command prompt and a popup window ( Liz Chak - Solution to this problem):

Sc1.JPG

Project Problems and Solutions

Problem: Firefox doesn't know how to open this address, because the protocol (c) isn't associated with any programs

If you didn't configure the paths.py paths correctly, you may run into this problem when you run the Performance Testing Framework:

Sc1.JPG

Solution: Firefox doesn't know how to open this address, because the protocol (c) isn't associated with any programs

In paths.py, the paths for INIT_URL, TS_URL and TP_URL have to be a local file url, not file path (file:///c:/):

"""The path to the file url to load when initializing a new profile"""
INIT_URL = 'file:///c:/project/mozilla/testing/performance/win32/initialize.html'

"""The path to the file url to load for startup test (Ts)"""
TS_URL = 'file:///c:/project/mozilla/testing/performance/win32/startup_test/startup_test.html?begin='

"""The path to the file url to load for page load test (Tp)"""
TP_URL = 'file:///c:/project/mozilla/testing/performance/win32/page_load_test/cycler.html'

Problem: ZeroDivisionError: integer division or modulo by zero

Traceback (most recent call last):
  File "C:\proj\mozilla\testing\performance\win32\run_tests.py", line 129, in ?
    test_file(sys.argv[i])
  File "C:\proj\mozilla\testing\performance\win32\run_tests.py", line 122, in te
st_file
    TP_RESOLUTION)
  File "C:\proj\mozilla\testing\performance\win32\report.py", line 152, in Gener
ateReport
    mean = mean / len(ts_times[i])
ZeroDivisionError: integer division or modulo by zero

Solution: ZeroDivisionError: integer division or modulo by zero

Check if there is contents in the base_profile directory that you have set for BASE_PROFILE_DIR in paths.py:

  • By default in paths.py: BASE_PROFILE_DIR = r'C:\extension_perf_testing\base_profile'
  • The BASE_PROFILE_DIR could be a different path, it doesn't have to be like the one above.

Problem: This page should close Firefox. If it does not, please make sure that the dom.allow_scripts_to_close_windows preference is set to true in about:config

  • You have to change the dom.allow_scripts_to_close_windows preference to true

Solution: This page should close Firefox. If it does not, please make sure that the dom.allow_scripts_to_close_windows preference is set to true in about:config

  • Type in about:config in the address bar
  • Scroll down and look for dom.allow_scripts_to_close_windows
  • Double click on it to set it to true

Directory Structure of Framework

Overview of Structure

A glance at the Framework File Structure (CVS files not included):

      win32
        |
        |
        |__ base_profile (dir)
        |     |
        |     |__ bookmarkbackups (dir)
        |     |    |
        |     |    |__ .html files
        |     |
        |     |__ Cache (dir)
        |     |
        |     |__ .bak, .html, .ini, .dat, .txt, .js, .rdf, .mfl files  
        |             
        |
        |__ page_load_test(dir)
        |     |
        |     |__ base(dir)
        |     |    |
        |     |    |__ other dirs and .html files
        |     |
        |     |__ cycler.html & report.html
        |
        |
        |__ startup_test
        |     |
        |     |__ startup_test.html
        |
        |
        |__ extension_perf_reports (dir for generated reports)
        |
        |
        |__ run_tests.py, paths.py, config.yaml and other .py, .html files
   
  • NOTE: Content in base_profile dir may vary


The following is written by Annie Sullivan (annie.sullivan@gmail.com):

base_profile/

  • This directory contains the base profile used for testing.
  • A copy of this profile is made for each testing profile, and extensions or prefs are added according to the test_configs array in run_tests.py.
  • For the page load test to run correctly, the hostperm.1 file must be set to allow scheme:file uris to open in new windows, and the pref to force a window to open in a tab must not be set.
  • The dom.allow_scripts_to_close_windows pref should also be set to true. The browser.shell.checkDefaultBrowser pref should be set to false.


page_load_test/

  • This directory contains the JavaScript files and html data files for the page load test.
  • The page load test opens a new window and cycles through loading each html file, timing each load.


startup_test/

  • This directory contains the JavaScript to run the startup test.
  • It measures how long it takes Firefox to start up.


extension_perf_report/

  • This directory is where the generated report will go into.
  • You may specify another directory to substitute this directory, but make sure that the paths.py is changed to point to it.


run_tests.py, paths.py

  • These files should be configured to run the test on different machines, with different extensions or preferences. See setup above.


Project News

Saturday, September 23, 2006

Performance tests didn't run sucessfully.

  • There weren't any results generated in the extension_perf_testing\base_profile and extension_perf_reports folders.
  • Output after the performance tests were run:
Traceback (most recent call last):
  File "C:\proj\mozilla\testing\performance\win32\run_tests.py", line 129, in ?
    test_file(sys.argv[i])
  File "C:\proj\mozilla\testing\performance\win32\run_tests.py", line 122, in te
st_file
    TP_RESOLUTION)
  File "C:\proj\mozilla\testing\performance\win32\report.py", line 152, in Gener
ateReport
    mean = mean / len(ts_times[i])
ZeroDivisionError: integer division or modulo by zero

Sunday, September 24, 2006

Understand further the approach to testing with the Python framework


Monday, September 25, 2006

elichak will be working on a resolution with alice to get the results generated in the extension_perf_testing\base_profile and extension_perf_reports folders.


Friday, September 29, 2006

elichak re-configured the environment of the machine to run the tests again. Cleaned up old files to do a clean test. Reinstalled Cygwin (replaced Make 3.80 with Make 3.81) and updated the testing files through CVS.


Sunday, October 1, 2006

Alice has successfully run the tests. The Zero Division error didn't occur again after she updated her test files. There were results generated in the extension_perf_testing\base_profile and extension_perf_reports folders. elichak attempted to run the test with the alice's code but the Zero Divsion Error still occured on her machine.


Wednesday, October 4, 2006

Elichak consulted Robcee about the Zero Division Error and he suggested a few things, like debugging the script. Elichak found out that the value of ts_time in the report.py file is empty but couldn't find out why the value of ts_time isn't assigned. According to alice, she didn't debug the scripts and only had to update the files to make them work.


Friday, October 6, 2006

Ben set up the VM for elichak to run her performance testing in that environment.


Wednesday, October 11, 2006

  • elichak configured the environment in the VM for her testing. The tests still gave the same results as before:
  • Zero Division Error at lines 122 and 129 in run_tests.py and line 153 in report.py
  • 2 files in the extension_perf_reports dir are generated but there are no graphs
  • elichak also changed the TS_NUM_RUNS, TP_NUM_CYCLES, TP_RESOLUTION values to 1 in run_tests.py to shorten the cycles of the performance testing for the purpose of debugging the scripts.
  • The error occurs in report.py because ts_time is empty, therefore, this fails:
for ts_time in ts_times[i]:
  mean += float(ts_time)
mean = mean / len(ts_times[i])
  • We speculate that the thing that is affecting the value of ts_time being generated is in ffprocess.py: RunProcessAndWaitForOutput always returns None in line 232
    return (None, True)
  • Further debugging by elichak is in process


Thusday, October 12, 2006

Work completed

The Zero Division Error is solved. Turns out that it was just a configuration problem. The documentation to set up the environment was rather subtle and needs a re-work.

Solution

Contents in the C:\proj\mozilla\testing\performance\win32\base_profile should also be in C:\extension_perf_testing\base_profile dir.

All work for this project is done on the VM, hera.senecac.on.ca

Work in progress

elichak

  • Trying out a few things in the framework to find out which direction I would like to do to the framework, either building new tests, improving on existing ones, strengthening the framework itself or porting it to other OS's
  • Revise the Firefox Performance Testing documentation

Friday, 20 Oct 2006

Last week, elichak has established to work on automating the setup of the environment and performance testing. The performance testing and environment setup is currently all over the place and is tedious for the developer to set it up.

The automation will entail:

Generating directories, dropping files in directories, installation of libraries, options to configure the performance testings etc.

Tuesday, 31 Oct 2006

Alice and Liz had a meeting and have established the key things that need to be done. What needs to be done:

  • ease configuration of python framework
    • too many config files to edit
      • have to know whole framework to configure it
      • not flexible
    • tedious
    • too many directories to create
    • too many extra libraries to load
      • a lot of dependancies!
    • things have to be copied to special directories
    • bad configurations don't cause errors!

How do we fix this?

  • configuration checker
    • yaml file validator
    • paths.py validator
      • checking the paths for existance
      • notify user if path doesnt exist and ask user if they want it created
      • checking if the directories have contents
  • next steps
    • get all the configuration in one place!
      • paths.py, config.yml, constants
      • have to run both ts and tp at the same time

Tuesday, 21 Nov 2006

Refer to progress chart. Performance Testing Framework progress chart

Wednesday, 29 Nov 2006

In-class Performance Testing Framework Configuration

  • DPS909 students contributed by testing the framework
  • Students have listed constructive feedback and many great suggestions to help improve the setup of the framework

Sunday, 3 Dec 2006

  • Liz Chak will fix up the sh script
    • Doesn't make a reports folder
    • Still runs the whole script even if it couldn't download the pre-requisites, should terminate
    • Missed out the paths.py file
    • Requires touch ups, but everything else is good

Sunday, 10 Dec 2006

Wednesday, 13 Dec 2006

The current Firefox Performance Testing Framework is effective and efficient.

Goals achieved:

  • Effective
    • Easy-to-use
    • Free from confusion
    • Informative and helpful
  • Efficient
    • Takes almost no time to set up the framework
    • Captures most user errors through validation and saves user time and frustration to fix an error while running the framework
      • Informs user of the directory/file invalidity or if they ran an invalid config file etc.

Things To-do (Immediately)

Project References

Project Events

In class Firefox Performance Testing

Before you begin

  1. You have to be on a Windows operating system
  2. Take a deep breath and go through the Performance Testing Setup Configuration Documentation
  3. Make a page under Comments on the Firefox Performance Testing Framework (see below) to list out your comments on the framework
  4. The performance testing is only successful if you see generated results in the reports directory (you'll understand what this means once you start)
  5. Ask Liz Chak if you have any questions

Things to look out for

  1. If the documentation is easy to follow
  2. Did you stumble upon any difficulty? If so, state them
  3. List any system errors, if any (with the error message and a brief description of what you think caused it)

Comments on the Firefox Performance Testing Framework

Instructions:

  • Make a page starting with your name/nick and end with "perf comments" (to ensure uniqueness)
  • Example: Liz Chak perf comments

List of comments:

Reflections on the project

Now that the first phase of the project is done, I would like to sit back and reflect on some of the experiences I have had with this project. Not only did this project grow, I have been growing with it. I've learned endless amount of things by working on this project (not just technical stuff). And as the saying goes, "you only learn from experience"!

Configuration frustration

Words can't express how I was initially frustrated with the framework. I hit brick walls a countless number of times when I tried to set the framework up and running and I was on the verge of giving up. It was only through my own perseverance and determination that I got this to work.

Outcome: A list of things to fix on the framework to ease the configuration, strengthen the framework and better documentation!!

First deliverable

I asked the class to test my first deliverable. I had faith that my framework would bring delight to my testers, but I was proven otherwise. Even with the effort that I've put into my first deliverable, it still created a group of frustrated and agonized users. Many of them ranted on what could be improved on the framework, hence:

Outcome: A list of things to automate the framework and EASE THE CONFIGURATION

Final deliverable

I'm glad that my first tester was thrilled about my Performance Testing. There is nothing better than a happy user. What I've learned is that TESTING is the KEY to a successful application. Why do so many of us developers neglect that??

Credits

Dean Woodside, Alice Nodelman, Ben Hearsum, Michael Lau, Eva Or, David Hamp Gonsalves , Dave Humphrey, Rob Campbell and of course, the DPS909 class who tested my framework. These are the individuals who played a significant part in making this framework headed toward success!

Bon Echo Community Test Day

Friday, October 06, 2006, from 7am - 5pm PDT
Mozilla QA Community:BonEcho 2.0RC1 prerelease Community Test Day