Difference between revisions of "0.1 Release 2012 WebVTT Test Suite"
(→Resources) |
(→Tests) |
||
(19 intermediate revisions by 9 users not shown) | |||
Line 4: | Line 4: | ||
In order to write our parser, we'll need a way to prove it's correct as per the spec. Doing so involves the creation of a spec conformance test suite, consisting of good and bad WebVTT files. Each test file makes sure that a particular part of the spec is true, and forces the parser to do various things. | In order to write our parser, we'll need a way to prove it's correct as per the spec. Doing so involves the creation of a spec conformance test suite, consisting of good and bad WebVTT files. Each test file makes sure that a particular part of the spec is true, and forces the parser to do various things. | ||
+ | |||
+ | The [http://www.w3.org/QA/WG/2005/01/test-faq#why W3C defines Conformance Testing] as follows: | ||
+ | |||
+ | <blockquote>Focuses on testing only what is formally required in the specification in order to verify whether an implementation conforms to its specifications. Conformance testing does not focus on performance, usability, the capability of an implementation to stand up under stress, or interoperability; nor does it focus on any implementation-specific details not formally required by the specification.</blockquote> | ||
+ | |||
+ | They [http://www.w3.org/QA/WG/2005/01/test-faq#good go on to say] that good tests are: | ||
+ | |||
+ | * Mappable to the specification (you must know what portion of the specification it tests) | ||
+ | * Atomic (tests a single feature rather than multiple features) | ||
+ | * Self-documenting (explains what it is testing and what output it expects) | ||
+ | * Focused on the technology under test rather than on ancillary technologies | ||
+ | * Correct | ||
When writing test files, remember that each test should test only one thing. Make them simple, small, and discrete. All you're doing is writing a WebVTT file with enough data in it to trigger a rule in the parser. | When writing test files, remember that each test should test only one thing. Make them simple, small, and discrete. All you're doing is writing a WebVTT file with enough data in it to trigger a rule in the parser. | ||
+ | |||
+ | Also make sure you capture metadata about your test. What is it testing? Which part(s) of the spec? How did you generate the file? See http://lists.w3.org/Archives/Public/public-texttracks-contrib/2012Aug/att-0000/webvtt_test_cases.html. | ||
== Example == | == Example == | ||
Line 25: | Line 39: | ||
** Install the webvtt npm module: | ** Install the webvtt npm module: | ||
− | $ npm install -g webvtt | + | $ sudo npm install -g webvtt |
* Add humphd's webvtt github repo as a remote | * Add humphd's webvtt github repo as a remote | ||
Line 35: | Line 49: | ||
* Switch to the seneca branch | * Switch to the seneca branch | ||
− | $ git checkout seneca | + | $ git checkout seneca |
* Add your tests to test/spec/good and test/spec/bad | * Add your tests to test/spec/good and test/spec/bad | ||
Line 51: | Line 65: | ||
== Tests == | == Tests == | ||
− | + | We'll break the tests up by type within the file format specification. Each group will be responsible for determining tests for their section of the spec. The sections, while somewhat arbitrary (i.e., feel free to further subdivide or offer other divisions) are: | |
+ | |||
+ | # [[General File Structure]] | ||
+ | # [[Cues IDs]] | ||
+ | # [[Cue Times]] | ||
+ | # [[OSD600/webvtt/cue_settings|Cue Settings]] | ||
+ | # [[Cue Text, including replacements]] | ||
+ | # Text Tags, Text and CSS | ||
+ | |||
+ | Each group is responsible for determining and writing the tests necessary for their section, and can be listed below, or in a separate wiki page. Every section/test needs to be owned so it doesn't get lost. | ||
+ | |||
+ | ===Test Naming Format=== | ||
+ | |||
+ | http://zenit.senecac.on.ca/wiki/index.php/Test_files | ||
+ | |||
+ | == Open Questions == | ||
− | + | * Which metadata should we capture about each test? | |
+ | * Which format should we use for this metadata? | ||
+ | ** Should we embed in the WebVTT comment block? | ||
+ | ** Should we have test-file.vtt and test-file.vtt.html (i.e., 2 files per test, one with metadata)? | ||
+ | ** Should we have one large file with data about the tests in the folder? | ||
+ | ** Should we write code/build steps to generate metadata from webvtt files? (e.g., extract from header of test file, merge secondary files into one file, etc.) | ||
+ | ** I asked this of the #whatwg channel on freenode, where the W3C spec editors hang out. [[WebVTT Testing Question on #whatwg|Their answer is here]]. | ||
== Resources == | == Resources == | ||
Line 60: | Line 95: | ||
* [https://github.com/humphd/webvtt humphd's webvtt repo, with seneca branch] | * [https://github.com/humphd/webvtt humphd's webvtt repo, with seneca branch] | ||
* [https://github.com/humphd/node-webvtt node-webvtt repo] (CLI version of [http://quuz.org/webvtt/ online webvtt validator]) | * [https://github.com/humphd/node-webvtt node-webvtt repo] (CLI version of [http://quuz.org/webvtt/ online webvtt validator]) | ||
+ | * [http://lists.w3.org/Archives/Public/public-texttracks-contrib/2012Aug/0000.html Contributing back to W3C] | ||
+ | * [http://www.w3.org/QA/WG/2005/01/test-faq W3C Test FAQ] | ||
+ | * UTF-8 Compatible Editors (make sure you are loading/saving UTF-8): | ||
+ | ** http://www.sublimetext.com/ | ||
+ | ** http://notepad-plus-plus.org/ |
Latest revision as of 17:22, 27 September 2012
Contents
Introduction
Bug https://github.com/rillian/webvtt/issues/7
In order to write our parser, we'll need a way to prove it's correct as per the spec. Doing so involves the creation of a spec conformance test suite, consisting of good and bad WebVTT files. Each test file makes sure that a particular part of the spec is true, and forces the parser to do various things.
The W3C defines Conformance Testing as follows:
Focuses on testing only what is formally required in the specification in order to verify whether an implementation conforms to its specifications. Conformance testing does not focus on performance, usability, the capability of an implementation to stand up under stress, or interoperability; nor does it focus on any implementation-specific details not formally required by the specification.
They go on to say that good tests are:
- Mappable to the specification (you must know what portion of the specification it tests)
- Atomic (tests a single feature rather than multiple features)
- Self-documenting (explains what it is testing and what output it expects)
- Focused on the technology under test rather than on ancillary technologies
- Correct
When writing test files, remember that each test should test only one thing. Make them simple, small, and discrete. All you're doing is writing a WebVTT file with enough data in it to trigger a rule in the parser.
Also make sure you capture metadata about your test. What is it testing? Which part(s) of the spec? How did you generate the file? See http://lists.w3.org/Archives/Public/public-texttracks-contrib/2012Aug/att-0000/webvtt_test_cases.html.
Example
The spec says (http://dev.w3.org/html5/webvtt/#webvtt-file-body) that the file must begin with an optional BOM character, followed by the string WEBVTT. Assuming you're testing the optional BOM character not being present, you'd expect the following file to be valid:
WEBVTT
And the following file to be invalid:
NOT WEBVTT
The first file should go in the good/ directory, the second in bad/, indicating that we expect the first to validate, and the second to fail.
Method
- Install the node-webvtt module:
- Install node.js (which includes npm): http://nodejs.org/
- Install the webvtt npm module:
$ sudo npm install -g webvtt
- Add humphd's webvtt github repo as a remote
$ cd your-webvtt-clone-dir $ git remote add humphd git://github.com/humphd/webvtt.git $ git fetch humphd
- Switch to the seneca branch
$ git checkout seneca
- Add your tests to test/spec/good and test/spec/bad
- Run your tests to make sure they pass in the JS WebVTT parser. You can do this directly or with make. To do it directly:
$ webvtt test/good/some-file.vtt
Or to do it with make across all your files
$ make check-js
- When you're done, send a pull request to your group member managing your tree, who will assemble all your tests, and then do a pull request to humphd.
Tests
We'll break the tests up by type within the file format specification. Each group will be responsible for determining tests for their section of the spec. The sections, while somewhat arbitrary (i.e., feel free to further subdivide or offer other divisions) are:
- General File Structure
- Cues IDs
- Cue Times
- Cue Settings
- Cue Text, including replacements
- Text Tags, Text and CSS
Each group is responsible for determining and writing the tests necessary for their section, and can be listed below, or in a separate wiki page. Every section/test needs to be owned so it doesn't get lost.
Test Naming Format
http://zenit.senecac.on.ca/wiki/index.php/Test_files
Open Questions
- Which metadata should we capture about each test?
- Which format should we use for this metadata?
- Should we embed in the WebVTT comment block?
- Should we have test-file.vtt and test-file.vtt.html (i.e., 2 files per test, one with metadata)?
- Should we have one large file with data about the tests in the folder?
- Should we write code/build steps to generate metadata from webvtt files? (e.g., extract from header of test file, merge secondary files into one file, etc.)
- I asked this of the #whatwg channel on freenode, where the W3C spec editors hang out. Their answer is here.
Resources
- WebVTT Spec
- humphd's webvtt repo, with seneca branch
- node-webvtt repo (CLI version of online webvtt validator)
- Contributing back to W3C
- W3C Test FAQ
- UTF-8 Compatible Editors (make sure you are loading/saving UTF-8):