[Dev] automated testing terminology
heikki at osafoundation.org
Wed Jan 11 18:00:21 PST 2006
Katie Capps Parlante wrote:
> * PreCheckInTests -- tests that developers should run before checking in
These are really the unit+integration tests that Tinderbox runs, with
the addition that one should check email functionality and sharing
functionality (I seem to recall).
> * UnitTests -- automated tests that are run by the build system.
> Developers are responsible for adding unit tests for the components they
> develop. For every component in the system there should be corresponding
> unit tests that can be run just to validate the functionality of that
At the moment when we say unit tests we actually mean tests that are run
using the Python unit test framework. They are a mixture of real unit
tests and integration tests.
But I would be ok to call these unit tests like we have called them before.
> * FunctionalTests -- complete set of manual and automated tests to test
> the functionality and performance of each feature in the release. These
> testcases will match one on one with the testcases in the test
> specification. Currently we have very limited number of automated tests
> for Chandler and any hellp from the user community will be greatly
> appreciated. If you have any expertise in writing automated tests for
> desktop UI applications and would like to contribute to Chandler test
> development, please contact us.
> * Performance tests -- performance testing may be conducted as part of
> functional tests to test the application startup time, response times,
> memory leak, CPU utilization, etc. The performance criteria will be
> developed by the Product/Design team for each release.
> * Integration tests -- test cases cases that test the application
> completely from end to end, after all functional components are code
> complete. This includes test cases with more complex scenarios than
> functional tests.
I don't actually see a need for integration tests like this. We have
unit tests + integration tests done automatically, and we have
functional tests which cover the application end to end and also include
external programs like Cosmo.
> * Regression tests -- subset of automated functional tests that will be
> run nightly during the development cycle to ensure no existing
> functionality was broken because of new feature development.
I don't think we have any separate regression test suites, and I think
we should not have them separate. They should be part of the
unit+integration tests that we run continuously.
> * AcceptanceTests -- tests that anyone can run in order to "bless" a
> milestone/release. This is a more extensive list of manual tests that
> are conducted at the end of each milestone/release.
I believe a prerequisite to passing acceptance tests is to pass all
other tests (with some leeway in perf tests).
-------------- next part --------------
A non-text attachment was scrubbed...
Size: 253 bytes
Desc: OpenPGP digital signature
Url : http://lists.osafoundation.org/pipermail/chandler-dev/attachments/20060111/a747c276/signature.pgp
More information about the Dev