[Cosmo-dev] Cosmo performance
mikeal at osafoundation.org
Fri Apr 6 17:06:28 PDT 2007
We're going to be moving this work into a contrib project in the
The new project will be responsible for a little more than just these
The project will;
1. Check tinderbox for the newest continuous build
2. Setup the new build with mysql
3. Run the performance test suite based on the use cases Priscilla
has given to adam (we'll probably run them several times like the way
the Chandler performance tests are run )
4. Store the results
5. Make pretty graphs and highlight changes (we'll probably make it
look somewhat like the graphs we're currently using for Chandler )
This is all very rough, but I'm hoping to get something functional by
the end of next week unless other work trumps this work. Once it's
working we can talk about what box we're going to put this on and
keep it running.
> On Apr 6, 2007, at 1:39 PM, Adam Christian wrote:
>> Until we can get hooks into the python server for windmill to do
>> all of this, I put together a little webapp for comparing action
>> performance in the UI you can check it out here:
>> Can I get access to a box at osaf to put this on?
>> For some reason my sort function isn't working so they don't
>> always show up asc as they should but ill fix it when I get a moment.
>> On Apr 5, 2007, at 3:57 PM, Adam Christian wrote:
>>> I am currently measuring Cosmo UI performance using Windmill,
>>> there is a performance tab that uses a timing object I built in
>>> near future we will be sending this back to the server and using
>>> python to store and analyze it, but for now I am working on a
>>> small ajax app to parse the current performance output and graph
>>> it so that we can compare performance times for each checkpoint
>>> we test with Windmill. As soon as I have a demo I will send out
>>> the url.
>>> On Apr 4, 2007, at 5:52 PM, Ted Leung wrote:
>>>> Hi folks
>>>> At this week's preview countdown meeting we spent some time
>>>> talking about Cosmo performance. With the switch to EIM/Morse
>>>> Code, there have been a lot of changes and we don't know if the
>>>> baseline performance numbers that Jared used to size the
>>>> production server machine still hold true. Since one of the
>>>> goals of the EIM/Morse Code was to improve performance, we
>>>> expect things to be better, but we need to verify that. Also,
>>>> it appears, anecdotally, that there are parts of the Cosmo UI
>>>> that should be snappier than they are. Generally speaking,
>>>> then, it seems that we probably need to get more organized on
>>>> the performance front. To that end, I'd like to make the
>>>> following proposal for getting started:
>>>> 1. Have Randy and Jared work together to put together some tests
>>>> that approximate the reasoning that used by Jared's old tests.
>>>> These tests were the ones that we used for a baseline
>>>> initially. Some of these tests need to be reworked because
>>>> they are DAV based, and we'll be using EIM based syncing now.
>>>> 2. Start looking a ways to measure the performance of the Cosmo
>>>> UI. In Chandler, we defined a set of use cases and a standard
>>>> data set, and measure the time it took for those operations to
>>>> complete. Things are a bit more complicated with a multi-user
>>>> web app, but we ought to be able to at least get some raw
>>>> numbers for single user responsiveness as a start. PPD is
>>>> going to come up with a list of use cases that we want to measure.
>>>> Any other ideas for areas to look at or for methodology?
>>>> cosmo-dev mailing list
>>>> cosmo-dev at lists.osafoundation.org
>>> cosmo-dev mailing list
>>> cosmo-dev at lists.osafoundation.org
>> cosmo-dev mailing list
>> cosmo-dev at lists.osafoundation.org
-------------- next part --------------
A non-text attachment was scrubbed...
Size: 186 bytes
Desc: This is a digitally signed message part
Url : http://lists.osafoundation.org/pipermail/cosmo-dev/attachments/20070406/7a66c40f/PGP.pgp
More information about the cosmo-dev