[kaffe] Re: Regression/build testing using Xenofarm

Jim Pick jim at kaffe.org
Mon Jul 28 08:41:02 PDT 2003

On Mon, 28 Jul 2003 14:22:21 +0200
Dalibor Topic <robilad at kaffe.org> wrote:

> Hi Jim,
> Jim Pick wrote:
> > Ultimately, I think we'll end up with a variety of regression testing
> > systems.  I'd like to get results for Gump, Mauve, SpecJVM, and others
> > up on the website.
> > 
> > Here's what I've been thinking:
> > 
> > 1) Let's not have one "official" regression testing framework.  Let's
> >    encourage people to develop their own regression testing systems in
> >    a distributed manner.
>  >
> > 2) Let's define a standard reporting format, as an XML format.  People
> >    format their results in this format, and upload them to the kaffe.org
> >    server.
> > 
> > 3) I'll write some scripts to suck in the various submitted XML files,
> >    and produce useful reports.  We could even use a database (for
> >    example, to track benchmark performance over time).
> > 
> > 4) Install something like CVSps so we can more easily cross reference
> >    individual checkins.
> That sounds like the better plan in the long run. Though it also sounds 
> like a nice project in itself, "designing and implementing a 
> regression-test meta-framework". ;)

I did some of the ground work on it this weekend.

The good news is that JSTL works great on Jetty (providing I drop in
some of the Apache XML stuff into the BOOTCLASSPATH to override Kaffe's
XML stuff).  Actually compiling the JSPs using kjc is a tad bit slow and
a memory hog, but it all works.  :-)

I've got Friday off - so I'm shooting for installing the first version
of it on Saturday.  :-)

> I'm not quite sure about 2) though. What should go into the reporting 
> format? A lot of build tools just output (not very formatted) text to 
> stdout and stderr. Or do you have filters in mind, that XMLize the 
> output of 'make', for example?

The XML "reports" would mostly just identify the build number, the
platform, the test performed, and whether or not it was "in progress",
"pass" or "fail".  It would also have additional metadata that might
provide an http link to the compile log.

Developers would drop these .xml files into a designated directory
underneath their home directories on pogo.kaffe.org.  Eventually, I'd
like to write up a way to set up web accounts and http uploading or
WebDAV on kaffe.org so that non-developers could also easily submit

Every few minutes, a cron job will run an ant task which reads all the
.xml files (maybe using XOM), and dumps them, or maybe just the
new/changes ones, into a SQL database (hsqldb?). The web output will be
a set of JSPs that just pulls data out of the SQL database. I thought of
using XSLT to do the processing, but that doesn't scale as nicely as

The initial web output is going to look something like tinderbox, except
it will be heirarchical, as we have literally hundreds of testable
combinations of tests/platforms/configurations.

There's not actually that much code involved.

We need a way of assigning build numbers to the various checkins.  I
was thinking of just counting the Changelog entries to come up with
a number.  It shouldn't be too hard to write scripts that take the
output of any type of test (compilation, make check, mauve, SpecJVM,
Gump, stability tests, etc.), make an XML file, and upload the


 - Jim

More information about the kaffe mailing list