Tags: Java
Okay, go http://patft.uspto.gov/netahtml/srchnum.htm to the patent lookup service and enter this number: 6,961,937
Scan a few names down the list and you'll see yours truly. I'm speechless, astonished, and more. I kinda don't like patents, but there it is, my name on a patent. This means I'm going to have to hunt up my resume and add a new section on Patents Awarded. My gosh.
I think that I or Constantin might have mentioned the software for which this patent was awarded. It's a pretty cool package that is (?unfortunately?) available only inside Sun. The patent tells the story but you have to read quite a long ways past an impressive list of claims before you get to it.
The Distributed Test Framework (DTF) is the software in question. The purpose is to schedule test execution jobs across a range of computer platforms and collect results. Before we designed and built DTF the Quality Team was running around the test lab like madmen, frantically executing test suite executions and trying to remember what was running and collecting the results afterwards. Human beings can only scale this kind of activity so far before they go nuts.
To test Java we select a couple dozen platform combinations. On each platform combination we run tens of thousands of tests (perhaps over 100 thousand?). And we are striving to do this on a regular basis, such as every night or at the very least every week. You might think "Oh, Java is cross platform, so why test on every platform"? After all we've had marketeers telling you guys for years to not quip "write once test everywhere" so why are we testing everywhere? It's simple, a lot of the Java implementation runs on top of native code that has varying implementation across platforms. AWT obviously is different for every platform, and also obviously are the Server and Client compilers in the VM. And there's more. That means any time there is underlying native code that's unique to a platform, then Java is likely to show platform specific bugs. Hence, to catch platform specific bugs we have to test on a wide variety of platform combinations.
Like I said, a human trying to schedule all that testing by hand, or with a spreadsheet, would quickly go nuts.
Hence, the management came to some of us in the quality team and said "do something about that", and we did. The names you see in the patent were involved with designing the solution. We of course started with a review of the available software (this was 1998-1999) and came up empty handed in terms of job-execution-scheduling packages that could dispatch jobs, including ones needing to put GUI's onto the local display, to Solaris, Linux and Windows.
What we did was turn to Jini, which was newly released at the time. The hype around Jini at the time was about embedded devices ... the popular vision being you'd have printers with Jini services built in, the printer sits on the network, whatever device you're carrying around could query the network with Jini for devices that offer a "PRINT" service, and that would solve world hunger, free the slaves, and so on. But that's not quite what we did with Jini because I saw it could be used for our purposes.
In our case we wanted to build a system that: supported a self-organizing set of test systems ... allowed us to submit a job based on a description of test system characteristics ... by matching a job by test system characteristics, the job could be dispatched to any matching test computer ... hence it would let us "scale" the test execution capabilities of our test network simply by adding new computers ... since it's a self organizing set of test systems, we can add and remove test systems at will and the system will adjust itself automatically.
Jini offered the foundational building blocks to let us easily build that system. Each test system has a "machine service" which advertises the existance of the test system. What it advertises are a set of properties describing the characteristics (e.g. operating system, os version, CPU, graphics card, etc). A Jini lookup service collects the advertisements of test systems. And the other leg is a controller which collects jobs to run, finds matching test systems by looking in the lookup service, and dispatches the jobs to test systems. It dispatches the job by sending a request to an agent on the test system, whose job is to handle test execution, it runs the specific test harness for the suite, handles collecting the results, and sending out notifications.
We've been using DTF successfully for several years now and gotten a lot of mileage out of it.
Still, I'm astonished at having received a patent. First, I still don't think this was patentable since we were drawing on a long history of job scheduling systems. I, for one, was reminiscing while designing the controller about punched cards fed into an IBM 370 and how I used to name my jobs "mine" so that I could point at the job status monitor and exclaim "that job's mine". Okay, so I'm a little strange.
But it dawned on me this evening ... if one were to say things aren't patentable because you simply used or built on existing tools ... then nothing would ever get patented. On the other hand, every new tool that comes out, Jini for example, allows for new combinations of ideas to click together and the new tool allows people to design new systems with features not available before because the new tool enables those new features. Surely it means something about patentability that I used the word "new" five times in that sentence.
This might not be the end of the patent story here. We filed for four patents, so there's a decent chance of more to be approved.
Hopefully the patents are narrowly enough defined so it doesn't stop too many of y'all from implementing your own systems.
Source: weblogs.java.net
Comments
Here's the full link to the patent in question.
Posted by: rlinwood on November 29, 2005 at 07:42 AM
First of all, in the interest of full disclosure, I disagree with the fundamental premise behind patents, and think the US government should stop granting them. But that said, since we're stuck with them, I'd like to address this: if one were to say things aren't patentable because you simply used or built on existing tools ... then nothing would ever get patented.
And how is this bad? Even if the accept that patents are needed, ethical, and make sense, I still do not believe there are more than 3-4 inventions, per year, world-wide, that actually deserve patent protection. And that's because patents are meant for protecting things that are really new, not just an evolution of existing concepts and techniques.
First, I still don't think this was patentable since we were drawing on a long history of job scheduling systems. .
I didn't read the entire patent, but if you - as one of the inventors - feel this way, I'd say this is exactly what I'm talking about. It's something that's merely evolutionary, not a fundamentally new invention. And these patents should not, IMO, be granted.
Posted by: sprhodes on November 29, 2005 at 11:08 AM
but if you - as one of the inventors - feel this way, I'd say this is exactly what I'm talking about. It's something that's merely evolutionary, not a fundamentally new invention. And these patents should not, IMO, be granted. --- well, either that, or I'm just predisposed to not liking patents and can't see whether this is a good patent or not. Clearly we didn't patent something as silly as a method to entertain a cat with a flashlight, or a way to swing on a swing. We did go through some interesting thought to develop that tool and that's worth something. Patentable? I don't know.
Posted by: robogeek on November 30, 2005 at 06:41 PM