BOSTON -- After nearly six months of tweaking and collecting feedback from users, the Business Readiness Rating has been refined to the point where its evaluation categories are now final.
The Business Readiness Rating (BRR) is a community forum that helps developers rate open source software in a standardized way. The rating system is sponsored by Carnegie Mellon West Center for Open Source Investigation, O'Reilly CodeZoo, SpikeSource and Intel Corp., and it has been in an evaluation phase since mid-2005.
At the LinuxWorld Conference and Exposition in Boston, Anthony Wasserman, professor of software engineering practice at Carnegie Mellon West, explained why such a system is necessary. It's especially important in light of the fact that open source project repositories like SourceForge.net now host more than 100,000 projects of varying complexity, licenses and support structures.
"If someone with little or no open source software experience went to a site like SourceForge and did a search, they are going to get more results than they know how to deal with," Wasserman said. "A search for open source content management software, for example, would get you 400 results."
But with the BRR, Wasserman said that users would have the ability to address some key concerns in today's data centers, including how to compare open source to closed source software; how to evaluate the validity of an open source project; and how to compare similar open source projects.
The BRR evaluates an open source project or product using a series of seven categories and a handful of subcategories. They include functionality, reliability, scalability, architecture and code quantity, support and services, licensing, project management, documentation and community.
Individuals who maintain the BRR said they understand that every developer and adopter of open source software will approach each project or product differently -- as it suits the needs of their business. That's why each of the seven broad categories can be individually weighted according to the individual's needs.
Forrester Research Inc.'s Michael Goulde, an analyst and member of the original BRR steering committee, said each user has the ability to add weight to each category so that the total adds up to 100%. The weighted results provided by the user are then weighed against the target data of the entire BRR community to achieve a rating from one to five.
BRR's goal is to provide an unbiased, trusted source for the evaluation of open source software, said Murugan Pal, the founder and chief technology officer of Redwood City, Calif.-based open source certification vendor SpikeSource Inc. And the idea behind BRR is to be a complete, standard assessment model -- not the idea that one size fits all.
In addition to a weighted user evaluation system, the BRR will also employ a four-stage assessment process. Murugan said the process begins with a quick assessment filter, followed by a target usage assessment, then a data collection and processing stage and finally a data translation stage.
The process has newly added software taxonomies from IDC, the Framingham, Mass.-based research firm.
To receive more information or to get involved with the BRR project, Murugan said users can register at www.openbrr.org, download BRR whitepapers or join the group and begin contributing open source software evaluations to mature the rating system.