This afternoon we demonstrated the final link in the chain of producing Monte Carlo data for CMS using this project (and the -dev project too, of course), namely the transfer of result files from the temporary Data Bridge storage to a CMS Tier 2 site's storage element (SE). To summarise, the steps are:

o Creating a configuration script defining the process(es) to be simulated
o Submitting a batch of jobs of duration and result-file size suitable for running by volunteers
o Having those jobs picked up by volunteers running BOINC and the CMS@Home application, and the result files returned to the Data Bridge
o Running "merge" jobs on a small cluster at CERN to collect the smaller files into larger files (~2.2 GB) -- this step has to be done at CERN as most volunteers will not have the bandwidth (or data plan!) to handle the data volumes required. This step also serves to a large extent as the verification step required to satisfy CMS of the result files' integrity.
o Transferring the merged files into the Grid environment where they are then readily available to CMS researchers around the world

Thanks, everybody. From here on it gets more political, but we've been garnering support as the project progressed. We now need to move into a more "production" environment and convince central powers-that-be to take over the responsibility of submitting suitable workflows and collecting the results. You will still see some changes in the future, especially as we bring some of the more-advanced features across here from the -dev project.

More...