Log in
The invention of a novel component-based model and approach for rapid distributed software development are the core research results for this case study. Using our methodology we have built a fully functional platform — the Grid Integrated Development Environment (GIDE) — which has been used for the development of user applications by several industrial partners. The main economic impact of our work is the new component-based development process resulting in much higher productivity and shorter development cycle. In addition, the four new international standards approved by ETSI provide impact on the wider professional community in the areas of grid and cloud computing.
The research in this case study has pioneered knowledge management technology. It has had major impact on drug discovery and translational medicine and is widely adopted in the pharmaceutical and healthcare industries. The impacts are:
Cloud computing is now used ubiquitously in consumer and commerce domains yielding unprecedented access to computing and data handling at affordable prices.
Work in this field was pioneered at the University of Southampton (UoS) from 1998 onwards and commercialised from 2008 through Dezineforce to enable companies to exploit cloud computing in engineering:
Throughout this period the team has also engaged in outreach to inspire and educate the next generation of scientists and engineers about High Performance and Cloud computing including a YouTube video with 485,000 hits and over 300 articles in media.
Research in the UoA has underpinned the development of the current version of BOINC (Berkeley Open Infrastructure for Network Computing), a technology to enable secure volunteer computing. The research was done as part of the climateprediction.net project that is currently managed as CPDN through the UoA, supporting international climate modelling. CPDN models climate change using donated cycles on users' computers, with almost 700,000 users registered by 2013. Significant work to develop BOINC in CPDN has enabled the public to engage with science more easily and conveniently. BOINC has become recognised as the key open-source tool for volunteer computing and is also available to companies to create their own grid networks. It has been used for a range of applications from driving experiments to find the Higgs particle to using home PCs to detect earthquakes.
A quiet technology revolution in the UK has been changing the way that police officers on the beat and hospital nurses access and record information, using handheld electronic notebooks that bring large time and cost savings. This revolution began as a University of Glasgow research programme and led to the creation of a successful spin-out company, Kelvin Connect. Acquired in 2011 by the UK's largest provider of communications for emergency services, Kelvin Connect has grown to 30 staff. Its Pronto systems are now in use by 10% of UK police forces and nursing staff in several UK hospitals.
Gateway technologies have enhanced the ability of end-users to engage with high-performance computing (HPC) programs on massively distributed computing infrastructures (DCIs) such as clusters, grids and clouds. The technologies are focussed on the needs of business, industry, organisations and communities; enabling them to extract added business and social benefit from custom high-value services running on a wide range of high-performance DCIs. Typically, such services are based on computational workflows tailored to specific business needs. DCIs may comprise resources already owned (eg. clusters) combined with resources rented on a pay-as-you- go basis (eg. clouds). Several companies and organisations worldwide are currently using the technologies.
This case study reports our work on the development, application and dissemination of innovative cloud-based technologies to industrial problem domains. First, decentralised scheduling is implemented within federated Clouds, to facilitate the new drug discovery process for a global pharmaceutical company. Second, multi-objective approaches to the management and optimisation of video processing and analysis workflows in distributed environments is described in the context of an SME organisation that is developing new products, services and markets. Both of these examples have attracted, and continue to attract, commercial funding, and demonstrate the efficacy of knowledge transfer into industry from University of Derby (UoD) research.
In 2011, a leading role was given to Peter Coveney in UCL's Department of Chemistry in defining the future strategy for the UK's e-infrastructure, based on the department's expertise and research in this field. This appointment led to the publication of the Strategy for the UK Research Computing Ecosystem document, which has since stimulated debate amongst policy makers and informed government policy. On the basis of its recommendations, the government has set up an advisory E-Infrastructure Leadership Council and allocated £354 million to improving the UK's high-performance computing capabilities and wider e-infrastructure, a move that is having wide-ranging industrial and economic impact in the UK. Most recently, in June 2013 the Strategy document stimulated further debate about the UK's e-infrastructure at the House of Lords.
Researchers in Cambridge have developed a data standard for storing and exchanging data between different programs in the field of macromolecular NMR spectroscopy. The standard has been used as the foundation for the development of an open source software suite for NMR data analysis, leading to improved research tools which have been widely adopted by both industrial and academic research groups, who benefit from faster drug development times and lower development costs. The CCPN data standard is an integral part of major European collaborative efforts for NMR software integration, and is being used by the major public databases for protein structures and NMR data, namely Protein Data Bank in Europe (PDBe) and BioMagResBank.
KCL research played an essential role in the development of data provenance standards published by the World Wide Web Consortium (W3C) standards body for web technologies, which is responsible for HTTP, HTML, etc. The provenance of data concerns records of the processes by which data was produced, by whom, from what other data, and similar metadata. The standards directly impact on practitioners and professional services through adoption by commercial, governmental and other bodies, such as Oracle, IBM, and Nasa, in handling computational records of the provenance of data.