Log in
Building on their groundbreaking research and collaborative networks, Babych and Sharoff have developed a range of language technologies which now reach major corporations, small specialist businesses, a large industrial consortium, and agencies of the EU and UN. Their translation tools have had significant industrial impact by improving efficiency, consistency and user experience, and leveraging existing data collections for new purposes. In terms of policy, the research has re-shaped attitudes toward the ownership of data by demonstrating the commercial value of pooling resources. Individual translators have also benefitted from these technologies and related CPD courses, helping them to improve document flow, terminology and translation activities.
The success of the eXtensible Markup Language (XML) has been due in large part to the technologies built around it for constraining, querying, styling and otherwise processing XML documents. Research carried out at Edinburgh has been instrumental in the creation and/or design of many of these core XML technologies, including XSLT, XML Schema, XInclude, XQuery and XProc. Edinburgh staff played key roles in bringing these technologies into widespread use in both the private and public sectors through participation in standards development work.
The Natural Language Toolkit (NLTK) is a widely-adopted Python library for natural language processing. NLTK is run as an open source project. Three project leaders, Steven Bird (Melbourne University), Edward Loper (BBN, Boston) and Ewan Klein (University of Edinburgh) provide the strategic direction of the NLTK project.
NLTK has been widely used in academia, commercial / non-profit organisations and public bodies, including Stanford University and the Educational Testing Service (ETS), which administers widely-recognised tests across more than 180 countries. NLTK has played an important role in making core natural language processing techniques easy to grasp, easy to integrate with other software tools, and easy to deploy.
In 1997 Professor David MacKay of the University of Cambridge Department of Physics developed Dasher, a software accessibility tool for entering text by zooming through letters displayed on a screen. Dasher has since transformed computing for tens of thousands of individuals unable to use a normal keyboard, and is recommended by many charities involved in assistive technologies, such as the European Platform for Rehabilitation network. Since 2008, Dasher has been downloaded over 75,000 times and has been ported to smart phones, making use of input devices such as tilt sensors and joysticks. Linking Dasher's information-efficient text generation from gestures or gaze direction to text-to-speech or real-time-text output channels has made Dasher an ideal component of augmentative and alternative communication (AAC) systems which address digital exclusion.
In his 2004 and 2010 Oxford University Press translations of the Qur'an, based on over 30 years of rigorous scholarship, Professor Muhammad Abdel Haleem sought to make, "the Qur'an accessible to everyone who speaks English". Numerous accolades, including membership of the Arabic language Academy in Cairo, approval of his 2010 translation by Al-Azhar University, sales of 250,000+ copies and his receipt of hundreds of laudatory messages from readers around the world attest to both the faithfulness of his translation and its accessibility to a wide readership. Abdel Haleem has contributed substantially to interfaith understanding through his translations and interpretations and assisted interfaith dialogue globally.
Political events across Arab nations focus the attention of stakeholders in government and business, including publishing, on the imperative of culturally sensitive translations from Arabic. Increasing interest in Arabic literature necessitates professional-ethical standards in translating. Research-informed translations at University of Edinburgh by Marilyn Booth and research-based translator training supports development of more sensitive translations, thus aiding a granular understanding of socio-cultural complexity in Arab societies amid dynamic political change. Such translation activities refute `clash of civilisations' discourses and stereotyping of Arabs and Islam. The research and resulting training methods impact practice and enhance support for emerging UK and Arab-region translators, approximately 80 to date.
This case study explores the impact of RGCL's Translation Post-Editing Tool1 (PET) on Hermes Traducciones y Servicios Linguisticos (Hermes), NLP Technologies Ltd. (NLPT) and the Department of Translation, Interpreting and Communication (DTIC) at Ghent University. Hermes and NLPT are companies providing translation services in varied domains through a pipeline that combines translation technologies and post-editing. DTIC offers postgraduate courses on translation studies and interpreting, including subjects such as post-editing. PET enables the editing of pre-translated text, and the detection of `effort indicators'. This helps improve assessment of translation systems/approaches, the quality of pre-translated text, and the effort needed to convert it into a publishable form. At Hermes, workflows developed using PET have reduced post-editing time by 31-34%; at NLPT, workflows optimised using PET have saved an average of 66 seconds of post-editing time per sentence. At DTIC, PET has been used to enhance the courses Computer-Aided Translation and Technical Translation.