18 open source translation tools to localize your project

18 open source translation tools to localize your project
Localization plays a key role in adapting projects for users around the world.

Localization plays a central role in the ability to customize an open source project to suit the needs of users around the world. Besides coding, language translation is one of the main ways people around the world contribute to and engage with open source projects.

There are tools specific to the language services industry (surprised to hear that’s a thing?) that enable a smooth localization process with a high level of quality. Categories that localization tools fall into include:

  • Computer-assisted translation (CAT) tools
  • Machine translation (MT) engines
  • Translation management systems (TMS)
  • Terminology management tools
  • Localization automation tools

The proprietary versions of these tools can be quite expensive. A single license for SDL Trados Studio (the leading CAT tool) can cost thousands of euros, and even then it is only useful for one individual and the customizations are limited (and psst, they cost more, too). Open source projects looking to localize into many languages and streamline their localization processes will want to look at open source tools to save money and get the flexibility they need with customization. I’ve compiled this high-level survey of many of the open source localization tool projects out there to help you decide what to use.

Computer-assisted translation (CAT) tools

 

omegat_cat.png

OmegaT CAT tool

OmegaT CAT tool. Here you see the translation memory (Fuzzy Matches) and terminology recall (Glossary) features at work. OmegaT is licensed under the GNU Public License version 3+.

CAT tools are a staple of the language services industry. As the name implies, CAT tools help translators perform the tasks of translation, bilingual review, and monolingual review as quickly as possible and with the highest possible consistency through reuse of translated content (also known as translation memory). Translation memory and terminology recall are two central features of CAT tools. They enable a translator to reuse previously translated content from old projects in new projects. This allows them to translate a high volume of words in a shorter amount of time while maintaining a high level of quality through terminology and style consistency. This is especially handy for localization, as text in a lot of software and web UIs is often the same across platforms and applications. CAT tools are standalone pieces of software though, requiring translators that use them to work locally and merge to a central repository.

Tools to check out:

Machine translation (MT) engines

apertium_screenshot.png

MT engines automate the transfer of text from one language to another. MT is broken up into three primary methodologies: rules-based, statistical, and neural (which is the new player). The most widespread MT methodology is statistical, which (in very brief terms) draws conclusions about the interconnectedness of a pair of languages by running statistical analyses over annotated bilingual corpus data using n-gram models. When a new source language phrase is introduced to the engine for translation, it looks within its analyzed corpus data to find statistically relevant equivalents, which it produces in the target language. MT can be useful as a productivity aid to translators, changing their primary task from translating a source text to a target text to post-editing the MT engine’s target language output. I don’t recommend using raw MT output in localizations, but if your community is trained in the art of post-editing, MT can be a useful tool to help them make large volumes of contributions.

Tools to check out:

Translation management systems (TMS)

 

mozilla_pontoon.png

Mozilla's Pontoon translation management system user interface

Mozilla’s Pontoon translation management system user interface. With WYSIWYG editing, you can translate content in context and simultaneously perform translation and quality assurance. Pontoon is licensed under the BSD 3-clause New or Revised License.

TMS tools are web-based platforms that allow you to manage a localization project and enable translators and reviewers to do what they do best. Most TMS tools aim to automate many manual parts of the localization process by including version control system (VCS) integrations, cloud services integrations, project reporting, as well as the standard translation memory and terminology recall features. These tools are most amenable to community localization or translation projects, as they allow large groups of translators and reviewers to contribute to a project. Some also use a WYSIWYG editor to give translators context for their translations. This added context improves translation accuracy and cuts down on the amount of time a translator has to wait between doing the translation and reviewing the translation within the user interface.

Tools to check out

Terminology management tools

baseterm_term_entry_example.png

Brigham Young University's BaseTerm tool

Brigham Young University’s BaseTerm tool displays the new-term entry dialogue window. BaseTerm is licensed under the Eclipse Public License.

Terminology management tools give you a GUI to create terminology resources (known as termbases) to add context and ensure translation consistency. These resources are consumed by CAT tools and TMS platforms to aid translators in the process of translation. For languages in which a term could be either a noun or a verb based on the context, terminology management tools allows you to add metadata for a term that labels its gender, part of speech, monolingual definition, as well as context clues. Terminology management is often an underserved, but no less important, part of the localization process. In both the open source and proprietary ecosystems, there are only a small handful of options available.

Tools to check out

Localization automation tools

okapi_framework.jpg

Ratel and Rainbow components of the Okapi Framework

The Ratel and Rainbow components of the Okapi Framework. Photo courtesy of the Okapi Framework. The Okapi Framework is licensed under the Apache License version 2.0.

Localization automation tools facilitate the way you process localization data. This can include text extraction, file format conversion, tokenization, VCS synchronization, term extraction, pre-translation, and various quality checks over common localization standard file formats. In some tool suites, like the Okapi Framework, you can create automation pipelines for performing various localization tasks. This can be very useful for a variety of situations, but their main utility is in the time they save by automating many tasks. They can also move you closer to a more continuous localization process.

Tools to check out

Why open source is key

Localization is most powerful and effective when done in the open. These tools should give you and your communities the power to localize your projects into as many languages as humanly possible.

Want to learn more? Check out these additional resources:

Originally published on OPENSOURCE.COM

Who do you think you’re apostrophising? The dark side of grammar pedantry

image-20170405-20472-16dpspa
He’s been called “punctuation’s answer to Banksy”. A self-styled grammar vigilante who spends his nights surreptitiously correcting apostrophes on shop signs and billboards. The general consensus is that he’s a modern-day hero – a mysterious crusader against the declining standards of English. But his exploits represent an altogether darker reality.

The man himself is not particularly offensive. In a BBC Radio 4 report, he comes across as a reasonable person who simply feels a compulsion to quietly make a difference to what matters to him. He doesn’t ridicule, he doesn’t court publicity, he simply goes out and adds or removes apostrophes as required. And he does it with care, usually.

So what’s the problem? The problem lies in what this kind of behaviour represents and therefore normalises. In championing our vigilante, we are saying that it’s okay to pull people up on their use of language. It gives people the confidence to unleash their own pet peeves onto the world, however linguistically dubious.

The grammar vigilante himself appears to have a specific type of target, and his approach is nothing if not considerate. However, there is another type of pedant who is not so subtle or self aware. Some people think nothing of highlighting inconsistent punctuation wherever they might see it, however innocuous or irrelevant it might be (apostrophes rarely actually disambiguate – after all, we get along fine without them in speech).

Never mind that it’s a handwritten notice in a shop window, written by someone for whom English is a second (or third, or fourth) language. Never mind that it’s a leaflet touting for work from someone who didn’t get the chance to complete their education. They need to be corrected and/or posted online for others to see. Otherwise, how will anybody learn?

After all, apostrophes are easy. If people would just take a bit of time to learn the rules, then there wouldn’t be any mistakes. For example, everybody knows that apostrophes are used to indicate possession. So the car belongs to Lynda, the car is Lynda’s. But what about the car belongs to her, the car is her’s? Of course not, we don’t use apostrophes with pronouns (although this was quite common in Shakespeare’s time) as they each have a possessive form of their own. Except for one that is, which still needs one: one does one’s duty. It doesn’t need one though – it’s is something.

Then there’s the question of showing possession with nouns already ending in “s”: Chris’s cat or Chris’ cat? Jess’s decision or Jess’ decision? Or plural nouns ending in “s”: The princesses’s schedule or the princesses’ schedule? I don’t remember it being this difficult in the 1980’s/1980s/’80s/80s/80’s.

We definitely don’t use apostrophes to indicate plurals, something that routinely trips up the fabled greengrocer’s with its potato’s (although it was once seen as correct to use apostrophes with some words ending in a vowel). But what about when we need to refer to dotting the i’s and crossing the t’s, or someone makes a sign saying CD’S £5.00?

Clever clogs

The point is, while some are clear, many of the rules around apostrophes are not as transparent as some people would have us believe. This is largely due to the fact that they are not actually rules after all, but conventions. And conventions change over time (see David Crystal’s excellent book for a detailed history).

When things are open to change, there will inevitably be inconsistencies and contradictions. These inconsistencies surround us every day – just look at the London Underground stations of Earl’s Court and Barons Court, or St James’s Park in London, and St James’ Park in Newcastle. Or business names such as McDonald’s, Lloyds Bank, and Sainsbury’s. Is it any surprise people are confused?

As in, belonging to the Earl? Dirk Ingo Franke, CC BY

Of course, all of these conventions are learnable or available to be looked up. But if people haven’t had the opportunity to learn them, or do not have the skills or awareness to look them up, what gives other people the right to criticise? Are those who point out mistakes really doing it to educate, or are they doing it to highlight their own superior knowledge? Are they judging the non-standard punctuation or the sub-standard person?

As in: more than one Baron? asands, CC BY

Picking on someone because of their language is always a cowardly attack. Linguist Deborah Cameron makes the point that this is still the case even when highlighting the poor linguistic skills of bigots and racists on social media. Tempting as it is to call out a racist on their inability to spell or punctuate, by doing so we are simply replacing one prejudice with another, and avoiding the actual issue. As Deborah Cameron says: “By all means take issue with bigots – but for their politics, not their punctuation.”

Apostrophes matter, at least in certain contexts. Society deems it important that job applications, essays, notices and the like adhere to the current conventions of apostrophe usage. For this reason, it is right that we teach and learn these conventions.

But fetishising the apostrophe as if its rules are set in stone, and then fostering an environment in which it is acceptable to take pleasure in uncovering other people’s linguistic insecurities is not okay. The grammar (punctuation?) vigilante of Bristol is relatively harmless. But he is the unassuming face of a much less savoury world of pedantry.

Originally published on THECONVERSATION.COM