Any company which intends to go global will require their content to be translated into many different languages, which means one translator and reviewer per language plus any other actors in the localization process which are part of that particular language’s translation workflow (e.g. localization experts, QA analysts, etc.). If you are a company which is going global in the EU, for example, that means you need your content translated into 23 or 24 different languages depending on whether you are part of it, which multiplied by the number of people involved in the process as mentioned above results in hundreds of people involved and collaborating in the process of a single project.
During the translation process, all of these people will need to communicate with each other, send files to one another and download files, without taking into account that project managers need to manage all queries—which results in hundreds of individual communications. If this process has to happen with a localization technology which is an offline server, the bottlenecks that are created here are of a huge scale, especially when you can have localization projects which can reach four-figure numbers, as they do for most companies. This creates a chaotic picture far from being cost-efficient or productive. If this scenario sounds familiar to you, then a technological upgrade is overdue. Carrying out this very same process but with cloud-based localization technology paints a completely new picture.
According to Foundry’s 2022 Cloud Computing Research, 92% of companies use cloud-based services and up to 63% of companies will have most or their whole IT infrastructure in the cloud by the end of 2023, a leap of 22% from mid-2022. With cloud-based localization technology, everyone is working on the same platform, in real-time and can communicate on it without having to go through a project manager or sending emails and waiting for replies. From a project manager’s point of view, cloud-based technology is vital to make the project more efficient. The difference between answering 24 separate queries about the same text or being able to answer it once and for everyone to see it speaks for itself. Everyone working on the same platform also means transitioning the program from an inefficient silo-based approach to a more efficient, centralized approach.
Without cloud-based technology, all actors involved work on their own computer and keep their own data, unaware of what everyone else is doing and without being able to leverage previous translations which could make their work more efficient. By working together, everyone contributes to the same TM, and not only does this exponentially accelerate time-to-market but also saves a lot of money down the line if parts of the same content need to be translated by another translator, who can then leverage the existing translations and reduce costs to the company.
Another native feature of cloud-based technology is that it is operating-system (OS) agnostic—in other words, it can be accessed from any operating system or mobile device, whether that is Windows, Mac, Linux, Android or iOS to name a few. Using a translation tool that only functions with a specific OS, or even a specific version of it, means that any productivity you gain from using this tool is lost because you can only deploy it with people who have this specific OS, regardless of their skill.
Director of Support Services, XTM
If you can identify silos involved in your localization process, this means that your program can improve considerably in terms of efficiency. Silos can occur organically with teams who have all the resources they need to carry out their work and become so focused on their goals that they lose sight of the bigger picture. An efficient localization program requires everyone involved to have aligned goals and work together, which will not only benefit productivity but also cost-efficiency. For this, you will need the right technology that brings everyone together.
Before you start evaluating your connectivity options for ways to localize it, you need to ask yourself: “Do you have a structured content source?”. In other words, do you create your content outside of any content-system interface? If the answer is ‘yes’, you are still creating your projects manually into your localization technology, and this is an issue because of the manual tasks involved. All your content should be created and stored on platforms such as content management systems (CMS), component content management systems (CCMS), design tools, product information management (PIM) solutions, etc. Once you have this technology as part of your content strategy, it should be connected to your localization technology, and this happens via connectors.
Connectors are an absolute necessity in the localization industry. Without a connector, you have to manually import and export files for translation, send them via email to the linguists and then import them back into your content system when the translation has been completed, also verifying that there aren’t any coding or labeling issues which affect the import. These are all manual actions and, therefore, are a delay in the process, because any manual action that could be automated subtracts value from the process. For example, if you need to spend 5 or 6 hours selecting content from a Figma file for translation, separating the translatable content from the non-translatable, copying it onto a file and sending it off for translation, you have used up the best part of a working day. Connectors automate this whole manual process and automatically perform tasks which take hours if done manually, such as the previous example. This is the most basic and outstanding characteristic and benefit of a connector, a value proposition that speaks for itself. Nevertheless, having a connector doesn’t mean that it doesn’t ever need upgrading.
Not all connectors support certain processes which are key to an efficient localization program. For example, does your connector support continuous localization, i.e. the continuous updating of source files? Does it provide linguists with visual context or metadata so your linguists can translate and localize with all the information they need? If the answer to any of these questions is “No”, then you may need to upgrade your connectivity. By supporting continuous localization, you will be able to localize your content updates regularly with minimum disruption to the system and no human interaction required, thus reducing costs, keeping your content always fresh and up to date. Not giving your linguists context (visual or metadata) means that you will forcibly have to implement post-publication review steps, further delaying the process and increasing costs. These are just two examples, but two of the most important reasons as to why your connectivity needs to make your process more efficient, not hinder it.
Any feature which allows you to remove workflow steps from the process is adding value to your program by saving time and costs. Assess your whole program and identify the manual steps which can be automated. If they could be automated but your technology won’t allow you, that’s the biggest sign that you need to upgrade your technology stack.
With many companies having a global presence, expansion into new markets is the next natural step for many of them. This expansion requires planning, and one of the most vital components as to whether this expansion is feasible and cost-efficient is if your localization technology can adjust to it. In other words, can it handle an exponential increase in output? Can it do so without incurring hidden costs such as word caps? Can you customize workflow steps on a language-by-language basis? In other words, is your localization technology scalable?
Scalability has become another non-negotiable characteristic for localization technology. Forbes describes scalability in technology as “a measure of how well a piece of software handles change in expected workload behavior situations or unexpected scenarios.” In other words, how well your technology can adapt to changing circumstances with minimum problems. The ever-changing nature of the localization industry means that scalability is particularly important in the localization industry. Companies need to be prepared for their output volumes to fluctuate and for this to occur with minimum disruption to the existing process if any at all as well as trying to prevent any costs shooting up. If your localization technology stack is not scalable, your localization costs will grow and you will still have to meet your goals one way or another, and in most cases it usually means adding extra human resources if your technology can’t do it for you.
For example, Company ABC decides to expand into 5 new markets and, as a result, it will increase its output volume by 8, since some of these markets have more than one official language. However, its localization technology has a word cap of 10,000 words per month and a limit of 10 users on its platform. These numbers were acceptable for Company ABC’s previous output volume, but fall considerably short of what they will need for their expansion. Therefore, to meet their goals, they have three options:
Pay more money to extend their current technology’s user limits and word cap.
Add human resources to your program to make up for the lack of technological resources.
Migrate to a scalable technology that can accommodate their growth.
Point 1 is a temporary and reactive solution that might solve the problem in the short term but not in the long term, as the company will suffer from the effects of its short-sightedness when it decides to expand again in 12 months or it decides to pull out of some of its markets. Point 2 is the most reactive of them all and by extension the most inefficient. Adding human resources to your localization program not only affects productivity and the risk of bottlenecks but also its cost-efficiency, since effectively you will have more people to do the same amount of work. Point 3 is the solution to the problem, and whilst it requires change and an investment, the results are plain to see. Not only does it solve the aforementioned problem but can also accommodate any changes that may come in the future, with the added pluses of accelerated time to market and increased overall ROI, which we’ve seen can go up by 80%.
Your localization costs should go down over time, despite global expansion, thanks to the optimization of your existing resources. If your costs remain static or are increasing, then you’re going the wrong way. Tap into new technological features out there such as Neural Fuzzy Augmented (NFA) or Artificial Intelligence (AI) to enhance your program’s cost-efficiency, because localization should not be perceived as a cost but rather as a business enhancer.
Business decisions should always be based on available data. Whether you are in the localization business or in any other industry, making decisions that are not based on data means that you are employing a reactive method rather than having a clearly defined strategy with goals. Data-driven decisions are widespread across all industries. According to the 2020 Global State of Enterprise Analytics survey by Microstrategy, 78% of global companies believe that they are using data and analytics effectively, and 60% of them use data and analytics to drive process and cost-efficiency. This data-driven approach not only has a huge impact on your program’s level of efficiency but also sets it up for success in the long term. Data is vital if you want to create a strategic program and it has the answers to all your questions — which is why it’s important to have all the data you need within immediate reach at the click of a button.
How to obtain this data is the natural follow-up question, and the answer is that your localization technology should be able to give you all the data you need and complete visibility of your program to make your decisions: current and trending costs, current and average project turnaround times, vendor performance… You might say that this data has always been available and used by companies, but putting it together always required a lengthy process with different files and then a side-by-side comparison to draw the conclusions. Now, with modern localization technology like translation management systems, you can export all the data from your project that you wish, going as granular as you like, and have it all in a single place for easy analysis and added efficiency.
Data in localization is key since, without it, decisions are made on an arbitrary basis and this approach is unsustainable, maybe not today or in a month but definitely in the short-term future. We can go back to a previously mentioned point to illustrate this issue: if your localization costs are static, then one of the reasons could be that your translation memory (TM) is providing no value, since TM leverage should reduce costs. Your technology stack should be able to provide you with data on your current leveraging and leveraging trends, and from there identify improvement opportunities on how you can make it more efficient. Another example is your vendor performance. A translation management system (TMS) should be able to give you insight on all their activity, including total word counts, average turnaround time, quality-assurance assessment, etc., and this way you can compare them to your other vendors and determine whether they are performing well or underperforming, and make decisions based on this information.
The answers to your localization program questions should come from data, and having this data readily available at any time, as granular as you want it, is a vital step in the path to localization success. If your technology cannot provide it and you are having to resort to managing different files from different systems to export it and then manually comparing it, you need a tech stack upgrade.
Informed decisions are vital to your program’s success. Whether that is assessing current performance or understanding whether your future needs and goals are feasible, all decisions to be made should be based on factual information. Having all this data immediately available not only allows you to make the right decision but to do it in a short timespan and start applying your improvements straight away, which is why you need the necessary technology which provides it.