One of the biggest impacts on the language services industry in the 2010s was the growing use and accuracy of machine-generated transcriptions and translations. Companies, freelancers, and consumers can use software (like Google Translate) for free or very cheap to get language services, and many companies are developing their own software. Even here at Atomic Scribe we’ve introduced an automatic transcription option for $1/audio minute that combines human quality with machine efficiency.
The good news is that this is helping speed up services, and a lot of software is widely accessible. The bad news is that the accuracy for most files and texts is nowhere near 100%, which can especially be worrisome if the software is used on medical records, legal documents, or other files that need to be correct. It’s also hard for software to offer nuance and localization instead of a direct translation of content, which can cause problems (like in the case of Norway’s Olympic team mistakenly ordering 15,000 eggs).
So what can we expect next? The use of such services will surely rise, though some fear that computers will erase humans from the equation entirely in this service sector. Fortunately, the ability for machines to be 100% accurate on every transcription and translation is far from a reality, as humans are still needed to ensure accuracy. But we can leverage the positives of this software to help human workers perform better and more efficiently.