Replies: 8 comments
-
@NorthOC in short yes there is! But there are many cavieats. my one project also is suffering for linear loop. That's why I started a discussion of asyncio. If I get vote or positive response then I was thinking to do asyncio with multiprocessing in this area specifically. |
Beta Was this translation helpful? Give feedback.
-
@NorthOC you can find a speed up sample here #195 (comment) |
Beta Was this translation helpful? Give feedback.
-
Alright, I have found a way to speed up translations for large amounts of data. Basically, my solution was to create a dict and then check if the same sentence was translated before. This is way faster than calling the Translator API, even for large amounts of data (which I had worked on). The longer the program ran, the faster my translations became, due to the repeating nature of language. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Large files are a pain in the ass to translate right now.
The current code for batch processing seems to do what any regular translator does normally: translating one string after another with the speed of O^n.
Maybe there are ways to improve this?
Like for instance, creating a string blob of list items, adding a specific separator, processing the batch and then splitting it back to a list?
Beta Was this translation helpful? Give feedback.
All reactions