The workspace will automatically download external files via HTTP so that they may be read and processed locally. But it will do so in a non-unique path so that a second request would overwrite the first one. If now two process request one asset in parallel but the first process tries to clean-up at the end, the file is now missing for the second operation. This would not happen if the path would be unique.
Re-downloading each time may be a bit slower, but that can easily be prevented by pulling the assets into the working file repository at the beginning of the workflow.
Solving this issue will enable services to properly clean up, removing files from the workspace after they are done.