An idea:
The Google-bot uses the Apache web server. The web server has been started on the server using a specific user account on that server. The NTFS file system has the option to deny access to files. You could set the 'deny access' option for the files you don't want indexed for the Apache's user account. This should not affect normal access to these files through file shares. Of course if the maintainers of these files try to access them through the web server, it would fail for them as well.
Kinda brutal, but inaccessible files are very hard to index, even for the current and future trickery Google implements in their Google-bot.
If you run Apache on a Linux server, file access management is usually easier. Still, the maintainer trying to access files through the company web server will also be a problem with Linux.
Google not indexing your sites, that will hurt their SEO score and therefore placement in the Google search results. If that is not an issue or concern, you could consider the idea above.