501
N.A.N.Y. 2019 / Re: NANY 2019: Shorthand 3_2
« Last post by Shades on February 05, 2019, 09:11 PM »You couldn't be more wrong.
If batch files are your thing, I found a lot of help and/or hints at the website of Rob van der Woude.
Downloading? From a website? Or a personal cloud drive?
Because of the inherited environment, a few batch scripts that are very important to the reeling and dealing of this company are in my care. And I have expanded their functionality significantly.
Well, you could generate in the main batch script another batch script for handling the download from either a specific or generic list of files then CALL it at the appropriate time in the main script and when done with the download you make the main batch script clean up that generated batch file. You will need the transfer tools you plan to use already in place on the necessary computer(s) to make that as "smooth" as possible.
But security-wise it is a flawed method. Not too big of a deal when you retrieve stuff from your own cloud drive or something like that, but in a security conscious environment, that method is best to be considered a 'no-no'.
If batch files are your thing, I found a lot of help and/or hints at the website of Rob van der Woude.
Downloading? From a website? Or a personal cloud drive?
Because of the inherited environment, a few batch scripts that are very important to the reeling and dealing of this company are in my care. And I have expanded their functionality significantly.
Spoiler
One element of that script is the transfer of software builds from a computer in Europe to South America where I reside.
Does such a scenario apply to your request? If so, I'll explain what I did. First, I made a server from old PC parts laying around, put Ubuntu Server LTS on it, installed WebMin and NextCloud on it and made that server face the Internet. On the other end I used several different protocols (FTP, FTPS, SFTP, SCP, WebDAV) to see what was the fastest method of transfer. In my case that was WebDAV (and with a really big margin), so the choice for NextCloud on my end was easily made.
On the other end I had to maintain a batch script that makes checkouts from a CVS server, which was already old software in 1995, but it is still in use and the people working with it really do not want to change. On that end I really had no other option than continue with batch. Anyway, the checkout is done by batch, the building process is now also done using batch and then the my transfer batch script comes into play.
There is nice way to do transfers with command line tools. I did find a tool called CarotDAV, which is dead simple to use software for transferring files manually through the Windows GUI. It really is and you can use it also to connect it to Google, DropBox, OneDrive etc., as well. But the only software I could find with command line support was WinSCP. That is a pretty powerful piece of software for transferring files. But it sure isn't fast. It was faster to log into the server and use CarotDAV to make a manual transfer than to do it by batch script with WinSCP.
In my situation CarotDAV was always finished between 10 and 15 minutes, while with WinSCP (which supports the all the FTP, SCP and WebDAV protocols) it would always take between an hour and an hour and a half. But some 10 months ago a new build of CarotDAV came out, which has now also command line support (and a progress meter).
So now, when a build is started it is checked out, archived, put in a local repository, transferred to my environment, put in my local repository and specific other folder for automatic regression testing (done by another script that is not batch, because "dog fooding" and with that specific script language it is easy to generate reports from the huge battery of regression tests which also need to be sent by mail to several people). And when the transfer has been successful, the transfer script also generates a HTML mail message with build specific content to several people. All automagically without any further user interaction, besides a person initiating the build.
What I can tell you is that no-one should ever want to use batch scripts in this way. It is messy script code at best and has cost me way too much time troubleshooting to make it reliable. Development and troubleshooting all had to take place within a text editor and the command-line box that comes with Windows. No IDE of any kind
You'll learn to appreciate the Linux command line shells or PowerShell from MS so much more after an exercise like this. Be sure of that.
Does such a scenario apply to your request? If so, I'll explain what I did. First, I made a server from old PC parts laying around, put Ubuntu Server LTS on it, installed WebMin and NextCloud on it and made that server face the Internet. On the other end I used several different protocols (FTP, FTPS, SFTP, SCP, WebDAV) to see what was the fastest method of transfer. In my case that was WebDAV (and with a really big margin), so the choice for NextCloud on my end was easily made.
On the other end I had to maintain a batch script that makes checkouts from a CVS server, which was already old software in 1995, but it is still in use and the people working with it really do not want to change. On that end I really had no other option than continue with batch. Anyway, the checkout is done by batch, the building process is now also done using batch and then the my transfer batch script comes into play.
There is nice way to do transfers with command line tools. I did find a tool called CarotDAV, which is dead simple to use software for transferring files manually through the Windows GUI. It really is and you can use it also to connect it to Google, DropBox, OneDrive etc., as well. But the only software I could find with command line support was WinSCP. That is a pretty powerful piece of software for transferring files. But it sure isn't fast. It was faster to log into the server and use CarotDAV to make a manual transfer than to do it by batch script with WinSCP.
In my situation CarotDAV was always finished between 10 and 15 minutes, while with WinSCP (which supports the all the FTP, SCP and WebDAV protocols) it would always take between an hour and an hour and a half. But some 10 months ago a new build of CarotDAV came out, which has now also command line support (and a progress meter).
So now, when a build is started it is checked out, archived, put in a local repository, transferred to my environment, put in my local repository and specific other folder for automatic regression testing (done by another script that is not batch, because "dog fooding" and with that specific script language it is easy to generate reports from the huge battery of regression tests which also need to be sent by mail to several people). And when the transfer has been successful, the transfer script also generates a HTML mail message with build specific content to several people. All automagically without any further user interaction, besides a person initiating the build.
What I can tell you is that no-one should ever want to use batch scripts in this way. It is messy script code at best and has cost me way too much time troubleshooting to make it reliable. Development and troubleshooting all had to take place within a text editor and the command-line box that comes with Windows. No IDE of any kind

You'll learn to appreciate the Linux command line shells or PowerShell from MS so much more after an exercise like this. Be sure of that.
Well, you could generate in the main batch script another batch script for handling the download from either a specific or generic list of files then CALL it at the appropriate time in the main script and when done with the download you make the main batch script clean up that generated batch file. You will need the transfer tools you plan to use already in place on the necessary computer(s) to make that as "smooth" as possible.
But security-wise it is a flawed method. Not too big of a deal when you retrieve stuff from your own cloud drive or something like that, but in a security conscious environment, that method is best to be considered a 'no-no'.

Recent Posts