topbanner_forum
  *

avatar image

Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length
  • March 19, 2019, 03:59 AM
  • Proudly celebrating 13 years online.
  • Donate now to become a lifetime supporting member of the site and get a non-expiring license key for all of our programs.
  • donate

Last post Author Topic: NANY 2019: Shorthand 3_2  (Read 2718 times)

Maestr0

  • Participant
  • Joined in 2010
  • *
  • Posts: 19
    • View Profile
    • Donate to Member
Re: NANY 2019: Shorthand 3_2
« Reply #25 on: February 05, 2019, 03:25 PM »
Shades, you seem to be a Batch file guru, is there a failsafe way in a batch file to download files from within the batch file? Bitsadmin is deprecated, sadly :(

Shades

  • Member
  • Joined in 2006
  • **
  • Posts: 2,474
    • View Profile
    • Donate to Member
Re: NANY 2019: Shorthand 3_2
« Reply #26 on: February 05, 2019, 09:11 PM »
You couldn't be more wrong.

If batch files are your thing, I found a lot of help and/or hints at the website of Rob van der Woude.

Downloading? From a website? Or a personal cloud drive?

Because of the inherited environment, a few batch scripts that are very important to the reeling and dealing of this company are in my care. And I have expanded their functionality significantly.

Spoiler
One element of that script is the transfer of software builds from a computer in Europe to South America where I reside.

Does such a scenario apply to your request? If so, I'll explain what I did. First, I made a server from old PC parts laying around, put Ubuntu Server LTS on it, installed WebMin and NextCloud on it and made that server face the Internet. On the other end I used several different protocols (FTP, FTPS, SFTP, SCP, WebDAV) to see what was the fastest method of transfer. In my case that was WebDAV (and with a really big margin), so the choice for NextCloud on my end was easily made. 

On the other end I had to maintain a batch script that makes checkouts from a CVS server, which was already old software in 1995, but it is still in use and the people working with it really do not want to change. On that end I really had no other option than continue with batch. Anyway, the checkout is done by batch, the building process is now also done using batch and then the my transfer batch script comes into play.

There is nice way to do transfers with command line tools. I did find a tool called CarotDAV, which is dead simple to use software for transferring files manually through the Windows GUI. It really is and you can use it also to connect it to Google, DropBox, OneDrive etc., as well. But the only software I could find with command line support was WinSCP. That is a pretty powerful piece of software for transferring files. But it sure isn't fast. It was faster to log into the server and use CarotDAV to make a manual transfer than to do it by batch script with WinSCP.

In my situation CarotDAV was always finished between 10 and 15 minutes, while with WinSCP (which supports the all the FTP, SCP and WebDAV protocols) it would always take between an hour and an hour and a half. But some 10 months ago a new build of CarotDAV came out, which has now also command line support (and a progress meter).

So now, when a build is started it is checked out, archived, put in a local repository, transferred to my environment, put in my local repository and specific other folder for automatic regression testing (done by another script that is not batch, because "dog fooding" and with that specific script language it is easy to generate reports from the huge battery of regression tests which also need to be sent by mail to several people). And when the transfer has been successful, the transfer script also generates a HTML mail message with build specific content to several people. All automagically without any further user interaction, besides a person initiating the build.

What I can tell you is that no-one should ever want to use batch scripts in this way. It is messy script code at best and has cost me way too much time troubleshooting to make it reliable. Development and troubleshooting all had to take place within a text editor and the command-line box that comes with Windows. No IDE of any kind :mad:

You'll learn to appreciate the Linux command line shells or PowerShell from MS so much more after an exercise like this. Be sure of that.


Well, you could generate in the main batch script another batch script for handling the download from either a specific or generic list of files then CALL it at the appropriate time in the main script and when done with the download you make the main batch script clean up that generated batch file. You will need the transfer tools you plan to use already in place on the necessary computer(s) to make that as "smooth" as possible.

But security-wise it is a flawed method. Not too big of a deal when you retrieve stuff from your own cloud drive or something like that, but in a security conscious environment, that method is best to be considered a 'no-no'.


Maestr0

  • Participant
  • Joined in 2010
  • *
  • Posts: 19
    • View Profile
    • Donate to Member
Re: NANY 2019: Shorthand 3_2
« Reply #27 on: February 06, 2019, 04:39 AM »
Thanks for the info, Shades! I'll look into it.

My current idea is to have the batch file check for (and download as needed) the #include files the main script needs. That can't be done in the script itself because it throws an error when it doesn't find it.
If I can't reliably get the batch file to get the required included files, I'll use a script to check for the required files and then run the main script.

Oh, and yeah, the downloaded files are from my public dropbox folder.

Shades

  • Member
  • Joined in 2006
  • **
  • Posts: 2,474
    • View Profile
    • Donate to Member
Re: NANY 2019: Shorthand 3_2
« Reply #28 on: February 06, 2019, 07:27 PM »
IF EXIST / IF NOT EXIST

That should help you find out if a file exist or not and apply conditions you like in the main batch script. If, for instance, you have a (short) static list of include files, you could could dedicate a section in your main batch script to generate a secondary batch script with the files that are there to copy. Then let the main batch script call this secondary script, which can now copy all existing files and when the secondary batch script is finished, you can let the main batch script delete the secondary script.

I do that to generate my HTML mail message, which is deleted after the command line mail client is done sending that generated message. And as long as you don't need passwords, it is pretty handy way of doing things.

And then you'll find out that you can achieve the same in PowerShell (or Linux shells) because these have much better support for pipes.   :D