topbanner_forum
  *

avatar image

Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length
  • Thursday March 28, 2024, 9:16 pm
  • Proudly celebrating 15+ years online.
  • Donate now to become a lifetime supporting member of the site and get a non-expiring license key for all of our programs.
  • donate

Author Topic: Create Local & Cloud copy of the same files on multiple computers & stay synced  (Read 5246 times)

questorfla

  • Supporting Member
  • Joined in 2012
  • **
  • Posts: 570
  • Fighting Slime all the Time
    • View Profile
    • Donate to Member

Sounds weird but this is a "Real" Question. :wallbash:

I am hoping someone can offer some suggestions to help in an unusual situation.
I have 4 people who use a huge number of document files for two different programs.  Both programs need to use the same files but for different reasons for these 4 people.

Due to recent changes in ISP rules, we are no longer able to use network drive mapping as we have been.  Connecting by VPN works fine for everything except mapping a network drive to the remote folder on the server.  COX has nor joined Comcast and many others in blocking port 445 for HOME internet users.  If they pay for business Internet, the problem immediately disappears so we pretty much narrowed it down very fast with that one fact.

Regardless of the reason, the vendors on the main program which is a SQL database said that many of their other users had switched to using CLOUD hosting to solve this problem.  This is due to the fact that their SQL relational database requires a fixed drive letter as a reference point.  When using the program to look up something it is always referenced by storage at a mapped network drive.  As soon as these people go home, they no longer have access to a Mapped drive on a business Server.
The only thing that changed was all the ISP’s where this occurred were admitting to blocking port 445

The Cloud service they recommend has a solution for this and does allow me to map a drive letter to their CLOUD storage which works very well for that program’s needs.  Even though CLOUD access is “streaming” files from beginning to end, it is fast enough for that program as it deals with files on a one by one basis. .   While these are all small files, there are over 50,000 in total number and no way to make it any less.  It will actually be more over time.   But in size, it is only about 10gb.

These same people also need to be able to use these same files as documents to be uploaded to a website.  They need to be able to upload several files at a time which requires a long delay to get them streamed first to their local drive before they can upload it to the Website.  We quickly found out that if they tried to go direct from the CLOUD drive to the website, it failed as soon as they filled up the buffers since the files were in their system yet to upload, they were still in the stream coming down.

One possibility I considered is to give each of the 4 people a full “seed” copy of all the files and keep it Synced to the CLOUD copy at all times.  They would use the local copy for their website uploads and the CLOUD copy (which CAN be mapped as a drive letter) to do the work needed for the Program that must have a Drive Letter.
Interestingly enough, the Server that hosts the SQL database program has no problem at all running over the VPN because port 445 is not needed.  Only Windows requires it for SMB protocols in Drive mapping but SQL can run without the mapped drive,  You just have no access to the documents referred to which you pull up a record.

But I can see all kinds of problems in Synchronization that could well cause a disaster at some point,  On the good side, If it works, I would have 4 off-site back-up copies of the entire data drive should something happen to it at the CLOUD location.

If I was going to even try this at all, I need a way to create a desktop display showing the total size, number of folders and number of files along with a baseline showing the largest size the CLOUD  drive been  along with the largest number of files and folders.

This would provide an easy way for each to monitor their CLOUD drive compared to their corresponding local folder.  A large discrepancy in anything would indicate a problem so it could be resolved before it was a disaster.

While I am sure there has to be a better way, the only one that I have proven will work is for each person to pay for business Internet at Home. Then their ISP will remove the port 445 block and we can go back to the normal networked drive access over a VPN.

I am hoping someone here has run into something similar and may know another way around this.  The port 445 block was obviously to prevent “work from home” rather than the proclaimed “additional security from Internet Worms” as that is what is does best at.

I look forward to seeing if anyone knows a better way to give both of these programs what they need.  I can tell you for 100% certain that the vendor of the software has already said they have no plans for making changes to their literal requirements for a common mapped drive letter for all users.  The “path” to document is coded into their software and they are not going to change it.

They are not concerned about the fact we need to use those same documents at all  the same workstations to do a task unrelated to their own.
Currently, the users are all coming up with various ways to beat the problem by copying what they think they might need into various folders all over their laptops and it is creating  disaster for who has the most current copy of any file as well as where did they happen to put it.


Stoic Joker

  • Honorary Member
  • Joined in 2008
  • **
  • Posts: 6,646
    • View Profile
    • Donate to Member
Assuming you have an actual server, have you considered using the Remote Desktop Service (a.k.a. Terminal Services)? The session can connect to the local drive from the remote server, and the whole shebang connects over port 3389. This would also give everyone direct access to the same singular data set so there wouldn't be any need to fret about synchronization issues.

Also the (single purchase) TS licensing is a hell of a lot cheaper than a (reoccurring fee) business internet connection, and the hardware requirements for 4 users ain't to bad either.

questorfla

  • Supporting Member
  • Joined in 2012
  • **
  • Posts: 570
  • Fighting Slime all the Time
    • View Profile
    • Donate to Member
Actually not  bad idea for the 4 users and no, I had not even thought of that.  Yes, it is a real Windows 2008 Server.  I have never had much use for remote desktop but  I will sure look into it.  They don't use a "Domain" so I had gotten used to being totally powerless as I cannot enforce all the usual restriction a tried Domain Level Server would allow.  Since I don't, I have also not put much effort into finding out how much I could limit them as far as possibly trashing the Server.
It could solve my problem with these 4 people IF, I can given them ONLY access to those files and folders.  And nothing else.
Since their use of the files and folder when loading the sites is secondary to the primary reason for the files and folders (they are thee as a resource to the other SQL program and if anything "broke" the loss of the SQL programs data would be a lot worse than the loss of access for 4 people from home.
These 4 are not "special" other than "Specially NOT knowing a thing about "How Things REALLY Work!"

Turing them loose inside the real server could be a bad idea.  I would have to lock them down to ONLY that single directory and ONLY being able to copy files from it as needed to upload to the website.  The files get uploaded in batches of 5 or 10 files to a mixtures= of areas so they still need the "middle" area.  As in copy to their drive, then upload an needed.  THEN  delete the copies.  That is the other problem.  In doing it this way, they are "forgetting" that once they have uploaded the files, they need to delete them from their "piles" and start fresh ones.  They tend to name them all kinds of weird folder names and forget what they were. 

They have been putting them everywhere.

One of them was so far off, that HER favorite method of getting back to the website was so save the page she was on.
NOT save as a favorite or anything like that.  I mean actually SAVE the PAGE as HTML.  I found folders all over her desktop and asked her where she got them.  At first I didn’t think much of it as I thought they were the names of the specific sites she was working on, the whole name ran off the end.  Then I noticed all these Internet explorer icons with the same name (again, all over her desktop)

When I asked what they were, she said that was how she gets back to where she was when she left.  So I looked inside the folders.  I almost could not believe what she had been doing!  These were literal COPIES of all the CODE needed to create the webpage. I am sure you know what I mean.  I had her click one of her "shortcuts" and looked at the address bar in IE.  Sure enough, she was inside the folder on her desktop.

Since the site is a Password protected one, that was probably all that save her.  Even though it was a "copy" of an inside page, as soon as she connected, it forced her to login.  The login routes you back to a specific page so.. if you did not KNOW what you were seeing it might not even look so weird.  Thank God for Hyperlinks!
I would be a bit worried about letting people like that work with the REAL only copies so I would still have to create a mirror folder on the server but it would be easier to keep that "COPY" folder updated for sure!

Shades

  • Member
  • Joined in 2006
  • **
  • Posts: 2,922
    • View Profile
    • Donate to Member
Port forwarding could be an option. Version control software (Mercurial, Git, SVN or even the venerable CVS) works fine whenmultiple people work on the same set of files. This type of software keeps track of who changed what and when, contains a server (using default ports that might or might not be blocked by the ISP) and are open source.

Once you get used to it (which doesn't take that long) and your users provide each commit with a meaningful description, you'll have no problem going back to a version of a file that works and there is no need to play the blame game anymore. You'll wonder how you or your organization could have ever worked without it.

The users do not need access to shared folders on the server this way and each user will have only one (1) copy of the files locally. Each committed change is stored in a folder on the server and the files from that folder can be used as you see fit.

I seriously advise you to get all the necessary heads around the concept behind these systems. 

40hz

  • Supporting Member
  • Joined in 2007
  • **
  • Posts: 11,857
    • View Profile
    • Donate to Member
If you have a Windows server and Windows clients, why aren't they simply establishing a VPN connection into it? Then a nice simple batch file (example: net use E: "\\server_name\share_name" /persistent:no) or powershell script that maps the drive(s) can be invoked by the user and all should be well.

Or am I just tired and missing something obvious... :huh:

---------------------

ADDENUM: I did miss it. Right on the OP. Sorry!
:-[
« Last Edit: April 11, 2014, 11:28 AM by 40hz »

Stoic Joker

  • Honorary Member
  • Joined in 2008
  • **
  • Posts: 6,646
    • View Profile
    • Donate to Member
If you have a Windows server and Windows clients, why aren't they simply establishing a VPN connection into it? Then a nice simple batch file (example: net use E: "\\server_name\share_name" /persistent:no) or powershell script that maps the drive(s) can be invoked by the user and all should be well.

Or am I just tired and missing something obvious... :huh:

---------------------

ADDENUM: I did miss it. Right on the OP. Sorry!
:-[

Actually that one had me baffled out of the gate as well. How the hell is the ISP blocking port 445 inside the tunnel? Raw traffic to the web sure...I can almost understand that one ... But inside the tunnel? ...That's just mean.



Turing them loose inside the real server could be a bad idea.  I would have to lock them down to ONLY that single directory and ONLY being able to copy files from it as needed to upload to the website.  The files get uploaded in batches of 5 or 10 files to a mixtures= of areas so they still need the "middle" area.  As in copy to their drive, then upload an needed.  THEN  delete the copies.  That is the other problem.  In doing it this way, they are "forgetting" that once they have uploaded the files, they need to delete them from their "piles" and start fresh ones.  They tend to name them all kinds of weird folder names and forget what they were. 

They have been putting them everywhere.

Two other handy technologies for stuff like this are the Distributed File System (DFS) and Shadow Copies/Previous Versions. DFS allows you to control access to the file system by only displaying the targets you want to see, instead of the whole drive. Granted NTFS permissions can/will keep them out of stuff too. But I find it's better to keep a users options as narrow as possible so they don't get lost/tempted/curious/etc. DFS can also provide access to discontiguous locations in a single virtual space. So even if the files were scattered across 9 different drives and servers, they could still access the allowed portions of all from a single drive mapping. I leveraged the capabilities of DFS to decommission our old file server during business hours, and while 20 people were in and out of the system all day long...nobody noticed the transition. Also the DFS roots aren't writable, which is a beautifully simply of enforcing cleanliness.

On a side note: most of the big multi function printer/copiers these days have a feature the automatically deletes files scanned from it to a share that are older than X time period. I'm wondering if there is something like that for file servers ... 40hz, ideas? I'd hate to have to write the thing myself ... But it is kind of tempting (in a sick evil fun sort of way). *Shrug* Back on Topic!

Previous Version uses/is part of the same Windows System Restore feature we all know and occasionally love or hate depending on how well it's working that day. When enabled (by default) it takes a snapshot of the drive every 12 hours. So if something gets deleted it can be restored on the fly from the Previous Versions tab of the parent folders properties dialog. The snapshot interval is configurable but it isn't recommended to take one more often than once an hour. I usually either go with the default or bump it to 3 times a day. This is also quite handy for those odd moments when somebody deleted something yesterday, so the previous nights backup media is already off site - and 20+ miles away - yet somebody important needs file X right freakin now.

40hz

  • Supporting Member
  • Joined in 2007
  • **
  • Posts: 11,857
    • View Profile
    • Donate to Member
Ok...from SJ's comment above maybe I didn't miss something?

Is it really possible for your ISP to block a port inside a VPN connection? This is a Micosoft to Microsoft VPN connection with no 3rd party involved - so the ISP shouldn't be able to see into it at all. That's the whole point of that "P" in VPN.

If they can, that's a new one for me.  :huh:

This sounds more like a policy or firewall issue on the server side.
« Last Edit: April 11, 2014, 06:32 PM by 40hz »

Innuendo

  • Charter Member
  • Joined in 2005
  • ***
  • default avatar
  • Posts: 2,266
    • View Profile
    • Donate to Member
That's the whole point of that "P" in VPN.

Please obey the rules when using our VN. There's no P in it and we'd like to keep it that way.

(Modern take on the old pool joke.)


Well...I thought it was funny. ;)

questorfla

  • Supporting Member
  • Joined in 2012
  • **
  • Posts: 570
  • Fighting Slime all the Time
    • View Profile
    • Donate to Member
http://customer.comc...st-of-blocked-ports/
for 40hz and company,  Yes, they can and do.  And I was busy proving it years ago before they stared publishing the fact on their website.  I would love to think my persistent pestering actually caused them to "fess up".    Not that it did any good but at least now I have Proof I can point to instead of people telling me I am a looney case and the problem is I don't know how to create a vpn.

I tried every possible trick, every possible mixture of protocols, every port forward or port "backward" or Ports AHOY!.

Nothing worked.  If they were using certain ISP's they had no ability to map to a folder.
On other ISP's the same exact procedure worked like a charm.  What got really weird was when SOME of the Cellular routers began to do the same thing.
Not all and not necessarily "by the provider"  it was a software thing,
They could be working one day, get a softare update on their cellular router (MiFi or whatever your favorite name for them) and AFTER that update, No Mapping.  Nothing else is affected and the ONLY way you can prove anything is to run Port Test software on the USERS's system.  It is blocked from them OUT not me IN so I can't even see the block.  And none of them was enough of a tech to have such software not know how to use it.

I finally had the "light-bulb" moment and connected to a system at their home as a remote user and ran the software for them from their end.  Sure enough, Port 445 TCP was dead as a doornail!

But on MY end it looks like they should have NO problem and the VPN will connect perfectly.

What is really "Evil" about this is their "Claim of Security" holds no water when you think about the fact that if you PAY for a business connection suddenly they don't CARE if you get "infected"?  I guess they only want to protect "Home Users"  ?? :-\  And apparently only for the USER getting OUT, Not from the "threat" getting IN. 

This same policy is now used by almost every large ISP.  I know of only a few who still let all traffic flow freely and I bet it is only a matter of time before they start the same racket.  All about  $$.