ATTENTION: You are viewing a page formatted for mobile devices; to view the full web page, click HERE.

Main Area and Open Discussion > General Software Discussion

Create Local & Cloud copy of the same files on multiple computers & stay synced

(1/2) > >>


Sounds weird but this is a "Real" Question. :wallbash:

I am hoping someone can offer some suggestions to help in an unusual situation.
I have 4 people who use a huge number of document files for two different programs.  Both programs need to use the same files but for different reasons for these 4 people.

Due to recent changes in ISP rules, we are no longer able to use network drive mapping as we have been.  Connecting by VPN works fine for everything except mapping a network drive to the remote folder on the server.  COX has nor joined Comcast and many others in blocking port 445 for HOME internet users.  If they pay for business Internet, the problem immediately disappears so we pretty much narrowed it down very fast with that one fact.

Regardless of the reason, the vendors on the main program which is a SQL database said that many of their other users had switched to using CLOUD hosting to solve this problem.  This is due to the fact that their SQL relational database requires a fixed drive letter as a reference point.  When using the program to look up something it is always referenced by storage at a mapped network drive.  As soon as these people go home, they no longer have access to a Mapped drive on a business Server.
The only thing that changed was all the ISP’s where this occurred were admitting to blocking port 445

The Cloud service they recommend has a solution for this and does allow me to map a drive letter to their CLOUD storage which works very well for that program’s needs.  Even though CLOUD access is “streaming” files from beginning to end, it is fast enough for that program as it deals with files on a one by one basis. .   While these are all small files, there are over 50,000 in total number and no way to make it any less.  It will actually be more over time.   But in size, it is only about 10gb.

These same people also need to be able to use these same files as documents to be uploaded to a website.  They need to be able to upload several files at a time which requires a long delay to get them streamed first to their local drive before they can upload it to the Website.  We quickly found out that if they tried to go direct from the CLOUD drive to the website, it failed as soon as they filled up the buffers since the files were in their system yet to upload, they were still in the stream coming down.

One possibility I considered is to give each of the 4 people a full “seed” copy of all the files and keep it Synced to the CLOUD copy at all times.  They would use the local copy for their website uploads and the CLOUD copy (which CAN be mapped as a drive letter) to do the work needed for the Program that must have a Drive Letter.
Interestingly enough, the Server that hosts the SQL database program has no problem at all running over the VPN because port 445 is not needed.  Only Windows requires it for SMB protocols in Drive mapping but SQL can run without the mapped drive,  You just have no access to the documents referred to which you pull up a record.

But I can see all kinds of problems in Synchronization that could well cause a disaster at some point,  On the good side, If it works, I would have 4 off-site back-up copies of the entire data drive should something happen to it at the CLOUD location.

If I was going to even try this at all, I need a way to create a desktop display showing the total size, number of folders and number of files along with a baseline showing the largest size the CLOUD  drive been  along with the largest number of files and folders.

This would provide an easy way for each to monitor their CLOUD drive compared to their corresponding local folder.  A large discrepancy in anything would indicate a problem so it could be resolved before it was a disaster.

While I am sure there has to be a better way, the only one that I have proven will work is for each person to pay for business Internet at Home. Then their ISP will remove the port 445 block and we can go back to the normal networked drive access over a VPN.

I am hoping someone here has run into something similar and may know another way around this.  The port 445 block was obviously to prevent “work from home” rather than the proclaimed “additional security from Internet Worms” as that is what is does best at.

I look forward to seeing if anyone knows a better way to give both of these programs what they need.  I can tell you for 100% certain that the vendor of the software has already said they have no plans for making changes to their literal requirements for a common mapped drive letter for all users.  The “path” to document is coded into their software and they are not going to change it.

They are not concerned about the fact we need to use those same documents at all  the same workstations to do a task unrelated to their own.
Currently, the users are all coming up with various ways to beat the problem by copying what they think they might need into various folders all over their laptops and it is creating  disaster for who has the most current copy of any file as well as where did they happen to put it.

Stoic Joker:
Assuming you have an actual server, have you considered using the Remote Desktop Service (a.k.a. Terminal Services)? The session can connect to the local drive from the remote server, and the whole shebang connects over port 3389. This would also give everyone direct access to the same singular data set so there wouldn't be any need to fret about synchronization issues.

Also the (single purchase) TS licensing is a hell of a lot cheaper than a (reoccurring fee) business internet connection, and the hardware requirements for 4 users ain't to bad either.

Actually not  bad idea for the 4 users and no, I had not even thought of that.  Yes, it is a real Windows 2008 Server.  I have never had much use for remote desktop but  I will sure look into it.  They don't use a "Domain" so I had gotten used to being totally powerless as I cannot enforce all the usual restriction a tried Domain Level Server would allow.  Since I don't, I have also not put much effort into finding out how much I could limit them as far as possibly trashing the Server.
It could solve my problem with these 4 people IF, I can given them ONLY access to those files and folders.  And nothing else.
Since their use of the files and folder when loading the sites is secondary to the primary reason for the files and folders (they are thee as a resource to the other SQL program and if anything "broke" the loss of the SQL programs data would be a lot worse than the loss of access for 4 people from home.
These 4 are not "special" other than "Specially NOT knowing a thing about "How Things REALLY Work!"

Turing them loose inside the real server could be a bad idea.  I would have to lock them down to ONLY that single directory and ONLY being able to copy files from it as needed to upload to the website.  The files get uploaded in batches of 5 or 10 files to a mixtures= of areas so they still need the "middle" area.  As in copy to their drive, then upload an needed.  THEN  delete the copies.  That is the other problem.  In doing it this way, they are "forgetting" that once they have uploaded the files, they need to delete them from their "piles" and start fresh ones.  They tend to name them all kinds of weird folder names and forget what they were. 

They have been putting them everywhere.

One of them was so far off, that HER favorite method of getting back to the website was so save the page she was on.
NOT save as a favorite or anything like that.  I mean actually SAVE the PAGE as HTML.  I found folders all over her desktop and asked her where she got them.  At first I didn’t think much of it as I thought they were the names of the specific sites she was working on, the whole name ran off the end.  Then I noticed all these Internet explorer icons with the same name (again, all over her desktop)

When I asked what they were, she said that was how she gets back to where she was when she left.  So I looked inside the folders.  I almost could not believe what she had been doing!  These were literal COPIES of all the CODE needed to create the webpage. I am sure you know what I mean.  I had her click one of her "shortcuts" and looked at the address bar in IE.  Sure enough, she was inside the folder on her desktop.

Since the site is a Password protected one, that was probably all that save her.  Even though it was a "copy" of an inside page, as soon as she connected, it forced her to login.  The login routes you back to a specific page so.. if you did not KNOW what you were seeing it might not even look so weird.  Thank God for Hyperlinks!
I would be a bit worried about letting people like that work with the REAL only copies so I would still have to create a mirror folder on the server but it would be easier to keep that "COPY" folder updated for sure!

Port forwarding could be an option. Version control software (Mercurial, Git, SVN or even the venerable CVS) works fine whenmultiple people work on the same set of files. This type of software keeps track of who changed what and when, contains a server (using default ports that might or might not be blocked by the ISP) and are open source.

Once you get used to it (which doesn't take that long) and your users provide each commit with a meaningful description, you'll have no problem going back to a version of a file that works and there is no need to play the blame game anymore. You'll wonder how you or your organization could have ever worked without it.

The users do not need access to shared folders on the server this way and each user will have only one (1) copy of the files locally. Each committed change is stored in a folder on the server and the files from that folder can be used as you see fit.

I seriously advise you to get all the necessary heads around the concept behind these systems. 

If you have a Windows server and Windows clients, why aren't they simply establishing a VPN connection into it? Then a nice simple batch file (example: net use E: "\\server_name\share_name" /persistent:no) or powershell script that maps the drive(s) can be invoked by the user and all should be well.

Or am I just tired and missing something obvious... :huh:


ADDENUM: I did miss it. Right on the OP. Sorry! :-[


[0] Message Index

[#] Next page

Go to full version