avatar image

Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length
  • Friday April 10, 2020, 1:51 am
  • Proudly celebrating 15 years online.
  • Donate now to become a lifetime supporting member of the site and get a non-expiring license key for all of our programs.
  • donate

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - questorfla [ switch to compact view ]

Pages: [1] 2 3 4 5 6 ... 23next
This would need to work on Windows 10 x64 and doesn't need to provide encryption.  This is a case where I need a simple way to put 12 to 15 documents in a folder and lock it such that it would ask for a password to access the files. 

In searching, I have located a number of these ranging from "Folder Locker" to a few that appear to not even be around anymore.  Some have many additional features included but in this case, that would be worse rather than better.  As long as I can Trust the Lock

Now the conditions:   I need it to be something where I could lock the folders on my system and post them for download on a website by a group of people.  The group would be downloading several normal folders of files as well but a few of them would be marked as needing a password and only a few of the group would know that password.   

Some of the "Folder Lock" software I have found appears to be limited to locking folders only on my system which would not help.  Nor do I want to hide the folder names, they need to be visible to all but just locked t those without the password.  I need the group to be able to download all the folders but certain folders need to be locked with a password.  But not necessarily encrypted as I have had problems in the past with encrypted files not working properly if they are not still located on the system where they were originally encrypted.

This also has to be easy to use because the people who access them are not tech geniuses.  A simple "enter the password" and once they enter the correct one, it should unlock and allow that user to open all the documents inside without further ado. As easy as possible.  Like an electronic version of a safety deposit box. The folder would contain a mixed bag of document types, some pdf, some Word, etc.  And no none wants to lock the files one at a time.

The last condition is that it needs to be a program I can run on my system to Lock the folder and the recipient should have no problem opening it on theirs once they download it preferably without needing to install anything at their end.

Some versions of 7-Zip might work if it can be made to create self-extracting archives that only require the correct password to be entered to access the contents.  I figure some form of a self-extracting archive is going to be my best bet but there might be other software out there that would work better. 

I would appreciate any information from those who had maybe had the same need at one point and found a working solution.  Thanks

In closing, I hope that all the members of Donation Coders Forum and their Families and friends are doing Ok and staying well during these trying times.

If this question is in the wrong place, please tell me where to move it.
I have run into a situation where i need to connect a system that runs on a single static ip and give it a 2nd iip that is part of our local main network.
Example the systems must stay active at and i need it also be seen at   My first thought was to use a 2nd NIC.  My 2nd thought was to use duel VPNs Servers as a bridge .  The 10.0.70. network already has Softether VPN and adding the same to the network should allow for a VPN Bridge.
But I am not sure which method would best allow for shared folders on the 184.174 system to be accessed by people on the 10.0.70 network. 
The dual nics do work but from everything i can read, this is not the best way to go. 
All pf the systems are running normal Windows 10 x64 Home.  Unfortunately,. MS has recently removed the Homegroup option from 10 Home. 
The third option (which might be best) would be to move the 184.174 system inside the 10.0.70 network but it is an small Apache Web Server that i would prefer to keep on the separate static IP it runs on now.  The folder access is for people that have to load and edit files on the websites.  None of them are proficient enough to deal with an FTP program and need this to be a simple shared network connection if possible

Thanks, highend .   That did better than I was doing.  I think I am ready to admit that I need a complete form template of some kind.  There are quite a few of these input fields, some need to have character count limits, some need to be set for numbers only.  We have used this specific setup script for some time and it works so well I hate to give it up.  But it is hard to explain how it works for new hires.
There are hundreds of 'forms programs' out there including MS Access and even Excel.  There are probably only 10 variables that have to be input and then another 4 or 5 that are created within the script.  It would be nice if I could get all this to display on the screen at the same time just like a filled-in form.
This would let me catch any errors before the project gets created with erroneous input.  Clicking the "GO" button at the bottom would accept the data fields and continue the creation and the displayed form could be saved as a filled-in template.  It might come in handy to be able to open that same form at a later date to see what was done.   
In addition, since these things are created sequentially, opening the last form would automatically fill in the same data fields and I could just change the ones that needed it.  A couple of us who use this in the IT department have played around with different options but since the script was written in a batch script the only thing I can think of for making a nice front end form would be to use HTA to make it.  Or maybe VBS.  Powershell is a pain since Windows restricts it by default
I figured there might even be something already on the Board from long ago but maybe I am searching with the wrong terms

I am trying to do something that i can't seem to manage.  During the running if a long script there are several steps where various data is either input or created.  At the end of the script, it is easy to write up a Job-Run Sheet that shows all the parts and pieces and where they got used and how.
I would like to be able to create the Job-Run sheet so that it is filled in while the loader is being run.  The first data entered is the job number and I use that to create a text file named "entered number.txt".  An d i was able to echo the first bit of data into it as line 1
But when i open the file so it can stay open as edited i cannot get the script to continue 
This is a setup for User to create a setup log and it would he helpful if they had that information displayed as it was entered instead of after the fact.  I have been able to create the Job-run <#> text file as well as write the first line but as soon as i open the file for viewing I am unable to return to the original batch script to continue.
This does not have to be a text file at that point as long as the User can see an easy to read listing of what they did so far.
Below is the code where the problem starts. 

set /p snum="Enter Job Number: "
>%snum%.txt echo Job Number is %snum%
call notepad %snum%.txt

>>PS:  I also tried cmd /c with same results
As soon as I call notepad to open the file so it can be read, the script stops.  I want it to stay in view so each step that echos more data to it will display the data in the same way

i think I'm getting lost in the iterations which is normal for me :(
i started out trying Find And Replace" (FAR) but never got it to work properly, perhaps due to the unusual characters.
I end up getting a working script using the Set command but hit problems when i tried to run it recursively over all 100+ text files in the folder while keeping the file names unchanged.

Each file is contains a single string repainting a URL and there are no spaces in the string.  The domain portion of each has changed and tht is the only part i need to modify  -  replace the section that says "" with "" ,change nothing else and keep the original filename.

This has to be done to over a hundred urls stored as named text files in a master folder.

So far this is what i got:

Code: Text [Select]
  1. for /f "tokens=1,* delims=¶" %%A in ( '"type *.txt"') do (
  2. SET string=%%A
  3. SET modified=!!
  4. echo !modified! >> out.txt
  5. )

This works on a filename by filename basis but requires temporarily renaming the files while they are processed.   I feel sure there is a way to walk the directory recursively and keep the filenames intact during the modification.

As written, the code above requires an infile and an outfile a bunch of renaming and is far more 'work" than should be needed.

I bet 4WD will have what I need right at hand if he is on line.  :).   
I would like for the result to be flexible enough to save the code for use on future projects where i might need to do something similar.


Thanks.  Your Instructions might get me there.  Eventually.  I had already tried changing the index.php for something similar and even with your exact wording it only works if i include a copy of both index.php AND a copy of styles.css in every subdirectory of every folders in documents directory.   
I am sure you are correct and it does have something to do with the relational path but for some reason on this setup I can't seem to get anything to work other than as it was
Many Thanks for your help.   At least I know now that I was looking in the correct direction.  One of the other members also sent me to a file called directory lister and for some reason THAT one Does work properly and even has the added pointers for forward and back.  But:  It is hard to read with fonts too small and colors light and it doesn't have an easy to work with styles.css like this one to make changes.   I wish i could merge the best of both. 

Maybe I need to try to merge the path statements from the new index.php such that it is stated in the first instance of index.php  on the website?  Rather hqan the following one it finds in the documents folder?    Or maybe it would be easier to change the Look of the directory lister snippet instead? Always some kind of decision to make.  :(


Sorry it took so long to reply 4WD but your HFS option is extremely interesting. I can see where it would be Very Useful for many things. 
I am not sure we can do it like that though since we have to share specific folders with different people.  There are normally at least 30 different folders shared to different user-groups at any point in time.

And to Shades:  the one we current are looking at is Sync (of It works fine but the interface is way too cluttered for my liking.  We just need a way to be sure only the specific people that should be able to see each folder Can see it.  And no others.  We used to want to control downloads but at this point, that isn't even an issue.

So a website that has the ability to control which Viewers can enter which folders on it would be great.  We already have that but it is very old and very complex to use for something this simple.  Sync allows us to control access to the Link to the SYnc folder which (if the viewing area were not full of ads for Sync) would be fine

Also to Deozaan:
i had found a similar setup and have been trying to get it to work on multiple subkevels with no luck.  i uploaded the code it in another of my forum posts to ATH .   Yours option does work better but i would need to tweak the way it 'looks'.  the css code included with the one i call DDC just has an easier to read format


DUH  I think i don't wait long enough.  :(

ATH:  i have attached the entire code here.  Only the style.css and Index.php are giving me problems.
i was able to add the single java code to the website JS folder andit works fine from there
And when you look at then, you will see inside them both it shows he the path and was what I assumed was the problem.  It shows their location to be "./name"
as in ./.style.css  and ./.index.php etc

it is a very concise and simple directory display snippet for an apache website.  Now if only it would work on folders that have multiple subfolders etc.  Instead the site drops back to using the apache directory listing default after level 1

Sorry not sure what i did wrong the first time  but the attachment failed.  the other post has complete code in

Thanks to those who offered answers on the php website for hosting files.   

Working from all your suggestions i was finally able to get exactly what i wanted.   Unfortunately, I have run into a problem with the CSS for the folder that contains the document files. 
The hosting file is an Apache Website that does exactly as needed.  I can add a folder to it named 'documents' that will display the documents in it in a perfect format exactly as asked.  The problem is that for it to work,  it requires a specific 'index.php and 'style.css' that are not used elsewhere.  To get the documents to display in all levels of each subfolder,  I have to put th0ose two files in each of the directory levels of every folder. 
Sublevels without those two files use the default Apache directory listing CSS which does not look very good.

Is there a place or file I can use to set the 'documents' folder to always display using the special index.php and style.css on every subdirectory in that folder?
Thanks for advice

Might well be 4WD.  You have much better memory than i do.   :-[

To describe the problem:  We have a folder called Users on a Windows 10 file-share system.  In it are about 50 subfolders each named for the exact same name as the user that should have read/write access to it.  Occasionally I have had to add someone as a Master to the Users folder with control over all the subfolders and when I do Windows often will change the write access for all subfolders by adding the new Master but in doing so it removes the write access for each user for their own sub-folder.  To fix this requires going to each subfolder, clicking down to properties and sharing permissions and  adding read/write access back to that one folder for its user. Then going to the next for 50 or more Users.

is there a way I could script this process so that it would read the name of the folder and add the user by that name to have read/write access to their folder.  ie:  in the Users folder -  For the subfolder named johnsmith it should add read/write access for the user named johnsmith  then larrybarns would get read write access to the subfolder named larrybarns and so on through all 50.  While doing this it should not change any permissions that already exists for these folders.

If this had not happened more than once already I would not bother to ask.  But since it has I figure there has to be some way faster than going folder by folder clicking to get to the "share permissions", and typing in each users name to add them back to their own folder

Two more good ones.  I knew 4WD would have at least one of these proggies in his toolbox. 
I am still reading the FAQ and Help pages on HFS but it looks promising.  I dont really care about using the Apache software, but at least that box has the required static IP with a domain name mapped to it.  From what i see HFS does need to have those.

What we have been using was a document manager I came up with 8 years ago.  It  has served well (and still does) but the girls wanted to load it by drag and drop instead of the configured PHP loader for the MySql database.

Me and the other IT guy as well as some outside people tried several different D&D modules but nothing seemed to click.  So I started looking at the back-ups of all the completed projects and realized that seen from the backend,
all they wanted was a way to display a Windows directory structure with folders and files on a webpage where the Users would be able to read the files inside of their labelled folders.

It needs some kind of basic Login\PW as an attempt to keep out the general public but they didn't need to keep track of any of the Users other than that.  The Doc Mgr using that sed a full PHP/MySql engine was overkill by a huge amount.  Trying to take apart an automobile and use the parts to make a  bicycle just didn't seem like a practical idea.  :mad:

I had considered giving the users access through a vpn and using Windows built-in share controls for managing access levels.   But...I would rather keep outside users "Outside" if possible and we already had the apache web-server configured so...I hated to waste that.

Shades, I see that NextCloud would require me to load the box with Linux or create a VM to host a Linux OS ?.  Or have i not read the instructions thoroughly enough?  Next Cloud looks very much like a Linux version of what used to be BitTorrent.  ( I believe they are now Resilio)

Thanks to all for the great ideas.  One of these will surely work for what they are trying to do.

Thanks Deozaan!  Advice like yours is exactly the kind I was hoping for.  I have used htaccess before with good results and you are probably correct in it being able to handle the "front gate".   This isn't a high security issue. 
I will definitely look at Directory Lister because that is exactly what I am trying to do.  If it can display the contents of a folder with its subfolder and files structures just as Windows does that is all we need.  Hopefully it protects against uploads or modification of contents but downloads are fine  Just trying to limit the people who can access and read the files inside. 

General Software Discussion / Simple php website for hosting files
« on: August 21, 2019, 07:32 PM »
Looking for the simplest possible way to display a folder along with all of its subfolders including the pdf files in those folders on a webpage.  It needs to have enough controls to allow Wthe Webpage viewers to have read-only access and prevent write/edit except and should require a login and password from Users to access the site.

Essentially this is an electronic version of a filing cabinet with controlled access.
More simply i guess it is an EBook where the individual pages are each separate files

We already have such a program but it includes too many unnecessary features and cannot be loaded using Windows drag and drop due to the inclusion of a MySql database which requires each document to be separately loaded into it through the loader which then tracks all access , a feature we do not need. 

I have tried using Business OneDrive as well as Google drive for this and while it can be done that way, I was not happy with the provisions for controlled access.  And we would prefer keeping the posted documents locally rather that hosted from the cloud

The hosted side should be something where the folder and its files can be loaded using Windows drag and drop into the display area and the whole thing has to be compatible with a WAMP-type Apache Web Host.

Appreciate any guidance.  I am hoping someone might have seen something like this and can point me to the right product

wraith two drives one is 5TB  the other is 6TB
both are sata both are GPT format
once i get this all finished there will be several more main folders on the 6 tb
at which time i plan to clone it to another 6 tb for use on another system
the problem at this point is more of  WHY are they so different.  Each item was transferred one piece at a time from about 60 sub-folders

BUT  it took place over too long a time,.  It is possible that for some reason, something got duplicated or misplaced or ??

i had one person recommend FC but that is way too slow and creating the dir.txt or whatever type file is too time consuming.  Took almost 10 minutes to create the first one and it is 80 mb in size.  I KNOW Notepad wont open it and don't know how long FC would take even after i create the 2nd 80mb file to compare it with.
i think I will go for one of the two programs ..
Thanks to your both

Here is the problem: On a Widows 10 system.  Folder B should be an exact duplicate of Folder A. But a check of each shows that B is about .3 TB larger than A.

A is 1.43 TB    B is 1.73 TB.   I need a way to quickly find out why.   Both A and B contain the same number of 2nd level folders so it is something deeper. Until I know what, it is possible that B has some files or folders that need to be added to A.  Or maybe it has duplicates that need to be deleted.  I don't want to use an automated process that would make that choice for me.

A program that could quickly list Only the folders/files that don't match showing file and folder names and sizes would be perfect. Does anyone have any suggestions? I know I have seen many over the years but none come to mind right away.  And this is a kind of rush job.
:(  even Beyond compare is reporting at least 45 minutes to complete the comparison run
guess ill just check back tomorrow

Merry Christmas and Happy New year to Everyone on the Board.  I was offline for the last part of the month (restrictions by the Wife in Charge) due to family all being together for first time in years. :)

To answer all the great replies though:
Shades.  You are always the one to [point out the logical fact-forward solutions.  To be honest, this is what the person who is doing this SHOULD be doing.  That was why I agreed to allow them to do it.
Instead, they created about 5 MORE top level folders with LONG names and used drag and drop to put all the other folders in those Five.  Things immediately went from "Worst to Worser"  :(  I had explained the problem and almost cried when I saw her solution.  Fortunately, I only allowed her to work with a copy of the original mess so..

Wraith:  Thanks for the comments on OneDrive.
I will take another look.  It has been a long time.  Every now and again, i peek at what has 'gathered itself" into my own one drive.  The other day, i tried to copy what was in it to another location and ran into a bunch of errors that basically said the files were not "available at  that time" .  Most of the errors were on larger files and none of these were anything i deliberately put into One Drive so i fully admit that I have not given it a fair test.  If it works for you, i should see what happens on another trial.   
The biggest problem we would have is constant flux.  We have  TON of storage with our Business OneDrive and the files would be constantly added, changed, moved etc.  That is going to need a ton of bandwidth. All that currently is all handled inhouse at gigabit speeds (or close) and I’m not sure how much of a hit we can take on our 30 MB (UP) Cable Pipe to the web.   Worth looking at if it works for you though.

And Lastly, 4WD!
😊 I Knew i could count on you having some great suggestions.  Seems like you must do a lot of the same things i have to do.  But you usually know all the best ways to do it.  I'm lookng forward to somebody coming out with an AI that can look at the problem and come up with best answer (one tha can be afforded by the little guys anyway, I'm sure Siri or Alexa could handle it with ease !)
At least you are patient enough to read the blasted HowTo's on Robo.  Some of your suggestions look Very Interesting though and I plan on trying them ALL out ASAP.

Wishing everyone a Great High-Tech New Year  for 2019. 

Again, thanks to all for the great suggestions

You mean you Trust OneDrive?   :tellme:
That would be asking a lot for me to do.   Though I am sure it has gotten better since it first came out. 

One Question about Syncovery:  Do you know how it will handle cases where the files are buried beyond the Max_Path point?  I run into this a lot on these exact folders.  I am sure that a number of them are going to be like that.  Many Backup programs either Choke or return error codes for those files.  Yet they  really are there and can be opened, read, modified etc. by the programs (and people) that put them there. 
Files exactly like that are one of the main reasons for this project.  If i don't shorten the paths soon, every file in those folders will be in paths that cannot be backed up.  I have heard that MicroSoft has promised to open up the path lengths allowed to some really extreme number in a soon to be released version of Win 10.   And i think it can even be enabled now with some registry edits.  But as far as I can tell, the original Max_Path of 260 total is still the rule at this time.
Besides, the whole point is that people should be more careful and if I dont enforce some limiting factor, they end up copying a whole folder and all its paths under another top-level of folders adding even more complexity.  Searching for specific files in this mess cant be easy on the File Indexer Service.  This whole effort is to try to remove as many of those rediculously long file and folder names as possible

Wraith, I finally had time to test syncovery and it does work as advertised.  :)  The robocopy script I came up with does about half the job.  I will probably buy Syncovery abecasue it is the most well thought-out process I have seen for some time.   I can see where it (especially the schreduling part!) would be GREAT to have.  I might use it for what I need now as well -  if i can't get the robocopy switches in the correct order to work properly.  So far it does create the empty directory but i cant get it to move specific files.    I was hoping to find a simple way to manage this one task and feel Syncovery is almost like using a Shotgun to kill a fly  :)
Still, as you say, i think it will do the trick after i run a couple more tests so i am sure i know what is going on.  I can see many uses for it on much larger tasks i have to perform daily. 

"Are you open to a pay solution?  I use Syncovery, and it can do exactly what you're saying."   Thanks Wraith!

Good to see some many of the Oldtimers are still here.  Happy Holidays to ALL!    I don’t mind paying for it my Chri$tma$ present to the board will be going out soon as well   I assume the mfr. has a demo or trial so I can be certain before buying it can do what is needed.  The place I work has got to have some of the  WORST people i have ever run across in all my years in IT.  Most places would be GLAD to have someone tell the employees how to do what will work and NOT do what won't.  I swear they always try what WON'T first in an effort to prove me wrong.   They are so bad I don’t even know if I can feel sorry for them  Every joke you have EVERY read in a "IT repair manual giving real life examples  This is where they com from!  A disaster almost occurred the other day because someone could not find the "ANY" key in a new program we just bought.  As in press ANY KEY to continue?  She stopped evertything to file an Urgent Trouble Ticket an waited for me to get there   It took me a couple of minutes to get enough self-control to explain. They thought they had an "outdated keyboard" that was missing important keys.  Just like the (now Unlabeled)  START on Desktop or the mysterious Windows Key. 

I can see it is time for Computer-KLASS -01)

Back to the question though.  Just for my own knowledge , I thought this was an exact described example in Robocopy but i cannot find it now so I am glad to know therre IS a way out.
Will let you know you how Syncovery workes out

PS:  Update I FINALLY got Robocopy to work :)  perfectly at that.  I have been told that I can even shorten this but here is the Working Script to create an empty copy of a dir with only all paths

robocopy.exe /R:0 /W:0 "C:\SCRIPTS\A" "C:\SCRIPTS\B" /E /Z /Copy:DT /XF "*.???" /LOG:"C:\SCRIPTS\EMPTY.LOG"
where A is the original folder and B is the exact copy with no files in it
At this stage I alwasy add a pause when trying things i dont know for sure but all ran well and the log said no errors

Merry Christmas!

I need to move all files of specific file-types from Folder A to Folder B and put them under the same path as they existed in A.  I need to do this on several filetypes but only one type at a time.  If the folder-path already exists in B, the file would just be added to it.  If it doesn't exist, the script would create it during the move.  It might be possible to do this with the right switches using Robocopy but I cant seem to find the right combintion of switches to work as needed.   The script needs to delete the files as they are moved to the new folder (Move insted of copy)
If there is a better program for doing this, I would be happy to use it.   I am hoping to be able to delete folder A when done as it should only contain 'trash'.  And folder B would contain only the file-types I need to keep with each in its original folder path.

I'm with you Wraith and I also questioned their veracity on this topic.  :tellme:

I posted a reply asking to be provided with the source for this information in writing as I also find this extremely hard to believe.  I can give you verbatim what they said but it would be impolite to pass along User names from another forum.   See next in quotes:

"It is my understanding that Microsoft is slowly doing away with mapped drive support and is wanting everyone to begin using UNC instead. At this point mapped drives are basically still around as a nod to programs such as yours that still require drive letters instead of UNC paths. "

On the topic of the problem itself, i was finally able to track down the reason for it by going off site and reconnecting through a VPN.  Connected Like that, i was given a lot more choices for where to start the search so I could see what was causing the whole problem.  It is caused by the way the database software is directing the request for the path.  Unless the User redirects Search to Begin at the the start of the mapped drive letter, it is by default using the already interpreted UNC path as the start.

SO:  Hat's off to you ATH,  :up:  Essentially you are 100% correct!  The software has the user already inside the UNC path before it even starts the search.  This was WHY it always returned the found items on their UNC path instead of the path using the mapped drive letter.  Fixing it just requires the User to point Search at the mapped letter before it begins.

So that is the end of MY problem. 
But like you, I would like to know where this one person got their info from re: Windows dropping support for mapped drive letters.  I promise to pass this along if i can get it.

Thanks for your reply ATH.  I wish that was all it was.  But i have it on good authority that MS  intends to do away with supporting mapped drives eventually.  It is just a matter of how soon they fully implement it.  Fully Qualified or  'UNC' paths will be the required way to link to a fie on a shared network location.  From what i understand it has something to do with ensuring security and constant compliance with other network functions.

Sorry i did not get the email telling me you had replied here as well.  I have been so tied up trying to fix this problem with all the links being wrong in a major company database because the Users tried to force a solution on a problem that made things worse instead of better,

I should add that your explanation is probably True as well  !  Thanks for the detailed explanation.

What no one can figure out is why this has never happened in the past.  The Users have always used the search function in explorer and it always returned the mapped drive letter (or so i have been told.. I do not personally do data entry so it could be that no one ever tried using search and they just dont remember it that way.

Pages: [1] 2 3 4 5 6 ... 23next