topbanner_forum
  *

avatar image

Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length
  • Tuesday April 23, 2024, 5:37 pm
  • Proudly celebrating 15+ years online.
  • Donate now to become a lifetime supporting member of the site and get a non-expiring license key for all of our programs.
  • donate

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - questorfla [ switch to compact view ]

Pages: prev1 [2] 3 4 5 6 7 ... 23next
26
To describe the problem:  We have a folder called Users on a Windows 10 file-share system.  In it are about 50 subfolders each named for the exact same name as the user that should have read/write access to it.  Occasionally I have had to add someone as a Master to the Users folder with control over all the subfolders and when I do Windows often will change the write access for all subfolders by adding the new Master but in doing so it removes the write access for each user for their own sub-folder.  To fix this requires going to each subfolder, clicking down to properties and sharing permissions and  adding read/write access back to that one folder for its user. Then going to the next for 50 or more Users.

is there a way I could script this process so that it would read the name of the folder and add the user by that name to have read/write access to their folder.  ie:  in the Users folder -  For the subfolder named johnsmith it should add read/write access for the user named johnsmith  then larrybarns would get read write access to the subfolder named larrybarns and so on through all 50.  While doing this it should not change any permissions that already exists for these folders.

If this had not happened more than once already I would not bother to ask.  But since it has I figure there has to be some way faster than going folder by folder clicking to get to the "share permissions", and typing in each users name to add them back to their own folder

27
Two more good ones.  I knew 4WD would have at least one of these proggies in his toolbox. 
I am still reading the FAQ and Help pages on HFS but it looks promising.  I dont really care about using the Apache software, but at least that box has the required static IP with a domain name mapped to it.  From what i see HFS does need to have those.

What we have been using was a document manager I came up with 8 years ago.  It  has served well (and still does) but the girls wanted to load it by drag and drop instead of the configured PHP loader for the MySql database.

Me and the other IT guy as well as some outside people tried several different D&D modules but nothing seemed to click.  So I started looking at the back-ups of all the completed projects and realized that seen from the backend,
all they wanted was a way to display a Windows directory structure with folders and files on a webpage where the Users would be able to read the files inside of their labelled folders.

It needs some kind of basic Login\PW as an attempt to keep out the general public but they didn't need to keep track of any of the Users other than that.  The Doc Mgr using that sed a full PHP/MySql engine was overkill by a huge amount.  Trying to take apart an automobile and use the parts to make a  bicycle just didn't seem like a practical idea.  :mad:

I had considered giving the users access through a vpn and using Windows built-in share controls for managing access levels.   But...I would rather keep outside users "Outside" if possible and we already had the apache web-server configured so...I hated to waste that.

Shades, I see that NextCloud would require me to load the box with Linux or create a VM to host a Linux OS ?.  Or have i not read the instructions thoroughly enough?  Next Cloud looks very much like a Linux version of what used to be BitTorrent.  ( I believe they are now Resilio)

Thanks to all for the great ideas.  One of these will surely work for what they are trying to do.







28
Thanks Deozaan!  Advice like yours is exactly the kind I was hoping for.  I have used htaccess before with good results and you are probably correct in it being able to handle the "front gate".   This isn't a high security issue. 
I will definitely look at Directory Lister because that is exactly what I am trying to do.  If it can display the contents of a folder with its subfolder and files structures just as Windows does that is all we need.  Hopefully it protects against uploads or modification of contents but downloads are fine  Just trying to limit the people who can access and read the files inside. 
 

29
General Software Discussion / Simple php website for hosting files
« on: August 21, 2019, 07:32 PM »
Looking for the simplest possible way to display a folder along with all of its subfolders including the pdf files in those folders on a webpage.  It needs to have enough controls to allow Wthe Webpage viewers to have read-only access and prevent write/edit except and should require a login and password from Users to access the site.

Essentially this is an electronic version of a filing cabinet with controlled access.
More simply i guess it is an EBook where the individual pages are each separate files

We already have such a program but it includes too many unnecessary features and cannot be loaded using Windows drag and drop due to the inclusion of a MySql database which requires each document to be separately loaded into it through the loader which then tracks all access , a feature we do not need. 

I have tried using Business OneDrive as well as Google drive for this and while it can be done that way, I was not happy with the provisions for controlled access.  And we would prefer keeping the posted documents locally rather that hosted from the cloud

The hosted side should be something where the folder and its files can be loaded using Windows drag and drop into the display area and the whole thing has to be compatible with a WAMP-type Apache Web Host.

Appreciate any guidance.  I am hoping someone might have seen something like this and can point me to the right product

30
wraith two drives one is 5TB  the other is 6TB
both are sata both are GPT format
once i get this all finished there will be several more main folders on the 6 tb
at which time i plan to clone it to another 6 tb for use on another system
the problem at this point is more of  WHY are they so different.  Each item was transferred one piece at a time from about 60 sub-folders

BUT  it took place over too long a time,.  It is possible that for some reason, something got duplicated or misplaced or ??

i had one person recommend FC but that is way too slow and creating the dir.txt or whatever type file is too time consuming.  Took almost 10 minutes to create the first one and it is 80 mb in size.  I KNOW Notepad wont open it and don't know how long FC would take even after i create the 2nd 80mb file to compare it with.
i think I will go for one of the two programs ..
CompareIt!/SynchronizeIt!
or
WinMerge,
Thanks to your both


31
Here is the problem: On a Widows 10 system.  Folder B should be an exact duplicate of Folder A. But a check of each shows that B is about .3 TB larger than A.

A is 1.43 TB    B is 1.73 TB.   I need a way to quickly find out why.   Both A and B contain the same number of 2nd level folders so it is something deeper. Until I know what, it is possible that B has some files or folders that need to be added to A.  Or maybe it has duplicates that need to be deleted.  I don't want to use an automated process that would make that choice for me.

A program that could quickly list Only the folders/files that don't match showing file and folder names and sizes would be perfect. Does anyone have any suggestions? I know I have seen many over the years but none come to mind right away.  And this is a kind of rush job.
:(  even Beyond compare is reporting at least 45 minutes to complete the comparison run
guess ill just check back tomorrow

32
Merry Christmas and Happy New year to Everyone on the Board.  I was offline for the last part of the month (restrictions by the Wife in Charge) due to family all being together for first time in years. :)

To answer all the great replies though:
 
Shades.  You are always the one to [point out the logical fact-forward solutions.  To be honest, this is what the person who is doing this SHOULD be doing.  That was why I agreed to allow them to do it.
Instead, they created about 5 MORE top level folders with LONG names and used drag and drop to put all the other folders in those Five.  Things immediately went from "Worst to Worser"  :(  I had explained the problem and almost cried when I saw her solution.  Fortunately, I only allowed her to work with a copy of the original mess so..

Wraith:  Thanks for the comments on OneDrive.
I will take another look.  It has been a long time.  Every now and again, i peek at what has 'gathered itself" into my own one drive.  The other day, i tried to copy what was in it to another location and ran into a bunch of errors that basically said the files were not "available at  that time" .  Most of the errors were on larger files and none of these were anything i deliberately put into One Drive so i fully admit that I have not given it a fair test.  If it works for you, i should see what happens on another trial.   
The biggest problem we would have is constant flux.  We have  TON of storage with our Business OneDrive and the files would be constantly added, changed, moved etc.  That is going to need a ton of bandwidth. All that currently is all handled inhouse at gigabit speeds (or close) and I’m not sure how much of a hit we can take on our 30 MB (UP) Cable Pipe to the web.   Worth looking at if it works for you though.

And Lastly, 4WD!
😊 I Knew i could count on you having some great suggestions.  Seems like you must do a lot of the same things i have to do.  But you usually know all the best ways to do it.  I'm lookng forward to somebody coming out with an AI that can look at the problem and come up with best answer (one tha can be afforded by the little guys anyway, I'm sure Siri or Alexa could handle it with ease !)
At least you are patient enough to read the blasted HowTo's on Robo.  Some of your suggestions look Very Interesting though and I plan on trying them ALL out ASAP.

Wishing everyone a Great High-Tech New Year  for 2019. 

Again, thanks to all for the great suggestions

34
You mean you Trust OneDrive?   :tellme:
That would be asking a lot for me to do.   Though I am sure it has gotten better since it first came out. 

One Question about Syncovery:  Do you know how it will handle cases where the files are buried beyond the Max_Path point?  I run into this a lot on these exact folders.  I am sure that a number of them are going to be like that.  Many Backup programs either Choke or return error codes for those files.  Yet they  really are there and can be opened, read, modified etc. by the programs (and people) that put them there. 
 
Files exactly like that are one of the main reasons for this project.  If i don't shorten the paths soon, every file in those folders will be in paths that cannot be backed up.  I have heard that MicroSoft has promised to open up the path lengths allowed to some really extreme number in a soon to be released version of Win 10.   And i think it can even be enabled now with some registry edits.  But as far as I can tell, the original Max_Path of 260 total is still the rule at this time.
 
Besides, the whole point is that people should be more careful and if I dont enforce some limiting factor, they end up copying a whole folder and all its paths under another top-level of folders adding even more complexity.  Searching for specific files in this mess cant be easy on the File Indexer Service.  This whole effort is to try to remove as many of those rediculously long file and folder names as possible

35
Wraith, I finally had time to test syncovery and it does work as advertised.  :)  The robocopy script I came up with does about half the job.  I will probably buy Syncovery abecasue it is the most well thought-out process I have seen for some time.   I can see where it (especially the schreduling part!) would be GREAT to have.  I might use it for what I need now as well -  if i can't get the robocopy switches in the correct order to work properly.  So far it does create the empty directory but i cant get it to move specific files.    I was hoping to find a simple way to manage this one task and feel Syncovery is almost like using a Shotgun to kill a fly  :)
Still, as you say, i think it will do the trick after i run a couple more tests so i am sure i know what is going on.  I can see many uses for it on much larger tasks i have to perform daily. 

36
"Are you open to a pay solution?  I use Syncovery, and it can do exactly what you're saying."   Thanks Wraith!

Good to see some many of the Oldtimers are still here.  Happy Holidays to ALL!    I don’t mind paying for it my Chri$tma$ present to the board will be going out soon as well   I assume the mfr. has a demo or trial so I can be certain before buying it can do what is needed.  The place I work has got to have some of the  WORST people i have ever run across in all my years in IT.  Most places would be GLAD to have someone tell the employees how to do what will work and NOT do what won't.  I swear they always try what WON'T first in an effort to prove me wrong.   They are so bad I don’t even know if I can feel sorry for them  Every joke you have EVERY read in a "IT repair manual giving real life examples  This is where they com from!  A disaster almost occurred the other day because someone could not find the "ANY" key in a new program we just bought.  As in press ANY KEY to continue?  She stopped evertything to file an Urgent Trouble Ticket an waited for me to get there   It took me a couple of minutes to get enough self-control to explain. They thought they had an "outdated keyboard" that was missing important keys.  Just like the (now Unlabeled)  START on Desktop or the mysterious Windows Key. 

I can see it is time for Computer-KLASS -01)

Back to the question though.  Just for my own knowledge , I thought this was an exact described example in Robocopy but i cannot find it now so I am glad to know therre IS a way out.
Will let you know you how Syncovery workes out

PS:  Update I FINALLY got Robocopy to work :)  perfectly at that.  I have been told that I can even shorten this but here is the Working Script to create an empty copy of a dir with only all paths


robocopy.exe /R:0 /W:0 "C:\SCRIPTS\A" "C:\SCRIPTS\B" /E /Z /Copy:DT /XF "*.???" /LOG:"C:\SCRIPTS\EMPTY.LOG"
where A is the original folder and B is the exact copy with no files in it
At this stage I alwasy add a pause when trying things i dont know for sure but all ran well and the log said no errors


Merry Christmas!

37
I need to move all files of specific file-types from Folder A to Folder B and put them under the same path as they existed in A.  I need to do this on several filetypes but only one type at a time.  If the folder-path already exists in B, the file would just be added to it.  If it doesn't exist, the script would create it during the move.  It might be possible to do this with the right switches using Robocopy but I cant seem to find the right combintion of switches to work as needed.   The script needs to delete the files as they are moved to the new folder (Move insted of copy)
 
If there is a better program for doing this, I would be happy to use it.   I am hoping to be able to delete folder A when done as it should only contain 'trash'.  And folder B would contain only the file-types I need to keep with each in its original folder path.

38
I'm with you Wraith and I also questioned their veracity on this topic.  :tellme:

I posted a reply asking to be provided with the source for this information in writing as I also find this extremely hard to believe.  I can give you verbatim what they said but it would be impolite to pass along User names from another forum.   See next in quotes:

"It is my understanding that Microsoft is slowly doing away with mapped drive support and is wanting everyone to begin using UNC instead. At this point mapped drives are basically still around as a nod to programs such as yours that still require drive letters instead of UNC paths. "

On the topic of the problem itself, i was finally able to track down the reason for it by going off site and reconnecting through a VPN.  Connected Like that, i was given a lot more choices for where to start the search so I could see what was causing the whole problem.  It is caused by the way the database software is directing the request for the path.  Unless the User redirects Search to Begin at the the start of the mapped drive letter, it is by default using the already interpreted UNC path as the start.

SO:  Hat's off to you ATH,  :up:  Essentially you are 100% correct!  The software has the user already inside the UNC path before it even starts the search.  This was WHY it always returned the found items on their UNC path instead of the path using the mapped drive letter.  Fixing it just requires the User to point Search at the mapped letter before it begins.

So that is the end of MY problem. 
But like you, I would like to know where this one person got their info from re: Windows dropping support for mapped drive letters.  I promise to pass this along if i can get it.

39
Thanks for your reply ATH.  I wish that was all it was.  But i have it on good authority that MS  intends to do away with supporting mapped drives eventually.  It is just a matter of how soon they fully implement it.  Fully Qualified or  'UNC' paths will be the required way to link to a fie on a shared network location.  From what i understand it has something to do with ensuring security and constant compliance with other network functions.

Sorry i did not get the email telling me you had replied here as well.  I have been so tied up trying to fix this problem with all the links being wrong in a major company database because the Users tried to force a solution on a problem that made things worse instead of better,

I should add that your explanation is probably True as well  !  Thanks for the detailed explanation.


What no one can figure out is why this has never happened in the past.  The Users have always used the search function in explorer and it always returned the mapped drive letter (or so i have been told.. I do not personally do data entry so it could be that no one ever tried using search and they just dont remember it that way.

40
This is an unusual situation that just occurred and the techs from the company who wrote the software have no idea and neither do I.  But it is creating a huge problem.

This is a database program that has to be loaded with file links and the only path it will allow is the mapped network drive letter path. for example  R:\docs\filename can be linked to the record in the database as the location for a specific file

This is done by browsing to the R:\docs folder and scrolling down to find the correct file which when clicked fills in the path as R:\docs\"filename" in the database record. 
However:
if the same process is followed and the User decides they want to search for the file in the docs folder, and once the file is found they click it.  THIS time Window Search fills in the database record for the file location using the full UNC path to the file...Without using the Mapped drive letter of R:
The database program has a setting that will not allow it to accept ANY paths that are not specific the the mapped drive letter.   So those that try to input with the UNC path (which is 100% correct in every way) are not accepted with an error that they are not allowed to use other paths that the one with the mapped drive letter.
I can't find any problems with the system  Everything checks out 100% and as long as the user stays in Windows explorer and scrolls down to find the file it always uses the mapped drive letter. 
Does anyone have any idea as to why using SEARCh in that mapped drive location would return a long \\remote server\mapped folder name\filename" and fill it in instead.  It will not work linked like that but if they go back and scroll to the file from the folder it is in, it works properly every time.
And yes, the UNC path translates to the exact mapped network drive path
This ONLY happens if they try to search for the file and it must be something new as no one has ever had this come up before.
My only hope at this time i tuse the old Vsubst program which always seemed to work right.  But no one has ever reported a problem using search before now.  Although i dont really see the need to use search when there are not that many files to scroll though.  Still, i also dont see any reason for Window to return the full UNC path when using deach and yet retunth normal R:\Docs\filename when scrolling tot it.
I have considered the possibility of rebuilding he windows index on that system. I dont yet know if this same bug affects all systems

Thanks

41
Thanks for the two replies.
4wd's, being the shortest would be nice but so far i can't get it to quite work.  I am sure i did something wrong with the new modules i had to import.  I saw it flash something in red but have not rerun
>  UPDATE  The error says:   add-NTFSAccess : The 'Add-NTFSAccess' command was found in the module 'NTFSSecurity', but the module could not be loaded.   So i am working on that one
>  Says something about the file not being digitally signed ;(  Also get further and just more messages telling me that for some reason the NTFSSecurity module is not going to run on this system ;((

PS:  I should add that this is a fully Up-to-date Windows 10 vr 1709 (Creator update installed) system, not Windows server and not Windows 10 1703.  That might make a difference.  Also the folders are accessed by the Users over the LAN in-house network, not directly from the system they are on.  All folders have the default shares of Admin as Owner.  The only thing i am trying to do is add each user to have access to their folder which is a folder named for them.  I normally add read/write for each user when i create the folders.  It is ONLY this one share permission that gets removed somehow on rare occasions.  The owner-share for admin is not changed when the other shares are removed.


Flatop0315.  On yours, i modified slightly making use of the $source variable you setup in line 1 and reused it in line 5 and 8 so this would be more universal and only need to state the folder-name once.  Thereafter using $source.  It appeared to work fine on a short test but did something i am not accustomed to for the sharing permissions. 
When i looked under the "Share" option it plainly says "not shared" so at first I thought the command had failed.

However, when i looked under the "Share" option for that folder it did show that user's name but with the permissions set to 'Contribute'. 
When i normally create these I don't even see that option.  I just click Share, add the username it is shared with and the permissions of  read/write. 
I noticed that the arrow beside 'contribute' would allow change to 'read/write'.  I tried to look up what the differences were and got even more confused. 
But the main worry is that even with that user listed, the folder does not show up as being shared at all.  Even when checking under Advanced system properties for all shared access.
PS: I did run the script under PowerShell as Admin

These folders serve as a backup for the employees desktop, documents and download files.  They are added to on a daily basis and while not perfect, have saved a few people from some major losses.

Any ideas on how to resolve this would be appreciated.  Perhaps i still did something wrong and i am continuing to test both versions




42
ATH:  I am open to suggestions on how to prevent this in the first place.  I am not even 100% sure the Archive utility is to blame because the timing of the problem is not really synced to the running of the Archive program.  I have considered other options such as Windows 10 updates which get blamed for everything from toe fungus to sunspots  :o and I feel sure adding the ability to remove user access from shared folders would fall right in line there somewhere.   :Thmbsup:
To be honest, i do have an ulterior motive for looking for a command-line method of doing this. 
Every time a new person is hired, there are a number of identical tasks that have to be performed using that new persons name and assigned password.  It had occurred to me that I could probably set up a master script to perform what now takes about 30 minutes per new hire and get that down to maybe one minute if everything could be scripted. 
Part of that process involves creating these folders and providing the correct information for ownership and sharing access.  Knowing how to do it from command line won't get me a raise or make more money, but if the script could do it all,  I would have an additional 29 minutes to play League of Legends every time they hired someone new. 8)

43
We have  system that hosts a folder where every employee has a sub-folder for their private use that is named the same as their username for that system.  There have been a couple of times when those rights "per user" get removed.  The folders are fine, the admin access is fine.  But the user who stores files in that folder loses all rights to it for both read and write.  I believe this might be tied to a special archive program that runs every 3 months but i have to track it down.

It wondered if there was a way to walk the directory and read the name of each sub-folder within and Add read/write share permissions for each folder to the user with the same name as the folder.

The main folder named "employees" has sub-folders named johnsmith,   fredwilson,   maryjones,  etc. for about 40 - 50 users
The last two times it happened, i went through the list one by one and restored the read/write permissions.  Since the folder names are the exact same as the user names (no spaces) I was hoping there would be a way to walk through the sub-folders under "Employees" and add read/write permissions for each folder to the user with the same name as the folder.   The User named "johnsmith" would get read/write only to his sub-folder named "johnsmith".  Same for "fredwilson" and "maryjones"
Below is an example layout.

C:\Employees\
                      fredwilson
                      johnsmith
                      maryjones

There are no loose files in the Employees folder and no folders that do not belong to Users that have an account on that system.  I just wondered is there was a way to handle the issue programmatically when it occurs > Read in the name of the folder and add  read/write permissions for that folder to the user with that name.

Would prefer Powershell or batch but whatever works :)  I am sure it will be some variation of the "icacls" command.

icacls C:\employees\%user% /grant %user%:(F)

But I am not sure if this is even close as i seldom if ever use icacls






44
Everything works BUT I end up with two extra spaces in the filename which i need to avoid if i can.
<edit>  the leading space in the name turned out to be from the text file itself.  a glitch.  The only problem is the center space between the variable and the added text  <edit>

Your code to extract the text file contents into a variable works but for some reason it leaves a leading space in the variable/  At the end,  i need to add the words "menucopy" to that variable without a space between that and original value

the output of the script gets to a file named  "%dmn%menucopy.png"
where %dmn% was the variable created from the text file and "menucopy" is just added text
But i cannot seem to get rid of a leading space added in the variable %dmn% and another one between it and the text "menucopy".

<Edit>  I forgot to setlocal EnableDelayedExpansion before adding the variables.  Problem solved.  Thanks for the input 4wd i needed your code to load the text into variables  <Edit>

I was gettong something like " jimsmith menucopy.png"  :)  Now I get "jimsmithmenucopy.png" !"!!  Perfect

45
Absolutely correct 4WD.  It would be an automatic setup that I would add to another "master housekeeping" routine.  I have a lot of submissions from people that contain the necessary data for a specific job.  Part of this is a text file that contains the name that the final output should use.  Once the first program has it all compiled, it is just named "finished".jpg  The final step is to rename finished to the name of the person t belongs to which is included in their file name.txt.  I was hoping to find a way to use the contexts of the text file to rename the jpg file which , currently, i just go through foldcer by folder for each "finished".jpg and copy the contents of the name.txt file to use for renaming the finished.jpg file.
Tedious but gets the job done.
If i find a better way, i will be sure to post it in case anyone else could ever use as it appears there is a lack of utilities or scripts with that ability. I'm probably the only one who ever asked for one.
Thanks All.


46
For example.  I need to rename a file photo.jpg using the contents of a text file "whose.txt". 
In all cases, the text file will only contain one word with no special characters or spaces only normal alphabetic letters. 
For example: whose.txt could be the word "mypicture".  Running the script would rename photo.jpg to mypicture.jpg and do it even if it overwrites an existing file with the name "mypicture.jpg".

I know this is probably something simple but it is late i can't think :(
I tried to set a variable to the contents of the text file then use the variable to rename the jpg file but can't get things in the right order to work.

47
It may not even be possible int he way that I would like to do. 
There is a place in the document that refers to use of function in an excel sheet to calculate cost for use by multiple persons.  ie: (Click here to use pricing tool.)  I can hyperlink these words to the Excel document but doing so on my system wont provide the correct link when the document is opened on another User's system.  This is a spreadsheet for calculating multiples of the various pricing shown in the Word Doc.

I was wondering if there as a way to literally "embed" these calculations into the Word Doc itself so they don't show up unless someone needs them and clicks the link for Multiple User pricing.  Or if the only way to make this work is to host the Excel file on our local network and link it such that it is pulled from the network when clicked.

Unfortunately,the document is for use by "all employees" and they may not be "In-House" when they click the link.  So it would require hosting on a web-accessible location done this way and require Internet access for clicking the link.

Are there Options I am not exploring?  Or a way to literally include the spreadsheet "hidden" in the Word doc?

48
i almost forgot i had left this 'hanging'. :huh:
That's what happens when everyone 'else' gets the Flu bug.

Anyway, I took lazy way out. 

I set the working vbs file to run with task-scheduler after the nightly backup and added the creation of the list to the backup routine.
I did try including the VBS as an integral part of the backup.bat file using : "cscript //nologo emailit.vbs" as the final step.   But this still gets skipped with no obvious reasons or errors.  As long as I eave Outlook open, running the VBS as a stand-alone item with the scheduler works like a charm so.. End of that issue.

I am still looking at your setup, 4WD, but for I must be doing something wrong or trying to hard on making the text become an active URL.  Or maybe it is that I really need to stick with Outlook where possible because of all the automatic sync on all devices in Exchange Server.   I did get the latest BLAT though and will keep trying.  The end results are an attached RTF file which is good enough for now



49
The following Outlook VBA is used to resize all images pasted to the body of an email by 50% scale.  Works great for copy&paste screenshots that are too large for an email page size.

Unfortunately, it doesn't stop with the body of the email but continues on to shrink the logo.jpg in the email signature.  Is there an easy way to exempt the signature portion of the email or other possible options that would allow the VBA to work as it should on all images other than the one in the signature?

Sub ResizeAllPicsTo50Pct()
    Const wdInlineShapePicture = 3
    Dim olkMsg As Outlook.MailItem, wrdDoc As Object, wrdShp As Object
    Set olkMsg = Application.ActiveInspector.CurrentItem
    Set wrdDoc = olkMsg.GetInspector.WordEditor
    For Each wrdShp In wrdDoc.InlineShapes
        If wrdShp.Type = wdInlineShapePicture Then
            wrdShp.ScaleHeight = 50
            wrdShp.ScaleWidth = 50
        End If
    Next
    Set olkMsg = Nothing
    Set wrdDoc = Nothing
    Set wrdShp = Nothing
End Sub

50
thanks all.  :-[
I gotta remember to stop making posts a 2 am.  :huh:
After giving it more thought in the light of day, i realized that I was making life hard for myself.  The reason for the rtf is that the written text can become active URL's once they are selected and clicked whereas in a text file this does not happen.
if i could have "read" the contents of the file whether rtf or txt into the body of the email line by line so it would retain the proper formatting, this would have solved the problem as the email body is html.

However:  After giving it a lot of thought, I decided that the person i was doing this for was going to have to give up and just open the attached file itself and from there either copy the link to a browser or do with it as they may need to get where they want to go.

But.. at 2am .. sometimes things don't look the same.
I am also currently caring for two very ill Flu victims :( (wife and daughter)

Either way, i have dropped the idea of making life easier for this person.
 
My only glitch in the works now is that there is almost NO way to get Outlook to SEND the blasted email once composed except by use of the vbs script here which is one I configured a year or so ago.   It was designed to be a 'drag and drop' email app which it works great for.  You see a brief BLINK and the email is gone.

Unfortunately.  I need to tie all this to the nightly backup which is a batch script.  I can use 'ipm.note' to build the email in batch and it works great!
BUT it wont SEND it.
I can build the file in the batch then use the VBS to send it ..
But
so far i have been unable to reconfigure the vbs to use a specific file rather than working as designed to capture the files dropped onto the VBS script.
This is from a lack of knowing much about WSH i am sure.  The original VBS file was one I reconfigure for my needs at  the time.  And it still works perfect for what i needed as i use it every day to drag and drop all kinds text files or zip files etc that i need to email to myself.

If you happen to see how i can set it to send a specific file at "C:\lists\listname.rtf"  as the only attachment?  That would do the trick.

Once the backup runs, i can call that vbs to send the resulting list.rtf to the person that needs it.  and it would all happen "auto-magically" at 4 am every night when the backup ran without me doing anything.
So i can go back to taking care of the ill. :(

PS:  4WD  I think what I was doing back then was sending an image file?  And i think we did end up with a BLAT script to do it? I wil revisit that as soon as i get a few minutes to do so.  But if BLAT was how it worked it may not "fly" now with the new exchange server setup and al the security checks on email configurations.
But it might work.   And i will look at that option.
I just know for SURE if i can get the VBS to handle a single named file at a specific location that it will for 100% sure work.









 




Pages: prev1 [2] 3 4 5 6 7 ... 23next