There is something called: non-interactive desktop heap memory. No matter how much RAM you have in your system, if you exceed this limit, Windows will report that there is not enough memory to complete the task.
Back in the XP days, You still had tools available that allowed you, the user, to have control of this 'desktop heap' memory. In my case I managed to run 50(!) instances of Excel on a computer with 2 GByte of RAM and an Athlon XP 3200 CPU. The reason? A customer had a serious server with 64 CPU's (Xeons) and 256 GByte of RAM. While this is nothing to sneeze at nowadays, back then it was an absolute and very expensive beast. But it had to run on Windows and they were very displeased about the fact that they could not run more than 50 Excel instances at any given time. Yes, my weakling of a workstation was able to do the same as their beast.
Consultants and business processes inside the customer's environment generated Excel files for further processing on that particular server. And the main purpose of that server was to handle several thousands of such processes at any given moment. So they, and we, learned the hard way about something called 'heap' memory. Luckily, one of the software applications in the software suite we make had already quite some functionality of Excel built-in. This software was then adjusted more to their needs and with the included scripting language they were able to ditch practically all of the Excel generating processes and replace those with the software application. And by doing so, several thousands of business processes were being processed.
Since Windows Vista/Windows 7, Microsoft has removed the tools to manage 'desktop heap' memory manually in Windows. Now the operating system manages this automatically. And yes, I checked, my workstation now ran Windows 7 with 8 GByte of RAM on a 3th generation i5 CPU. 13 Excel instances at once was the absolute maximum.
The XP workstation back then became unbelievably slow. Killing Excel instances one by one took at least a hour, if not more. All that time the workstation did not give you the impression that it was frozen, just terribly slow. Of course, everything was recorded in a test report sent to the customer, who then started to listen to our recommendation of swapping out Excel with proper software. A relative small test with 100 business processes doing their thing at once with our software was done without their server not even breaking the thought of going to sweat. After that, it was not a hard sell any more.
Of course, my example does not apply to the situation of the original poster, it is just there to explain why I know there is something like 'desktop heap' memory and that its limitations can seriously spoil your fun, no matter how much resources your computer has or the expectations those resources instil.
Links about 'heap' memory that may be of interest (or not):
https://docs.microso...esktop-heap-overviewhttps://docs.microso...tation-out-of-memoryThe original poster should check if it still possible to execute the steps mentioned in the following Microsoft article in his Windows installation:
https://docs.microso...tation-out-of-memoryMicrosoft does not recommend this at all, because it is highly likely one will get unexpected and negative experiences if they do. Improvements in the automatically handling of desktop heap memory has made Windows much more stable. Or much less friendly towards badly behaving software, whatever your viewpoint is.