Well, the power of the original concept was avoiding tedious cut and paste to begin with. Not only is time saved, but it helps avoid data transfer errors. Besides the problems of cut-pasting one by one, ("Where was I again?") but let's say you batch run it, but you forgot and deleted a tab, now you have a corrupted batch and have to go debug.
After a few days of rest, I went back to this. The way through was realizing that the "goal" was to import the data into a spreadsheet. So bugs aside (see below!) the only point of the text file with two lines was just in the import process to spreadsheet programs it can split the data with a text/tab delimiter. However it's still plenty easy just to load in one column at a time, so that's fine.
However, I may have found a bug. On the URL list the first URL is first, sure. But on the titles list, it seems to be at the end! Because this whole use case is about lining up data rows, that causes a shift off by one, making every single one wrong! (I caught on the second time and fixed it, but it's certainly a bug.)
So otherwise I am almost satisfied with this aspect of this program.