This should be gently amusing to you gang, and I get to put a different nuance spin on it.
Computing tends to have two common (and a rare third) presentations in movies. One is varying levels of blunders by the writers, with a (moldy?) form of Fridge Logic because they play the odds of the audience actually thinking it all sounds good. (Ya know, you'd think the bean counters could override PHB managers by saying "hey, let's give That Techie Guy We Know a pizza and have him do a sanity check", but apparently not, because managers seems to have the strongest voice in a corp and say "Nah! I want another $1,000 explosion for three seconds!) For me, the funniest is the legendary NCIS "2 idiots 1 computer" example because it's not even "An IP Address can do anything!", but people playing a keyboard like a piano duet! https://www.youtube..../watch?v=u8qgehH3kEQ
One step up is when they flash actual code on the movie screen, though it turns out not to be terribly dramatic. But at least it's real code, usually. That's where this site comes in. At your level, it's a nice organized time saver of where chunks of code at least come from, in classic internet fashion "1 man's research, published to all". http://moviecode.tumblr.com/
1. The rare third is when computing is actually used at least coherently, better competently, or finally brilliantly, to actually do something! That could be a fun separate thread!
2. However, the script of the NCIS scene clearly came from X writer, and the visual arts guy stuck with the episode decided to at least put a little more smoke and mirrors into it all. Because if you turn off the volume, there is an *epic* amount of *somethings* flashing by on the screen! I wish someone with Too Much Time On Her Hands (gender empowering!) would create a nice detailed list!