a red hot cyber suspense tale“Last month, the standards editor at The New York Times wrote a memo that shocked -- shocked! -- bloggers everywhere. He asked Times writers to avoid using the word ‘tweet’ (as in, ‘to say something on Twitter’).
“’We don't want to seem Paleolithic,’ he wrote. ‘But we favor established usage and ordinary words over the latest jargon or buzzwords.’
“That the Internet’s reaction was so swift and harsh only proves the point: the techno-savvy population can't even conceive of the existence of a less savvy crowd. If you use jargon every day, you can't imagine that millions of people have no idea what you're talking about.
“I do a lot of public speaking. And even today, when I ask my audience how many know what Twitter is, sometimes only a quarter of the hands go up.
“The response depends a lot on where I'm giving the talk and the audience’s age. But one day it occurred to me: how would they know? All of these buzzy social networking sites like Facebook and Twitter sort of crept up on us. The government never mailed fliers to every household explaining what it’s all about.”
-- from the July 7, 2010 New York Times Op-ed piece, "For Those Facebook Left Behind" by David Pogue (which I found through the headline on my personalized iGoogle home page).
I have not heard a lot of the phrase "digital divide" in a while. But I did read the recent articles about Finland making access to broadband a legal right and President Obama pledging to expand broadband access throughout America. To me, a librarian who considers such tools essential for leveling the playing field on which we educate students in the Twenty-first century, this expansion of broadband is a positive thing.
Not that I think everything about our brave new digital world is a plus. There are incredibly important privacy issues to be resolved -- if such resolution is at all possible. I also know that too many of us spend far too much time sitting in front of our laptops (like I am right now). It is summertime and I really should be making like Thoreau.
(I am no Twitter aficionado. I think I already mouth off more than enough without starting to tweet, too.)
You can bet that I am going to be a bit more mindful about the real dangers inherent in our digital world after reading nonstop through BRAIN JACK, a breathtakingly, fast-moving, futuristic, cyber-thriller about high school-age kids in which the population groups who are for and against the increasingly pervasive nature of the Internet end up warring against one another.
It all begins with Sam Wilson sitting down in a cafe in lower Manhattan and hacking into the computer system in a high security building next door. Sam is a high school student and a hacker with extraordinary skills. In his successful attempt to alter some files on a server in the Telcoamerica buiding so that he can get himself and his best friend some free hardware, we observe Sam causing enough systems damage escaping detection (when tech people at Telcoamerica realize that their system has been compromised) that it takes three days for the Internet across America to be working at full speed again. The payoff for this invasion of Telcoamerica's system is a pair of top-of-the-line laptops and a pair of Neurotech neuro-headsets that are soon thereafter delivered to his door. These neuro-headsets are the newest in technology advances: one puts on the headset and it provides for thought recognition (you work on the computer using your mind in the same manner that we, today, have voice recognition programs for word processing).
But what Sam missed, as he escaped out the back door of that cafe next to Telcoamerica, is that there was a security camera mounted out there.
Sam's next caper: hack the White House in order to participate in a top-secret hacker's conference:
"He scanned the disk structure of the big server.
"There were over thirty disk drives attached to the machine. He scrolled through the list
Richie Partington, Richie's Picks
July 2010