DECEMBER 2, 2025 – Back in the 90s, a couple of guys cashed in on their religious belief by punching out a whole series of novels about end times, starting with their lead-off book, Left Behind. I was never interested in reading any of the formula-laden volumes, but I knew people who had, and from them I heard the gist of the religious theme: The rapture is manifest when Christian “believers” are snatched from whatever they were doing—walking down the street, driving a car, flying an airplane—and pulled up (never down or sideways, though once you get yourself into outer space, what’s “up,” “down,” or “sideways,” anyway?) into heaven to sit at the base of the Holy Throne, occupied by God Almighty Himself, with Jesus standing in total peace nearby, shepherd’s crook in hand and flock of sheep at his feet. Nice work if you can get it. But you couldn’t achieve eternal life and salvation—not only from sin, but from harm, pain, disease, and high health care premiums—if you weren’t a bona fide believer. The non-believers? They all got . . . left behind.
In the spirit of full disclosure, I’m not sure that I’ve accurately portrayed that scene at the seat of God. In fact, I’m pretty sure that I never heard enough details to describe anything of what became of the folks who “disappeared,” except that they apparently left their clothes behind. If, in fact, that’s what happens to “believers” in real life (“death”?), then we’d have to conclude that heaven is a de facto nudist colony. So much for the naked truth—as peddled by two Christian “believers,” who left their readers behind on the way to the bank.
However little I can represent about the fate of the believers, at least as portrayed in Left Behind, what I know for certain about the books is that those who were left behind were, well, SOL, as it were.
Left Behind and SOL is exactly how I felt when visiting our younger son’s office today. Recently, he’d moved into larger quarters in the attractive office space his firm occupies in the heart of West Hartford, Connecticut. It was not unlike the various office-building offices that I called home for most of my waking hours between 1982 and the advent of Covid—March 11, 2020, to be brutally exact: interior and exterior windows; L-shaped desk with matching credenza; stylish desk chair; art on the walls; on the window shelf, framed photos of his photogenic son and bride (reminiscent of framed photos on my window ledges, featuring (photogenic) Beth, Byron and his brother); a computer system on the desk; and most important of all—a full complement of “high count” snacks, courtesy of a trip or two to Costco, squirreled away in obscure corners of his setup, and therefore relatively secure from pillaging.
So far, so good. In side-by-side photos of my offices over the years and his, things in each would look remarkably similar—perhaps reassuringly so. At least as I surveyed his digs, I felt a certain sense of stability and security about things: over a period of 43 years, during which many aspects of our lives have changed beyond recognition, something as foundational as an “office” where day-to-day business is conducted, has withstood change.
But just then I happened to glance at his three computer screens. They were loaded with numbers, trading symbols, and fine print. Numbers and trading symbols on computer screens aren’t anything novel in the finance industry. Nor is fine print, necessarily, but huge volumes of AI-generated summaries of the fine print? Now we’re talking new-fangled. The “new-fangled” was explained to us when Beth happened to ask (astutely at the same time naively), “How much do you rely on AI?” In response to her question, Byron patiently showed us the summary of a recent earnings call regarding a publicly traded company. In one column was a transcript of every participant’s questions and comments. Some were nearly the length of the screen and covered an extensive discourse on one matter or another germane (we can presume) to the company’s performance and prospects. In an adjoining column, Byron pointed out the “new-fangled,” the AI-generated summaries—concise restatements of what had taken fairly articulate humans three times as many words to convey.
He then toggled to another screen and demonstrated how AI is applied to tracking various customized investment parameters for selected companies.
Next, mostly to humor me, Byron pulled up ChatGTP and entered a query, “Explain to me the dynasties of China over the past 5000 [!] years.”
“Voila!” as our two-year-old grandson loves to say, along with “Magnifique!” and, whenever I prompt him, “CHI-naa.” As you can imagine, within seconds this popular AI platform presented what appeared to be a detailed outline of the history of Chinese dynasties from the beginning of time.
In reaction to all this, Beth said, “So we don’t need to know anything anymore.”
That’s been the case for quite some time (welcome to Google and other search engines), but advances in AI have produced quantum leaps in what machines can know and do and . . .
“But as the slogan at Mounds Park [Academy][where both of our sons attended high school] went,” said Beth, “‘We don’t teach what kids should know but how to think.” Yet, I thought, it seems that now machines can “think” as well as “know.” I was unnerved by the thought that Margaret Mead’s (original) quotation, “Children must be taught how to think, not what to think,” is fast becoming obsolete. As I observed Byron adapting to this brave new work world, I began to feel . . . left behind. (Cont.)
Subscribe to this blog and receive notifications of new posts by email.
© 2025 by Eric Nilsson