Today, Nov. 20, 2017, is disorienting for me. I reach exactly the age, to the day, at which my mother died in 1983.
Yes, a person would have to be an obsessive number cruncher to know that today is the precise number of days since my 59th birthday last summer that Mom lived past her 59th. And a person would have to be just a little insecure and into the mystical to be preoccupied with this fact.
It’s not a health concern that’s on my mind. I fully expect to be as fine tonight when I drop off to sleep as I am typing this – spry, mobile, and with a nearly ideal body weight since losing 90 pounds nine years ago and keeping it off since.
But something just isn’t right – I will be older than Mom ever got to be.
Older than Mom. No way. Can’t be.
I am not ready to be senior to my source of wisdom and nurturing. Mom will always be older and savvier than I am.
Millie Morrison raised my older sister and myself as a single parent, starting when that was too new even to be a phenomenon.
I will still look to Mom in my memories and in my modern day conception of her for a more mature perspective on things.
Of much more importance than the fact that Mom was our sole financial provider during 15 years when no child support arrived, she was our moral and intellectual guide. A high school English and Humanities teacher, Mom made sure we would love learning as we grew up by incorporating it into what we enjoyed.
When she saw I was fascinated by maps at age five, Mom introduced me to reading by going over the Indiana page in an atlas and teaching me to read our state’s city names. (I’m not sure if it was state loyalty or an abundance of short words; Gary, South Bend and Fort Wayne were great starter outers.)
In a few years I was a sports lover, helped along by the NBC Baseball Game of the Week, so Mom brought home sports themed fiction and non-fiction books she bought through her school to keep me developing my reading skills while liking the experience.
Mom earned her master’s degree, taking classes at a downtown Louisville college while our grandmother and other adults came over to child sit. Mom also enrolled in a non-credit institute to study the culture and politics of India.
She rose at 5 every morning, fed the beloved cats who owned us, then worked the crossword puzzle with coffee in hand — all before taking her son and daughter to school, then driving on to her job to teach other people’s children.
We didn’t think of Mom as extraordinarily disciplined as such, because we had no comparison. But in retrospect, oh my, how she epitomized efficiency and focus, while maintaining an easy, approachable manner (despite a son who tested her patience).
We did appreciate Mom serving as a fountain of information. Friends, students and even a television news anchorman at a station where my father had worked decades earlier would sometimes call asking her to resolve a grammatical matter, which Mom would do in seconds, then go back to cooking supper.
Books by Hermann Hesse, D.H. Lawrence and Alvin Toffler shared space on her shelves with parenting guides and her college yearbooks – and Mom’s issues of The Atlantic.
Their pages each month represented the eclectic spheres Mom would have seen more of in person, but for tight finances and her firm belief that her children came first.
Besides, Mom was happy sharing minds and hearts with the educated friends she had, in our area and other nearby places.
Of course, she gave us needed advice in a rapidly changing world. And today, as I reluctantly walk past that chronological point where fate took her from us, becoming – as impossible as it is to behold this – older than Mom ever was, she gives me the same advice on my number fixation about today’s date that she often did when I’d overthink and anguish about the shallowness and incongruity of the world:
Stanley Kubrick, in a bold artistic flourish, set out to film the movie by taking Polaroid pictures, then stringing them together. The need to make 30 photographs for a single second of screen time caused him to abandon the plan early, to the relief of a skeptical cast and crew.
I wrote the following letter Aug. 16, 2017 to my congressman, Rep. John Yarmuth, Democrat of Kentucky’s 3rd District:
Dear Rep. Yarmuth,
After careful consideration of the effects it may have on the stability of the nation, I write today to ask you to initiate the use of the 25th Amendment to the U.S. Constitution to remove President Donald Trump from power.
The amendment allows the Vice-President and a majority of the Cabinet to recommend the removal of the president in cases where the president is “unable to discharge the powers and duties of his office,” and allows the House and Senate to confirm the recommendation over the president’s objection by two-thirds vote.
Though it may appear that partisan loyalty by the Vice-President and cabinet members would impede the process, under Section 4 of the 25th Amendment, Congress has considerable input.
In lieu of waiting for a cabinet majority to make the recommendation, Congress may by law provide an independent body, described as a “disability review body” which, with the Vice-President’s concurring, could declare the president unable to discharge the powers and duties of his office, and send its own written declaration to the Senate president pro tempore and the House speaker.
Twenty-fifth Amendment author U.S. Senator Birch Bayh of Indiana, concurring with former President Dwight Eisenhower, said the question of whether a president should be removed is “really a political question.” Bayh continued that the decision to invoke the 25th Amendment should rest on the “professional judgment of the political circumstances existing at the time.”
Today, President Trump’s performance in office has demonstrated ineptitude and instability which have endangered the security of the nation and the lives of millions of innocent Americans and residents of other nations. I strongly believe that circumstances show, based on Senator Bayh’s criteria, that President Trump is unable to discharge the powers and duties of his office. These circumstances include the president’s:
*Inability to abide by the 14th Amendment’s requirement of equal protection of the law, as shown by his use of derogatory sweeping generalizations of minorities.
*Failure to devote effort to his job in the crucial first months in office, constantly vacationing at his own resort while his agenda flounders in Congress.
*Hazardous and ill-considered nuclear saber rattling done on his personal whim, instead of relying on plural input by military strategists.
*Lack of basic linguistic skills, which undermines the communicating to the public needed to ensure consent of the governed, and use of gaslighting trickery and evasive adhominem responses to criticism.
*Refusal to sit for the crucial American tradition of independent media scrutiny, instead calling reporters enemies of the people, a verbal assault which undermines the 1st Amendment to the U.S. Constitution.
In the May issue of The Atlantic magazine, National Constitution Center president and George Washington University law professor Jeffrey Rosen, while acknowledging that use of the 25th Amendment’s never before employed involuntary removal mechanism on a president not incapacitated by illness “could trigger a political crisis,” added: “…(T)he constitutional test of the president’s being ‘unable to discharge the powers and duties’ of the office was intended to be vague and open-ended.”
Rosen added: “Because the Twenty-fifth Amendment was intended to leave the determination of presidential disability to politicians, rather than to doctors, nothing in the text or history of the Amendment would preclude the vice president, Cabinet, and Congress from determining the president is ‘unable to discharge the powers and duties of his office’ if they deemed it in their political interest to do so.”
The intractable and worsening dangers posed by President Trump’s clear inability to discharge the powers and duties of his office now outweigh any negative effects of the use of the 25th Amendment. Though sufficient votes in the House and Senate certainly would not exist now to remove the president, appointing a disability review body would communicate to the administration that President Trump’s fitness for the office is an issue that will very possibly result in his facing removal if he continues using his current tactics.
I urge you to propose a discussion on the prompt creation of a disability review body for the purpose of weighing the evidence on using the 25th Amendment to the U.S. Constitution to remove President Donald Trump from power.
“Hey, whose Grandpa didn’t tell some tales?” asked the headline on a New York Times obituary in February 2006 for television actor Al Lewis, best known for playing the vampire-ish Grandpa on “The Munsters,” the 1960s CBS comedy.
It was a lighthearted and deservedly cheery send off for a brilliant character actor, political activist and restauranteur whose contributions to our lives ranged from a million escapist TV laughs to bold radical street activism.
The headline was also a colossal understatement. Al Lewis told more than just “some” tales like the embellished fish stories everybody’s grandfather leaves us with. In fact, the same obit story listed him as age uncertain. That’s because Lewis at different times had listed two birth years as his own.
He was born in either 1910 or 1923 as Alexander Meister. Or Albert Meister. In New York City. Or 287 miles from there in Wolcott, a small town in far upstate Wayne County, N.Y.
That town near Lake Ontario entered the Al Lewis narrative late in his life when a reporter trying to clear up the matter of the actual year in which he was born asked Lewis why no birth certificate bearing his identity could be located in NYC, his hometown. Lewis responded that he was not born in the Big Apple, but entered the world while his mother had briefly lived in Wolcott to work in a factory.
Sealing Al Lewis’ stature as the greatest man of mystery is that no birth record for any A. Meister can be found in Wayne County, said imdb.com, a public figure biography site.
Imdb said that days after Al Lewis’ death, one of his three sons announced that Lewis had in fact been born on April 30, 1923, not 1910 as the actor had claimed.
“Why the deception?” asked the web site Everything2.com. “It could’ve been part of his tryouts for ‘The Munsters.’ If he was born in ’23, he was actually a year younger than Yvonne DeCarlo, who was supposed to be his daughter. But by claiming to be 13 years older, perhaps he felt he’d seem more grandfatherly to the show’s producers.
“At any rate,” Eveyrthing2 continued, “it seems likely that Lewis told a bunch of stories about his youth, either to support his claims about his birthdate or just for the joy of telling stories.”
Al Lewis’ lifelong penchant for fudging brought anything but joy to historians and journalists, who often had to retract or revamp information they had confidently published about one of the television era’s most beloved and eclectic entertainers.
In fact, that New York Times obituary was the second one within days the nation’s newspaper of record published on Al Lewis, the latter correcting the first’s careless inclusion of already discredited information. The Times obituarist Dan Barry wrote that almost every claim Lewis made about his early life – his birth date and place of birth, his wartime adventures in the merchant marine, his education – was unverifiable and possibly false.
Among others were that Lewis had faced danger touring the maliciously anti-union Southeast to help John L. Lewis organize workers, rallied outside the White House in support of condemned immigrant anarchists Sacco and Vanzetti, performed as a clown in a traveling circus, sold hot dogs at Brooklyn Dodgers games at Ebbets Field, and in the mid-1960s hired Charles Manson to babysit his three children (he recalled Manson as trustworthy and caring).
Regarding Al Lewis’ educational resume, the imdb.com site added: “Although he claimed to have a Ph.D. in child psychology from Columbia University, the university has no record of it, under his stage name or his real name.”
Lewis’ reliability began being questioned in the early 2000s after his wife of two decades, Karen Lewis, found documents while preparing for her ostensibly 93-year-old husband’s hospitalization for an angioplasty which showed he was in fact just 80. That was the first she knew of any age discrepancy, but the Times quoted her as saying the finding didn’t affect her feelings about him.
A reporter soon examined the actor’s commonly reported story that he had served as a paralegal in the trial of the Scottsboro Boys, a landmark civil rights case involving nine black Alabama teenagers falsely accused in 1931 of raping two white women.
A 1923 birth would have made Lewis eight during the trial (or college age if he were born in 1910). At whatever stage of life, Lewis said he learned of the Scottsboro Boys’ plight after his mother attended a rally for their freedom.
His mother, if one trusts the following Al Lewis recollection on the web site Everything2.com, “worked in the garment trades. My mother was an indomitable spirit. My grandfather had no sons. He had six daughters. They lived in Poland or Russia, every five years it would change. My mother being the oldest daughter, they saved their money, and when she was about 16 they sent her to the United States, not knowing a word of English. She went to work in the garment center, worked her back and rear-end off and brought over to the United States her five sisters and two parents. I remember going on picket lines with my mother. My mother wouldn’t back down to anyone.”
Nothing suspicious about that classic early 1900s immigrant working class bio.
Also perfectly plausible is the 6-foot-1 Lewis’ description of his playing basketball in his youth in New York City and later serving as a non-hired scout for NBA teams – but was he the very best scout in the game?
When Lewis boasted to independent radio station WFMU’s blog that, “you can call Marty Blake, the chief scout for the NBA, he lives outside Atlanta, and ask him who is the most knowledgeable man of roundball you have ever met. Without hesitation, he will tell you, Al Lewis.”
So Kliph Nesteroff, the author of WFMU blog entry “The Myths and Politics of Grandpa Munster,” ran that claim past Blake, who concurred: “He (Lewis) knew everything there was to know about basketball from the tips of your toes to the top of your head.”
However, Nesteroff also wrote: “Lewis liked to say he worked on the defense committee of Italian anarchists Sacco and Vanzetti. If there were any semblance of truth to this, it would have occurred when he was no more than five years old…. Neither was he in Washington, as he claimed, the night the American communists Julius and Ethel Rosenberg, sentenced to death for treason, were executed.”
It IS known that Al Lewis, living out his left-wing values, donated his time and most of his earnings from the two-year run of The Munsters to charities, particularly a program helping teenage runaways, who were proliferating in Los Angeles during the late 1960s. But the admiration one feels upon hearing of this altruism quickly turns to skepticism, when Lewis identifies one of those kids he brought under his wing:
“That’s how I met Charlie Manson. He babysat my three kids…. He sat for four or five hours, he amused the kids, he brought the guitar and he played, no big deal, no sweat.”
Back in the real world of documented facts, Lewis ran for governor of New York as the Green Party candidate in 1998, opposing Republican incumbent George Pataki. Like a precursor of Bernie Sanders and with an accent to match, Lewis toured the Empire State fervently condemning health insurance companies, polluting industries, U.S. wars, and corporate tax breaks which made the poor overtaxed. At age 88 (or 75?), he won 52,533 votes, above the 50,000-vote threshold for receiving automatic ballot placement in the subsequent election. Lewis decided not to make another run, however, citing long odds of being elected as a Green.
He sought to be listed on the 1998 ballot as “Grandpa Al Lewis” to gain momentum from his TV recognition. A state judge turned down the request.Before The Munsters premiered in 1964, Lewis played New York City police officer Leo Schnauser in the comedy “Car 54, Where Are You?” from 1961 to ’63. Real police in his hometown loved the character and Lewis did public appearances on their behalf. Relations 40 years later between police and radical candidate Al Lewis were cooler when the Green gubernatorial hopeful criticized police use of force practices as racist.
Everyone, however, was warm toward “Grandpa,” and Lewis’ most memorable TV character was how he was often addressed by political supporters, TV fans and customers at Grampa’s Bella Gente Italian, a Greenwich Village restaurant he founded and where his regular presence was a draw. Lewis would greet customers entering, chatting with them, posing for pictures and signing autographs.
One unlikely sounding distinction by Al Lewis that was in fact documented before millions is that he was once censored by Howard Stern. You read right, censored by Howard Stern, America’s chief poddy mouth of the air.
Lewis, who discussed political issues with iron fervor, but free of obscenities on his own Saturday radio show in the early 2000s on New York City’s WBAI, once joined Stern in an outdoor rally against the FCC’s frequent fining of Stern and others for regular use of words banned on airwaves. Not realizing that his microphone was tied into a live broadcast of Stern’s show as well as the rally’s public address system, Grandpa told the crowd: “We’re here because we all have a purpose… And that purpose is to say ‘Fuck the FCC! Fuck ’em! Fuck ’em! Fuck ’em!’ ”
An uncharacteristically mortified Stern frantically slapped his hand on the mic to try to keep his fines from piling even higher.
“I really thought [he’d] lost his mind,” Stern said on the WFMU blog. “As far as I was concerned, my career was over because we’re on the radio live.”
For once, there was no doubting Al Lewis meant what he said.
Brian Arbenz loved Grandpa on The Munsters — and the radical left positions he took while running for office.
49 years later, Mr. Manring froze, fell to his knees on a rock and said he knew this was the spot….”I hope I’m at peace now.”
Mr. Manring was a family man, a veteran and a machinist in a factory back in the era when blue collar jobs brought the wages and benefits to support a family and veteran status was naturally associated with being an upstanding person.
I still refer to him as “Mr. Manring,” rather than Roy Manring, because he was one of the adult volunteers for our Boy Scout Troop 54 in New Albany in the early 1970s. The form of address I used then for our adult leaders still seems proper to me.
Yet I was a rebel — then, as now. And when I beheld the green uniforms, the trademark salute and the combat medal-like layout of our merit badges, I would be aware of a contradiction between my like of scouting and my passionate opposition to the war in Vietnam.
Among our troop’s adult leaders, two — including my easygoing uncle Joe — were Spiro Agnew-admiring conservatives and World War II veterans.
Mr. Manring, however, didn’t show his ideological cards. He always maintained an easy smile and although his dark eyes were piercing and handsome, they were wide and innocent. He seemed perpetually to be an uncomplicated, contented, locally immersed average person, untouched by the controversies of the wider world.
But before I make him sound like the kind of person who might not follow the national news, let me recall one day when I was 11 when my mother astonished me by telling me that my very own scout troop’s adult volunteer had in the early 1950s been a national news story.
Mr. Manring was interviewed on the Today Show. That’s the NBC Today Show, with the whole country watching.
And no, this wasn’t one of those chance interviews with tourists hanging out by Rockefeller Center. Mr. Manring was a guest in the studio, telling the nation of a medical miracle, as Mom passed the story on to me. She said that while he was in combat in the Korean War, he was hit with a barrage of gunfire and survived having nine bullets in his body at once.
I don’t recall her giving any more details, except that Mr. Manring seemed homespun during the interview, and I got the impression that he was almost casual talking to the nation about such a hellish experience.
I don’t know which Today Show icon conducted the interview, but I figure a polished and professional Hugh Downs or Dave Garroway sitting with a warm and folksy laborer made for as unlikely an encounter as when the unknown Korean villager and the teenager from Southern Indiana faced each other down for a horrifying moment in a war waged by superpowers so beyond the reach of either.
Flash forward to the era of the internet. I now find out that Mr. Manring’s voluminous wounds weren’t from one-to-one combat.
He had been taken prisoner by the North Koreans and was one of 42 U.S. captives shot on a hillside while their hands were tied behind their backs. It was a massacre.
Records were vague and for decades Mr. Manring had understood that he was the sole survivor. Hence, the solo Today Show appearance.
A historian researching the atrocity in the mid-1990s found that in fact five people had survived — three of whom were still living — and persuaded the Army to give the trio medals to note the suffering of all 42 of the POWs. The Pentagon also offered them a trip back to South Korea in 1999 to let them try to identify the exact spot of the massacre so a plaque could be placed there on the 50th anniversary the following year.
From the British newsreel on the Waegwan massacre
One of the three was not physically up to the trip — so my former scout leader and a fellow survivor, a private first class who had been Mr. Manring’s friend during the war, traveled to a hillside near Waegwan, South Korea. (The friend, who of course had been presumed dead by Mr. Manring for more than 40 years, lived in Cincinnati, just 110 miles away, all that time. When Mr. Manring learned that his buddy in fact had not been killed in the massacre, he jumped in his car and drove straight up I-71 to reunite with him).
In 1999, the return trip to Korea commenced, and a Boston Globe reporter accompanied Mr. Manring and his Cincinnati friend all the way to Waegwan. She reported that Mr. Manring had taken not the nine bullets I recall in my mother’s telling, but 14 — including five from what we call “friendly fire.”
The Globe, detailing the horrible events on the hillside in 1950, said that after the North Koreans left the 42 Americans for dead, the bullet-ridden Mr. Manring began to hobble away from the killing site, only to be shot at by a U.S. unit which was unable to identify his tattered uniform.
Ravaged seemingly beyond hope of survival by both sides in a war euphemistically called a “police action,” he spent 18 months in hospitals in Korea, Japan and the United States. Amazingly, as a boy scout, I never recall detecting a limp or a stammer or any other indication that this happy and laid back man could ever have been victimized by violence on such an historic scale.
For a long time, even some of those closest to Mr. Manring didn’t fully know either.
He told the Globe reporter: “My kids knew I was an ex-POW, but they didn’t know what I had been through…. I didn’t want to talk to anyone about it, except my wife.”
The reporter watched Mr. Manring and his buddy examine the terrain around Waegwan for hours, patiently trying to match what they were seeing with 49-year-old memories. Then, in one instant that brought back an anguish the opposite of the mood familiar to his New Albany piers, Mr. Manring froze, fell to his knees on a rock and said he knew this was the spot.
Shuddering, he described to the Globe how on that day in 1950 his grandfather appeared to him in image just after the North Koreans pulled out, put his arm on the shoulder of the bloodied 18-year old and warned him: “They’re coming back, get out of here.”
The reporter and others in the entourage then allowed Mr. Manring and his friend a few minutes each alone on the hillside.
Mr. Manring returned, the Globe reported, then whispered:
“I talked to the boys. I hope I’m at peace now. I begged their forgiveness. I have dreams about them all the time. I feel guilty that I survived.”
There was one more profound memory the visit brought out, one which the reporter said caused Mr. Manring to be overcome with emotion.
Speaking softly, he said to her: “I’m going to tell you something I’ve hardly told anyone…. I shot a little Korean girl — she was maybe 8 or 10 years old.”
Mr. Manring then recounted a kill-or-be-killed moment in the early days of the war. His platoon was approached by a group of refugees, but when he took out his binoculars, he saw a girl among them holding a grenade — with the pin removed — forcing him, with no time to think, to become a killer in order to be a lifesaver.
He shot the child, resulting in the grenade exploding at her feet, killing many of the refugees, rather than her intended targets. Even though some of the refugees were found to be wearing North Korean uniforms under their civilian clothes, Mr. Manring, almost a half century later, thought of the person who nearly lobbed a live grenade at him and his colleagues first as a little girl, not a guerilla.
“I put a bullet in between her eyes,” he told the Globe, sobbing. “She bothers me to this day.”
Also around the 50th anniversary of the war, Mr. Manring discussed the incident with a student historian from Indiana University Southeast, who quoted him recalling the little girl on a website: “She comes and sees me every now and then. She asks me, ‘Why, why did you do this to me?’ I told her, ‘I’m sorry honey, but I had to.’ ”
After describing to the student the wartime policy of a ruthless North Korean government of using civilians of all ages as homicidal infiltrators, Mr. Manring added that he would again respond the same way to seeing the child pull the pin.
Reading the full story of the anguish in our cheerful scout volunteer’s past opened my eyes to the dual role of soldiers as victims and offenders in war.
This has always complicated peace activism by rendering expressions of appropriate sympathy for them vulnerable to being twisted into pro-war spin.
Hesitating to kill in a combat situation because of awareness of the enemy’s humanity is precisely what combat training is designed to prevent, as though such a moment is a fatal weakness. It is in fact our greatest strength.
Regarding the two directions from which the gunfire came that ravaged the teenage Mr. Manring, I was socialized during my childhood to see being shot by the other side, or one’s own, as polar opposite phenomena.
One is heroic and noble, the other an absurd boondoggle.
Yet if we accept the overriding principle of our religiosity that we are put in this world to love one another, are not all war wounds from friendly fire?
“Accidental” describes not just the five American-made bullets that hit Mr. Manring, but the whole scenario of a young man from New Albany and counterparts from equally insular villages on the Korean peninsula being whisked from lives of community involvement and small scale economics not to meet and interact, but to kill or be killed.
Roy Manring donated many hours to help our scout troop’s leaders help me and my young colleagues learn to work together pitching tents, preparing food, hiking, telling folk tales – fitting his volunteering in around the customary 40 hours a week of conscientious factory work when American industrial jobs were in their prime. Precisely the day-to-day mundanity which boys of my youth turned to war comics to escape in pursuit of a glamorous warrior narrative we believed was at the heart of our gender’s identity.
We did not see that the time spent quietly adding to lives by one’s own initiative – rather than imperiling lives, one’s own included, by robotically adapting to an arbitrary and unnatural state of enmity – constituted Mr. Manring’s true moments of valor.
There was a time when one office co-worker or member of the lunch bunch was the go-to person for questions like, “Who was Lincoln’s first vice-president?” or “What year did ‘Jeopardy’ premier?”
I remember that time well because I was that one turned to to instantly produce “Hannibal Hamlin” or “1964.” Then came Google on I-pads and my principal role in the group was obsolete.
As with all who find themselves displaced by technology, I had to find new skills to, in this case keep my sense of validation, rather than employablity.
For a while, that was tough! Gradually, though, I learned that I can have a purpose in the group by – this is so simple it is embarrassing – just being a pleasant person. I’d put that: by just being me, but the problem was, “me” had equaled “knowledge” for as far back as I could recall. Being the brain was a great gig for so long that I complacently stuck with it, until my support system was yanked away, forcing me to access the many parts of myself I had been ignoring. So, thank you, Google!
That’s the positive angle on the new, less cerebral, more personable me. There also have been unhappy developments which have influenced this change.
Months after Robin Williams’ stunning death in 2014, his loved ones laid out how he simply could not control the genius currents constantly running his mind, pushing him always to observe, create comedy and dazzle, a three-step process that had long been as natural, even automatic to him as breathing.
His stuck-on mind was so fast, that being humorous on the spot became a command more so than a talent. He began hallucinating, then experiencing dementia through a condition called Lewy Body Disorder, so named from a protein called alpha-synuclein abnormally deposited in the brain in configurations known as Lewy bodies.
No, I have never had that, or experienced anything like Robin Williams’ reported symptoms.
Nor has my mind reached the level of dysfunction endured by Phil Ochs, an outspoken folk singer in the early 1960s. He was my kind of person: left wing, esoteric and fearless.
The son of an army doctor in World War II, Phil Ochs’ genius produced biting satire which attacked shortcomings he saw among progressives, as well as excoriating capitalism and racism.
Colorful, handsome and daring, Ochs had high standards for his art and for left activism. He occasionally argued with members of his own audiences over pronouncements they shouted.
Yes, I can identify with Phil Ochs, primarily because his depth of understanding was a burden. In a society of snappy phrases and sound bites, getting elaborate messages out through pop culture eventually is futile, I believe.
I figure that may have been one of the factors in his losing his mind in the 1970s, even becoming dissociative from his own identity.
His changes seemed innocent at first. His music’s ardent leftist tone softened as Ochs did songs of centrist Americana and he became longing for martyred brothers John and Robert Kennedy. The changes then kicked into rapid gear. After becoming homeless, Ochs was diagnosed as genuinely perceiving that he was someone else – a self-invented persona Ochs gave the name John Butler Train (after JFK and William Butler Yeats). And he believed he, as Train, had killed the great folk figure Phil Ochs.
He eventually regained his identity and seemed clear headed and contented if apathetic while living with relatives on Long Island, N.Y.
He did child care for nephews and nieces, played cards and did little else, acting blasé about his musical achievements and the political struggles wrapped up in them. Internally, however, Phil Ochs was not so sedate. He committed suicide in 1976.
Again, my strains in life have not been as great as what Ochs faced, but had I achieved some national stature, who knows?
I’ve gone through some similar outlook adjustments. I took on the world in my late teens and early 20s, often championing leftist causes in my writings in mainstream and my college media, as well as letters to the editor in newspapers and over Marxist nations’ shortwave radio stations (shortwave was then essential in much of the world, but in the 1970s and ‘80s followed by only five percent of Americans, generally introverts and the NSA’s unit which created dossiers by monitoring letters like mine).
Suddenly feeling worn down by the absence of results in the me-generation society around me and put off by the sectarian splits on the world’s left, I started seeking social democratic change. I became more mellow and less strident, much in the same manner as Phil Ochs had, and started feeling more affinity with the society’s better angels. I even gave a tip of the hat to John and Bobby Kennedy.
I saw great progress possible via better social policies like family planning, gun controls, mass transit and restorative justice.
My path has resembled Phil Ochs only to a limited extent, but considering his end, the similarities have been enough to give me pause.
If there is one similar thread in what I have chronically experienced and the derailments of Williams’ and Ochs’ lives, it is overthinking. I am constantly aware — hyperaware of meanings to be defined from events, encounters and statements, even unscripted ones.
“You’re too much of an empath,” one acquaintance told me after I described how a stranger’s momentary frustration that morning over one of life’s rough spots was sticking with me all day.
Yes, I have trouble letting simple events I observe remain simple; I must fight the ingrained habit of referencing everything to the realm of complex ideas, concepts and polemics.
While sauntering along an apartment walkway to visit a friend in what happened to be the year 2001 I was greeted unexpectedly by a pleasant chatty little girl on a trike. She looked a little like the character Josephine Floyd, who speaks to her father in a picturephone call in the movie 2001: A Space Odyssey.
Most people would be content to note the nominal resemblance and move on. My instincts for drawing parallels wouldn’t nearly be sated with that.
I instantly decided to write a column for the monthly peace and justice newspaper I edited telling how the encounter with this charming child crystalized in my mind the differences between the future year envisioned by Arthur C. Clark and Stanley Kubrick and the real 2001.
That actual year was about whether little children would get health care, a home and eventually a job beyond Taco Bell, not whether life on a space station would beckon their fathers, who in the real 2001 may be known to the Josephine Floyds of the nation more for their signatures on child support checks.
That column was never written, a sign that my red hot penchant for epiphanies was beginning to cool.
Then came social media, which reconnected me with elementary school chums with whom I’d had almost no contact for decades. That opened my eyes to something else that might be an imbalance in my mind – it turns out I have precise memory skills that are astonishing, maybe even spooky to some.
“You mean everyone can’t do that?” I asked a 6th grade classmate from 40-plus years earlier when he was flabbergasted at how I, from memory, tagged everyone in his copy of the class picture – in two minutes. No, that is not a normal skill, I learned.
Though neither he nor any of the other schoolmates I checked in with after joining Facebook thought it troubling that I could, say, remember particular questions they had asked our teachers during lessons on adjectives and adverbs or South American geography, I became a little self-conscious.
Was this newfound ability a gift, or was it creepy? After all, some of these folks about whom I could remember such details were people I had never actually talked to back in school.
More to the point of my present agenda, would it be an obstacle to improving my social contacts – just another reminder that I have always been different?
When asked by an innocently smiling person from way back, “How do you remember all this?” I, perhaps out of a sudden awareness that this could indeed be a problem, or just being lightheartedly self-effacing, told her, “I’m forgetful impaired.”
Truth is, my bigger situation is hyperawareness. And as a method of treatment, I am experimenting with being less precise on arcane data. In conversations, I’m saying, “that was more than 30 years ago,” instead of my traditional way of citing of the exact number when that number is not essential to the topic.
I’m asking myself, how much should I hold onto empathy over some complete stranger missing a bus this morning, or a driver pulling away not realizing their soft drink had been placed on their car roof.
Or a telephone customer to whom I gave the wrong serial number on a model railroad set sold at the store where I at age 18 worked my first job, forcing her and her husband to drive across town on a snowy night in pursuit of a coveted product it turned out we did not have. Yes, self-forgiveness is another issue involved in my being “forgetful impaired,” perhaps better described as an inability to let go.
I’m also using paraphrases more when they will do instead of exact quotes in recounting statements by public figures, or a judge’s ruling on a water rate hike, or my 2nd grade teacher when she taught us what homonyms were in 1966 – uh, make that more than 50 years ago.
Brian Arbenz, a self-published of author and independent journalist, lives in Louisiville, whose residents may notice he seems less deep in thought these days.
The United States of America will not become fascist at noon on Jan. 20, 2017. That’s not a cause for much relief, however, because whereas Donald Trump’s taking the oath of office will not make us fascist, it will be a continuation, and perhaps acceleration of a steady move toward autocracy and totalitarianism underway since longer than most of us have been alive.
Even if it were to be Hillary Clinton on the Capitol steps with her hand on the bible, the scene still would be of a new leader of a continuing system which because of specific actions taken, largely functions through conspiracy, not the consent of the governed; and a capitalism where obtaining and protecting great wealth more often has been based on anti-competitive cartels – hatched at the corporate heights and accommodated to at the small business level – than anything approaching free markets. Democratic presidents of the last 25 years have pushed this trend along about as vigorously as have Republicans.
This must not, however, be taken as a cynical excuse to drop out on the grounds that all systems are destined to be corrupt, so carve out the best deal you can for yourself. Not at all!
The American system has been used by the people to create the Civil Rights movement, the polio vaccine, anti-lynching laws, great music and theater, rural cooperatives, feminism, multi-racial labor movements and same-sex marriage equality.
“The premise of America,” as one friend of mine recently put it, is what is good and worth supporting.
Sabotage by secretive plutocrats has regularly stymied the many leaps forward in struggles to make free speech, the right to peaceably assemble and the equal protection of the law real, and for people, not just to protect assets.
My wish is that the timeline below might serve as something of a strategy in reverse. It’s not meant to be a gloomy litany that explains how we got to our current danger, but a list of reversible mistakes. I believe correcting them ought to be done to get us back on course toward fulfilling that premise.
There were many thefts of our rights before 1947, including the Palmer raids, the massacre of the Bonus Marchers, the post-Haymarket Bombing crackdown and several U.S. Supreme Court decisions giving corporations — once regarded as licensed to operate in the public interest — 14th Amendment rights, in the wake of the Plessy v. Ferguson decision’s denying black Americans that amendment’s equal protection of the law.
More to the present time, I see the following 70-year trail which has steadily eroded pluralism to protect corporate assets, confused licensed privilege for absolute entitlement, replaced the educible exchanges of honest discourse with the circular logic and ad hominem attacks of ideological tribalism, and turned media from scrutinizers to stenographers. All this eventually enabled a cagey celebrity of mediocre business ability to unleash groundless xenophobia and claim the presidency:
1947 — Taft-Hartley Act passes over President Harry Truman’s veto. Along with curbing many labor union practices allowed under the Wagner Act passed a decade earlier, Taft-Hartley gives the federal government the power to order all unions to refuse membership to anyone with communist affiliations. Instead of prosecuting people for specific acts of sedition, this preemptively bans political activity based on certain ideologies, and due only to their agendas threatening corporate profits.
1947 – Screen Guide for Americans, an 11-page document, is published by Ayn Rand, an anti-egalitarian with close ties to the House Un-American Activities Committee. Though the guide stressed it was for voluntary consideration, Rand also reported a long list of filmmakers to the FBI over themes and specifics in their movies (she told the bureau Frank Capra’s It’s a Wonderful Life contained communist propaganda, and she once condemned The Best Years of Our Lives for a negative portrayal of business and even for suggesting that veterans should receive collateral-free loans).
Rand’s Screen Guide went way beyond cautioning Hollywood against producing outright radical messages. Warning of “snide little touches which communists sneak into scripts,” the guide attempted to micromanage the industry to “present the political ideas of Americanism strongly and honestly,” which Rand said required movie plots to favor business and profit.
1948 – General Motors and other companies are let off with virtually no punishment after a federal court found they had illegally destroyed the nation’s urban passenger rail system, forcing car dependency nationwide. This dependency is represented as “The American Love Affair with the Car” by media who ignore GM’s giant illegal trust and the courtroom travesty that re-shaped the American city and set the stage for wars over oil. Beginning on their own in 1926, then through forming a dummy corporation called National City Lines 10 years later, GM bought rail lines in order to shut them down to vastly increase sales of its buses and cars. By 1946, National City Lines controlled public-transit systems in more than 80 cities, from Los Angeles to Baltimore. Standard Oil of California, Mack Truck, Phillips Petroleum and Firestone Tire would join GM in backing this sham operation.
“These companies, that had probably eliminated systems that in order to reconstitute today would require maybe $300 billion… were individually fined $5,000,” the documentary film Taken for a Ride said. The film said GM and its co-conspirators kept cutting back rail service deliberately to make riding less attractive. Former L.A. railway worker Jim Holzer said: “…the less attractive, the fewer riders. And then they say, `Well see, we can’t make any money.’ So they abandon it.”
1949, 1950 – Creation of NATO with permanent stationing of a peacetime U.S. military in many European nations, ending the Constitutionally-compelled practice of de-mobilizing the army after a war. This is followed by the U.S. entering the Korean War without the declaration of war required by Article 1, Section 8 of the U.S. Constitution. President Eisenhower in his farewell address a dozen years later acknowledges a huge permanent army combined with a vast arms industry is “new in the national experience” and is the harbinger of a Military-Industrial Complex through which the “potential for the disastrous rise of misplaced power exists and will persist.” Military spending is greatly increased in three of the next five presidencies, with no changes made to address Eisenhower’s warning.
1950 – McCarran Act, passed by overriding a sternly worded veto by President Truman, gives the government power to order any group which is communist — or even which favors any position in common with communists — to turn over all its internal documents, in blatant violation of the Fourth Amendment’s right of the people to be secure in their houses, persons, papers and effects from unreasonable search and seizure. As with Taft-Hartley but far more broadly, political beliefs — rather than actual seditious acts — are criminalized. Senators Richard Nixon and Lyndon Johnson and Representative John F. Kennedy vote for the McCarran Act and vote to override.
President Truman, writing in his veto message that adequate government powers already existed to prevent seditious activity, said, “In a free country, we punish men for the crimes they commit, but never for the opinions they have,” and he called the McCarran Act “a long step toward totalitarianism.”
1953 – U.S. and Britain overthrow Iran’s democratically elected government of Dr. Mohammad Mossadeghand install the Shah’s dictatorship, keeping secret their role in this coup until 1975. The U.S. under Truman and Eisenhower had expressed concern about Mossadegh’s ability to withstand a hypothetical coup by an Iranian pro-Soviet faction called the Tudeh Party. However, the U.S. National Security Archive acknowledged, “The joint U.S.-British operation ended Iran’s drive to assert sovereign control over its own resources and helped put an end to a vibrant chapter in the history of the country’s nationalist and democratic movements. These consequences resonated with dramatic effect in later years. When the Shah finally fell in 1979, memories of the U.S. intervention in 1953, which made possible the monarch’s subsequent, and increasingly unpopular 25-year reign intensified the anti-American character of the revolution in the minds of many Iranians.”
1953 – Secretary of State John Foster Dulles begins corralling church leaders and pastors to tell them to equate U.S. foreign policy with Christian values.
1954 – Eisenhower signs a law adding “under god” to the secular Pledge of Allegiance, over the objections of the daughter of the Pledge’s author, Christian minister and socialist Francis Bellamy (1855-1931), and in violation of the First Amendment’s prohibition of state establishment of religion.
Eisenhower explained in a Flag Day speech the next year that he re-wrote the Pledge of Allegiance as part of a drive to “strengthen those spiritual weapons which forever will be our country’s most powerful resource in peace and war.” Though History.com said he valued religion, the site said Eisenhower had left his family’s faith as a young adult, then become baptized a Presbyterian only after becoming president in 1953.
1954 – Eisenhower approves plan refused by Truman for the CIA to overthrow Guatemalan democracy ostensibly to stymie communism, but actually to benefit the finances of United Fruit Company, the board of which includes two high CIA officials. Democratically elected president Jacobo Arbenz Guzman had begun land reforms to reverse confiscation of small farms by the wealthy landowners and United Fruit. The New York Times censors its own correspondent in Guatemala to make its story on the coup match more closely the State Department lie that Arbenz was overthrown in a popular uprising. As a result of the coup, the worst human rights abusing dictatorship in the Americas is brought to power. Various rulers for the next 40 years commit genocide, torture and rape against Guatemalans, with several officers trained at the U.S. Government’s School of the Americas. In the 2000s, the U.S. admits it taught torture techniques at the SOA to Latin American and Caribbean military members.
1954 – Louisville civil rights activist Carl Braden convicted of “criminal syndicalism” in local criminal court over a bomb which exploded in a home recently bought by the family of Charlotte and Andrew Wade, who were black, in the previously all-white Louisville suburb of Shively. Wife and husband Anne and Carl Braden had helped the couple obtain the home and leftists which guarded it were then accused of having planted the bomb to foment a communist-sought race war. Selective evidence distorts the Bradens’ political leanings to persuade the jury. Federal court soon rules states can’t charge someone with sedition, freeing Carl Braden from prison, but the case is used to red bait Civil Rights to hamper its ties to the left.
1956 – Eisenhower further breaches the First Amendment by signing a bill making “In God We Trust,” the motto of the U.S. and requiring it to be printed on dollar bills.
1961 – Presidents Eisenhower and John F. Kennedy authorize attempts to assassinate Congo’s prime minister Patrice Lumumba and Cuban president Fidel Castro. Lumumba is known as pro-labor union and Castro has nationalized corporate wealth. Attorney General Robert Kennedy is aware that in its attempts on Castro, the CIA is working with criminal mobsters he seeks to prosecute.
Speaking on Nov. 16, 1961 at the University of Washington, JFK falsely assures the world that the U.S. will not use assassination, or other tactics of totalitarian powers, such as releasing false information and assembling counterfeit mobs, both of which Eisenhower did to overthrow Arbenz in Guatemala.
1964 – The facts surrounding the aerial attack by North Vietnam on a U.S. ship in Gulf of Tonkin are incorrectly presented to Congress, mostly through honest error, by the Lyndon Johnson administration. Johnson never clarifies or corrects what was later found to be a much smaller attack on the ship, which actually appeared to be poised for an attack on North Vietnam. Indications of oil under the South China Sea motivate LBJ to send more than 600,000 U.S. troops into South Vietnam. His refusal to raise taxes to pay for the sudden full blown war prompts monetary and fiscal moves that send inflation surging, reversing Johnson’s progress reducing poverty. Increased welfare use soon results.
1967 – U.S. Senator Birch Bayh’s proposed constitutional amendment to eliminate the Electoral College and rely on direct popular election of the President fails for the first of six times over the next 10 years. Segregationist Strom Thurmond of South Carolina blocked the amendment in committee in 1969 to enable George Wallace’s second try to win enough electoral votes to throw a presidential election into the House in a plan to bargain to rescind Civil Rights laws. A conservative filibuster ended Bayh’s final try to pass the amendment in 1977. In 2006, the Indiana liberal and others launch an effort to effectively end the Electoral College by a voluntary compact, described by Bayh on C-Span, but it has made scant progress.
1969 — President Richard Nixon declares a war on illegal drugs. By the early 1980s, the U.S. has the largest percentage of people incarcerated of any democracy, and more than half are nonviolent offenders, the Prison Fellowship said in 1984, refuting the unsubstantiated popular media image of a “soft on crime” judicial system. In 2013, the NAACP said that about 14 million whites and 2.6 million African-Americans have reported using an illicit drug, yet African-Americans are sent to prison for drug offenses at 10 times the rate of whites.
The NAACP said African-Americans serve virtually as much time in prison for a drug offense (58.7 months) as whites do for a violent offense (61.7 months).
1976 – Money as Speech legal doctrine began to take shape as Supreme Court in Buckley v. Valeo strikes down limits on personal spending for a political campaign, in a case brought by Senator James Buckley, of New York’s Conservative Party and Independent presidential candidate Eugene McCarthy, a longtime Minnesota Democrat. The court said spending money to influence elections is a form of constitutionally protected free speech. Future rulings culminating in 2010’s Citizens United establish abjectly unequal corporate money as the principle power in U.S. elections, as Political Action Committees and soft money TV ads target state legislators around the nation, letting corporate money leverage control of many statehouses.
1977 – U.S. Supreme Court, in a suit brought by Arizona lawyers, declares state Bar Association bans on lawyers’ TV and radio advertising infringe on their free speech rights, in one of the key cases involving the concept of “commercial free speech” and the doctrine of “free speech absolutism.” Free speech absolutism, which had been devised by liberal justices, is used by the highest court in striking down campaign finance reforms, while the 14th Amendment’s equal protection clause is not invoked.
1981 – Ronald Reagan administration breaks air traffic controllers union, inspiring similar moves in private sector. The Reagan administration refrains from using constitutional and legal powers allowing the federal government to challenge corporate mergers, which are sparked with his tax cuts.
After 35 years of merger mania, 10 corporations make virtually every food product familiar to U.S. grocery shoppers, the anti-poverty group OxFam America said.
1983 – New York Times magazine story by Floyd Abrams headlined “The New Effort to Control Information” details how the Reagan administration is making the shaping of perception paramount. That fall, American reporters are told a “shoot to kill” policy will enforce a rule that they not cross into the combat zone of the brief U.S. landing onto Grenada. Both political parties begin using consultants to shape their policy statements in more favorable, simpler terms. The next year, both parties’ presidential candidates skip live, unrehearsed interviews traditionally done on Meet the Press and similar shows.
1984 – Federal Communications Commission, after heavy lobbying by the broadcast industry, allows more TV and radio stations to be owned by a single entity, then shortly allows TV networks to cross-own cable channels. Today, six giant corporations own almost all the nation’s popular media. This same FCC de-regulation order ended the 38-year-old requirement that commercial TV stations include community service programming. In 1946, the FCC required local air time be given for community civic and religious issues and local talent; in 1960, the requirement was generalized to “community needs and interests” based on FCC guidelines. The 1984 de-regulation order said that because competition had increased, “commercial necessity dictates that broadcasters must remain aware of the issues of the community.” Yet under such marketplace forces, public affairs programming and minority oriented shows largely vanish from local TV, and nightly news reporting becomes more shallow and entertaining.
1987 – Declaring: “The perception of broadcasters as community trustees should be replaced by a view of broadcasters as marketplace participants,” FCC chairman Mark S. Fowler persuades the body to end the Fairness Doctrine, which had required owners of broadcast licenses to present both sides of controversial issues considered to affect the public interest.
A Democratic congress votes to reinstate it, but can’t override President Reagan’s veto. During Bill Clinton’s presidency, a bill to reinstate the Fairness Doctrine can’t get out of committee, and candidate Barack Obama in 2008 announces he will not seek the doctrine’s return, saying he would instead push for more diverse ownership of broadcasting and for independent low-power radio. Broadcast Journalist Nancy Graham Holm, writing in 2014 in the Huffington Post, recalls that the Fairness Doctrine’s demise lessened the depth of TV news and frequency of public affairs programs in the Oakland-San Francisco market where she worked in the ‘80s, and caused the number of one-sided right wing radio talk shows nationwide to skyrocket.
1990 – U.S. Sen. Jesse Helms, a North Carolina Republican, mails 125,000 misleading letters to homes in black voting precincts informing residents of jail time for vote fraud in an unapologetic attempt to reduce black turnout in a toss-up race against former Charlotte mayor Harvey Gantt, who is black. Helms’ campaign soon settles a Justice Department complaint over the race-specific nature of the mailings, but at least one other Republican candidate, in Indiana’s 8th District, copies Helms’ tactic during a tossup race. Republican controlled state legislatures soon begin devising needless voter ID requirements to depress turnout of minorities, students and women.
1995 – President Clinton signs an Anti-Terrorism bill in the wake of the Oklahoma City bombing which includes a provision placed by Republican senators greatly reducing the chance for a convicted state prisoner to appeal sentences to the federal judiciary through a longtime process called writ of habeas corpus. The bill said no Federal court may grant habeas corpus to a state prisoner if state courts had decided his or her claim on the merits — unless the state decision was “contrary to, or involved an unreasonable application of” federal constitutional law.
Backers of the diminished writ wanted to make carrying out the death penalty easier – to curb crime, they asserted. Yet, one month after Oklahoma bomber Timothy McVeigh is executed in federal prison in August of 2001, Al Quaida kills seven times as many people in the World Trade Towers and the Pentagon.
1996 – President Clinton gets applause during State of the Union address when he announces “one strike and you’re out” policy of eviction from public housing for any criminal activity. What he does not add is that the “one strike” rule also calls for eviction if a “tenant, any member of the household, a guest, or another person under the tenant’s control,” commits serious crimes, including possession of illegal drugs. The Journal of Criminal Law and Criminology reports the policy put law abiding residents into the streets. By the late 1990s, the One Strike policy and some Welfare Reform provisions are seen causing a jump in homelessness, under the radar of most media who focus on excellent all-around economic numbers.
1996 – Helms-Burton Law, following on the Toracelli Law of four years earlier, attempts to stop investment in Cuba even by other nations, through elaborate bureaucratic interference in private business outside U.S. jurisdiction. Despite expressing initial interest in opening trade with Cuba, presidents George Bush, then Bill Clinton support Toracelli and Helms-Burton, motivated by their respective re-election concerns. This represents a new ability of PAC money to fragment foreign policy, also long apparent in power of lobbies for Israel, China trade and Saudi Arabian interests.
1996 — Clinton signs the Telecommunications Act into law after massive lobbying by corporations moves it through Congress with “no” votes from only five senators and 16 House members. Despite glowing predictions of more competition from the act, cable TV and telephone rates have risen, and today, more than 90 percent of media is owned by six corporations: Viacom, News Corporation, Comcast, CBS, Time Warner and Disney. In 1983, 50 corporations owned 90 percent. “Never have so many been held incommunicado by so few,” said Latin American journalist Eduardo Galeano about the Telecommunications Act, before which corporations were limited to owning 40 radio stations. Today, Clear Channel owns 1,240.
“Before the ink was even dry on the 1996 Act,” wrote S. Derek Turner, research director of Free Press, in a 2009 report, “the powerful media and telecommunications giants and their army of overpaid lobbyists went straight to work obstructing and undermining the competition the new law was intended to create.”
2001 – After the 9/11 bombings kill 2,800 people, the Patriot Act is hastily enacted, giving greater government powers to enter and search homes without informing the occupants, unless they are charged. A reauthorization in 2004 creates a Total Information Awareness central computer, overseen by Iran-Contra one-time felon John Poindexter, with the clearance to record every U.S. resident’s credit card purchases, magazine subscriptions, medical prescriptions, web site visits, e-mails sent or received, academic grades earned, bank deposits made, trips booked and other personal activities.
2010 – President Barack Obama establishes tribunals, rather than jury trials, for suspected national security violators. In coming years, the president aggressively cracks down on government and contractor employees who leak classified information. The Justice Department acknowledges it secretly seized AP reporters’ phone records while investigating a potential CIA leak, and targeted a Fox News reporter as part of a criminal leak case. No journalist was charged with a crime. After an outcry, the Justice Department issued new guidelines limiting when journalists’ records can be sought.
2012 — Obama signs National Defense Authorization Act, of which Sections 1021 and 1022 give the federal government broader powers to arrest anyone, including U.S. citizens in or outside the country, on suspicion of terrorism-related crimes, and to imprison them indefinitely without trial.
A judge’s voiding of these powers as unconstitutional was reversed by an appeals court and the House has twice voted down efforts to rescind them. President Obama said upon signing the NDAA he would strictly avoid using it to infringe on rights, but in 2012 the Huffington Post farsightedly asked whether successor presidents would feel the same way. Dan Johnson, founder of People Against the NDAA (PANDA) told Huff Post:
“The 2012 NDAA’s detention provisions apply to anyone, anywhere. But who is most likely to have the NDAA used against them? It depends on how you define the word terrorist. The Department of Homeland Security said that individuals or organizations ‘reverent of individual liberty’ and ‘suspicious of centralized federal authority’ pose a threat. The state of Georgia calls publishing ‘public records’ terrorism. The FBI added the director of an anti-fracking film to the terror watch list…. The government won’t define ‘terrorist,’ in order to keep their options flexible.… Under Section 1021, anyone who has committed a belligerent act, which even the government could not define when questioned in court, can be detained indefinitely, without charges or trial, as a ‘suspected terrorist.’ ”
2012 – Report in Southern California Law Review says the concept of, “Quality of Life policing” had run amok, with officers in many cities telling young black men socializing on sidewalks, “move along,” then arresting them for loitering or trespassing if they don’t move, even though the men’s congregating is perfectly legal. Exorbitant bail and squalid jail cell conditions often lead to guilty pleas when no crime has been committed and no reasonable suspicion existed. QOL policing was started in New York City during mayor Rudolph Giuliani’s term ostensibly as a way to stop small offenses from mushrooming into major crime. In New York, offenses such as graffiti, public urination, panhandling, littering, and unlicensed street vending were targeted, but so was legal congregating.
“A misdemeanor conviction can deprive a person of a driver’s license, public housing, student loans, or legal immigration status. Even an arrest record can interfere with job prospects, and most employers say they check criminal records before hiring,” said report author Alexandra Natapoff. Noting later that one of the worst abusers of the practice was Ferguson, Mo., where riots broke out over a fatal shooting by police, Natapoff said: “Such wrongful convictions represent the convergence of two of our criminal system’s worst flaws: its racial skew and its rush to convict.”
2012 — Facebook manipulates selected users by changing the number of posts considered “positive” or “negative” to see if the users’ comments showed that their moods were affected. Washington Post digital reporter Caitlin Dewey wrote in 2014 that this experimentation, while “creepy,” was allowed in the fine print of Facebook users’ Terms of Service.A 2014 article in the New Republic, noting the heavy, even exclusive reliance by many politically minded people on Facebook for news, theorized that the social media site could swing entire national elections through users’ newsfeeds, if it wanted to.
2016 – U.S. presidential election characterized by false news, absence of substantive discussion of issues, and complete disappearance of due process, particularly via the theft of Hillary Clinton’s e-mails. Immediately after Donald Trump wins presidency, Russian oligarchs’ interference in election to help him is revealed, and evidence grows indicating involvement by Vladimir Putin. President Obama reportedly knew about the Russian interference during the race, but balked at ordering an investigation after being intimidated by Senate Republican leader Mitch McConnell, who threatened to accuse the White House of partisanship if they probed the interference.
Brian Arbenz is a writer, researcher and resister of whatever oppression comes. He lives in Louisville, Ky.
I saw my father infrequently growing up – I mean once or twice a decade, so I do not at all identify with Robert Bly’s assertion that males are collectively wounded by the transition to industrial society that resulted in their fathers leaving home for eight hours a day.
These dads came back each evening, right, Mr. Bly?
When I was 20, on the advice of a sibling, I decided to give a father-son relationship another try. So, in 1979, I boarded a plane for Albuquerque to spend a week with George A. Morrison.
I didn’t really have a sense of who he was, and on a brilliant August day, as my plane crossed the sensually tan Sandia Mountains and landed at the Albuquerque Sunport, my lack of familiarity with him set me apart from most of the 400,000 residents of the city. My father, for 10 years in the 1950s and ‘60s, had been New Mexico’s best known television news anchorman, delivering daily 6 and 11 pm newscasts which – for the lack of another TV market in the state during many of those years – were beamed statewide. That’s a territory that would stretch from Louisville to Minnesota.
After my dad earned a law degree, he left the news business, but remained highly recognized while serving as assistant district attorney for Albuquerque, frequently talking on the air about high profile cases.
So, in 1979, instead of my father showing me his home state, I had the inverted experience of being introduced to him by New Mexico.
In the three trips I had made in 15 years to the Land of Enchantment to visit my father, I had learned that governors, senators and the University of New Mexico football coach were cohorts or acquaintances of his. Two of Dad’s close friends were author William Eastlake (Dad and other friends had helped him choose the title of his signature book Castle Keep) and Clarence Birdseye Jr., whose father’s invention of frozen foods still determines the itinerary of your grocery trips.
In a room filled with my dad’s friends from New Mexico, it seems the only one I wouldn’t already know the life story of was the one who had sired me.
I knew he was a Democrat and had from time to time been mentioned as a possible candidate for governor, a quest which could have succeeded before Watergate gave media the mandate to report on personal missteps such as the philandering and heavy drinking my father did until his early 50s.
What kind of Democrat was he? I heard him say good things about civil rights (he had once served as the legal counsel for the Zuni Indian tribe), but overall supportive things about the Vietnam War (he told me of a passionate argument with the very anti-war Eastlake). An English lit degree holder from U of L, Dad was, by any standard, pro-civil liberties and he once oversaw the consumer protection division of Albuquerque’s Bernalillo County — but knowing he came of age in the 1930s and 40s, can you wager a guess about which issue would prompt this otherwise enlightened intellectual to lapse into bigotry at the drop of a hat? Or, more precisely, at a gesture or an enunciation that struck him as effeminate?
I don’t mean my father would ridicule anyone in their presence, but while at his apartment during my 1979 visit, I saw him launch into a tirade of insults while we were watching a brief TV segment featuring an interview with a man he figured was gay. Suddenly, I saw the Male High School football star and World War II submarine warfare veteran my father also had been.
But there was one more famous person for Dad to introduce me to on this trip. I asked if he knew U.S. Sen. Harrison Schmitt, a first-term New Mexico Republican. In keeping with Dad’s Robin Leach-like knack for associating with the rich and famous, yes, he in fact worked down the hall from and occasionally chatted with Schmitt, who went by his nickname Jack. Dad said he would be glad to try to arrange a meeting.
The senator, my father added, was a political wunderkind, winning election in 1976 as a dogmatic conservative counterpoised to unions in such a pro-labor state. Of course, four years before that, the geologist Jack Schmitt had walked on the moon on Apollo 17, the grandest and most successful of the six lunar landing missions.
Extra-terrestrial glory can obscure a clash in political philosophies – or in the case of John Glenn, even ease the effects of being mired in the S and L scandal.
So Schmitt wasn’t that extremist out to break your union. He was a space hero, who had turned moon dust into politically magic dust.
In the 1990s and 2000s, the ex-senator Jack Schmitt has become a climate change denier, repeatedly condemning the theory of human causes of global warming as fiction by an environmental movement he has described as the place communism essentially migrated to after the opening of the Berlin Wall. (Makes sense; I mean there was no environmentalism here before 1989, was there?)
So, we’re talking quite a chasm to bridge when I shake hands with Senator Schmitt. Could it get tense?
No. My fascination with space would make meeting Jack Schmitt an apolitical thrill. And if memories of his three walks on Taurus-Littrow weren’t enough, Harrison H. “Jack” Schmitt also had been my first same-sex crush. I mean minutes after the Apollo 17 crew returned from the moon, splashing down near Samoa, I saw him without his helmet for the first time and… well, he had just returned, and I was now off TO the moon.
That was the instant I, as a 14-year-old, knew I was bisexual. I never told my father of this, and didn’t care to seven years later during my 1979 visit, but wouldn’t Schmitt’s office have been a bizarre venue for that? Imagine coming out to your father, a senator, an astronaut, a veteran journalist, a Republican, a Democrat and a homophobic district attorney all at once!
I don’t know whether I would have been prosecuted, disinherited, evicted or pepper sprayed. You would have read about me in a news story datelined Albuquerque, that’s for certain.
This explosive moment of familial and political drama never happened, though. Schmitt wasn’t in town during my visit.
That is not surprising. You see, the senator went on to be defeated in 1982 – marking the only time an astronaut has lost a U.S. election in nine races – and the big issue raised by Democrat Jeff Bingaman (and yes, Dad knew him, too) was the fact that the incumbent simply was never in the state, physically or ideologically. Schmitt was constantly touring, speaking about the cause of mining the moon, an issue absolutely irrelevant during a severe recession in 1982 which had focused voters’ attention on the here and now, not on rocks a quarter-million miles away.
Today, his status as a private citizen gives Schmitt the mobility to challenge the overwhelming scientific consensus on global warming and on his still passionate cause of building a thriving lunar extraction industry, which almost every other scientist and financer dismisses as pie in the sky.
Oh, and as for my crush – I don’t even remember what I saw in the guy.
This column is from Brian Arbenz’ book “Lost And Found In Louisville.”