The sordid reason Rhode Island abolished the death penalty

old rhode island statehouse

Spend any time in Rhode Island and you quickly learn a few things: It’s not an island; it would appear to have the most corruption per capita outside of Sicily; and for such a small entity the Ocean State has more than its fair share of interesting stories.

Consider that Rhode Island hasn’t executed anyone in more than 170 years. Part of the reason is that the last man to die at the hands of the state was almost certainly railroaded, a victim of anti-Irish, anti-Catholic, anti-immigrant bigotry that was prevalent in many areas of the United States into the 20th century.

This particular story begins on Dec. 31, 1843, when textile magnate Amasa Sprague finished supper at his Cranston, RI, mansion and went for a walk. Sprague was powerful both in physique and prominence.

He was a New England Brahmin, and together with his brother William owned a textile business started by his father William Sprague Sr. The Spragues owned several cotton mills in Rhode Island, but their most profitable factory was the print works in nearby Spragueville, which printed calico patterns on cloth.

The A & W Sprague Co. employed most residents of Spragueville, owned the tenements they rented and the company store where they shopped. He was a man to be feared.

During Sprague’s after-dinner walk, he was accosted by at least two individuals. He was shot in the right wrist and struck with a blunt instrument in the left side of his head, then his right. Despite desperate attempts to fight back, Sprague was overcome and killed.

There was no shortage of potential suspects, according to the New England Historical Society.

Logo of A & W Sprague Co., showing Cranston, RI, textile plant.

Logo of A & W Sprague Co., showing Cranston, RI, textile plant.

There was talk that the murder was politically motivated. The previous year, an individual named Thomas Dorr had been arrested for a failed attempt to force broader democracy in Rhode Island by setting up a rival government that would expand the vote to all adult white males.

Still governed by the state’s 1663 colonial charter with its relatively high property requirement for suffrage, Rhode Island allowed only white, propertied men – about a third of adult male population – to vote. The Irish, who were nearly all disfranchised under the colonial charter, strongly supported the Dorr Rebellion.

Sprague, like many wealthy white males, benefited from the system in place and, along with his brother William and brother-in-law Emanuel Rice, helped orchestrate Dorr’s downfall. Some speculated that supporters of Thomas Dorr, who would later be found guilty of treason against the state, assassinated Amasa Sprague, according to the New England Historical Society.

Others looked closer to home. William and Amasa Sprague apparently disagreed about what direction the family business should take. William wanted to expand the company beyond Rhode Island, while Amasa was content to continue the business at its current size and profitability. Neither man had a reputation for backing down when they didn’t get their way.

Suspicion also fell upon Nicholas Gordon, a tavern owner whose establishment was frequented by Sprague’s millworkers, much to Amasa Sprague’s displeasure.

Gordon’s tavern was attached to his home and was located in a section of Cranston, which, in the decidedly indelicate sensitivities that were prevalent in 19th century America, was nicknamed “Monkeytown” because of its Irish population.

“Amasa Sprague had successfully fought against renewing Gordon’s liquor license because, he said, his Irish millworkers were getting drunk during work hours and neglecting their jobs,” according to the New England Historical Society. “Gordon and Sprague had fought publicly. Sprague and Gordon had once met on a path and neither refused to give way. Finally Sprague grabbed Gordon by the collar and shouted, ‘Get out of the way, you damned Irishman!’”

The entire case was a fiasco from beginning to end. William Sprague resigned his senate seat to supervise the murder investigation, an apparent conflict of interest.

Not only was Nicholas Gordon quickly arrested, along with his younger brothers, John and William, the Gordon’s mother and a friend of Nicholas’ named Michael O’Brien – because everyone knew the Irish always stuck together – but the Gordon’s dog was apprehended, as well. (The dog was later described by a defense attorney as toothless and old.)

William and John were tried first, with the Irish community rallying behind them and raising funds for their defense.

Ultimately, it was 29-year-old John Gordon, recently arrived from Ireland to join his brothers Nicholas and William, who took the fall for the crime. William was found not guilty, but John was found guilty despite a conviction based on contradictory circumstantial evidence.

How badly were the cards stacked against John Gordon?

In trials held at the then-Rhode Island Statehouse, Presiding judge Job Durfee told jurors to give greater weight to Yankee witnesses than Irish witnesses. He added that they did not have to believe anything that the Irish witnesses for the defense said because they were by their nature unable to tell the truth, according to a 2013 report on the sordid affair by the Cranston Herald.

In addition, Henry Bowen Anthony, the editor of the Providence Journal, the leading news source for Rhode Island at the time, provided the public with plenty of “facts” about Gordon’s guilt, even though many were asserted without a shred of truth to them, the Herald added.

One of the pieces of evidence that convicted John was a broken gun found near the body of Amasa Sprague. Nicholas was known to own a gun, but it couldn’t be found in his house, so it was assumed the broken gun was his. After the trial it was discovered that William had hidden Nicholas’ gun under the attic floorboards, according to the New England Historical Society.

Nicholas was tried later, but he had an alibi and the witnesses who convicted his brother were suddenly unsure of their memories. His trial ended in a hung jury. His gun turned up just before his second trial, which also ended in a hung jury.

John Gordon was hanged on Feb. 14, 1845, in Providence. His last words were, “I hope all good Christians will pray for me.”

Many believed he was innocent and the victim of a legal lynching. Some 1,400 Irish came from Rhode Island, Connecticut and Massachusetts for his funeral. The procession took a detour to pass the Statehouse and the homes of the Yankee elite.

Seven years later, the Rhode Island legislature banned capital punishment, in part because of the travesty of John Gordon’s trial.

In 2011, 166 years after John Gordon was hanged by the state of Rhode Island, Gov. Lincoln Chafee pardoned him.

“John Gordon was put to death after a highly questionable judicial process and based on no concrete evidence,” Chafee said in 2011. “There is no question he was not given a fair trial.”

(Old Rhode Island Statehouse, Providence, where John Gordon and his brothers were tried for the murder of Amasa Sprague.)

Memorial Day: Remembering three men from three wars

George Koon 2 cropped

It’s difficult to walk through any older Southern cemetery and not find gravestones identifying individuals who gave their lives for their country.

Even if one doesn’t include the hundreds of thousands of Confederate dead that dot cemeteries from Virginia to Florida, the Carolinas to Texas, there are many, many thousands who died in the line of duty, whether it was during the American Revolution, the War of 1812, the Indian Wars of the 1830s, the Spanish-American War, World War I, World War II, Korea, Vietnam, or the other conflicts the US has been involved in over the past 240 years.

In a small church cemetery in the South Carolina Midlands rest the remains of three men who died during three major conflicts that the United States participated in during roughly the first half of the 20th century,

Each died in very different times under very different circumstances, yet all are buried in Old Lexington Baptist Church Cemetery within about 15 feet of each other.

Milton Wilkins Shirey was a private in Company B, 31st US Infantry Regiment who perished of pneumonia on Dec. 12, 1919, in Siberia, at age 19.

Gravestone for Pvt. Milton W. Shirey.

Gravestone for Pvt. Milton W. Shirey.

US involvement in Siberia is a little-known aspect of the Great War. President Woodrow Wilson sent several thousand troops to Vladivostok in 1918 following the October Revolution for a number of reasons, including aiding in the rescue 40,000 members of the Czechoslovak Legions, who were being held up by Bolshevik forces as they attempted to make their way along the Trans-Siberian Railway to the Pacific, where they hoped they could eventually make their way back around the world to the Western Front.

Also, Wilson wanted to protect large quantities of military supplies and railroad rolling stock that the US had sent to the Russian Far East in support of the prior Russian government’s war efforts on the Eastern Front.

Weather conditions made the Siberian experience a miserable one. There were problems with fuel, ammo, supplies and food, and horses suffered terribly in the sub-zero Russian winter.

Troops struggled, as well. During the American Expeditionary Force’s 19 months in Siberia, 189 soldiers, including Shirey, died.

It took four months for the US government to get Shirey’s body back home to South Carolina, where hundreds attended his funeral in April 1920.

Pvt. Ulysess S.G. Shealy, 23, was killed in action Sept. 27, 1944, in Italy. Details of his service, unit, and where he was killed are sketchy, but online records do show that Shealy’s remains weren’t returned to the US for burial until March 1949.

Gravestone for Pvt. Ulysess S. Shealy.

Gravestone for Pvt. Ulysess S. Shealy.

Given that 73,000 American dead from World War II are still missing in action, though, of course, presumed deceased, just the fact that Shealy’s body was returned to his home state was no small feat.

Finally there is the grave of Sgt. First Class George Walter Koon. Koon, 36, enlisted in the US Army in 1936 and served for nearly 15 years.

He was taken captive by Chinese forces on Dec. 1, 1950, after the Battle of the Ch’ongch’on River, a fierce conflict between Chinese and American troops.

Evidence shows he died of neglect, specifically malnutrition, gangrene and dysentery, while being marched from Kunu-ri to a POW camp along the Yalu River, military records show.

Sgt. Koon was one of 11 individuals whose bodies were found in a mass grave by US authorities, assisted by North Korean officials, in 2002. In 2005, Koon’s brother Carl gave a blood sample and the military was eventually able to match it with the remains.

A funeral service for Koon was held in May 2008 at Old Lexington Baptist Church Cemetery, 57 years after his death.

Three men, ranging from a 19-year old just out of high school to a career soldier nearly twice his age. Men whose causes of death ranged from illness, to wounds and neglect, to being killed in action. Men who died thousands of miles from their homes in the rural South. It was scene played out, of course, all across the United States.

Each, sadly, is a story that was repeated tens of thousands of times in the 20th century alone. It continues today.

There are those who believe war is wrong under all circumstances; it certainly is a terribly unfortunate occurrence.

This Memorial Day many in the US will give little more than a glancing thought – if that – to the sacrifice of those who gave their lives for their nation. There are many in other parts of the world, including South Korea and Italy, though, who still remember.

Shining a light on anti-independence fallacies

Portrait of a boy with the flag of Wales painted on his face.

Among common canards used to thwart peaceful independence movements is the idea that the entity attempting to go its own way is too small, too poor, has too few people, etc.

These were arguments employed by those who opposed Scotland’s independence referendum in 2014, and who resist sovereignty movements in Catalonia and Corsica, among other regions of the world where a segment of the population is pondering an autonomous path.

But the blog Borthlas, focusing on the idea of Welsh independence from the UK – said by some to be impossible because Wales is “too poor” – raises interesting points:

Borthlas turns to a comparison of national per-capita GDP as a means to judge a region’s muscle, admitting that this is not an exact science because per-capita GDP tells nothing about the relative cost of living in a country.

“The population of a country with a low GDP per capita and a low cost of living might actually feel better off than the people of another country where both figures are higher,” the blog explains. “It also tells us nothing about the way wealth is shared out in a country – so the population of a country with a low GDP per capita but where the wealth is evenly shared might feel better off than the people of a country with a high GDP per head and huge inequality.”

But despite those caveats, per-capita GDP is still a good starting point to assess where would Wales fit were it an independent state, Borthlas writes.

  • According to International Monetary Fund figures, Wales would place 24th in the world in per-capita GDP were it independent of the UK, out of more than 170 countries;
  • The World Bank puts Wales at 27th, ahead of more than 150 other nations; and
  • The United Nations ranks Wales 31st place, with more than 160-odd countries beneath it.

Each organization has per-capita GDP figures for a different number of countries; currently there is something like 195 recognized independent nations.

Map of Wales.

Map of Wales.

Wales fares relatively well among European Union nations, as well, ranking in the top half, according to Borthlas.

The real issue why it’s difficult for regions such as Wales, Scotland and Catalonia to gain traction when it comes to independence is multi-fold.

First, these areas are often compared economically to the countries of which they are a part. Wales and Scotland aren’t going to stack up very well against the UK as whole, but then again, neither would England proper. But if there’s a place in the world for the likes of Andorra, Belize, Equatorial Guinea and Liechtenstein, entities such as an independent Wales, Scotland and Catalonia would not only have little problem surviving, but would almost certainly thrive.

Next, traditionalists, and certainly hidebound imperialists, are almost always reluctant to give up that which they have spent centuries holding reign over, for psychological and political reasons.

Finally, the loss of any portion of a nation to independence means a loss of money, one way or the other. Some may point to a region such as Wales and say that it receives significant sums from the UK Treasury. However, Wales is denied sovereign control over its natural resources, including water, mineral and energy exports.

Ultimately, the bottom line tends to be the bottom line these days when it comes to adhering to the concept of self-determination.

Local leader fights for right for employees to remain ignorant

Henry Reilly

One sometimes wonders if parochial politicians realize how narrow they appear when they express close-minded views, or if it’s actually their goal to put forth that perception in the first place.

Henry Reilly, a councillor representing the Mourne area  in County Down on a local council in Northern Ireland, recently wrote a letter to a local publication complaining that area workers employed by the same council were being queried about their Irish language skills.

“Workers are being asked if they have an Irish language qualification, how competent they are in Irish, if they would be willing to deal with enquiries from the public in Irish and if they would be willing to take a course in Irish. Staff are even asked if they would like to take such a course during working hours!” Reilly wrote to the News Letter.

Reilly added that council staff members who had contacted him expressed concern that their lack of knowledge of Irish or interest in learning Irish could harm their promotion prospects.

“It is clear to me that the implication of the audit is that having Irish will be a distinct advantage when working for the council,” he added. “This is wrong and discriminatory against the Protestant community.”

So here we have a government entity which, as part of its responsibility to serve its citizenry, seeks to assess the Irish-speaking capabilities of its employees. Understanding that not all employees may be able to speak Irish, it asks if they would be interested in taking a course in the language during working hours.

The council is willing to pay to enable employees to learn another language, to help them better serve the populace. But an elected official finds fault with that. Not because of the potential cost, or because it would potentially leave the council staff shorthanded during working hours, but because it somehow discriminates against the Protestant community.

As I noted when I first learned of this on the blog An Sionnach Fionn, I wish someone would pay me to learn a second language.

The only thing that’s seems unfair is that the people of Mourne find themselves represented by an ignorant ass who is either kowtowing to a handful of bigots who don’t want to learn Irish because they see it as the language of Catholics, or is grandstanding in a bid to lock up votes for the next election.

I don’t know what the threshold should be for having civil staff learn different languages to serve a polyglot population, but clearly there are many regions that would benefit from having some understanding of the language(s) of those they serve, whether it’s Irish in Northern Ireland, Spanish in parts of the United States, French in parts of Canada, etc., etc.

Public service isn’t about bending the job to the employee’s whims, but adapting to what the populace needs, when possible.

If Reilly has his way, services that could be better provided by a staff at least somewhat conversant in Irish would either go undelivered, or be delivered in a decidedly less efficient manner. Either way, some of Reilly’s constitutents would lose – but he’d rather pander than serve all of the public.

(Top: Henry Reilly, councillor on the Newry, Mourne and Down District Council representing the Mourne area.)

I’ll have the free lunch – as long as he’s paying for it

freelunch_thumb

Here’s an unsurprising bit of news out of our nation’s capital:

An overwhelming majority of Washington, D.C., residents support a proposal before the District Council to give each worker in the city 16 weeks of paid time off to care for a newborn or for a dying family member, according to the Washington Post.

The predictable part is that more than half of those polled also say they don’t want workers themselves to have to pay for the largesse.

Sorry, guys (and gals), but as Milton Friedman stated ever so eloquently, there’s no such thing as a free lunch. Someone somewhere is going to have to pick up the tab.

If you understand and accept that you’re going to pay one way or the other, that’s fine. But if you expect others to willingly pony up, or that benefits will flow like manna from heaven, you’ve got another thing coming.

The last time I looked the District of Columbia doesn’t have its own printing presses with which to churn out money, so D.C. would have to raise taxes and/or cut employees to pay for such a benefit.

Understand, that’s not a judgment on whether the benefit is worth the cost, but a simple matter of fact. If workers are going to be allowed 16 weeks of paid time off to care for newborns or dying family members, the district will need funds to oblige.

Those pushing for the minimum wage to be increased to $15 an hour need to recognize this reality, as well. Over the course of a year, a full-time worker making $15 an hour would earn a little more than $32,000. That’s all well and good but, again, that money has to come from somewhere.

As the alchemists of old discovered, you can’t get something for nothing. There is a cost to every benefit, even if that cost is hidden. To pretend otherwise is to be foolish, disingenuous or willingly naïve.

The public wants what the public gets …

Democratic Presidential Debate in Vegas

Living in a nation in which presidential canvassing is a never-ending cycle where campaigning for the next election begins almost as soon as the last one ends, it’s difficult to pay much heed to the myriad candidates promising an endless array of bread and circuses or, conversely, labeling foes as the antichrist.

The media has done much to create this horse race atmosphere, dispatching a multitude of reporters to follow candidates and catch the daily 20-second platitudes of aspiring nominees while real news around the nation and the world goes uncovered.

Candidates understand the game and manipulate the media, who play along in order to maintain access. Negative stories appear, but generally unless a candidate has an absolutely astounding number of craptacular skeletons in his or her closet, the media’s not going to scuttle anyone’s campaign.

More candidates equal more possibilities which equal more news. And more news means more ad sales, at least for television.

So when my daughters or friends ask me who I’m voting for, I tell them it’s early so I haven’t made up my mind. This is true, as our presidential election is still more than a year away.

The reality is that I have better things to do than listen to highly coached politicians spout well-rehearsed lines that have been trotted out and approved by focus groups.

Sadly, some of the most astute bit of political analysis I’ve come across is the following, which comes from a website called What Would Tyler Durden Do?, a website largely dedicated to mocking celebrities.

Although the site rarely strays into politics, and can be more obscene than a Kardashian attempting to read Shakespeare, Tyler Durden has a pragmatic take on the American political system that, while few in power would like to admit, is likely closer to reality than many average US citizens realize.

Consider its take on the recent Democratic debate:

Bernie Sanders supporters are largely more educated than Clinton supporters, but widely less practical. Despite the fact that Sanders college-aged Internet minions flooded the polling sites post-Democratic debate to declare Bernie Sanders the hippy atheist god almighty, every single major media outlet including CNN which ran the debate picked Hillary Clinton as the winner. Now Sanders followers are outraged, bemused, and frazzled. The standard emotional state of socialists.

I’m reluctantly forced to admire young Utopian dreamers. Before you get your first real STD or crappy job to pay the rent or unwanted pregnancy or draft notice or lousy marriage or mortgage or cancer, that is the time to dream of a perfect world. A land where everybody chooses bikes over cars, the homeless are no longer mentally ill alcoholics but misunderstood poets, and the fry guy and the McDonald’s CEO both make 40 bucks an hour, 10 after taxes. But politics isn’t about childish dreams. It’s about Mafioso-level bodies in the dumpster realities.

CNN is owned by Time Warner Cable. It donates heavily to the Clintons and Bushes for a reason that has nothing to do with the political philosophy you cherish while smoking pot in the quad and discussing Marx. It has to do with access and power and money. Big huge gobs of money in billion-dollar chunks. Let’s see, do we support the socialist who wants to break us up into little bits and force us to compete with public access channels on taxpayer-funded steroids or do we want the hacks who will keep us tight and flush with monopoly cash?

Agree or disagree with the above, it possesses more than a little truth. We’re certainly a long way from what the Founding Fathers, imperfect though they may have been, had in mind more than 225 years ago.

(“The public wants what the public gets” is from “Going Underground,” by The Jam.)

How the tyranny of the petty minded can infect a society

Coleman_Livingston_Blease

Like most US states, South Carolina has elected some bad governors over the years. Pitchfork Ben Tillman, an avowed racist and demagogue who did a great deal to divide the state in the late 19th century, is currently getting some much-needed scrutiny, but one of his protegés, Cole Blease, never fails to amaze when his career is analyzed.

Blease was a self-proclaimed pro-lynching, anti-black education politician who was cut from the same cloth as Tillman. He was elected to the state’s highest office in 1910 through his ability “to play on race, religion and class prejudices,” appealing especially to South Carolina’s farmers and mill workers, according to Ernest Lander’s work, “A History of South Carolina 1865-1960.”

Blease acquired such a bad reputation that he was said to represent the worst aspects of Jim Crow and Ben Tillman, a noxious combination if there ever was one. Blease, for example, is said to have once buried the severed finger of a lynched black man in the South Carolina gubernatorial garden in Columbia.

He was not only doggedly political, but arrogant about it, as well.

In early February 1911, less than a month after taking office, Blease stated publicly that he wouldn’t appoint anyone but friends to public office if he could help it.

The matter came to a head after a judge elected in Richland County, where Columbia is located, did not qualify in time to take office immediately, and a short-term intermediary was needed.

The Richland County Bar Association endorsed Duncan J. Ray as a special judge, and Ira B. Jones, chief justice the SC Supreme Court, wrote the governor recommending and requesting the appointment of Ray, adding that this was “the course prescribed by the law, as the statute governing special judges says they shall be appointed by the governor upon the recommendation of Supreme Court,” according to an article in the Feb. 9, 1911, edition of the Bamberg Herald.

“However, the governor had already taken the bit in his teeth and appointed F.J. Caldwell, of Newberry, to preside, and when the Chief Justice wrote him recommending Mr. Ray, he replied that he would not appoint anybody but his friends to public office,” the paper added.

Blease made no apologies for injecting politics directly into the judiciary system.

“My friends,” he said, “are to receive some consideration from this administration. I do not expect to appoint my enemies to office upon the recommendation of anybody unless it be that I cannot find a friend who is competent and worthy of the position.”

The (Columbia) State newspaper, begun in 1891 as a response to Tillman and his politics, took Blease to task. Continue reading

Amid ignorance, compassion and humanity shine through

dps

Because we in South Carolina haven’t had enough strife over the past month, what with the racially motivated killings at Emanuel AME Church in Charleston on June 17 and the ensuing polarizing debate about removing the Confederate flag from the Statehouse grounds in Columbia, a pair of dubious groups from out of state descended upon our capital over the weekend to try to add fuel to the fire.

The North Carolina-based Loyal White Knights of the Ku Klux Klan held a rally at the Statehouse this past Saturday, as did the Florida-based Black Educators for Justice, described as a subset of the “New Black Panther Party.”

While there weren’t more than a few dozen members from either group on hand to spread their bizarre brand of fanaticism, there were as many as 2,000 individuals who protested the interlopers.

Yet, among the foolishness of two groups who seemed hell-bent on stirring up odious emotions for the sake of publicity was at least one inspiring moment.

In  a scene caught by a civilian photographer, a black police officer came to the aid of an older white man, overcome by heat, who was garbed in a Nazi t-shirt during Saturday’s activities.

In the above photo, provided to the Associated Press by Rob Godfrey, the former spokesman for Gov. Nikki Haley, S.C. Department of Public Safety Chief Leroy Smith helps an unidentified man wearing National Socialist Movement attire up the stairs of the South Carolina statehouse.

The image showed “who we are in South Carolina,” Smith told the Charleston Post and Courier.

One never knows what’s in the heart of individuals such as the character who was assisted by Smith, but it can only be hoped that the latter’s actions might force the former to at least reconsider his long-held positions on matters such as race. Stranger things have happened.

Alphabetical rankings: The United States’ national shame

US ranking

As if Americans – beset by murder, mayhem and political strife – haven’t had enough bad news lately, there’s this staggering bit of misfortune:

Of 196 countries in existence today, the United States ranks 182nd in the world alphabetically.

This, despite the fact that the US has an abundance of natural resources, top-notch health care, one of the highest literacy rates in the world and is one of the longest-existing modern democracies.

Now, we Americans could stand around and play the blame game, but the simple fact is we should all be embarrassed. Ponder this: There are but 13 countries the US ranks ahead alphabetically, and they include such political basket cases as Uzbekistan and Yemen.

Consider those nations that have outpaced us in the ABCs: Cuba, El Salvador, Guinea-Bissau and even Kyrgyzstan, where citizens struggle daily to even spell their country correctly.

Sadly, even after years of conflict in both Afghanistan and Iraq, the US is still classified behind both of those nations alphabetically, despite pouring billions of dollars into military efforts.

As has been noted, it’s time for Americans to take a long, sobering look at this country, and how it ended up all the way down at No. 182.

If we’re ever going to remedy this deplorable situation, we have to act now. If you won’t do it for yourself, do it for future generations. As always, think of the children!

(HT: Clickhole)

Deep debate cast aside for quick decisions based on ‘perception’

FILE -- The Confederate battle flag flies near the South Carolina State Capitol building in Columbia in this file framegrab.

Over the past few days it has been stated repeatedly that the Confederate flag should be removed from the South Carolina Statehouse grounds because it’s a racist symbol – no matter what its advocates claim – because “perception is reality.”

Certainly the Confederate battle flag was misappropriated in the 1950s and ‘60s by groups opposed to the Civil Rights movement. That these groups, such as the Ku Klux Klan and the White Citizens Council, also made ample use of the Stars and Stripes, seems to be of small concern to those who would like to see the Confederate flag placed in a museum.

While there’s plenty of room for debate about the role of the Confederate flag in public life, if the basis for one’s arguments includes “perception is reality,” then one is starting from a position of weakness.

History has shown that the idea that perception can be both erroneous and damaging.

Black Codes and Jim Crow laws were enforced in part because blacks were perceived by many as being inferior to whites. Most ex-slaves, thanks to law and/or custom, had never been taught to read or write. They were therefore perceived as being less intelligent than whites, even though the playing field was never close to being level.

This perception continues to hold currency even today among some, who mistakenly believe that blacks as a group don’t have the capacity to keep pace with whites and some other ethnic groups, while overlooking the fact that in many areas where African-Americans make up a significant percentage of the population substandard schooling and a history of state indifference to education are the real culprits.

Along those same lines, blacks were perceived well into the 20th century as lacking the educational skills necessary for college. At the time of the Harlem Renaissance in the 1920s, only about 10,000 American blacks – one in 1,000 – were college educated, according to the Journal of Blacks in Higher Education. Today, more than 4.5 million blacks hold a four-year college degree.

Consider also that blacks who volunteered or were drafted into the US military were discriminated against for many decades because of the perception that they were suited only for “heavy lifting” rather than positions that relied on brainpower.

At the outset of the Civil War, neither free blacks nor escaped slaves were allowed to enlist in the Union Army. The prevailing view among Union officers was that the black man lacked mental ability, discipline and courage, and could never be trained to fight like the white soldier. It would take the better part of two years before white military leaders, desperate for troops, consented to the use of black soldiers, enabling this error to be disproved.

Up into World War I, black troops were often given thankless tasks that white soldiers sought to avoid and racial segregation in the US military remained in place until after World War II.

During the latter conflict, the Navy assigned most who did enlist to mess duty and the Marines barred blacks entirely until 1942. The military as a whole held to the “perception” that blacks weren’t as good at “soldiering” as whites.

Continue reading