History
Sometimes, what seems like a necessary change, and what might have been done with good intentions, ends up being far more detrimental than helpful. In the United States, from 1879 to 1933 our monetary system was based on and utilized gold. The lone exception to that was the embargo on gold exports during World War I. With the gold standard, creditors had the right to demand payment in gold. It was a stable and tangible source of income and payment, and it ensured that the borrower actually had the funds to make the payment. The bank failures during the Great Depression of the 1930s frightened the public, causing them to begin hoarding gold, making the “payment in gold” policy untenable, furthering the panic. On June 5, 1933, Congress enacted a joint resolution nullifying the right of creditors to demand payment in gold, thereby taking the United States off the gold standard. That meant that our currency was no longer backed by gold. This may not have seemed like a bad solution at the time, but it certainly opened the door for a number of huge problems later.
President Roosevelt had seen how this worked when Britain was facing similar pressures, and they decided to drop the gold standard in 1931. So, when the Great Depression hit, Roosevelt decided to implement it in the US. Soon after taking office in March 1933, President Roosevelt declared a nationwide bank moratorium in order to prevent a run on the banks by consumers lacking confidence in the economy. At the same time, he forbade banks from paying out gold or to exporting it. He based his decision on the Keynesian economic theory, which states that one of the best ways to fight off an economic downturn is to inflate the money supply. If the amount of gold held by the Federal Reserve is increased, it would in turn increase its power to inflate the money supply.
With that in mind, Roosevelt, on April 5, 1933, ordered all gold coins and gold certificates in denominations of more than $100 turned in for other money. The order required all persons to deliver all gold coin, gold bullion, and gold certificates owned by them to the Federal Reserve by May 1, 1933. In exchange, they were to be given $20.67 per ounce. I’m sure it seemed like a good deal, so the people complied. By May 10, the government had taken in $300 million of gold coin and $470 million of gold certificates. Congress two months later, enacted a joint resolution repealing the gold clauses in many public and private obligations that required the debtor to repay the creditor in gold dollars of the same weight and fineness as those borrowed. To complete the inflation process, in 1934, the government price of gold was increased to $35 per ounce, effectively increasing the gold on the Federal Reserve’s balance sheets by 69 percent. This increase in assets allowed the Federal Reserve to further inflate the money supply. So, with a stroke of the pen, the Federal Reserve, and thereby the government increased their wealth by 69%…overnight.
The $35 per ounce value of gold held August 15, 1971, when President Richard Nixon announced that the United States would no longer convert dollars to gold at a fixed value, thus completely abandoning the gold standard. Now I’m not an economist, but if I understand this correctly, it was at that point that the United States, as well as many other nations, began using what is known as “Fiat Money.” Fiat money is “a type of currency that is not backed by a precious metal, such as gold or silver. Fiat money is an intrinsically valueless object or record that is accepted widely as a means of payment. Accordingly, the value of fiat money is greater than the value of its metal or paper content.” It is designated by the issuing government to be legal tender and it was authorized by government regulations, which means that the government could decide what its value would be. That would render the money to be worthless. Not that it changed much in the amounts of Fiat money in use at the time, but in 1974, President Gerald Ford signed legislation that permitted Americans to own gold bullion again.
When people started moving west in the United States, the trips took approximately four months by wagon train. At that time, saying “goodbye” to loved ones was a very sad moment, because many times people would never see each other again. The distance was just too great and the cost too much, and unless people were moving permanently, they simply couldn’t take the trip. At that time…the 1860s, they likely had no idea that things would get easier either.
Nevertheless, just a short sixteen years later, all that changed. The train wasn’t a completely new invention in the 1860s, and in fact was invented in 1829. Still, as the United States was experiencing its expansion to the west, there were few towns and no railroads. So, moving west still meant that families might never see their departing loved ones again, and if the did, it would be a number of years. During the early 19th century, when Thomas Jefferson first dreamed of an American nation stretching from “sea to shining sea,” it took the president 10 days to travel the 225 miles from Monticello to Philadelphia via carriage. The distance from New York to San Francisco (which was founded in 1776, but didn’t become part of the United States until the signing of the Treaty of Guadalupe Hidalgo in 1848). Jefferson was president from 1801 to 1809. The distance from New York City to San Francisco is 2,565 miles from New York City, making the carriage time in those days approximately 114 days, or a little less than a third of a year. If you weren’t moving west, you would also have to make the return trip. It just wasn’t feasible.
Jefferson saw a possible answer to his dilemma as early as 1802, with “the introduction of so powerful an agent as steam.” At that time, he predicted, the possibility of “a carriage on wheels” that would make “a great change in the situation of man.” For Jefferson, the change would not come soon enough, and he never witnessed the “carriage on wheels” he had envisioned. Nevertheless, within half a century, America would have more railroads than any other nation in the world, and by 1869, America would see the first transcontinental line linking the coasts was completed. Now, those journeys that were impossible for any reason but moving, that had previously taken months using horses, could be made in less than a week. In fact, on June 4, 1876, the Transcontinental Express train traveled from New York City to San Francisco in a mere 83 hours.
It seemed inconceivable that a human being could travel across the entire nation in less than four days. Nevertheless, now the impossible had become not only possible, but a reality, and the United States became a very mobile nation. Prior to that time, the coast were months apart, and it was speculated that the nation couldn’t possibly stay united. They couldn’t even communicate very well, much less see each other. Now, those problems were in the past. Just five days later, daily passenger service began. You simply can’t hold progress back. Once the technology is available, the reality isn’t far behind. Americans were amazed at the speed and comfort of this new form of travel. Of course, it wasn’t for everyone, because as with all new inventions, it was expensive and therefore, only for the wealthy. The train was luxurious, with first-class passengers riding in beautifully appointed cars with plush velvet seats that converted into snug sleeping berths. The finer amenities included steam heat, fresh linen daily and gracious porters who catered to their every whim. For an extra $4 a day, the wealthy traveler could opt to take the weekly Pacific Hotel Express, which offered first-class dining on board. As one happy passenger wrote, “The rarest and richest of all my journeying through life is this three-thousand miles by rail.”
A third-class ticket could be purchased for only $40–less than half the price of the first-class fare. At this low rate, the traveler received no luxuries. Their cars, fitted with rows of narrow wooden benches, were congested, noisy and uncomfortable. The railroad often attached the coach cars to freight cars that were constantly shunted aside to make way for the express trains. Consequently, the third-class traveler’s journey west might take 10 or more days. Still, no one complained, because 10 days was far better than four months, and sitting on a hard bench seat was still much better than six months walking alongside a Conestoga wagon on the Oregon Trail. Of course, this amazing method of transportation would only be amazing for a short time in history, as trains gave way to cars and eventually airplanes. Still those things were either a way off, or would have to wait for the necessary roads and airports to make those forms of travel available. Trains were here now, and life was good!!
In late April 1919, anarchists, a group of people who believe that “society should have no government, laws, police, or any other authority. Most anarchists in the US advocate change through non-violent, non-criminal means. However, a small minority believed change could only be accomplished through violence and criminal acts” decided to take matters into their own hands. If you ask me, their actions proved without a doubt, why we need thinks like government (but not BIG government), laws, police, and other authority figures. In these bombings, radicals mailed over 30 booby trap dynamite-filled bombs to prominent politicians and appointees, including the Attorney General of the United States, as well as justice officials, newspaper editors, and businessmen, such as John D Rockefeller, and the mayor of Seattle, Washington. All but one of the bombs were addressed to high-level officials, but strangely, one bomb was notably addressed to the home of a Federal Bureau of Investigation field agent, Rayme Weston Finch, who had been tasked with investigating the Galleanists, and who in 1918 had arrested two prominent Galleanists while leading a police raid on the offices of their publication Cronaca Sovversiva. No one was killed in these bombings, but unfortunately, one senator’s maid lost her hands.
With the epic failure of the first bombings fresh in their minds, the anarchists decided to try again. On June 2, 1919, in seven US cities, all within approximately 90 minutes of each other, bombs once again rock the area. These bombs were much bigger. Explosions took place in some of the most significant urban areas in America, including New York, Boston, Pittsburgh, Cleveland, Washington DC, Philadelphia, and Patterson, New Jersey. It is believed that the bombers were most likely disciples of Luigi Galleani, an extremely radical anarchist who pushed for violence to rid the world of laws and capitalism. The bombs exploded on June 2 were much larger than those sent in April. They comprised up to 25 pounds of dynamite packaged with heavy metal slugs, which were designed to act as shrapnel. The bombs were sent to government officials who had endorsed anti-sedition laws and deportation of immigrants suspected of crimes or associated with illegal movements, as well as judges who had sentenced anarchists to prison.
One of the bombs was set off by a militant anarchist named Carlo Valdinoci, who was a former editor of the Galleanist publication Cronaca Sovversiva and close associate of Luigi Galleani. Sovversiva planted the bomb, but as he as he set it, it blew up the front of newly appointed Attorney General A Mitchell Palmer’s home in Washington, DC. Unfortunately for Sovversiva, the bomb exploded too early, and he was killed in the very blast he created. At the time, a young Franklin and Eleanor Roosevelt lived across the street and were also shaken by the blast. That bombing was just one in a series of coordinated attacks that day on judges, politicians, law enforcement officials, and others in eight cities nationwide.
Within minutes of the bombing of US Attorney General Palmer’s home, more were exploding in other cities, including Philadelphia, Pennsylvania. Here, two bombs exploded within seconds of each other under the porch of the rectory of the Our Lady of Victory Catholic Church, caving in the porch and shattering every window in the rectory and those in the basement. The church was still smoldering when another bomb exploded less than a mile away at the home of Philadelphia jeweler Louis Jajieky. That also seems like an unlikely target, given the political nature of most of the bombs. The interior of the Jajieky residence was utterly demolished, leaving only the exterior four walls standing.
While the anarchists were persistent, they were not especially “good” anarchists. During these bombings and simultaneous explosions in six other cities, none of the men who were being targeted were killed, but one bomb took the life of New York City night watchman William Boehner. Each of the bombs was delivered with several copies of a pink flyer titled “Plain Words,” which read, “War, Class war, and you were the first to wage it under the cover of the powerful institutions you call order, in the darkness of your laws. There will have to be bloodshed; we will not dodge; there will have to be murder: we will kill because it is necessary; there will have to be destruction; we will destroy to rid the world of your tyrannical institutions.”
It was discovered that the flyer was printed in a printing shop operated by two anarchists, Andrea Salsedo, who was a typesetter, and Roberto Elia, who was a compositor. Bothe of them were known Galleanists. Rather that go to jail, Salsedo committed suicide. Elia was given the opportunity to testify about his role, and thereby avoid deportation, but he refused the offer. Unfortunately, the prosecutors could not obtain enough evidence to proceed with a criminal trial. Authorities continued to deport known Galleanists using the Anarchist Exclusion Act and related statutes.
The predecessor to the FBI, the federal investigation in Philadelphia, headed by Special Agent Todd Daniel and the Bureau of Investigation’s Acting Director, William Flynn had been investigating these anarchists. Days after the bombings, Special Agent Daniel said, “The terrorist movement is national in scope, and its headquarters may be located in this city [Philadelphia].” Daniel also noted the large number of “anarchists in this city and so many places used by them for meeting places.” Daniel’s first thought was that the perpetrators of the Philadelphia bombings were members of the Industrial Workers of the Word (a leftist union that embraced socialistic principles). On June 5, federal and local investigators were tracking down members of the “bomb-throwing squad,” which was said to have included women. They had 12 radicals suspected of having a hand in the city’s attacks under constant surveillance.
At the time of the bombings, there was already a lot of anxiety in America. The world was in the middle of a deadly wave of the pandemic flu, the Bolshevik Revolution in Russia, and the ensuing over-hyped “Red Scare,” and sometimes violent labor strikes across the country. The attacks it engendered angered Americans. People wanted answers and they wanted a severe response. The Attorney General, looking to make a bid for the presidency, didn’t want to stir up a revolution, so he was ready to oblige. He created a small division to gather intelligence on the radical threat and placed a young Justice Department lawyer named J Edgar Hoover in charge. Hoover collected and organized every intelligence gathered by the Bureau of Investigation (the FBI’s predecessor) and other agencies to identify anarchists most likely involved in violent activity. Even with all the “investigating,” the bombings were never officially pinned on anyone. They did arrest, under recently passed laws like the Sedition Act, suspected radicals and foreigners identified by Hoover’s group, including well-known leaders Emma Goldman and Alexander Berkman. With much public fanfare in December, several radicals were put on a ship, dubbed the “Red Ark” or “Soviet Ark” by the press, and deported to Russia. Basically, that ended all the repercussions over the 1919 Anarchist Bombings.
Most wars end when one of the sides surrenders because they know they have lost the war. It would be strange for two enemies to simply walk away from a war. Nevertheless, in the year 585BC, the Medians were at war with the Lydians. When the king of Lydia refused to hand over the Scythians to Cyaxares, a war broke out between the two kingdoms. The war had been going on for five years. Things weren’t going well for either side. Neither side seemed to be able to get the upper hand in the war. Then, it all changed. Still, it didn’t change in the way you would expect.
Technically this battle was called the Battle of Halys, but it ended with the solar eclipse that Thales predicted. After that, it was called the Battle of the Eclipse. The solar eclipse so terrified the Medians and Lydians that they forgot about the vicious battle they were fighting and fled the battlefield. It had to have been the fastest end to a war in history…before or after.
Modern astronomical research places the date of the battle on May 28th 585 BC. Other dates have been proposed with varied evidence, but May 28 is the most widely accepted date for the event. For these ancient people, a solar eclipse would have appeared to have been an omen from the gods. They didn’t know any better. Subsequently, they lay down their weapons and quickly ushered in a truce.
Whilt this eclipse wasn’t the first one to be recorded, it was the first to be accurately predicted. According to the Greek historian Herodotus, Thales of Milete had made the prediction that it would occur in the year that it did. How he made the prediction is a mystery, and the answer continues to elude modern historians, because prediction of such events requires certain astronomical knowledge that they weren’t thought to have had back then. Nevertheless, he was probably the only one who was not particularly surprised about the sudden darkness on the battlefield. Everyone else was in a state of panic, and both sides decided that it was time to get the heck off the battlefield, before something really bad happened.
Memorial Day is a different kind of day, because it is not a holiday of celebration, but rather a day remembrance. We cannot celebrate this day, because it is about honoring those soldiers who went to war and didn’t make it back. It was the ultimate sacrifice. As the saying goes concerning soldiers, “all gave some, but some gave all!” When a soldier goes to war, they know. They are very aware that the possibility exists that they will not come back home. They know that their sacrifice might be the ultimate sacrifice. They want to make it home, but they know it may not be. Today is about those soldiers who did not make it home.
I doubt if there are many families that can say that they have never lost a soldier in battle, but while I don’t specifically know of any in my family, I’m sure there are some back there a way. There have been many wars, and with each one examined, comes the increased chance of having a relative who dies at war. It doesn’t matter anyway, because Memorial Day is a day to honor those who gave all, whether they are related to us or not. Their sacrifice is what makes us free today. They fought for people they didn’t even know, gave up time with the family they loved, and died in a place they didn’t want to be. That is the epitome of bravery and courage.
Some of them, including my uncle, Jim Richards’ brother Dale Richards never left the place they died. Dale fought in Normandy, France, and that is where he is to this day. The people of France are so grateful for the soldiers who fought and died over there, that they keep the graves looking beautiful. It’s nice to know that there are people who continue to show their appreciation for those men who “gave all” for them. Their sacrifice should never be forgotten. Their families can certainly never forget. They have had to go forward with their lives without the love and support of the soldier that went to war and never came home. That soldier had potential. They could have been anything they wanted to be, but instead, they chose to give their life to ensure the freedom of other human beings. Today, we honor all of those men who “gave all” for us and so many others. We thank you for your service, and we honor your memory. God bless you all, from a grateful nation.
After World War I ended, war weary Americans decided that we needed to isolate ourselves from the rest of the world to a degree, in order to protect ourselves. Of course, that didn’t stop people from other countries from wanting to immigrate to the United States. The Johnson-Reed Immigration Act reflected that desire. Americans wanted to push back and distance ourselves from Europe amid growing fears of the spread of communist ideas. The new law seemed the best way to accomplish that. Unfortunately, the law also reflected the pervasiveness of racial discrimination in American society at the time. As more and more largely unskilled and uneducated immigrants tried to come into America, many Americans saw the enormous influx of immigrants during the early 1900s as causing unfair competition for jobs and land.
Under the new law, immigration was to remain open to those with a “college education and/or special skills, but entry was denied disproportionately to Eastern and Southern Europeans and Japanese.” At the same time, the legislation allowed for more immigration from Northern European nations such as Britain, Ireland, and Scandinavian countries. The law also set a quota that limited immigration to two percent of any given nation’s residents already in the US as of 1890, a provision designed to maintain America’s largely Northern European racial composition. In 1927, the “two percent rule” was eliminated and a cap of 150,000 total immigrants annually was established. While this was more fair all around, it particularly angered Japan. In 1907, Japan and US President Theodore Roosevelt had created a “Gentlemen’s Agreement,” which included more liberal immigration quotas for Japan. Of course, that was unfair to other nations looking to send immigrants to the United States. Strong US agricultural and labor interests, particularly from California, had already pushed through a form of exclusionary laws against Japanese immigrants by 1924, so they favored the more restrictive legislation signed by Coolidge.
Of course, the Japanese government felt that the American law as an insult and protested by declaring May 26 a “National Day of Humiliation” in Japan. With that, Japan experienced a wave of anti-American sentiment, inspiring a Japanese citizen to commit suicide outside the American embassy in Tokyo in protest. That created more anti-American sentiment, but in the end, it made no difference. On May 26, 1924, President Calvin Coolidge signed the Immigration Act of 1924 into law. It was the most stringent and possibly the most controversial US immigration policy in the nation’s history up to that time. Despite becoming known for such isolationist legislation, Coolidge also established the Statue of Liberty as a national monument in 1924.
There are many diseases that can become an epidemic, but I never expected laughter to be one of the. After all, isn’t laughter a good thing? It is known as the “best medicine” as far as I have ever heard. Nevertheless, I suppose that one can have too much of a good thing. Such was the case on January 30, 1962, in or near the village of Kashasha on the western coast of Lake Victoria in Tanganyika (which, once united with Zanzibar, became the modern nation of Tanzania) near the border with Uganda. On that day the Tanganyika laughter epidemic of 1962 began. It was then that the outbreak that was categorized as an outbreak of mass hysteria, or mass psychogenic illness (MPI), was rumored to have occurred.
The laughter epidemic began at a mission-run boarding school for girls in Kashasha. It started with three girls and quickly spread throughout the school, affecting 95 of the 159 pupils between the ages of 12 and 18. I wonder if the teacher thought the girls were just trying to cause trouble or disrupt classes? The symptoms lasted from a few hours to as much as 16 days, with the average being around 7 days. Strangely, the teaching staff was not affected, other than to report that students were unable to concentrate on their lessons. All told, the first outbreak in Kashasha lasted about 48 days. It was just too long to be able to successfully run the school, so the school was forced to close on March 18, 1962. Unbelievably, a second phase of the outbreak began when the school reopened on May 21, 1962. This second phase of the outbreak affected an additional 57 pupils. The all-girl boarding school reclosed at the end of June.
You might think that the girls were just having a great time messing with their teachers, but the epidemic spread to Nshamba of the Muleba District, a village 55 miles west of Bukoba, where several of the girls lived. In April and May 1962, 217 villagers, mostly young people had laughing attacks over the course of 34 days. In a crazy twist, the Kashasha school was sued for “allowing” the children and their parents to transmit it to the surrounding area. In June, the laughing epidemic spread to Ramashenye girls’ middle school, affecting 48 girls. Additional schools and the Kanyangereka village were also affected to some degree. About 18 months after it started, the Laughing epidemic simply died off. It was contained to the areas within a 100-mile radius of Bukoba. In all, 14 schools were shut down and 1000 people were affected.
Symptoms of the Tanganyika “laughter epidemic” included laughter and crying, beside general restlessness and pain, as well as fainting, respiratory problems, and rashes. Contrary to the name, this was a very really epidemic with very real health concerns. Many of the symptoms experienced were stress-induced due to various external factors. This kind of stress and anxiety provokes mass hysteria outbreaks, which are reactions to perceived threats, cultural transitions, instances of uncertainty, and social stressors. You would think that these kinds of stressors would be common and happen to people every day. Maybe they were, but for whatever reason, this time, these stressors seemed much worse to those suffering with them. On top of that, the majority of the population affected by this epidemic were young people and adolescent children. The outbreak has been attributed to the young not having the appropriate coping skills to manage such stresses and anxieties. In addition to those stressors, the young people have a need for acceptance and are eager to blend into a group, making them vulnerable to “influence” contagion. Influence contagion refers to “the subtle and sometimes unwitting spread of emotions or behaviors from one individual to others. It occurs when people influence each other’s thoughts, feelings, or actions without necessarily being aware of it.”
Basically, in this cast, linguist Christian F Hempelmann theorized that the episode was stress-induced. “In 1962, Tanganyika had just won its independence, he said, and students had reported feeling stressed because of higher expectations by teachers and parents.” MPI, he says, usually occurs in people without a lot of power. “MPI is a last resort for people of a low status. It’s an easy way for them to express that something is wrong.” Whatever the case may be, the “laughter epidemic” ended up being a very real phenomenon.
In centuries past, men could only be knighted for some kind of military bravery or prowess, but since 1917, by proclamation of King George V the reigning British monarch at the time, Britain has been awarding memberships into The Most Excellent Order of the British Empire, better known as British Order of Knighthood, to reward both civilian and military wartime service, to men and women, which makes sense since women are in the military now, and there are often civilian participants in some operations of war.
If that was the only change, it might have been ok, but in 1918, a separate military division of the order was created, that allowed persons of renown (basically celebrities) to be brought into the order as well. The five classes of both civil and military divisions, listed in descending order and conferred on men and women equally, are Knight and Dame Grand Cross (GBE), Knight and Dame Commander (KBE and DBE, respectively), Commander (CBE), Officer (OBE), and Member (MBE). Conferment of the two highest classes entails admission into knighthood, if the candidate is not already a knight or dame, and the right to the title of “Sir” or “Dame” as appropriate. (Knights and Dames Grand Cross, together with Knights of the Garter and of the Thistle, may be granted the use of supporters with their arms.) Appointments are usually made on the recommendation of the British Secretary of State for Defense and the Secretary of State for Foreign and Commonwealth Affairs. It seems strange to me that it would happen like that.
I understand that they were trying to include more people, but in doing so, it seems to me that the honor is cheapened, somehow. These days, being knighted holds a much different meaning than it used to. Nations with a monarch as their head of state would, once upon a time, issue knighthoods to their loyal subjects and foreign citizens who have done great deeds for their country. Today, you can still earn a knighthood through military prowess, but you can also earn one if your artistic, scientific, or civil service “shines greatly” upon the crown. Really?? Of course, you could also be a genocidal Marxist dictator who overthrows the government and you’ll eventually be knighted…or you could just be a penguin…I mean it!!! To me, being knighted for anything other than extraordinary service to the country makes the award of little to no value. I don’t dispute these actors and others right to recognition, it’s just that knighthood doesn’t seem the place for it.
Flight has always been an obsession with humans. In the early 11th century, an English Benedictine monk named Eilmer of Malmesbury, attempted a gliding flight using wings he had made. The “flight” went pretty well. It is said that he “flew” about 220 yards, before crashing and breaking both his legs. I suppose you could call that a flight, thereby making it the first winged flight in history, but the reality is that it was quite likely the first “crash” instead. Apparently, this “flying monk” made his maiden flight.
As the story goes, Eilmer of Malmesbury built himself a pair of wings around the year 1005. He then climbed up onto a tower and proceeded to jump off. He managed to glide into a headwind for about 220 yards. Unfortunately, the headwind was quite strong, and combined with a bit of a panic on the part of the “flying monk” he began to descend rapidly…ie crash. He ended up veering off to the side and crashing. I hardly thing what he did could be called a flight. To me it couldn’t even be called a controlled crash. Basically, he jumped off a roof and after the few seconds it took to travel 220 yards, he dropped like a rock. It’s very possible that he found himself at the mercy of his own rash decision to take on such a venture.
Eilmer of Malmesbury, survived the “crash landing” but with both legs broken. He survived the crash and his broken legs healed, but in those days, the setting of his legs couldn’t have been good, and most likely wasn’t done at all. With that working against him, Eilmer of Malmesbury walked with a limp for the rest of his life. Eilmer used a bird-like apparatus to glide downwards against the breeze. Unfortunately, he was unable to balance himself forward and backwards, as does a bird by slight movements of its wings, head and legs. He would have needed a large tail to maintain that equilibrium. While, Eilmer could not have achieved true soaring flight, he might have glided down safely with a tail. Eilmer said he had “forgotten to provide himself with a tail.” I guess every endeavor has its flaws, and this was no exception.
Fidel Castro was a thorn in the side of the United States and many other countries for many years. The CIA has long been rumored to have planned a series of assassination attempts against Castro using elaborate means, from using exploding cigars to poisoned pills to putting thallium salt in his shoes. Castro’s inner circle estimate that there were some 634 attempts to take his life including, giving Castro poisoned cigars, gifting him with a scuba suit lined with a skin-eating fungus, and stabbing him with a poisoned needle hidden inside a pen. There were also a number of plots to humiliate him as well, including the plot to spray Castro’s broadcasting studio with an LSD-like substance so he’d trip and fall while giving one of his speeches and (hopefully) sound very silly too. And they wanted to dust his shoes with a substance that would make his beard fall out. Those didn’t happen either.
But the document officially reveals some of the schemes focused on Castro’s love of skin diving, hoping that he would pick up the beach shell and trigger an explosive, and another that considered giving the revolutionary a contaminated diving suit. That of course leaves a lot to chance and makes the beach unsafe for anyone else there, or that the wrong person might pick up the seashell. These ideas for assassination attempts, whether real or rumored, were wild and often crazy, but at the very least, they show just how frustrating a man Fidel Castro was. When people consider booby trapping a seashell, you know they are grasping at straws. Still, such an attempt wouldn’t work on just any “mark” because not every mark would possibly pick up a seashell.
In each of these attempts or imaginary attempts, Fidel Castro walked away unscathed. It wasn’t that attempts weren’t made it was just that Fidel Castro wasn’t killed in the attempts. I don’t know how he escaped, and many people wish he had not. His reign of terror was far too long and far too horrific. In the end, Castro, who overthrew US-backed Cuban leader Fulgencio Batista in 1959, died of natural causes on November 25, 2016, at age 90, leaving power to his brother Raul.