When Rome expanded its army to Auxiliaries
The decision by the Roman Empire to allow foreigners to serve in their army marked a significant turning point in the history of the ancient world.
This policy shift, made due to the desperate need for manpower, had profound and far-reaching consequences for the empire.
In the early days of the Roman Republic, citizenship and the right to serve in the military were tightly linked.
However, as the Roman Empire expanded its borders, the demand for soldiers increased exponentially, stretching the limits of the available Roman citizens willing or able to serve in the legions. The Roman Empire, at its peak, stretched from the British Isles to the Middle East and North Africa.
By the late Republic and early Empire periods, Rome faced numerous external threats and internal challenges. The need for a larger and more diverse army became evident.
The Roman Empire began incorporating auxiliaries, non-citizen soldiers, into its ranks. The auxiliaries were recruited from the provinces and provided specialized skills and local knowledge that proved valuable.
The Roman army welcomed a diverse array of warriors. Germanic tribesmen were skilled in close-quarter combat. Numidian cavalry, mounted on swift horses, provided mobility and reconnaissance. Archers from the Eastern provinces shot arrows with deadly accuracy.
The inclusion of non-citizens in the military provided a pathway for assimilation and integration, fostering a sense of loyalty and shared identity among the diverse soldiers. However, the loyalty of non-citizen soldiers was sometimes questioned. Language barriers and cultural differences also posed challenges.
To help mitigate this, Rome implemented various strategies such as offering citizenship as a reward for exemplary service, creating a sense of belonging and shared destiny.
Auxiliaries were often organized into units based on their origin, allowing them to maintain their cultural identities. Roman officers, fluent in multiple languages, served as bridges between the different groups, fostering cohesion and understanding.
One of the main concerns was the potential for auxiliary soldiers to turn against the Roman Empire. This concern was not unfounded, as there were instances where foreign soldiers, particularly those from Germanic tribes, revolted against the Roman Empire.
Another concern was the impact of foreign soldiers on the Roman military culture. The Roman army was known for its discipline and strict adherence to military regulations. The inclusion of foreign soldiers, with their different customs and practices, while sometimes beneficial, could also undermine this military culture and weaken the overall effectiveness of the Roman army.
The cultural exchange within the military had effects on the broader Roman society, contributing to a more cosmopolitan identity. This cultural diversity also presented challenges in terms of unity and a shared sense of purpose.
The influx of foreigners, particularly those of non-Roman religions, raised concerns among some Roman citizens. Some feared the erosion of their traditional values and identity, leading to occasional social clashes.
As Roman citizen recruitment declined, the army increasingly relied on barbarian auxiliaries. Though initially beneficial, these soldiers lacked the ingrained loyalty and discipline of Roman citizens. This dilution of the legions' core strength could have compromised their effectiveness in critical battles. It accelerated some existing problems but also provided temporary solutions and contributed to the empire's cultural dynamism.
Over time, the Roman Empire faced internal strife, economic troubles, and increasing difficulty in defending its vast borders. The decline of the Roman economy, the debasement of the currency, and internal power struggles weakened the empire internally. Simultaneously, external pressures from Germanic tribes, the Huns, and other invaders strained the Roman military.
While the inclusion of foreigners in the military was not a direct cause of the fall of the Roman Empire, the broader consequences of this policy, such as changes in the cultural and social fabric, contributed to the evolving nature of Roman society.
The inclusion of non-citizen soldiers marked a significant chapter in the history of the Roman Empire, shaping its military strategy, cultural identity, and the dynamics of its vast and diverse territories.
18
views
Clemente's Legacy: Baseball Brilliance and Humanitarian Heroism
Roberto Clemente was one of baseball's greatest players. His life was marked by extraordinary achievements on the field, profound humanitarian efforts, and sadly also a tragic end.
Born on August 18, 1934 in Carolina, Puerto Rico Clemente's early life was shaped by his love for baseball and he showed exceptional talent from a young age. By the time he was 17, he was playing for the Cangrejeros de Santurce in the Puerto Rican Winter League. There his remarkable skills as an outfielder and powerful hitter quickly caught the attention of Major League Baseball scouts.
Clemente signed with the Brooklyn Dodgers, becoming one of the first Latin American players to break into Major League Baseball. But his career really took off after he joined the Pittsburgh Pirates in 1955. His exceptional skills as an outfielder, powerful bat, and unmatched arm strength quickly earned him widespread recognition.
While Clemente’s achievements on the baseball diamond are legendary, his legacy extends beyond the confines of the stadium. In 1958, during the offseason, he enlisted in the United States Marine Corps Reserve, driven by a sense of duty and patriotism. He served his six-month active duty commitment at Parris Island, South Carolina and Camp Lejeune, North Carolina.
While there, Clemente’s tireless work ethic and unyielding spirit helped to inspire fellow recruits. The rigors of military life also honed his mental and physical toughness, helping to solidify his legendary on-field tenacity.
By the early 1960s, Clemente had blossomed into a full-fledged star. He led the Pirates to two World Series victories and claimed a National League MVP award.
Clemente championed racial equality, becoming a vocal advocate for Latino players and spoke out against poverty and inequality.
His dedication to humanitarian causes reached its pinnacle in 1972 when a devastating earthquake struck Nicaragua, leaving thousands dead and millions homeless. Deeply moved by the suffering of the people, Clemente organized relief efforts to provide aid.
The first three shipments were diverted by corrupt government officials and never made it to the quake victims. So Clemente boarded the next rescue flight himself, determined to see his mission through. Sadly, on New Year's Eve 1972, the plane carrying Clemente crashed off the coast of Puerto Rico, taking his life. He was only 38.
In recognition of his outstanding contributions to baseball and his humanitarian efforts, Roberto Clemente was posthumously inducted into the Baseball Hall of Fame in 1973, making him the first Latin American and Caribbean player to receive this honor.
His tragic death solidified his place as a national hero, not just in Puerto Rico but across the globe. His contributions to baseball, his unwavering commitment to justice, and his compassionate humanitarian efforts have left an enduring legacy that continues to inspire and resonate with people around the world.
Photo Credits: The Clemente Museum, US Marine Corp
RoySmith @ wiki commons
Music: Nothing's Quite the Same by Kit Wheston
51
views
Fossil Feud: The Bone Wars of the 19th Century
The Bone Wars, also known as the Great Dinosaur Rush, was a period of intense fossil hunting and discovery during the late 19th century.
The Bone Wars were primarily driven by the heated rivalry between two paleontologists, Othniel Charles Marsh and Edward Drinker Cope. This fierce competition significantly contributed to the discovery of numerous dinosaur fossils in the western United States, particularly in states like Wyoming and Colorado.
The feud between Marsh and Cope began in the 1870s when numerous dinosaur fossils were discovered in the American West. The competition between the two scientists led to a race to uncover and name new dinosaur species.
Initially, they maintained a collegial relationship, exchanging ideas and specimens. However, as their ambition and desire for recognition intensified, their collaboration quickly turned into a fierce rivalry.
The rivalry was fueled by a combination of professional jealousy, personal animosity, and the intense pressure to outdo one another in the field of paleontology. Both Cope and Marsh sought to amass the largest collection of dinosaur fossils, and they spared no expense in their pursuit.
This led to a frantic race to discover, describe, and name new dinosaur species, often without thorough scientific scrutiny.
The Bone Wars were characterized by underhanded tactics, including bribery of quarry workers to provother's ide exclusive access to fossil-rich sites, espionage, and attempts to discredit each work.
The two paleontologists engaged in a relentless public relations battle, using newspapers and scientific journals to criticize and undermine their opponent's credibility.
The competition reached its peak during the late 1870s and early 1880s. The two scientists were so focused on outdoing each other that they sometimes overlooked the accuracy of their findings. This resulted in the misclassification and misinterpretation of several dinosaur species, which later had to be corrected by subsequent generations of paleontologists.
Cope and Marsh continued to feud until the early 1890s, when they both began to suffer from financial problems. Cope's financial woes were partly due to his own extravagance, but they were also exacerbated by his obsession with outdoing Marsh. Marsh's financial problems were due in part to the high cost of his fossil expeditions, but they were also exacerbated by his legal battles with Cope.
Marsh died in 1897, and Cope died in 1899. Both men were deeply in debt at the time of their deaths.
Despite the negative aspects of the Bone Wars, it played a pivotal role in advancing the field of paleontology and included the discovery of over 140 new species of dinosaurs, including species of Triceratops, Allosaurus, Diplodocus, and Stegosaurus.
Photo credits: ScottRobertAnselmo, Ad Meskens, Anky-man, Allie_Caulfield, Michael Barera, The_Wookies @ wiki commons
Music - Lit by JNGS
42
views
Inside the Brain: The Profound Personality Shifts of Phineas Gage
Phineas Gage suffered a horrific injury that changed him forever. His experiences have contributed to our understanding and raised interest in how the human brain operates.
Born in 1823, Gage worked as a railroad construction foreman in the mid-19th century. However, he had a bizarre and life-altering incident on September 13, 1848.
Gage was working as a foreman for a crew of men who were blasting rock to build a railroad bed in Cavendish, Vermont. He was using a tamping iron to pack explosive powder into a drilled hole when the powder prematurely detonated. The tamping iron was propelled through Gage's head, entering just below his left cheek and exiting through the top of his skull.
Amazingly, Gage survived the accident, but those close to him said that he suffered profound changes in his personality and behavior.
Before the accident, Gage was described as a responsible, reliable, and industrious individual, qualities that contributed to his success as a railroad construction foreman.
But afterwards people who knew Gage said his personality saw a marked loss of inhibitions and an increase in impulsivity.
His case drew the attention of physician Dr. John Harlow, who played a crucial role in documenting and treating Gage's injuries and later became instrumental in the historical record of the case.
Harlow's efforts helped establish Gage's case as a landmark in the study of neuroscience, particularly in the exploration of the frontal lobes' role in personality and social behavior.
At the time, the prevailing belief was that different mental functions were localized to specific regions of the brain. Gage's case challenged this notion, as his injury did not cause paralysis or loss of basic cognitive functions.
Gage's case provided early evidence that damage to the frontal lobes could result in personality changes, ushering in a new era of research into the functions of different brain regions.
There has been debate over the years over the extent of Gage’s personality changes, which some argue have been exaggerated.
He was said to have became prone to fits of anger after the accident, and quick to become agitated over minor issues. Previously affable and well-liked, he became socially inappropriate and exhibited a lack of empathy.
There have also been reports that his mental health improved and that in later life he was far more functional, and socially far better adapted, than in the years immediately following his accident.
Still, Gage faced numerous challenges in the aftermath of his injury. He worked as stable coach driver but suffered health problems that made work difficult. These later extended to epileptic seizures. Gage died in 1860 at the age of 36.
Gage's contribution to neuroscience lies not only in the survival of a traumatic brain injury but also in the invaluable insights it provided into the intricate workings of the human brain.
Over time, researchers and neuroscientists have continued to investigate the intricate connections between brain structure and function, building upon the insights provided by Gage's unique case.
Photo Credits: ادمین سایت, Guorong Ma, Hongying Fan, Chanchan Shen, Wei Wang, Anthony E. D. Mobbs, E.J. Barnes, L.B. Lee, EEng, Department of Radiology, Uppsala University Hospital, Erald Mecani, Sydniemlinde, Daniel G. Axtell, Van Horn JD, Irimia A, Torgerson CM, Chambers MC, Kikinis R, Der Lange, Yun-Jun Sun, Barbara J. Sahakian, Christelle Langley, Anyi Yang, Yuchao Jiang, Jujiao Kang, Xingming Zhao, Chunhe Li, Wei Cheng and Jianfeng Feng @ wikicommons
100
views
Pablo Escobar and the White House: The mysteries of the iconic photo
Pablo Emilio Escobar Gaviria, the notorious Colombian drug lord, is a name that resonates with infamy in the annals of criminal history. Born on December 1, 1949, in Rionegro, Colombia, Escobar rose from humble beginnings to become the undisputed leader of the infamous Medellín Cartel. His life and influence are a complex web of criminal enterprise, political intrigue, and a legacy that continues to cast a shadow over Colombia and the global war on drugs.
But one of his most iconic moments was a photo. In 1981, Escobar, accompanied by his family, visited the United States. During this trip, he posed with his son, Juan Pablo, in front of the North Portico of the White House in Washington D.C.. Maria Victoria, Escobar's wife, captured the moment on Pennsylvania Avenue, with the U.S. president’s house as the backdrop.
The apparent ease with which Escobar was able to take a photograph in front of the White House raises questions. It came at the same time as Escobar was making enormous profits from smuggling cocaine into the United States.
The Medellín Cartel quickly expanded after it bought property in the Bahamas in 1978, which facilitated its smuggling operations.
The co-founder of the cartel, Carlos Lehder, purchased real estate on the small island of Norman's Cay, which subsequently served as a crucial layover point where drug planes could refuel on their journey to the United States.
At the time of the iconic photograph, Pablo Escobar was also actively attempting to position himself as a legitimate politician in Colombia. In 1982 he successfully entered the Colombian Congress as an alternate member and was granted parliamentary immunity and the right to a diplomatic passport.
Escobar undertook extensive initiatives to cultivate a respectable image, investing millions in the development of some of Medellín's most impoverished neighborhoods. He constructed housing complexes, parks, football stadiums, hospitals, schools, and churches. His charitable activities may have drawn attention away from his smuggling operations.
But his stint in politics did not last long. Minister of Justice Rodrigo Lara-Bonilla investigated the illicit methods through which he made his money and Escobar was forced to resign two years following his election.
It is possible that Escobar’s presence in front of the White House did not raise the attention of U.S. security agencies because he was not yet well known. But it’s also possible that the CIA or other agencies were aware of his presence, but saw no benefit from arresting him at that time.
There is also speculation that the CIA had an agreement to allow the passage of cocaine into Florida, though there is no evidence for this claim. The CIA’s activities during this time did come under scrutiny on allegations that the spy agency was involved in the Nicaraguan Contras' cocaine trafficking operations during the 1980s. US government investigations have claimed that these theories are unsupported.
Escobar’s trip to the White House was not the only high profile part of his journey in the United States. His family also made an extravagant trip to Disneyland. According to Juan Pablo Escobar, the family spared no expense during their vacation. He was said to have hired a personal consultant to advise them on attractions, had a personal driver and binged on clothing and souvenirs from the trip.
While the exact factors behind Escobar’s brazen photo remain unclear, it remains an intriguing part of his legacy.
Escobar’s notoriety increased after he ordered the bombing of Avianca Flight 203 in 1989, killing all 107 people on board. This was done in retaliation for the Colombian government's extradition of drug traffickers to the United States.
By the early 1990s the Colombian government, supported by a U.S.-led task force, was hunting him relentlessly.
The drug lord's empire crumbled as key members of the cartel were captured or killed. In 1993, facing imminent capture, Escobar was gunned down by Colombian authorities on a rooftop in Medellín.
The legacy of Pablo Escobar is a complex and controversial one. While he is remembered as one of the most notorious criminals in history, his impact on Colombian society is undeniable. The scars of the drug trade and the violence associated with Escobar's reign continue to shape the country's socio-political landscape.
Photo credits: Sebastian Marroquin; Ghazi777755, Claudiabetancourt03, Juandax, Elcarrielnoespatrimonio, david peña, Death270598, Colombia Turiscol, Richard Vandervord, SEMANA, Cristian Santiago Gomez Zapata, James St. John, Tiomono @ wiki commons
Music: Bowers & Wilkins by JNGS @ unminus.com
88
views
Don't Blame Benjamin Franklin! The History of Daylight Saving
The biannual adjustment of clocks for daylight saving time is a frustration for many, and for those that hate the practice there appears to be no hope that it will end anytime soon. Let’s look at how this all began.
Daylight savings is often linked to Benjamin Franklin, but the roots of this controversial practice really lie in attempts at energy conservation during war times.
Franklin playfully suggested the idea in 1784, sarcastically proposing that the party-loving citizens of Paris could save on lamp oil by going to bed earlier. However, it only gained serious consideration in New Zealand and England much later.
George Vernon Hudson, a New Zealand entomologist, proposed the idea in 1895 because he wanted to maximize daylight for his insect collecting hobby. William Willett, an English builder, also championed daylight saving time after his observations during a 1905 horseback ride that many residents of London were still asleep while the sun was rising. An avid golfer, Willett also disliked cutting short his round when the sun set.
Willett published "The Waste of Daylight" in 1907, proposing setting clocks forward by 80 minutes in spring and reversing the process in the fall. Despite initial resistance, he gained support for the idea, but passed away in 1915, a year before it saw widespread adoption.
The impetus for its adoption came about during world war one when governments were concerned about conserving energy.
Germany led the way in Europe, formally adopting daylight savings time on April 30, 1916. The rationale was indeed to conserve coal by optimizing daylight usage and reducing the need for artificial lighting. The idea was that by shifting the clocks forward during the longer days of summer, people would use less electricity for lighting in the evening.
Austria-Hungary, the United Kingdom soon followed suit, and the United States also introduced the concept in 1918.
Daylight savings continued to be adopted during war times and intermittently ceased thereafter. It regained popularity during the 1970s energy crisis and began the road towards its current permanent presence.
But research on the energy-saving effects of daylight savings has produced mixed results. The impact can vary depending on factors such as geographical location, climate, and technological advancements. Some studies also suggest that the reduction in lighting costs during the extended daylight hours is offset by increased energy consumption in other areas, such as heating and cooling, as well as transportation.
As societies have evolved and technology has advanced, the primary reasons for adopting daylight saving have expanded to include factors like increased outdoor activities, economic benefits, and potential reductions in traffic accidents.
It has been argued that farmers enjoy the ritual, but in fact they largely oppose the schedule, saying that it adds to challenges of farming.
Like Hudson, the New Zealand bug collector, daylight saving benefits office workers – by increasing available light for after work activities.
More recently there has also been more attention on the health consequences of changing the clock, which include disrupted sleep and stress, along with increased risks of strokes, heart attacks, and traffic accidents.
Efforts to end daylight saving have gained momentum, with the European Parliament voting to abolish it in 2019. However, member states have delayed decisions on how to move forward. In the United States, the Senate passed legislation in 2022 for a permanent shift to summer time starting in 2023, but progress stalled in the House of Representatives.
Today, discussions continue about the relevance and potential impacts of daylight saving on energy savings and public well-being, but for those opposing the practice there seems to be little light on the horizon.
56
views
Franz Reichelt: The Tragic Tale of the Flying Tailor
Franz Reichelt etched his name into history with a peculiar and tragic experiment that would come to define him as "The Flying Tailor." Born on October 16, 1879, in Wegstädtl, Austria-Hungary (now part of the Czech Republic), Reichelt was a skilled tailor, and by the age of 20 he had moved to Paris to pursue a career in fashion.
He opened his own shop in an upscale neighborhood near the Palais Garnier, home to the Paris Opera.
But Reichelt harbored a fascination with flight and an unyielding desire to conquer the skies. In particular, he fixated on the idea of creating a wearable parachute suit.
The advent of the aviation era had led to a wave of accidents, prompting an increasing focus on safety measures and the quest for a reliable parachute. However, as of 1910, there remained a conspicuous absence of a parachute well-suited for jumps from airplanes or at low altitudes.
Reichelt sought to create a parachute that would be lightweight, easy to wear, and would open automatically when the wearer jumped from a great height.
Reichelt experimented with dummies thrown from his fifth floor apartment. Initially these tests showed promise - dummies outfitted with foldable silk "wings," landed lightly. However, translating this into a functional wearable "suit" was a challenge. The original design used 6 square meters (65 sq ft) of material and weighed approximately 70 kilograms (150 lb).
Reichelt persevered, aiming to reduce the weight and increase the surface area for the canopy. But his designs continued to fail and when tested on dummies dropped from his apartment, they landed heavily.
It was reported that Reichelt himself also tested his outfits, once jumping from a height of 8 to 10 metres (26 to 33 ft) where landing on a bed of straw shielded him from injury. Another attempt from a height of 8 meters (26 ft) reportedly ended with a broken leg.
Undeterred, Reichelt attributed the failures at least in part to the short drop distances over which he had conducted his tests. He was keen to experiment from a higher level at the Eiffel tower.
That fateful day arrived on February 4, 1912. Reichelt climbed to the first platform of the tower, which was 187 feet above the ground. His parachute suit consisted of a long cloak with a hood and two large wings attached to his arms.
After some hesitation Reichelt leaped off the tower, expecting his invention to unfurl and slow his descent gently.
But his parachute failed to open, and he plummeted to the ground, landing with a sickening thud. He was killed instantly. The man who aspired to conquer the skies became a victim of his own daring experiment.
In the aftermath of Reichelt's death, his parachute suit was scrutinized by experts who confirmed its flaws. The design lacked the necessary aerodynamics and structural integrity to function as a reliable parachute. It was too heavy and cumbersome, and it did not have enough surface area to slow his descent.
The police later said that they had only approved the experiment for dummies and did not greenlight a jump by Reichelt himself. Reichelt’s friends also said they were surprised by his plans, which he had concealed before the act, and they unsuccessfully tried to talk him out of it.
Reichelt's death was filmed by a news crew and became one of the most iconic images of the early aviation era.
51
views
Julia Child: From Military Service to Culinary Mastery
Julia Child was born on August 15, 1912, in Pasadena, California, the eldest of three children born to John McWilliams, a wealthy investment banker, and Julia Carolyn, a paper-company heiress.
In her youth, Child enjoyed playing sports such as basketball and tennis. After college, where she majored in history, she briefly worked as a copywriter and editor in New York.
In 1941, with the onset of World War II, Child joined the Office of Strategic Services (OSS), the precursor to the Central Intelligence Agency (CIA). She quickly rose in the ranks, moving from typist to a top-secret researcher.
Notably, during her time at OSS Child was responsible for developing a shark repellent during World War II that helped keep sharks away from underwater explosives.
Her intelligence work took her to various locations, including Kandy, Ceylon (now Sri Lanka)., where she met her future husband, Paul Child, also a member of the OSS.
After the war, Julia and Paul moved to Paris where Paul, a diplomat, was stationed. Julia quickly fell in love with French cooking and had the opportunity to study at the Cordon Bleu cooking school, where she learned the intricacies of French cooking under the guidance of some of the finest chefs.
It wasn't until she met her husband that Child began cooking and found joy in making food. Paul was a foodie and imparted his enthusiasm to Julia.
Julia's time at Le Cordon Bleu was transformative, but not without its challenges. As the only woman in her class and more than 6 feet tall, she was an exuberant presence in the kitchen, but she faced skepticism and occasional condescension from her male counterparts. However, her determination and love for the craft propelled her forward until her graduation in 1951.
Julia then turned to the challenge of introducing French cuisine to an American audience unfamiliar with its nuances. In collaboration with Simone Beck and Louisette Bertholle, she co-authored the groundbreaking cookbook "Mastering the Art of French Cooking," which was published in 1961. This comprehensive guide demystified French culinary techniques for an American audience, making gourmet cooking accessible to home cooks.
Mastering the Art of French Cooking was a groundbreaking book. It was the first comprehensive cookbook on French cuisine to be published in English. The book was a huge success, and it made Julia Child a household name.
In 1963, Child launched her own cooking show on PBS called The French Chef. The show was an instant hit, and it made her a beloved figure in American culture.
The French Chef was one of the first cooking shows on television, and it revolutionized the way Americans cooked and ate. Child taught viewers how to make classic French dishes such as boeuf bourguignon, coq au vin, and crème brûlée in a way that was both entertaining and informative.
One of Julia Child's enduring contributions to the culinary world is her emphasis on the joy of cooking. She had a charismatic and unpretentious demeanor that made gourmet cooking seem approachable and enjoyable for everyone. Her famous phrase, "If you're afraid of butter, use cream," reflected her philosophy of indulgence in moderation, promoting the idea that good food is meant to be savored.
Julia Child continued to write cookbooks and host cooking shows throughout her career.
The love story between Julia and Paul added a romantic dimension to her culinary journey. Paul, an artist and photographer, contributed significantly to the visual appeal of Julia's cookbooks. His keen eye for detail and artistic sensibilities complemented Julia's culinary expertise, creating a harmonious blend of words and images that made their work distinctive.
Julia Child died in her sleep on August 13, 2004, at the age of 91, after having her last meal that night of French onion soup. She credited her long and adventurous life to eating red meat and drinking gin.
Photo credits: MattHucke, Breville USA, transcendancing, RadioFan, Glenn Dettwiler, Antony-22 @ wiki commons, Mister Mister, Chris Molloy, Roberto Vivancos, GEORGE DESIPRIS @ pexels.com
Music credits: Trumpets in Your Ears by Wowa & Chris Rede, Just Cool by Wowa @unminus.com
70
views
Friendly Fire Fiasco: The Bizarre Battle of Karánsebes
In military history, there's a bizarre episode that doesn't stand out for its brilliance or heroism but for its absurdity—the Battle of Karánsebes. Amidst the Austro-Turkish War of 1788-1791, the Austrian army, camped near Karánsebes in present-day Romania, found itself in a comically chaotic situation.
The Austrian army, who came from numerous countries and spoke different languages, were poorly disciplined and were preparing to face the Turks.
That night, confident in victory, the generals strategized at the emperor's tent. Meanwhile, hussar cavalrymen, in search of the enemy, encountered Vlach Roma offering liquor and companionship. An agreement was struck, leading to a boisterous celebration.
Later, another Romanian unit sought to share the revelry but was met with resistance, sparking a confrontation. Shouts of "Turcii! Turcii!" intensified the confusion. At midnight, gunfire erupted. Misinterpreting the situation, the hussars believed the Romanians were Ottoman soldiers in disguise. Panic set in, and they fled towards the bridg. German officers shouted "Halt! Halt!" but other soldiers, not understanding German, mistook it for "Allah," thinking it referred to the Turkish Islamic cries to their God.
A commander, thinking the Turks were attacking, ordered cannons to fire. Chaos ensued, with the Austrian army fighting among themselves.
The fighting lasted for several hours, and by the time it was over, hundreds of Austrian soldiers were dead or wounded. The Turks, meanwhile, had escaped unharmed.The Ottoman Turks, upon hearing about the internal collapse of the Austrian forces, simply strolled into an abandoned and decimated camp.
Unaware, the Ottoman army found a deserted camp two days later, claiming an unexpected victory.
The Battle of Karánsebes serves as a stark illustration of the unpredictable nature of warfare. Amidst misunderstandings and folly, what could have been a historic victory turned into a farcical defeat.
Music credits: Trumpets in Your Ears by Wowa & Chris Rede, Ambisax by RRound @unminus.com
17
views
From Hamilton to the Fed: The Fall and Rise of U.S. Central Banks
For the last century the U.S. Federal Reserve has held unprecedented power over the U.S. economy and banking sector. But the role of a U.S. central bank has been controversial.
The First Bank of the United States was established in 1791 under Alexander Hamilton, the first Secretary of the Treasury.
At the time, the United States was a young country still recovering from the Revolutionary War, and its government was heavily in debt. Hamilton believed that a central bank could help to stabilize the economy by regulating commercial banks and issuing paper currency.
He also believed that a central bank could help the government to finance its debts by buying government bonds. This would provide the government with the money it needed to pay its debts and to invest in infrastructure and other projects.
The bank was opposed by influential people including Thomas Jefferson and James Madison. Critics argued that the bank's existence was unconstitutional and concentrated too much power federally instead of developing state banks. They further worried that it would benefit wealthy businessmen in cities at the expense of ordinary Americans and farmers.
In 1811, Congress voted not to renew the First Bank's charter and the bank was closed.
Following the War of 1812, the Second Bank of the United States was chartered in 1816, again citing a need to address economic challenges. Its responsibilities included regulating state banks, stabilizing the currency, and managing national finances.
The bank came under fire early on for mismanagement. It failed to control the amount of paper money being issued by its branch banks in the West and South. This contributed to the speculative land boom that occurred after the war.
When the U.S. markets collapsed in the Panic of 1819 the bank continued to contract credit even as the economy began to recover. This policy made mass unemployment and falling property values worse.
The bank later oversaw more steady credit expansion under the leadership of Nicholas Biddle, and became a more powerful institution. But it faced opposition, notably from Andrew Jackson, who believed it had failed to produce a stable national currency, lacked constitutional legitimacy, subverted the rights of states and was “dangerous to the liberties of the people.”
The Bank War unfolded, culminating in Jackson's veto of the bank's rechartering in 1832.
The Free Banking Era ensued. During this time the country faced several economic crises, including in 1836, when Jackson attempted to fend off excesses in land speculation with the Specie Circular, which required payments for government land be made in gold or silver. That shocked state banks that had issued paper money without enough backing from the precious metals, and caused a contraction in credit and culminated in the panic of 1837.
During the free banking era states passed laws that allowed banks to operate with little regulation. This led to a proliferation of banks, some of which were poorly managed. Proponents of central banks say that this era suffered from not having a central bank to act as a lender of last resort during downturns and panics to stabilize the financial system.
Central bank critics counter that central planning of the economy and the manipulation of the money supply creates asset bubbles and inflation and exacerbates boom and bust cycles. The concentration of power among a few, unelected officials that can make decisions that impact the entire economy is also seen as undemocratic and bailing out banks that make bad investments creates moral hazard.
After the financial panic of 1907, efforts to create a central bank hastened again. This culminated in 1913 with the creation of the Federal Reserve System. The mandate of the Federal Reserve, with its twelve regional banks and Board of Governors, was to address financial panics, provide an elastic currency, and serve as a lender of last resort.
The Fed sets interest rates and managed the country's money supply. After the financial crisis of 2007-2009 it underwent bond purchases, which increases money supply in the economy and helps to expand credit. Supporters say this helped the economy rebound from the economic downturn, but critics charge that it has led to asset bubbles, contributed to inflation, reduced the value of the U.S. currency and made it more difficult for the economy to return to normal conditions without central bank support.
The Federal Reserve is also considering launching a digital currency, which if implemented could give it complete control over how U.S. citizens spend their money.
Photo Credits: FamZoo, Rob Lavinsky, iRocks.com, AgnosticPreachersKid, Myotus, Bestbudbrian @ wiki commons
Karolina Grabowska, Erik Scheel, Anna Nekrashevich @ pexels.com
Music credits: Trumpets in Your Ears by Wowa & Chris Rede, Good God by Wowa @unminus.com
130
views
The Battle of Los Angeles: A Night of Mystery and Mayhem
In the early hours of February 25, 1942, Los Angeles found itself thrust into a state of wartime frenzy as the city's residents witnessed an otherworldly spectacle in the night sky. The city, still reeling from the attack on Pearl Harbor in Hawaii, found itself plunged into hysteria as an unidentified appeared in the night sky.
Air raid sirens sounded and a citywide blackout was put into effect. Witnesses reported seeing strange lights and objects in the sky. Some people said that they saw parachutes and even flying saucers. Others reported hearing explosions and seeing anti-aircraft fire streaking through the sky.
The military scrambled fighter planes to intercept the unidentified objects, but they were unable to find anything. The anti-aircraft fire continued for several hours, but no enemy aircraft were shot down.
As dawn broke over Los Angeles, the city emerged from the blackout, and the residents were left to survey the aftermath of the chaotic night. Shell fragments caused damage to numerous buildings and vehicles. Three people also lost their lives in car accidents, while two succumbed to heart attacks due to the stress of the incident..
But despite the intensity of the anti-aircraft barrage, there was no evidence of an enemy aircraft, and no wreckage was found. The Battle of Los Angeles had ended as mysteriously as it began.
In the wake of the incident, military officials scrambled to provide an explanation for the events of that night. The official statement claimed that the object was likely a weather balloon or a wayward barrage balloon, both common sightings during wartime. The military maintained that the intense anti-aircraft fire was a result of war nerves and heightened anxiety.
However, the official explanation failed to quell speculation and conspiracy theories. Some suggested extraterrestrial origins for the object, citing the inability of the military to bring it down as evidence of otherworldly technology. Others proposed that the incident was a psychological warfare experiment orchestrated by the military to gauge public reactions.
Some also believed that the military was covering up a real attack by Japanese aircraft. However, no Japanese aircraft were found in the area after the incident.
The Battle of Los Angeles remains a compelling mystery in the historical record. Whether misidentified weather phenomena, experimental military technology, or an extraterrestrial visitation, the events of that night continue to be debated.
Photo credits: soron616 @ wiki commons
Music credits: Trumpets in Your Ears by Wowa & Chris Rede, Blue Sjy by Wowa @unminus.com
31
views
Decay and Demise: Unveiling the Chronicles of the Roman Empire's Fall
The Roman Empire holds immense historical importance as a foundational civilization whose contributions span politics, law, governance, architecture, engineering, art, literature, and culture.
At its zenith in 117 AD, it stretched across 2.3 million square miles, encompassing Africa, Asia, and Europe, with an estimated 60 million inhabitants. It stood as one of history's grandest and mightiest empires. However, its sprawling size became unsustainable.
The decline began shortly after Emperor Trajan's death in 117 when Mesopotamia was lost to the Parthians. Rome never regained this territory. In Europe, defending the vast Germania border strained resources.
In 212, Caracalla's Antonine Constitution granted citizenship to free men across the empire, promoting Roman identity. The Romans, however, viewed non-Romans, especially Germanic tribes, as barbarians.
The decline began in earnest during the 3rd century, a period often referred to as the "Crisis of the Third Century." This tumultuous era was characterized by frequent changes in leadership, military crises, economic instability, and a breakdown of central authority. Starting in 235, a succession of short-lived emperors ruled the empire, often coming to power through military coups or assassinations. The assassination of Emperor Alexander Severus marked the beginning of this chaotic period.
Rome faced constant pressure at it's borders. In the east, the Sassanid Empire (Persia) became a formidable adversary and engaged in a series of wars with Rome. In the west, Germanic tribes, particularly the Goths, began to pose a significant threat. In 251, the Roman Empire suffered a humiliating defeat at the hands of the Goths at the Battle of Abritus, during which Emperor Decius was killed.
The Roman economy also faced severe challenges during this period. Heavy taxation, rampant inflation, and a debased currency (the denarius) eroded the wealth of the empire. The government's efforts to address these economic issues often fell short, leading to further instability.
Emperor Diocletian came to power in 284 and initiated a series of reforms known as the "Tetrarchy." He divided the empire into eastern and western halves, each with its own emperor and subordinate "Caesars." He implemented price controls and strengthened the imperial bureaucracy. While these reforms brought temporary stability, they were not enough to halt the empire's long-term decline.
Constantine the Great, who ruled from 306 to 337, is known for his conversion to Christianity and his establishment of Constantinople (modern-day Istanbul) as the new eastern capital of the Roman Empire in 330. This move signaled a significant shift in the center of power from Rome to the eastern provinces. While it brought administrative efficiency and security to the east, it also contributed to the gradual decline of the western half of the empire.
The Visigoths famously sacked Rome in 410, marking the first time the city had been captured in over 800 years. In 451, the Huns, led by Attila, invaded the western provinces, causing widespread devastation. While these attacks were repelled, they weakened the Roman Empire's ability to defend its borders.
The internal division and power struggles among Roman leaders further contributed to the empire's decline. Civil wars became increasingly common, as different generals and provincial leaders vied for control. The divide between the eastern and western halves of the empire also deepened.
In 476, the Roman Empire suffered a devastating blow when the Germanic chieftain Odoacer deposed the last Roman emperor, Romulus Augustulus. This event is traditionally regarded as the fall of the Western Roman Empire.
While the western half of the Roman Empire crumbled, the eastern half, known as the Byzantine Empire, continued to thrive for several centuries. They preserved Roman traditions, culture, and governance. The Byzantine Empire would endure until the fall of Constantinople in 1453, marking the end of the Roman legacy in the east.
Historians cite myriad reasons for Rome's fall, from economic decline to external pressures. Internally, economic instability eroded Rome's tax base and disrupted long-distance trade, fracturing the empire's unity. Externally, increasing numbers of outsiders, including Germanic tribes, sought to join Rome, not conquer it. However, Rome's lingering disdain for these newcomers hindered integration.
The Roman Empire's legacy endures. It left an indelible mark on Western civilization, influencing art, architecture, law, language, and governance.
Photo credits - Jean-Pol GRANDMONT, David Corral Gadea, José Luiz Bernardes Ribeiro, Marie-Lan Nguyen, British Museum, DAVID ILIFF, Hartmann Linge, Cjcaesar, Ninara, Classical Numismatic Group, Inc, Rabax63, Bigdaddy1204 @ wiki commons
Music - Trumpets in Your Ears by Wowa & Chris Rede, Piratos by Wowa @unminus.com
220
views
The Great Stink of London: A Stench That Sparked Urban Reform
London in the mid-19th century was a rapidly expanding urban hub. The city's population more than doubled between 1800 and 1850.
As the population surged, so did the volume of waste generated by both residents and businesses. But the sanitation infrastructure of the era was utterly inadequate to handle the sheer magnitude of human waste and industrial discharges.
The Great Stink commenced in July 1858, coinciding with scorching temperatures in London, averaging between 34-36°C (93-97°F). The oppressive heat caused the Thames River to recede to its lowest levels in years, revealing the putrid sewage that had accumulated on its bed.
The crux of the issue lay in London's antiquated sewer system. These sewers constituted an aging patchwork of channels, cesspits, and drains that discharged directly into the Thames River. Once a vital waterway for transportation and commerce, the Thames had devolved into an open sewer, emitting a foul odor of filth and contagion.
The Great Stink presented a dire public health crisis for London. The fetid air was not only a nuisance, it was also seen as a breeding ground for disease. Cholera, typhoid, and other waterborne illnesses were rampant at the time, claiming countless lives.
During the Victorian era, prevailing healthcare theories leaned toward the miasma theory, attributing the spread of contagious diseases to the inhalation of contaminated air. This contamination could stem from the stench of decomposing bodies, sewage, putrefying vegetation, or even the exhalations of infected individuals. Many believed that miasma was the primary mode of cholera transmission, a disease that caused widespread fear due to its rapid dissemination and high mortality rates.
It wasn't until later in the century, notably after Dr. John Snow's investigation into the cholera outbreak in Soho in 1854, that the view that contaminated water was the source of cholera gained acceptance.
The growing popularity of flush toilets contributed to the stench. Fear that miasma from the sewers could spread disease led to more frequent toilet flushing, resulting in even more sewage discharge into the Thames.
Reports of the stench reached the highest levels of government, and members of Parliament were forced to suspend their sessions because of the unbearable conditions in the Palace of Westminster, located on the banks of the Thames.
The government’s response during the early days was to douse the curtains of the Houses of Parliament in chloride of lime, and later to pour chalk lime, chloride of lime and carbolic acid directly into the water. But this was ineffective. The crisis demanded immediate action.
The turning point arrived in the form of an enterprising engineer named Joseph Bazalgette. Appointed as the chief engineer of the Metropolitan Board of Works, Bazalgette had long championed a comprehensive overhaul of London's sewage and sanitation systems.
Bazalgette's visionary plan encompassed the construction of an intricate network of subterranean sewers designed to redirect sewage away from the city and into the Thames Estuary. This ambitious endeavor aimed not only to resolve the immediate Great Stink crisis but also to forestall future outbreaks of waterborne diseases.
Bazalgette's plan required more than just the construction of underground sewers. To accommodate the new system, the Thames itself had to be transformed. Bazalgette oversaw the creation of massive embankments along the river, which not only housed the new sewers but also widened the riverbanks and provided space for the construction of new roads.
The embankments served as marvels of engineering and urban planning. They not only ameliorated sanitation conditions but also improved the city's infrastructure and visual appeal. This ambitious project involved the removal of vast quantities of waste and the relocation of numerous buildings and businesses along the river.
By the mid-1860s, Bazalgette's ambitious project was nearing completion. The new sewer system effectively diverted waste away from the city and into the estuary, eliminating the immediate threat of the Great Stink. The impact on public health was profound, with the incidence of waterborne diseases decreasing dramatically.
The Great Stink of London, while a harrowing episode in the city's history, had a lasting legacy of urban reform. Bazalgette's sewer system set a standard for sanitation infrastructure that would be emulated in cities around the world. It demonstrated the importance of long-term planning, investment in public health, and the role of engineering in shaping urban environments.
Bazalgette's innovative approach to sewage and sanitation not only eradicated the immediate threat but also laid the groundwork for London's future as a modern, healthy, and livable city.
54
views
The Early Years of the Roman Empire: From Republic to Imperial Power
The early days of the Roman Empire represent a crucial era when the Roman Republic evolved into a formidable imperial force. This left an enduring mark on Western civilization, deeply impacting politics, legal systems, culture, and more.
The Roman Empire was founded in Rome in 753 BC by the twin brothers Romulus and Remus. The strategic location along the Tiber River, provided access to fertile lands, natural defenses, and crucial trade routes, all of which played a pivotal role in its rise.
Rome's evolution from a monarchy to a republic in 509 BC marked a significant turning point, introducing a novel system of governance, characterized by a delicate balance of power among various branches. This fostered political stability and accountability.
One of the key factors in Rome's ascent was its formidable military. The Roman legions, known for their discipline and organization, became the backbone of the empire's expansion.
Rome's republic held annual elections of two consuls, who served as both political leaders and military commanders. However, the political landscape was marked by a struggle between the patricians and the plebeians.
During the early republic, Rome experienced significant territorial expansion, including the conquest of the Italian peninsula and the epic conflicts known as the Punic Wars against Carthage.
The rise of Julius Caesar in the first century BC marked a pivotal moment in Roman history. His remarkable career as a skilled military general and charismatic politician began to ascend when he formed the First Triumvirate in 60 BC, amid political turmoil and power struggles. This informal alliance enabled Caesar to secure the consulship and embark on successful military campaigns in Gaul.
As consul, Julius Caesar implemented a series of reforms aimed at addressing social and economic inequalities. Nevertheless, his ambition and popularity made him a polarizing figure within the Roman Senate, leading to a power struggle. In 49 BC, Caesar famously crossed the Rubicon River with his legions, sparking a civil war against the Senate and his rival, Pompey.
In 48 BC, Caesar emerged victorious at the Battle of Pharsalus, compelling Pompey to flee to Egypt, where he met his demise.
Caesar returned to Rome as a conquering hero. He assumed the position of dictator perpetuo, effectively making him the supreme ruler of Rome, and enacted a series of reforms. However, Caesar's growing power and refusal to disband his legions raised concerns among the Senate and traditionalists who feared a return to monarchy.
On the Ides of March in 44 BC, Caesar was assassinated by a group of Senator conspirators led by Brutus and Cassius, plunging Rome into turmoil once again.
Eventually, the Second Triumvirate was formed in 43 BC. Comprising Octavian (Caesar's great nephew and heir), Antony, and Lepidus, this sought to avenge Caesar's death and consolidate their authority.
The Second Triumvirate purged political opponents, and pursued Caesar's assassins, most notably Brutus and Cassius, who were defeated at the Battle of Philippi in 42 BC. However, personal rivalries and power struggles within the triumvirate led to the marginalization of Lepidus. Tensions between Octavian and Antony grew, culminating in the famous Battle of Actium in 31 BC, where Octavian emerged victorious.
Octavian became the uncontested ruler of Rome. In 27 BC, he formally established the principate. Augustus, as Octavian came to be known, portrayed himself as the princeps, or "first citizen," rather than a monarch.
Augustus wielded enormous power. He held multiple titles, including imperator (military commander), consul, and tribune, which granted him authority over the military, administration, and the Roman Senate. This period marked the beginning of the Roman Empire in earnest.
Augustus oversaw an era of relative peace and stability known as the Pax Romana, that lasted until 180 AD. This period witnessed remarkable prosperity and cultural flourishing. The Roman Empire reached its zenith, spanning from Britannia to Egypt.
Augustus ruled for over four decades, and his succession plan paved the way for a line of emperors known as the Julio-Claudian dynasty. This dynastic succession solidified the principle of hereditary rule, which persisted in subsequent imperial dynasties.
The early years of the Roman Empire left an indelible impact on world history. The transition from the Roman Republic to the Roman Empire marked the end of a republican experiment and the rise of a new political order characterized by autocracy and centralization.
Photo credits - Jebulon, Chris Nas, Joel Bellviure, José Luiz Bernardes Ribeiro, FeaturedPics, Diliff, Doc Searls, Benh LIEU SONG, Marcok, Kurt Kaiser, Vyacheslav Argenberg, Herbert Weber, Jenna Post, Krzysztof Golik, Shakko, Rabax63, Alphanidon, Luke Jones, Pveverdingen, Carole Raddato @ wiki commons
Music - Trumpets in Your Ears by Wowa & Chris Rede, Please Wait by Wowa @unminus.com
94
views
Magna Carta: The Medieval Roots of Modern Democracy
Magna Carta, also known as the Great Charter, serves as a foundational cornerstone of modern democracy and the rule of law. Made on June 15, 1215, in a meadow at Runnymede, England, it represents a pivotal moment in the struggle for individual liberties and limitations on absolute monarchical power.
The early 13th century was a tumultuous time in England. King John, infamous for his oppressive and arbitrary rule, faced mounting discontent among his barons, clergy, and subjects. His reign was marked by exorbitant taxation, military failures, his abuse of the legal system, and his interference with the Church. By 1215, a group of rebellious barons demanded relief from the king's tyranny. It was in this backdrop of political unrest that Magna Carta was born.
Magna Carta was not a declaration of universal human rights, but rather a practical and specific attempt to address the grievances of the barons and limit the authority of the king. It outlined a set of promises and rights and primarily focused on taxation, feudal obligations, and the administration of justice.
The Charter contained 63 clauses, each addressing a particular grievance or demand. Some clauses were concerned with smaller matters, such as fishing rights on the River Thames. However, other clauses laid the foundation for principles that would become central to the development of constitutional law.
Key Principles of Magna Carta include:
Rule of Law: Magna Carta established the principle that no one, not even the king, is above the law.
Due Process: The Charter ensured that individuals could not be arbitrarily punished or deprived of their rights.
Taxation with Consent: One of the central concerns of the barons was excessive taxation. Magna Carta required that any new taxes or levies must be approved by a council of nobles, known as the Great Council. This principle laid the groundwork for modern concepts of representative government and taxation with representation.
Protection of Widows and Heirs: The Charter included provisions to protect the inheritance rights of widows and heirs.
Freedom of the Church: Magna Carta asserted that the English Church should be free from royal interference.
Habeas Corpus: Although not explicitly mentioned in Magna Carta, the principle of habeas corpus, which ensures that individuals cannot be held in detention without a legal basis, traces back to this document.
Magna Carta was not an instant remedy to the grievances of the barons or the broader issues of governance in medieval England. King John, after reluctantly sealing the Charter, sought papal annulment, plunging the country into civil war. However, Magna Carta's principles gained greater significance over time.
During the reign of John's son, King Henry III, and subsequent monarchs, Magna Carta was reissued and reconfirmed multiple times and it was expanded and modified to address contemporary concerns. By the late 13th century, it had evolved from a baronial charter into a symbol of broader legal and political principles.
One of the most pivotal moments in Magna Carta's evolution came in 1297 when King Edward I issued a version known as the "Confirmatio Cartarum" (Confirmation of the Charters). This made the document applicable to all free men in the kingdom, not just the barons. It solidified the idea that the king's authority was limited by law and that the rights and liberties enshrined in the Charter were not the exclusive preserve of the aristocracy.
Magna Carta also had a profound impact beyond England. It was a model for the development of constitutional and legal principles in other nations, including Scotland, where the "Declaration of Arbroath" in 1320 drew inspiration from the Charter. In the American colonies, Magna Carta's legacy played a significant role in shaping the concept of individual rights and the resistance to British colonial rule. The famous phrase "No taxation without representation" echoes the Charter's provisions on taxation and consent.
Magna Carta influenced the development of legal systems around the world. Many see it as a precursor to modern constitutionalism, the Bill of Rights, and the Universal Declaration of Human Rights. Its principles have left an indelible mark on the development of democratic societies and the protection of human rights.
Magna Carta transcended its historical context to become a symbol of liberty and justice for all. The document stands for the power of collective action against tyranny and the enduring quest for justice, fairness, and the rule of law.
Photo credits - Basher Eyre, Michael Garlick, Richard Croft, Adrian S Pye, Christine Matthews, Yale Law Library, David Dixon, Philip Halling, Karen Vernon, Tony Hisgett, FDR Presidential Library & Museum, Rasiel Suarez, Artists' Suffrage League @ wikicommons
Music theme - Trumpets in Your Ears by Wowa & Chris Rede @unminus.com
148
views
1
comment
Dr. John Stapp: Pushing the Limits of Human Endurance
Dr. John Paul Stapp was a pioneer in aerospace medicine, known for his groundbreaking experiments that pushed the limits of human endurance. His primary focus was understanding how acceleration and deceleration affect the human body to enhance the safety of aircraft, spacecraft, and automobiles.
In October 1944, Stapp joined the U.S. Army Air Forces as a physician and became a qualified flight surgeon. By 1946, he was assigned to the Aero Medical Laboratory at Wright Field's training facility in Dayton, Ohio, where his interest in testing human tolerance took off.
Driven by the increasing speed and altitude of post-World War II aircraft, Stapp wanted to research human tolerance to G-forces, rapid spins, oxygen deprivation, and exposure to cosmic rays. This endeavor was partly sparked by witnessing ejection seat tests.
Stapp's initial experiments took place in early 1947 on a 2,000-foot track at Muroc Air Force Base (now Edwards). Northrop Aircraft created a sled with rockets to produce a maximum speed of around 200 miles per hour, and installed 45-foot-long hydraulic brakes at the end of the course. The sleds, which were code-named “Gee Whiz.”
Stapp himself frequently rode the sled, even suffering injuries during the experiments that included a fractured wrist. The sled’s breaks produced a deceleration of up to 35 Gs, or 35 times the force of gravity. Before the experiments, it was believed that 18 Gs would be fatal for humans.
Stapp relocated to Holloman Air Force Base in 1953. There, in collaboration with the Northrop Corporation, he developed the advanced Sonic Wind No. 1 sled, equipped with a propulsion section and a water brake system that used a scoop to stop the sled by digging into dams.
On March 19, 1954, Stapp, strapped into a jet pilot's seat replica, made his first ride on Sonic Wind No. 1. Six rockets propelled the sled to a speed of 421 mph, setting a land speed record.
Subsequently, more objectives, such as testing human tolerance to wind blasts, were integrated into the experiments. On August 20, 1954, the sled reached a remarkable speed of 502 mph, with Stapp exposed to the elements and surviving the ordeal.
Stapp's most notable run took place on December 10, 1954. With nine rockets behind him, he reached a speed of 632 mph in just five seconds. The 1.4-second deceleration subjected him to over 40 Gs. Emergency personnel had to remove him from the seat due to severe injuries, including burst blood vessels in his eyes, temporary blindness, cracked ribs, broken wrists, and circulatory and respiratory issues. His remarkable feat earned him the title of the "Fastest Man on Earth" and a spot on the cover of Time Magazine in September 1955.
Stapp soon shifted his focus to the Manhigh Project, collaborating with Dr. David Simons to study human endurance at the edge of space through high-altitude balloon flights. This research was crucial in ensuring astronauts could withstand the G-forces during launch and reentry and the rigors of space travel.
In 1957, Stapp and his team launched Simons to a record-breaking altitude of 101,000 feet in a pressurized gondola as part of Project Manhigh, lasting 32 hours. Simons was celebrated on the cover of Life magazine and hailed as "The First Space Man."
With the launch of Sputnik by the Soviet Union in 1957, space exploration became a priority for NASA, which took over research in this field. Stapp assisted NASA in selecting astronauts for Project Mercury, the United States' first manned space program.
Stapp returned to the Air Force to conduct his final high-altitude balloon experiment, Project Excelsior, with the aim of developing an ejection system for rocket-powered jets at high altitudes. On August 16, 1960, test pilot Joseph Kittinger, who had also worked on Project Manhigh, ascended to 102,800 feet in just 90 minutes, breaking records set by Simons in 1957. Kittinger's leap from the balloon resulted in the longest free fall and parachute jump in history, covering nearly 20 miles in just 13 minutes and 45 seconds.
Stapp's pioneering work laid the foundation for advancements in aerospace safety. He developed safety innovations, including the "side saddle" harness, designed to protect paratroopers during rough landings.
He later also developed a keen interest in the potential applications of his research in the realm of automobile safety and was a prominent advocate for the use of seat belts.
Dr. John Paul Stapp passed away on November 13, 1999, at the age of 89, leaving behind a profound legacy in aerospace medicine and safety.
Photo credits - NASA, Air Force Test Center History Office, U.S. Air Force, Air Force Research Laboratory, Department of Defense, DoD News Channel, Wright Air Development Division - Aerospace Medical Division
127
views
The Bermuda Triangle: Unraveling the Mystery
The Bermuda Triangle, also known as the "Devil's Triangle," is an area in the Atlantic Ocean that has fascinated people for years due to stories of mysterious disappearances and strange occurrences. It's roughly defined by Miami, Bermuda, and Puerto Rico. While some say it's all sensationalism and coincidence, the mystery endures.
The legend of the Bermuda Triangle took shape in the early 20th century but gained wide attention in the mid-20th century. Vincent Gaddis coined the term "Bermuda Triangle" in a 1964 magazine article, and Charles Berlitz popularized it in his 1974 book, "The Bermuda Triangle," describing a place where ships and planes vanished mysteriously.
The 1970s saw the peak of the Bermuda Triangle's popularity in popular culture. Documentaries, books, and movies fueled the mystery, often emphasizing supernatural aspects like ghost ships, time warps, and extraterrestrial encounters.
One famous incident is the 1945 disappearance of Flight 19, five U.S. Navy bombers that vanished during a training mission from Fort Lauderdale, Florida. The search and rescue mission also lost a plane, fueling the Bermuda Triangle legend.
In 1963, the SS Marine Sulphur Queen disappeared with its 39 crew members while in the Bermuda Triangle. No wreckage was ever found. Similarly, in 1976, the tanker SS Sylvia L. Ossa vanished with 37 crew members on board while traveling from Tampa, Florida, to Boston, Massachusetts.
Various theories attempt to explain the Bermuda Triangle's mysteries, including extraterrestrial involvement, the influence of Atlantis, vortices transporting objects to other dimensions. However, there's no scientific evidence to support these theories.
Despite its reputation as a paranormal vortex, others have offered more rational explanations including -
Human Error: Navigational mistakes, miscommunication, and poor decisions by pilots and sailors contribute to accidents.
Weather: Sudden and severe weather changes, like waterspouts and hurricanes, pose risks.
Methane Hydrate: Some suggest that methane hydrate, a natural gas, might be on the ocean floor, causing vessels to lose buoyancy if released.
Compass Variations: Magnetic anomalies in the area can make compasses behave erratically.
Traffic Volume: The Bermuda Triangle is heavily trafficked, and more traffic means more accidents statistically.
Skeptics argue that selective reporting, sensationalism, and disregard for statistical norms fuel the Bermuda Triangle mystery. Most ships and planes pass through without incident. Modern technology, like GPS and satellite tracking, has reduced unexplained disappearances, which often occurred when such tech was in its infancy.
Investigations into famous Bermuda Triangle incidents have yielded plausible explanations. For example, the wreckage of Flight 19 was discovered in 1991 off the coast of Florida, although the exact cause of the crash remains unclear.
The Bermuda Triangle remains a captivating mystery. Whether it's the thrill of the unknown, fascination with historical enigmas, or the enduring allure of adventure, this region continues to spark curiosity about the unexplained.
Photo credits - NASA, Luke Hancock, Athanasius Kircher, Alphaios, USGS, U.S. Navy @ wiki commons
Nino Souza, Sean Johnston, Alex Andrews, Brooke Vandy, Korie Jenkins, Earl Andre Roca, Deklanşör 67, Isidoro Esposito, Johannes Plenio, Pixabay, Kamran Norollahi, Kelly, Maël BALLAND, Ruvim Miksanskiy @ pexels.com
Music theme - Trumpets in Your Ears by Wowa & Chris Rede @unminus.com
175
views
The Eternal Emperor: Qin Shi Huang's Tomb and His Terracotta Army
Located in the Shaanxi province of China, The Tomb of Qin Shi Huang is a place shrouded in mystery, history, and wonder. It is here that the world-famous Terracotta Army was discovered, an archaeological marvel that has captured the imaginations of people worldwide.
This vast underground necropolis stands as a testament to the power and ambition of Qin Shi Huang, the first Emperor of China, and offers a window into the ancient world of China's imperial past.
To understand the significance of the Tomb of Qin Shi Huang, one must first grasp his remarkable life and achievements. Born as Ying Zheng in 259 BC, he ascended the throne of the Qin State at the tender age of 13.
With a visionary zeal, he sought to unify the various warring states of China under a single rule. After decades of warfare, Ying Zheng succeeded in this monumental task, proclaiming himself as Qin Shi Huang, the First First Sovereign Emperor, in 221 BC.
Qin Shi Huang's reign marked a pivotal point in Chinese history. He standardized currency, measurements, and writing systems, creating a cohesive nation-state.
He also wanted to control the country from one central place and stop powerful local rulers, and so broke down the walls that divided the different parts of China and had new walls built that connected the old ones along the northern border.
This was meant to protect China from the Xiongnu people in the north. The barriers were not intended to be permanent, but ultimately formed part of what became the Great Wall of China, which was expanded, repaired and maintained by subsequent dynasties.
From the moment he took the throne, Qin Shi Huang was preoccupied with immortality. It is said that he spent much of his rule searching for the Elixir of Life, believing it would grant him immortality, but these efforts were in vain.
Ultimately, the Emperor turned his attention to his own tomb, envisioning an underground palace that would rival the grandeur of his earthly empire. He spared no expense, conscripting hundreds of thousands of laborers to construct his mausoleum over the course of nearly four decades.
The most famous discovery within the Tomb of Qin Shi Huang is the Terracotta Army, a vast collection of life-sized statues meticulously crafted to accompany the Emperor in the afterlife. This astonishing find was made in 1974, when local farmers digging a well stumbled upon fragments of clay figures.
The Terracotta Army comprises thousands of statues, including infantry, cavalry, chariots, and officers, each possessing distinct facial features and expressions. These remarkable sculptures were created to serve and protect the Emperor in the next world, mirroring the military organization of his time.
The sheer scale of the Terracotta Army is staggering. It is estimated that over 8,000 soldiers, 130 chariots, and more than 500 horses have been discovered, with much of the site still left to be excavated. The figures are intricately detailed, with individualized clothing, hairstyles, and weapons.
The purpose of this massive clay army was to serve as an imperial guard and escort for Qin Shi Huang in the afterlife. This reflects the ancient Chinese belief in an afterlife similar to the earthly realm, where one's status and needs continued beyond death.
The meticulous construction of the Terracotta Army is a testament to the sophistication of Qin Dynasty craftsmanship. It also speaks to the autocratic nature of Qin Shi Huang's rule, as the monumental effort and resources devoted to his tomb reflect his absolute power.
The ongoing excavation of the Tomb of Qin Shi Huang has revealed numerous other fascinating finds and some say also includes a number of booby traps designed to protect the tomb.
The preservation of the Terracotta Army and the Tomb of Qin Shi Huang presents unique challenges. Exposure to air and moisture can cause rapid deterioration of the clay figures, leading to a loss of detail and color. To address this, extensive efforts have been made to protect and conserve these priceless artifacts.
In recent years, sophisticated techniques such as 3D scanning and digital reconstruction have been employed to document and study the figures. This not only helps preserve the original statues but also provides researchers with a valuable resource for further analysis and understanding of the Terracotta Army.
The discovery of the Terracotta Army has transformed the site into a major tourist attraction and a symbol of Chinese cultural heritage. Millions of visitors travel to witness this awe-inspiring archaeological marvel.
The Tomb of Qin Shi Huang and the Terracotta Army it houses stand as a testament to the grandeur and ambition of China's first Emperor.
Photo credits - Aaron Zhu, Deadkid dk, Gary Todd, Maros, Zossolino, Jiisi, BrokenSphere, 画中的日记, Siestecita, Francisco Diez, Jorge Láscar, Philg88, Ingo.staudacher @ wiki commons
Music theme - Trumpets in Your Ears by Wowa & Chris Rede @unminus.com
39
views
The Great Emu War of Australia: A Feathered Fiasco
The Great Emu War of Australia was an unusual and some say ill advised battle between the government and the big birds of Australia. Australia is known for its unique wildlife, which includes the Emu, a large flightless bird known for their speed and agility.
To understand why the government took issue with these animals we must go back to World War I. After the war, many Australian veterans were given land to farm in Western Australia, as a part of a government initiative to reward their service. When the Great Depression hit in 1929, these farmers were encouraged to increase their wheat crops. The government promised subsidies for the wheat, though it ultimately failed to deliver on this.
The farmers were also caught in a tough market with falling wheat prices. In 1932 they then faced an additional and unusual problem – a massive emu population explosion. Thousands of emus migrated inland during their breeding season descended upon the farmlands of Western Australia, causing extensive damage to crops and infrastructure.
Faced with the emu onslaught, the farmers appealed to the government for help. The emus were not only devouring their wheat and destroying fences but also proving to be quite elusive targets.
In response to the farmers’ pleas, the government decided to take action. The Minister of Defense at the time, Sir George Pearce, approved a plan to deploy military forces armed with Lewis machine guns to combat the emu menace.
On November 2, 1932, the “war” officially commenced. Major G.P.W. Meredith led two soldiers, Corporal S. McMurray and Gunner J. O’Halloran, armed with two Lewis machine guns and 10,000 rounds of ammunition to the battlefield – the farmlands of Western Australia.
Their mission was to reduce the emu population and alleviate the farmers’ woes. But the soldiers soon discovered that emus were not the easiest of foes. The birds were swift, agile, and adept at scattering in the face of danger.
When the first shots rang out, the emus scattered in all directions, making them difficult to target. Despite their best efforts, the soldiers found it challenging to kill a significant number of emus, and the birds seemed to mockingly dodge the bullets.
The initial battles proved futile for the soldiers. The Lewis machine guns jammed frequently, and the emus remained unscathed. Major Meredith reported that only a few emus were killed during the first encounters. Even the local media found the situation absurd, with one newspaper declaring that the “Emus were Invulnerable to Bullets.” The emus, it seemed, were winning the war.
After a series of unsuccessful skirmishes, the soldiers reevaluated their approach. Major Meredith decided to mount one of the Lewis machine guns on a truck, hoping to increase mobility and accuracy. This adaptation also proved ineffective as emus outran the vehicles.
As the emus continued to damage the farmers crops, the government decided to make a second attempt to cull the big birds and sent more soldiers, including Meredith, into battle.
By the end of December 1932, the government soldiers had claimed victory, but it was a limited one. They estimated that they had killed around 1,000 emus, which was only a fraction of the estimated 20,000 emus that had plagued the farmlands. With the government’s ammunition running low and the war’s cost escalating, the operation was eventually abandoned. The emus, having endured a barrage of bullets, retreated for the time being.
The success of the operation was questionable and highlighted the absurdity of using military force against birds. But it also offered some valuable lessons and insights: The event highlights how government decisions can sometimes be poorly thought out or executed, leading to absurd and costly consequences.
The Emu War also underscores the importance of wildlife conservation and sustainable farming practices. Today, Australia takes more measured approaches to manage conflicts between wildlife and agriculture.
58
views
J. Robert Oppenheimer: The Complex Legacy of the Father of the Atomic Bomb
J. Robert Oppenheimer was a brilliant American physicist who left an indelible mark on the course of history during the 20th century. Born on April 22, 1904, in New York City, Oppenheimer displayed exceptional intelligence from a young age. He pursued his education at prestigious institutions, including Harvard University and the University of Cambridge, where he made significant contributions to the field of theoretical physics.
His work in quantum mechanics and astrophysics earned him recognition as one of the most promising young physicists of his time. By the 1930s, Oppenheimer's reputation as a brilliant scientist had already been firmly established, and he had become an influential figure in the world of academia.
However, it was during World War II that Oppenheimer's life took a dramatic turn. In 1942, he was appointed as the scientific director of the Manhattan Project, a top-secret government initiative aimed at developing the world's first atomic bomb. Leading a diverse team of scientists, Oppenheimer demonstrated exceptional organizational skills and scientific acumen.
The pinnacle of the Manhattan Project came on July 16, 1945, with the successful test of the atomic bomb, code-named "Trinity," in New Mexico. Witnessing the immense power of this devastating weapon, Oppenheimer famously quoted the Bhagavad Gita, expressing mixed emotions of awe and sorrow.
While the atomic bombings of Hiroshima and Nagasaki brought an end to World War II, they also raised profound ethical questions about the use of such destructive weaponry. Oppenheimer grappled with the moral implications of his creation and became an advocate for international control of atomic weapons.
However, his associations with communist sympathizers during the politically charged post-war era led to the revocation of his security clearance. Despite facing professional and personal setbacks, Oppenheimer continued to contribute to the scientific community and academia.
In his later years, he served as the director of the Institute for Advanced Study in Princeton, where he mentored and inspired future generations of physicists.
Oppenheimer's remains a controversial figure, celebrated for his scientific brilliance and criticized for his involvement in the development of nuclear weapons.
15
views
The Dark Tale: The Murder and Possible Cannibalism of Johan de Witt
The murder of the de Witt brothers, Cornelius and Johan de Witt, is a tragic and infamous event that occurred during the 17th century in the Netherlands. The brothers were prominent politicians and statesmen who played crucial roles in the Dutch Republic. Their deaths were a result of political turmoil and mob violence during a turbulent period known as the Rampjaar or "Disaster Year."
Cornelius and Johan de Witt were born into a wealthy and influential family. Cornelius, the older brother, served as the Grand Pensionary of Holland, effectively the head of government, while Johan held various political positions, including that of the Raadpensionaris (Council Pensionary) of Holland.
In 1672, the Dutch Republic faced a dire crisis as it was invaded by France, England, and several German states. The conflict, known as the Franco-Dutch War, led to widespread panic and instability within the country. The de Witt brothers' policies and efforts to maintain peace and diplomatic solutions were met with criticism and anger from those who favored a more aggressive stance.
On August 20, 1672, amidst the chaos of the Rampjaar, the de Witt brothers were attacked by a mob of Orangists, who were supporters of the House of Orange and favored a strong monarchy. The mob broke into the prison in The Hague where the brothers were being held and brutally lynched them.
The murder of the de Witt brothers shocked the nation and reverberated throughout Europe. It was a poignant symbol of the instability and violence that characterized the time. Their deaths also marked a turning point in Dutch politics, as the House of Orange gained more power and the republic began to shift towards a more monarchical system.
The memory of the de Witt brothers has endured through history, and they are remembered both for their political contributions and the tragic manner of their deaths. The events surrounding their murder serve as a cautionary tale about the dangers of mob violence and political extremism, while their lives stand as a testament to the complexities of governance and the challenges faced by leaders during times of crisis.
17
views
The Dancing Mania: Unraveling the Mystery of Mass Hysteria
Throughout history, there have been strange occurrences that have baffled and captivated societies across the globe. One such enigmatic phenomenon is the dancing manias, also known as dancing plagues or choreomania, which emerged in various European regions during the late Middle Ages and Early Modern period.
The dancing manias first appeared in the 14th century and reached their peak in the 16th and 17th centuries. These episodes involved groups of people, often numbering in the hundreds or even thousands, who would start dancing uncontrollably in the streets. These dancing outbreaks were not lighthearted or celebratory, but rather bizarre and frenzied, lasting for days, weeks, or even months.
During the dancing manias, those affected were unable to control their movements and continued to dance despite exhaustion, dehydration, and physical injuries. These episodes often led to severe consequences, with some individuals even dancing themselves to death.
Numerous theories have been proposed to explain the dancing manias, but no single cause has been definitively established. Some scholars attribute the outbreaks to religious fervor or spiritual possession, suggesting that the dancers were engaged in a form of ecstatic ritual or seeking divine intervention. Others believe that the dancing manias were a form of mass hysteria, triggered by social and psychological factors, such as stress, anxiety, or collective delusion.
Some researchers have also pointed to possible medical explanations, including neurological disorders, such as epilepsy or encephalitis, or toxicological causes, such as ingestion of hallucinogenic substances or ergot poisoning from contaminated rye.
Regardless of the underlying cause, the dancing manias left a lasting impact on European society, influencing art, literature, and medical discourse of the time. These episodes were often interpreted as divine punishment, and religious authorities sometimes attempted to exorcise the "possessed" dancers through rituals and prayers.
The dancing manias continue to intrigue historians, psychologists, and medical researchers, sparking debates about the nature of mass hysteria and the complex interplay between psychological, social, and cultural factors in shaping human behavior.
Photo credits: Wellcome Library, London, Arnoldius, Radek Kucharski
Music theme - Trumpets in Your Ears by Wowa & Chris Rede @unminus.com
4
views
J.P. Morgan's Fateful Decision: Canceling His Trip on the Titanic
In April 1912, the Titanic, a luxurious and supposedly "unsinkable" ocean liner, embarked on its maiden voyage from Southampton, England, to New York City, USA. Among the many passengers scheduled to travel on this historic voyage was the renowned American financier, J.P. Morgan.
J.P. Morgan was a prominent figure in the world of finance, known for his immense wealth and influence. He had intended to travel on the Titanic, but at the last moment, he canceled his trip and opted to stay in Europe for business matters. The exact reason for his decision remains a matter of speculation, with some suggesting health issues or the need to deal with financial matters being the possible reasons behind the cancellation.
Morgan's decision to cancel his voyage proved to be fortuitous, as the Titanic struck an iceberg on the night of April 14, 1912, leading to a tragic and fatal sinking. More than 1,500 people lost their lives in one of the most infamous maritime disasters in history. J.P. Morgan's last-minute change of plans spared him from the tragedy that unfolded on the Titanic's ill-fated voyage.
9
views
The Mau Mau Uprising: Kenya's Struggle for Independence
The Mau Mau Uprising, also known as the Mau Mau Rebellion or simply Mau Mau, was a significant armed resistance movement that took place in Kenya during the 1950s.
The seeds of discontent were sown in the early 20th century when the British arrived in Kenya, seeking fertile land for European settlers. The indigenous people were displaced from their ancestral lands, resulting in a loss of cultural identity and economic stability.
The divide between the settlers and the Africans grew wider as the British government enacted policies favoring the minority white population and marginalizing the African majority. By the 1940s, the Kikuyu, who comprised the largest ethnic group in Kenya, had started organizing themselves to challenge the British rule.
The uprising officially began in 1952 when the Mau Mau fighters launched coordinated attacks against British officials, settlers, and loyalist Kikuyu members who opposed their cause. The rebels' tactics included sabotage, raids on settlements, and acts of violence. They were particularly effective at using guerrilla warfare, taking advantage of Kenya's dense forests and mountainous terrain, making it difficult for the British security forces to track them down.
In response, the British authorities declared a state of emergency, and troops were dispatched to quell the rebellion. The colonial administration launched a harsh crackdown on suspected Mau Mau sympathizers, leading to widespread arrests, detentions, and extrajudicial killings.
The violence and brutality on both sides escalated, resulting in a cycle of retribution and vengeance. Despite the heavy-handed response by the British authorities, the Mau Mau movement gained popular support among the African population, who saw it as a symbol of resistance against colonial oppression.
In 1960, the British government conceded to pressure and initiated talks with African leaders, leading to the Lancaster House Conference in London. These negotiations paved the way for Kenya's independence, which was achieved on December 12, 1963.
The Mau Mau Uprising was a watershed moment in Kenya's history, marking the end of British colonial rule and the beginning of a new era of self-determination.
Photo credits - JayRiley254, Dr Stima, Jerome KL, Rotsee2 @wikicommons Royal Prince Media, Kureng Workx, Engin Akyurt, Antony Trivet @ pexels.com
42
views
Nikola Tesla: His Genius and His Eccentricities
Nikola Tesla, born on July 10, 1856, in Smiljan, Croatia, was a visionary inventor, electrical engineer, and futurist who left an indelible mark on the world of science and technology. He is often referred to as the "father of the electrical age" and is renowned for his numerous groundbreaking inventions and innovations.
Tesla's early life showed signs of his remarkable intellect and curiosity. After studying engineering in Graz and Prague, Tesla moved to Budapest and later to Paris, where he worked for the Continental Edison Company.
In 1884, Tesla moved to the United States to work for Thomas Edison, a prominent inventor and entrepreneur. He quickly proved himself as an ingenious electrical engineer and made significant improvements to Edison's direct current (DC) systems. However, a fundamental difference in their approaches to electricity, with Tesla advocating for alternating current (AC), led to a falling out between the two inventors.
Tesla's development of the AC system revolutionized the generation and transmission of electricity. His AC motor design, patented in 1888, laid the foundation for the modern electrical power distribution systems used worldwide. The AC system allowed for the transmission of electrical power over long distances more efficiently than the DC system, making electricity accessible to a broader population.
Among Tesla's most iconic inventions is the Tesla coil, patented in 1891. The Tesla coil is a high-voltage resonant transformer that allowed for the production of high-frequency and high-voltage electricity. It led to the creation of impressive electrical displays, commonly known as "Tesla coils," that are still used today in educational demonstrations and entertainment shows.
Another extraordinary invention of Tesla's was the wireless transmission of electricity. He envisioned a world where electrical power could be transmitted through the air without the need for cables. Although he successfully demonstrated wireless power transfer over short distances, his ambitious project of building a global communication and power transmission tower known as the Wardenclyffe Tower faced financial challenges and was never fully realized.
Throughout his life, Tesla was known for his eccentric behavior and peculiar habits. He had an intense fear of germs and was meticulous about cleanliness. He was also known to work long hours without rest, sometimes even forgoing meals.
Despite his brilliance, Tesla struggled with financial difficulties, and much of his later life was spent in relative obscurity and poverty. He continued to work on various projects and ideas, but many of them remained unrealized. Tesla died on January 7, 1943, in New York City at the age of 86.
In the decades following his death, Nikola Tesla's contributions to science and technology received renewed recognition and admiration. His vision of a world powered by clean and accessible electricity has become a reality, with AC power grids providing energy to homes and industries worldwide.
Tesla's legacy continues to inspire generations of scientists, engineers, inventors and business people.
Photo Credits - Fizped, Lošmi, Nikolatesla1855, Zátonyi Sándor, Duncan.Hull, Ermell, Chetvorno, Senator88 @ wikicommons
Killian Eon, Pixabay, Ricky Esquivel, Magda Ehlers, Dmitry Marchenkov, Murry Lee, cottonbro studio, jeff mathew, Pueblo Santo films LTD @ pexels.com
142
views