Category: FYI

FYI

FYI July 29, 2017


1864 – American Civil War: Confederate spy Belle Boyd is arrested by Union troops and detained at the Old Capitol Prison in Washington, D.C.
Isabella Maria Boyd (May 4, 1844[1] – June 11, 1900[2]), best known as Belle Boyd, as well as Cleopatra of the Secession and Siren of the Shenandoah, was a Confederate spy in the American Civil War. She operated from her father’s hotel in Front Royal, Virginia, and provided valuable information to Confederate General Stonewall Jackson in 1862.

Early life
Isabella Maria Boyd was born on May 4, 1844, in Martinsburg, Virginia (now part of West Virginia). She was the eldest child of Benjamin Reed and Mary Rebecca (Glenn) Boyd. Boyd would describe her childhood as idyllic, living a care-free life, of a reckless tomboy, who climbed trees, raced through the woods, and dominated brothers, sisters, and cousins. Despite her family’s lack of money, Boyd received a good education. After some preliminary schooling, she attended the Mount Washington Female College in Baltimore, Maryland.
Southern spy

Boyd’s espionage career began by chance. According to her 1866 account, on July 4, 1861, a band of Union army soldiers heard she had Confederate flags in her room, and they came to investigate. They hung a Union flag outside her home. This made her angry enough, but when one of them cursed at her mother, she was enraged. Boyd pulled out a pistol and shot and killed the man. A board of inquiry exonerated her, but sentries were posted around the house and officers kept close track of her activities. She profited from this enforced familiarity, charming at least one of the officers, Captain Daniel Keily,[3][4] “To him,” she wrote later, “I am indebted for some very remarkable effusions, some withered flowers, and a great deal of important information.”[5] Boyd conveyed those secrets to Confederate officers via her slave, Eliza Hopewell, who carried the messages in a hollowed-out watch case. On her first attempt at spying, she was caught and told she could be sentenced to death, but was not. She was not scared and realized she needed to find a better way to communicate.[6]

One evening in mid-May 1862, Union Army General James Shields and his staff gathered in the parlor of the local hotel. Boyd hid in the closet in the room, eavesdropping through a knothole she enlarged in the door. She learned that Shields had been ordered east from Front Royal, Virginia. That night, Boyd rode through Union lines, using false papers to bluff her way past the sentries, and reported the news to Colonel Turner Ashby, who was scouting for the Confederates. She then returned to town. When the Confederates advanced on Front Royal on May 23, Boyd ran to greet Stonewall Jackson’s men, avoiding enemy fire that put bullet holes in her skirt. She urged an officer to inform Jackson that “the Yankee force is very small. Tell him to charge right down and he will catch them all.” Jackson did and that evening penned a note of gratitude to her: “I thank you, for myself and for the army, for the immense service that you have rendered your country today.” For her contributions, she was awarded the Southern Cross of Honor. Jackson also gave her captain and honorary aide-de-camp positions.[7]

After her lover gave her up, Belle Boyd was arrested for the first time on July 29, 1862, and brought to the Old Capitol Prison in Washington, D.C., the next day.[8] An inquiry was held on August 7, 1862, concerning violations of orders that Boyd be kept in close custody.[9] Boyd was held for a month before being released on August 29, 1862, when she was exchanged at Fort Monroe.[10] She was arrested again in June 1863, but was released after contracting typhoid fever.[11]

In March 1864, she attempted to travel to England, where she was intercepted by a Union blockade and sent to Canada.[12] There she met Union naval officer, Samuel Wylde Hardinge. The two later married in England.[13] The two had one child, a daughter, and Boyd became an actress in England after her husband’s death to support her daughter. Following the death of her husband in 1866, she returned to the United States on November 11, 1869. She married John Swainston Hammond in New Orleans. After a divorce in 1884, Boyd married Nathaniel Rue High in 1885. A year later, she began touring the country giving dramatic lectures of her life as a Civil War spy.[citation needed]

Post-War years and death

Boyd published a highly fictionalized narrative of her war experiences in a two volume book titled Bell Boyd in Camp and Prison.[14] While touring the United States (she had gone to address members of a GAR post), she died of a heart attack in Kilbourne City (now known as Wisconsin Dells), Wisconsin, on June 11, 1900. She was 56 years old. She was buried in the Spring Grove Cemetery in Wisconsin Dells, with members of the Local GAR as her pallbearers.[15] For years, her grave simply read:

BELLE BOYD
CONFEDERATE SPY
BORN IN VIRGINIA
DIED IN WISCONSIN AND WAS BURIED IN SPRING GROVE CEMETERY
ERECTED BY A COMRADE[16]

In pop culture
The Smiling Rebel is Harnett Kane’s 1955 novel about Belle Boyd.[17]

Her bullet-riddled handbag was the featured artifact on an episode of Legends of the Hidden Temple.

Belle Boyd is a main character in Cherie Priest’s 2010 steampunk novel Clementine.

She Wouldn’t Surrender is James Kendricks’ 1960 novel for Monarch Books about Belle Boyd.

 
 
 
 


1914 – Irwin Corey, American actor and activist (d. 2017)
“Professor” Irwin Corey (July 29, 1914 – February 6, 2017) was an American stand-up comic, film actor and activist, often billed as The World’s Foremost Authority. He introduced his unscripted, improvisational style of stand-up comedy at the San Francisco club, the hungry i. Lenny Bruce described Corey as “one of the most brilliant comedians of all time”.[2]

Biography
Corey was born on July 29, 1914 in Brooklyn, New York.[3] Poverty-stricken after his father deserted the family, his mother was forced to place him and his five siblings in the Hebrew Orphan Asylum of New York,[4] where Corey remained until his early teens. He then rode in boxcars out to California, and enrolled himself at Belmont High School in Los Angeles.[3] During the Great Depression he worked for the Civilian Conservation Corps and, while working his way back East, became a featherweight Golden Gloves boxing champion.[5]

Corey supported Communist/Socialist left-wing politics.”[6] He appeared in support of Cuban children, Mumia Abu-Jamal, and the American Communist Party, and was blacklisted in the 1950s, the effects of which he stated lingered throughout his life. (Corey never returned to Late Night with David Letterman after his first appearance in 1982, which he claimed was a result of the blacklist still being in effect.[7]) During the 1960 election, Corey campaigned for president on Hugh Hefner’s Playboy ticket.[6] During the 2016 Democratic Party presidential primaries, Corey endorsed Vermont United States Senator Bernie Sanders for the nomination and presidency.[8] Corey was a frequent guest on The Tonight Show hosted by Johnny Carson during the late 1960s and early 1970s.[5]

When the publicity-shy Thomas Pynchon won the National Book Award Fiction Citation for Gravity’s Rainbow in 1974, he asked Corey to accept it on his behalf.[9] The New York Times described the resulting speech as “…a series of bad jokes and mangled syntax which left some people roaring with laughter and others perplexed.”[9]

In the Robert A. Heinlein science fiction novel Friday, the eponymous heroine says

At one time there really was a man known as “the World’s Greatest Authority.” I ran across him in trying to nail down one of the many silly questions that kept coming at me from odd sources. Like this: Set your terminal to “research.” Punch parameters in succession “North American culture,” “English-speaking,” “mid-twentieth century,” “comedians,” “the World’s Greatest Authority.” The answer you can expect is “Professor Irwin Corey.” You’ll find his routines timeless humor.[10]

For an October 2011 interview,[3] Corey invited a New York Times reporter to visit his 1840 carriage house on East 36th Street. Corey estimated its resale value at $3.5 million. He said that, when not performing, he panhandled for change from motorists exiting the Queens–Midtown Tunnel. Every few months, he told the interviewer, he donated the money to a group that purchased medical supplies for Cuban children. He said of the drivers who supplied the cash, “I don’t tell them where the money’s going, and I’m sure they don’t care.” Irvin Arthur, Corey’s agent for half a century, assured the reporter that Corey did not need the money for himself. “This is not about money,” Arthur said. “For Irwin, this is an extension of his performing.”[3] In his memoir, Phoning Home, Jacob M. Appel cites a personal encounter with Corey on a street in New York City as the basis for his novel, The Man Who Wouldn’t Stand Up.[11]

Career
Comedy

In 1938 Corey returned to New York, where he got a job writing and performing in Pins and Needles, a musical comedy revue about a union organizer in the “garment district”.[12] He claimed that he was fired from this job for his union organizing activities. Five years later he was working in New Faces of 1943 and appearing at the Village Vanguard, doing his stand-up comedy routine. He was drafted during World War II, but was discharged after six months, after he claimed he convinced an Army psychiatrist that he was a homosexual.[13]

From the late 1940s he cultivated his “Professor” character. Dressed in seedy formal wear and sneakers, with his bushy hair sprouting in all directions, Corey would amble on stage in a preoccupied manner, then begin his monologue with “However …” He created a new style of double-talk comedy; instead of making up nonsense words like “krelman” and “trilloweg”, like double-talker Al Kelly, the Professor would season his speech with many long and florid, but authentic, words.[12] The professor would then launch into observations about anything under the sun, but seldom actually making sense.

However … we all know that protocol takes precedence over procedures. This parliamentary point of order based on the state of inertia of developing a centrifugal force issued as a catalyst rather than as a catalytic agent, and hastens a change reaction and remains an indigenous brier to its inception.[12] This is a focal point used as a tangent so the bile is excreted through the panaceas.[4]

Changing topics suddenly, he would wander around the stage, pontificating all the while. His quick wit allowed him to hold his own against the most stubborn straight man, heckler or interviewer. One fan of Corey’s comedy, despite their radically different politics, was Ayn Rand.[14] Theatre critic Kenneth Tynan wrote of the Professor in The New Yorker, “Corey is a cultural clown, a parody of literacy, a travesty of all that our civilization holds dear, and one of the funniest grotesques in America. He is Chaplin’s tramp with a college education”.[15]

In 1975, Corey gave a typically long-winded, nonsensical performance in New York City for journalists waiting for the Rolling Stones to announce the band’s 1975 Tour of the Americas. The press was still listening to Corey ramble on when they finally noticed that the Stones were playing “Brown Sugar” on a flatbed truck driving down Fifth Avenue.
Broadway

In 1951 Corey appeared as “Abou Ben Atom”, the Genie, in the cult flop Broadway musical Flahooley along with Yma Sumac, the Bil and Cora Baird Marionettes and Barbara Cook (in her Broadway debut). Corey’s performance of “Springtime Cometh” can be heard on the show’s original cast album.[12]

Film and television
Corey appeared occasionally in 1950s television as a character actor. He appeared in an episode of The Phil Silvers Show titled “Bilko’s Grand Hotel”, in which Corey plays an unkempt Bowery bum being passed off as a hotelier by Sgt. Bilko.[12] The Professor was a frequent guest comic on variety shows and a guest panelist on game shows during the 1960s and 1970s.[16]

Corey became so synonymous with comic erudition that, when a Providence, Rhode Island television station, WJAR-TV, wanted a spokesman to explain changes in network affiliations,[when?] Corey got the job. Lecturing with pointer in hand, Corey manipulated magnetic signs to demonstrate how television schedules would be disrupted. By the end of the commercial, the visual aids were in shambles and the professor had meandered from his original point. Corey would do the same promos for WTMJ-TV in Milwaukee in 1977 during the time that rival stations WITI and WISN-TV switched affiliations.[12]

Corey often appeared on The Steve Allen Show (1962–1964), whereupon he would end his rambling stand-up routine with Allen and stage hands literally chasing Corey with a giant butterfly net.[12] He later guest starred on the The Donald O’Connor Show in 1968.[17] Corey appeared in various Broadway productions, including as a gravedigger in a production of Hamlet.[12]

In 2009, filmmaker Jordan Stone began filming what was to become an award-winning documentary film called “Irwin & Fran”.[18] Political activist and fellow stand-up comedian Dick Gregory shared some in-depth and provocative memories and Academy Award-winning actress Susan Sarandon narrated in a very personal tone.[18] The film won The People’s Film Festival Best Film Award in 2013.[18]

Personal life and death
Corey was married for 70 years to Frances Berman Corey, who died in May 2011.[3] The couple had two children, the late Margaret Davis, née Corey, an actress, and Richard Corey, a painter. Their two grandsons are Amadeo Corey and Corey Meister.[19][20] Irwin Corey turned 100 in July 2014.[21]

Corey died at the age of 102 on February 6, 2017 at his apartment in Manhattan with his son Richard at his side.[22]

Filmography
How to Commit Marriage (1969) – The Baba Ziba [23]
Fore Play (1975) – Professor Irwin Corey[23]
Car Wash (1976) – The Mad Bomber [23]
Thieves (1977) – Joe Kaminsky (reprising his stage role)[23]
Chatterbox! (1977) – Himself[23]
Fairy Tales (1979) – Dr. Eyes[23]
The Comeback Trail (1982) – Himself[23]
Stuck on You! (1982) – Judge Gabriel [23]
Crackers (1984) – Lazzarelli[23]
That’s Adequate (1989) – D.W. Godilla[23]
Jack (1996) – Poppy[23]
I’m Not Rappaport (1996) – Sol[23]
The Curse of the Jade Scorpion (2001) – Charlie[23]
Irwin & Fran (2013) – Himself[18]

 
 
 
 

 
 
 
 
John O’Nolan: Announcing Ghost 1.0
John is the founder and lead designer at Ghost. He works hard behind the scenes pushing pixels, writing code and keeping Ghost users happy. Say hi
 
 
 
 
George Dvorsky: FDA Considering Forcing Companies to Reduce Nicotine in Cigarettes to Non-Addictive Levels
 
 
 
 
By Elizabeth Van Flandern: Meet the Woman Behind New York’s 1800s School For Crooks
 
 
 
 
By Elizabeth Van Flandern: Meet the Other Legendary Female Aviator (who Could Drink Any Sailor Under the Table)
 
 
 
 
By Air Force Senior Airman Ramon A. Adelan, 407th Air Expeditionary Group: Face of Defense: A-10 Pilot Spits Fire in Fight Against ISIS
 
 
 
 

By Yolanda R. Arrington: The Future is Now: Inside the Navy’s Future Force Science & Tech Expo
 
 
 
 
by Dan Colman: Google Launches Free Course on Deep Learning: The Science of Teaching Computers How to Teach Themselves
 
 
 
 

 
 
 
 

 
 
 
 

Widget not in any sidebars

 
 
 
 

Widget not in any sidebars

 
 
 
 

Widget not in any sidebars

 
 
 
 

Widget not in any sidebars

FYI July 28, 2017


1866 – At the age of 18, Vinnie Ream becomes the first and youngest female artist to receive a commission from the United States government for a statue (of Abraham Lincoln).
Lavinia Ellen “Vinnie” Ream Hoxie (September 25, 1847 – November 20, 1914) was an American sculptor. Her most famous work is the statue of Abraham Lincoln in the U.S. Capitol rotunda.[1]


Early life

Ream was born September 25, 1847, in a log cabin in Madison, Wisconsin, as Lavinia Ellen Ream. She was the youngest daughter of Lavinia and Robert Ream. Robert Ream was a surveyor and a Wisconsin Territory civil servant. Her mother was a McDonald of Scottish ancestry. The Reams also operated a stage coach stop, one of the first hotels in Madison, from their home. Guests slept on the floor.

Her brother Robert Ream enlisted in the Confederate army, in Arkansas, serving in Woodruff’s battery.[2]

Vinnie Ream attended Christian College in Columbia, Missouri, now known as Columbia College. A portrait of Martha Washington by Ream hangs in St. Clair Hall.[3][4]


Career

In 1861, her family moved to Washington, D.C. After her father’s health began to fail, she began working outside the home to support her family.[5] Vinnie Ream was one of the first women to be employed by the federal government, as a clerk in the dead letter office of the United States Post Office from 1862 to 1866 during the American Civil War. She sang at the E Street Baptist Church, and for the wounded at Washington, D.C. hospitals.[6] She collected materials for the Grand Sanitary Commission.[7]

In 1863, James S. Rollins introduced Ream to sculptor Clark Mills.[8] She became an apprentice in Mills’s sculpting studio the next year, at the age of seventeen.[5] In 1864, President Lincoln agreed to model for her in the morning for five months, and she created a bust of his figure.[3] During this time, Ream also began intense public relations efforts, selling photographs of herself and soliciting newspaper attention as a marketing strategy.[5]

Vinnie Ream was the youngest artist and first woman to receive a commission as an artist from the United States government for a statue. She was awarded the commission for the full-size Carrara marble statue of Lincoln by a vote of Congress on July 28, 1866, when she was 18 years old.[9] She had used her previous bust of Lincoln as her entry into the selection contest for the full-size sculpture. There was significant debate over her selection as the sculptor, however, because of concern over her inexperience and the slanderous accusations that she was a “lobbyist”, or a public woman of questionable reputation. She was notorious for her beauty and her conversational skills, which likely contributed to these accusations.[5] She worked in a studio in Room A of the basement of the Capitol.[10]

Senator Edmund G. Ross boarded with Ream’s family during the impeachment of Andrew Johnson.[11] Ross cast the decisive vote against the removal of President Johnson from office, and Ream was accused of influencing his vote. She was almost thrown out of the Capitol with her unfinished Lincoln statue, but the intervention of powerful New York sculptors prevented it.[5] Once the U.S. government had approved the plaster model, Ream traveled to Paris, Munich, Florence, then Rome, to produce a finished marble figure.[3][5] She studied with Léon Bonnat in Paris, also producing busts of Gustave Doré, Père Hyacynthe, Franz Liszt, and Giacomo Antonelli.[12] Her studio in Rome was at 45 Via de San Basile.[13] She met Georg Brandes at that time.[14][15] While in Rome, she faced controversial rumors that claimed that it was the Italian workmen and not Ream who were responsible for her successful sculpture of Lincoln.[5]

When the statue was complete, Ream returned to Washington. On January 25, 1871, her white marble statue of President Abraham Lincoln was unveiled in the United States Capitol rotunda, when Ream was only 23 years old.[16][17] She later opened a studio at 704 Broadway, New York.[18] In 1871, she exhibited at the American Institution Fair.[19][20]

She returned to Washington and opened a studio and salon at 235 Pennsylvania Avenue.[21] She was unsuccessful in her entry in the Thomas statue competition.[22] In 1875, George Armstrong Custer sat for a portrait bust.[23] In 1876, she exhibited at the Centennial Exposition.[24] In November 1877, she produced a model for a Lee statue in Richmond.[25] After lobbying William Tecumseh Sherman and Mrs. Farragut, she won a competition to sculpt Admiral David G. Farragut (Ream statue). Her sculpture, located at Farragut Square, Washington, D.C., was unveiled on May 28, 1878.[26] It was cast in the Washington Navy Yard.[27]

Ream married Richard L. Hoxie, of the U.S. Army Corps of Engineers, on May 28, 1878.[28][29] They had one son. Her husband was reassigned to Montgomery, Alabama, and Saint Paul, Minnesota. Finally, the Hoxies lived at 1632 K Street near Farragut Square,[30] and had a summer home at 310 South Lucas Street, Iowa City, Iowa.[31][32] Vinnie played the harp for entertainment.[26]

Her marbles, America, The West, and Miriam, were exhibited at the 1893 World’s Columbian Exposition.[33] Ream designed the first free-standing statue of a Native American, Sequoyah, to be placed in Statuary Hall at the Capitol.

She died on November 20, 1914.[17] Vinnie Ream Hoxie and her husband are buried in section three of Arlington National Cemetery, marked by her statue Sappho.[34]

Works
Sappho 1865–1870
Thaddeus Stevens 1865
America 1870
The West 1870?
Miriam 1870?
Abraham Lincoln 1871
Abraham Lincoln ca. 1870–1874
Admiral David G. Farragut (Ream statue) 1881
Edwin B. Hay 1902–06
Samuel Jordan Kirkwood 1906
Sequoyah 1912–1914

Legacy

A first-day cover stamp was issued in honor of Vinnie Ream and her work on the statue of Sequoyah, the Native American inventor of the Cherokee alphabet.

George Caleb Bingham painted her portrait twice.[35]

The town of Vinita, Oklahoma, was named in honor of Vinnie Ream.[36]

More on wiki:

 
 
 
 


1925 – Baruch Samuel Blumberg, American physician and academic, Nobel Prize laureate (d. 2011)
Baruch Samuel Blumberg (July 28, 1925 – April 5, 2011) — known as Barry Blumberg — was an American physician, geneticist, and co-recipient of the 1976 Nobel Prize in Physiology or Medicine (with Daniel Carleton Gajdusek), for his work on the hepatitis B virus while an investigator at the NIH.[2] He was President of the American Philosophical Society from 2005 until his death.

Blumberg received the Nobel Prize for “discoveries concerning new mechanisms for the origin and dissemination of infectious diseases.” Blumberg identified the hepatitis B virus, and later developed its diagnostic test and vaccine.[2][3]

Biography
Early life and education

Blumberg was born in Brooklyn, New York, the son of Ida (Simonoff) and Meyer Blumberg, a lawyer.[4][5] He first attended the Orthodox Yeshivah of Flatbush for elementary school, where he learned to read and write in Hebrew, and to study the Bible and Jewish texts in their original language. (That school also had among its students a contemporary of Blumberg, Eric Kandel, who is another recipient of the Nobel Prize in medicine.) Blumberg then attended Brooklyn’s James Madison High School, a school that Blumberg described as having high academic standards, including many teachers with Ph.Ds.[6] After moving to Far Rockaway, Queens, he transferred to Far Rockaway High School in the early 1940s, a school that also produced fellow laureates Burton Richter and Richard Feynman.[7] Blumberg served as a U.S. Navy deck officer during World War II.[2] He then attended Union College in Schenectady, New York and graduated from there with honors in 1946.[8]

Originally entering the graduate program in mathematics at Columbia University, Blumberg switched to medicine and enrolled at Columbia’s College of Physicians and Surgeons, from which he received his M.D. in 1951. He remained at Columbia Presbyterian Medical Center for the next four years, first as an intern and then as a resident. He then began graduate work in biochemistry at Balliol College, Oxford and earned his Ph.D there in 1957, eventually becoming the first American to be master there.[9]

Scientific career
Throughout the 1950s, Blumberg traveled the world taking human blood samples, to study the genetic variations in human beings, focusing on the question of why some people contract a disease in a given environment, while others do not. In 1964, while studying “yellow jaundice” (hepatitis), he discovered a surface antigen for hepatitis B in the blood of an Australian aborigine.[10] His work later demonstrated that the virus could cause liver cancer.[11] Blumberg and his team were able to develop a screening test for the hepatitis B virus, to prevent its spread in blood donations, and developed a vaccine. Blumberg later freely distributed his vaccine patent in order to promote its distribution by drug companies. Deployment of the vaccine reduced the infection rate of hepatitis B in children in China from 15% to 1% in 10 years.[12]

Blumberg became a member of the Institute of Cancer Research (ICR) of the Lankenau Hospital Research Institute in Philadelphia in 1964, which later joined the Fox Chase Cancer Center in 1974, and he held the rank of University Professor of Medicine and Anthropology at the University of Pennsylvania starting in 1977. Concurrently, he was Master of Balliol College from 1989 to 1994. He was elected a Fellow of the American Academy of Arts and Sciences in 1994.[13] From 1999 to 2002, he was also director of the NASA Astrobiology Institute at the Ames Research Center in Moffett Field, California.[14][15][16]

In 2001, Blumberg was named to the Library of Congress Scholars Council, a body of distinguished scholars that advises the Librarian of Congress. Blumberg served on the Council until his death.[17]

In November 2004, Blumberg was named Chairman of the Scientific Advisory Board of United Therapeutics Corporation,[18] a position he held until his death. As Chairman, he convened three “Conference[s] on Nanomedical and Telemedical Technology”,[19] as well as guiding the biotechnology company in the development of a broad-spectrum anti-viral medicine.

Beginning in 2005, Blumberg also served as the President of the American Philosophical Society. He had first been elected to membership in the society in 1986.[20]

In October 2010, Blumberg participated in the USA Science and Engineering Festival’s Lunch with a Laureate program, in which middle and high school students of the Greater Washington D.C., Northern Virginia and Maryland area got to engage in an informal conversation with a Nobel Prize–winning scientist over a brown-bag lunch.[21]

In an interview with the New York Times in 2002 he stated that “[Saving lives] is what drew me to medicine. There is, in Jewish thought, this idea that if you save a single life, you save the whole world”.[22]

In discussing the factors that influenced his life, Blumberg always gave credit to the mental discipline of the Jewish Talmud, and as often as possible, he attended weekly Talmud discussion classes until his death.[23]

Death
Blumberg died on April 5, 2011,[1] shortly after giving the keynote speech at the International Lunar Research Park Exploratory Workshop held at NASA Ames Research Center.[24] At the time of his death Blumberg was a Distinguished Scientist at the NASA Lunar Science Institute, located at the NASA Ames Research Center in Moffett Field, California.[25][26]

Jonathan Chernoff, the scientific director at the Fox Chase Cancer Center where Blumberg spent most of his working life said, “I think it’s fair to say that Barry prevented more cancer deaths than any person who’s ever lived.”[27] In reference to Blumberg’s discovery of the Hepatitis B vaccine, former NASA administrator Daniel Goldin said, “Our planet is an improved place as a result of Barry’s few short days in residence.”[28][29][30]

In 2011, the Library of Congress and National Aeronautics and Space Administration (NASA) announced the establishment of the Baruch S. Blumberg NASA/Library of Congress Chair in Astrobiology, a research position housed within the Library’s John W. Kluge Center, which explores the effects of astrobiology research on society. The Chair was named for Blumberg in recognition of his service to the Library of Congress Scholars Council, and his commitment to “research and dialogue between disciplines.” [31]

In 2011, in recognition of Blumberg’s long professional and personal association with the Department of Biochemistry and the Glycobiology Institute, Oxford University established the Baruch Blumberg Professorship in Virology.

Manuscript Collection
The Baruch S. Blumberg papers are held at the American Philosophical Society in Philadelphia, PA. The collection contains 458 linear feet of materials documenting the life and career of Dr. Blumberg.

 
 
 
 

By Rhett Jones: A Microsoft Font Really Did Take Pakistan’s Prime Minister Down
 
 
 
 
By Michael Waters: A Look Back at the Desegregation of the U.S. Military
 
 
 
 
By Luke Spencer: The Ghost Villages of Newfoundland
 
 
 
 
By Brandon Katz: Musician Michael Johnson Has Died at 72
 
 
 
 
via Josh Jones: People Who Swear Are More Honest Than Those Who Don’t, Finds a New University Study
“When used judiciously,”
As to the question of whether swearing betrays a lack of education and an impoverished vocabulary, we might turn to linguist, psychologist, and neuroscientist Steven Pinker, who has made a learned defense of foul language, in drily humorous talks, books, and essays. “When used judiciously,” he writes in a 2008 Harvard Brain article, “swearing can be hilarious, poignant, and uncannily descriptive.” His is an argument that relies not only on data but on philosophical reflection and literary appreciation. “It’s a fact of life that people swear,” he says, and so, it’s a fact of art. Shakespeare invented dozens of swears and was never afraid to work blue. Perhaps that’s why we find his representations of humanity so perennially honest.
 
 
 
 
By Kalen Bruce: 5 Real Ways to Actually Make Money Online
 
 
 
 
By Jennie Yabroff: Svetlana Alexievich Gives a Voice to the Women Who Served in WWII
“The idea that the women’s experiences might be of… greater interest than the men’s was downright heresy.”
 
 
 
 

Robina Asti has always loved flying. She was a commercial pilot and flight instructor, and flew for the Navy in World War II. At 92, Asti tells her story of living as a transgender woman since 1976, and her fight against the Social Security Administration (SSA) in the U.S. to be treated like any other widow. Find out how Asti’s case, taken up by Lambda Legal, succeeded in changing policy at the SSA.
 
 
 
 

Pursuit (4K) from Mike Olbinski on Vimeo.

 
 
 
 
Track Trailer
 
 
Track Trailer Blog

 
 
 
 

Widget not in any sidebars

 
 
 
 

Widget not in any sidebars

 
 
 
 

Widget not in any sidebars

 
 
 
 

Widget not in any sidebars

FYI July 27, 2017


1689 – Glorious Revolution: The Battle of Killiecrankie ends.
The Glorious Revolution, also called the Revolution of 1688, was the overthrow of King James II of England (James VII of Scotland) by a union of English Parliamentarians with the Dutch stadtholder William III, Prince of Orange. William’s successful invasion of England with a Dutch fleet and army led to his ascension to the throne as William III of England jointly with his wife, Mary II, James’s daughter, after the Declaration of Right, leading to the Bill of Rights 1689.

King James’s policies of religious tolerance after 1685 met with increasing opposition from members of leading political circles, who were troubled by the king’s Catholicism and his close ties with France. The crisis facing the king came to a head in 1688, with the birth of the king’s son, James Francis Edward Stuart, on 10 June (Julian calendar).[a] This changed the existing line of succession by displacing the heir presumptive (his daughter Mary, a Protestant and the wife of William of Orange) with young James Francis Edward as heir apparent. The establishment of a Roman Catholic dynasty in the kingdoms now seemed likely. Some Tory members of parliament worked with members of the opposition Whigs in an attempt to resolve the crisis by secretly initiating dialogue with William of Orange to come to England, outside the jurisdiction of the English Parliament.[1] Stadtholder William, the de facto head of state of the Dutch United Provinces, feared a Catholic Anglo–French alliance and had already been planning a military intervention in England.

After consolidating political and financial support, William crossed the North Sea and English Channel with a large invasion fleet in November 1688, landing at Torbay. After only two minor clashes between the two opposing armies in England, and anti-Catholic riots in several towns, James’s regime collapsed, largely because of a lack of resolve shown by the king. However, this was followed by the protracted Williamite War in Ireland and Dundee’s rising in Scotland.[b] In England’s distant American colonies, the revolution led to the collapse of the Dominion of New England and the overthrow of the Province of Maryland’s government. Following a defeat of his forces at the Battle of Reading on 9 December, James and his wife Mary fled England; James, however, returned to London for a two-week period that culminated in his final departure for France on 23 December. By threatening to withdraw his troops, William in February 1689 (New Style Julian calendar)[a] convinced a newly chosen Convention Parliament to make him and his wife joint monarchs.

The Revolution permanently ended any chance of Catholicism becoming re-established in England. For British Catholics its effects were disastrous both socially and politically: Catholics were denied the right to vote and sit in the Westminster Parliament for over a century; they were also denied commissions in the army, and the monarch was forbidden to be Catholic or to marry a Catholic, this latter prohibition remaining in force until 2015. The Revolution led to limited tolerance for Nonconformist Protestants, although it would be some time before they had full political rights. It has been argued, mainly by Whig historians, that James’s overthrow began modern English parliamentary democracy: the Bill of Rights 1689 has become one of the most important documents in the political history of Britain and never since has the monarch held absolute power.

Internationally, the Revolution was related to the War of the Grand Alliance on mainland Europe. It has been seen as the last successful invasion of England.[2] It ended all attempts by England in the Anglo-Dutch Wars of the 17th century to subdue the Dutch Republic by military force. However, the resulting economic integration and military co-operation between the English and Dutch navies shifted the dominance in world trade from the Dutch Republic to England and later to Great Britain.

The expression “Glorious Revolution” was first used by John Hampden in late 1689,[3] and is an expression that is still used by the British Parliament.[4] The Glorious Revolution is also occasionally termed the Bloodless Revolution, albeit inaccurately. The English Civil War (also known as the Great Rebellion) was still within living memory for most of the major English participants in the events of 1688, and for them, in comparison to that war (or even the Monmouth Rebellion of 1685) the deaths in the conflict of 1688 were mercifully few.

More on wiki:

 
 
 
 


1667 – Johann Bernoulli, Swiss mathematician and academic (d. 1748)
Johann Bernoulli (also known as Jean or John; 6 August [O.S. 27 July] 1667 – 1 January 1748) was a Swiss mathematician and was one of the many prominent mathematicians in the Bernoulli family. He is known for his contributions to infinitesimal calculus and educating Leonhard Euler in the pupil’s youth.

Early life and education
Johann was born in Basel, the son of Nicolaus Bernoulli, an apothecary, and his wife, Margaretha Schonauer and began studying medicine at Basel University. His father desired that he study business so that he might take over the family spice trade, but Johann Bernoulli did not like business and convinced his father to allow him to study medicine instead. However, Johann Bernoulli did not enjoy medicine either and began studying mathematics on the side with his older brother Jacob.[2] Throughout Johann Bernoulli’s education at Basel University the Bernoulli brothers worked together spending much of their time studying the newly discovered infinitesimal calculus. They were among the first mathematicians to not only study and understand calculus but to apply it to various problems.[3]

Adult life
After graduating from Basel University Johann Bernoulli moved to teach differential equations. Later, in 1694, he married Dorothea Falkner and soon after accepted a position as the professor of mathematics at the University of Groningen. At the request of Johann Bernoulli’s father-in-law, Johann Bernoulli began the voyage back to his home town of Basel in 1705. Just after setting out on the journey he learned of his brother’s death to tuberculosis. Johann Bernoulli had planned on becoming the professor of Greek at Basel University upon returning but instead was able to take over as professor of mathematics, his older brother’s former position. As a student of Leibniz’s calculus, Johann Bernoulli sided with him in 1713 in the Newton–Leibniz debate over who deserved credit for the discovery of calculus. Johann Bernoulli defended Leibniz by showing that he had solved certain problems with his methods that Newton had failed to solve. Johann Bernoulli also promoted Descartes’ vortex theory over Newton’s theory of gravitation. This ultimately delayed acceptance of Newton’s theory in continental Europe.[4]

In 1724 he entered a competition sponsored by the French Académie Royale des Sciences, which posed the question:

What are the laws according to which a perfectly hard body, put into motion, moves another body of the same nature either at rest or in motion, and which it encounters either in a vacuum or in a plenum?

In defending a view previously espoused by Leibniz he found himself postulating an infinite external force required to make the body elastic by overcoming the infinite internal force making the body hard. In consequence he was disqualified for the prize, which was won by Maclaurin. However, Bernoulli’s paper was subsequently accepted in 1726 when the Académie considered papers regarding elastic bodies, for which the prize was awarded to Pierre Mazière. Bernoulli received an honourable mention in both competitions.

Private life
Although Jacob and Johann worked together before Johann graduated from Basel University, shortly after this, the two developed a jealous and competitive relationship. Johann was jealous of Jacob’s position and the two often attempted to outdo each other. After Jacob’s death Johann’s jealousy shifted toward his own talented son, Daniel. In 1738 the father–son duo nearly simultaneously published separate works on hydrodynamics. Johann Bernoulli attempted to take precedence over his son by purposely and falsely predating his work two years prior to his son’s.[5][6]

Johann married Dorothea Falkner, daughter of an Alderman of Basel. He was the father of Nicolaus II Bernoulli, Daniel Bernoulli and Johann II Bernoulli and uncle of Nicolaus I Bernoulli.

The Bernoulli brothers often worked on the same problems, but not without friction. Their most bitter dispute concerned finding the equation for the path followed by a particle from one point to another in the shortest time, if the particle is acted upon by gravity alone, a problem originally discussed by Galileo. In 1697 Jacob offered a reward for its solution. Accepting the challenge, Johann proposed the cycloid, the path of a point on a moving wheel, pointing out at the same time the relation this curve bears to the path described by a ray of light passing through strata of variable density. A protracted, bitter dispute then arose when Jacob challenged the solution and proposed his own. The dispute marked the origin of a new discipline, the calculus of variations.

L’Hôpital controversy
Bernoulli was hired by Guillaume de l’Hôpital for tutoring in mathematics. Bernoulli and l’Hôpital signed a contract which gave l’Hôpital the right to use Bernoulli’s discoveries as he pleased. L’Hôpital authored the first textbook on infinitesimal calculus, Analyse des Infiniment Petits pour l’Intelligence des Lignes Courbes in 1696, which mainly consisted of the work of Bernoulli, including what is now known as l’Hôpital’s rule.[7][8][9]

Subsequently, in letters to Leibniz, Varignon and others, Bernoulli complained that he had not received enough credit for his contributions, in spite of the fact that l’Hôpital acknowledged fully his debt in the preface of his book:

Je reconnais devoir beaucoup aux lumières de MM. Bernoulli, surtout à celles du jeune (Jean) présentement professeur à Groningue. Je me suis servi sans façon de leurs découvertes et de celles de M. Leibniz. C’est pourquoi je consens qu’ils en revendiquent tout ce qu’il leur plaira, me contentant de ce qu’ils voudront bien me laisser.

I recognize I owe much to the insights of the Messrs. Bernoulli, especially to those of the young (John), currently a professor in Groningen. I did unceremoniously use their discoveries, as well as those of Mr. Leibniz. For this reason I consent that they claim as much credit as they please, and will content myself with what they will agree to leave me.

 
 
 
 

By Julien K.: 4 Ways to Use an Ice Cube Tray (other Than for Ice)
 
 
 
 
10 Hacks To Keep Pests Out Of Your Prized Garden
 
 
 
 
by jessyratfink: How to Press and Dry Flowers (and Leaves!)
 
 
 
 
by Paige Russell: 3D Printed Bobblehead
 
 
 
 
Raphael Orlove: Watch When Two Americans Took On Rally Finland
 
 
 
 
Lindsey Adler: Smart Man John Urschel Smartly Quits Football
Baltimore Ravens lineman and math genius John Urschel is walking away from football at age 26.

Urschel, in addition to being a full-time football player, has also been pursuing a Ph.D. in mathematics from MIT. It seems the smart man has decided to do the smart thing and save his brain—and walk away after three full seasons, which makes him eligible for an NFL pension.
 
 
 
 
Jennings Brown: Jeff Bezos Surges Ahead of Bill Gates to Become World’s Richest Rich Guy
 
 
 
 
Grabient: Grab yourself a gradient
 
 
 
 

Widget not in any sidebars

 
 
 
 

Widget not in any sidebars

 
 
 
 

Widget not in any sidebars

 
 
 
 

Widget not in any sidebars

FYI July 26, 2017


1814 – The Swedish–Norwegian War begins.
The Swedish–Norwegian War, also known as the Campaign against Norway (Swedish: Fälttåget mot Norge), War with Sweden 1814 (Norwegian: Krigen med Sverige 1814), or the Norwegian War of Independence; was a war fought between Sweden and Norway in the summer of 1814. The war resulted in Norway entering into union with Sweden, but with its own constitution and parliament.

Background
Treaty of Kiel

As early as in 1812, prior to the Napoleonic invasion of Russia, the Swedish Crown Prince Charles John had entered into an agreement with Tsar Alexander I that Russia would support a Swedish attack on Norway in order to force Denmark-Norway to cede its northern part to Sweden.[1] The Swedish attack against Norway was rejected, however, and Swedish forces were instead directed against France in Central Europe. The Swedish troops were deployed against Napoleon’s forces as a result of agreements between Charles John and diplomats from the United Kingdom and Prussia, which indicated that Norway would be ceded to Sweden after France and its allies (which included Denmark-Norway) were defeated.[2]

By the Treaty of Kiel in January 1814, King Frederik VI of Denmark-Norway had to cede Norway to the King of Sweden, due to Denmark-Norway’s alliance with France, and its defeat during the later phases of the Napoleonic Wars. This treaty was however not accepted by the Norwegians.

Norwegian Constituent Assembly
Prince Christian Frederick of Denmark, heir presumptive to the thrones of Denmark and Norway and Governor-general of Norway, took the lead in the insurrection, and he called for a constitutional assembly. This adopted the liberal constitution of 17 May, which also elected Christian Frederick as the king of an independent Norway.

As the head of the new state, Christian Frederick desperately tried to gain support from the United Kingdom, or any of the other major powers within the Sixth Coalition, in order to maintain Norway’s independence. However, the foreign diplomats gave no hope for any outside support to the Norwegians.

Armies
The Norwegian Army mustered 30,000 men, and it had taken up positions away from the border with Sweden, in the fear of being outflanked. The Norwegian navy had few vessels, and most of them were stationed at the islands of Hvaler, close to Sweden.

The Swedish Army consisted of 45,000 men, experienced and well-equipped soldiers. The Swedish Navy had a number of large vessels and a capacity for moving and landing troops.
Major Commanders

Jean Baptiste Bernadotte – former Marshal of France and heir presumptive to the Swedish throne
Magnus Fredrik Ferdinand Björnstjerna – Swedish general
Johannes Klingenberg Sejersted – Norwegian major general
Frederik Gottschalck von Haxthausen – Norwegian minister of finance and Oberhofmarschall

War
The hostilities opened on 26 July with a swift Swedish naval attack against the Norwegian gunboats at Hvaler. The Norwegian army was evacuated and the vessels managed to escape, but they did not take part in the rest of the war. The main Swedish offensive came across the border at Halden, bypassing and surrounding the fortress of Fredriksten, and then continuing north, while a second force of 6,000 soldiers landed at Kråkerøy outside of Fredrikstad. This town surrendered the next day. This was the start of a pincer movement around the main part of the Norwegian army at Rakkestad.

On the front towards Kongsvinger the forces were more evenly matched, and the Norwegian army eventually stopped the Swedish advance at Lier on 2 August, and won another victory at Matrand on 5 August. On 3 August, King Christian Frederick reached the front at Østfold and was persuaded to change his strategy and use the 6,000 men stationed at Rakkestad in a counterattack against the Swedes. The order to counterattack was given on the 5th of August, but the order was recalled a few hours later. The Norwegian forces therefore withdrew over the Glomma river at Langnes in Askim.[3] The last major battle of the war was fought on 9 August at the bridgehead at Langnes, where the Swedish forces once more were driven back.[4] Sweden then attempted to outflank the Norwegian line, and successfully did so during the battle of Kjølbergs bro on the 14th of August. The Swedes then had a clear path to Kristiania, the Norwegian capital, which made the Norwegian situation unsustainable.

Although the Norwegian Army had won at Langnes, it was nevertheless clear to both the Norwegian and Swedish military authorities that a defeat was inevitable.[4] Even as they had managed to deliver several minor offensive blows to the Swedes, thus applying pressure on the Swedes to accept Norway as a sovereign nation[citation needed], it was considered impossible to try to stop the Swedes in the long run.[4] The Swedish offer of negotiations was therefore accepted as the war had put a heavy strain on the Norwegian finances. Every day of delay in securing Norway by the Swedes brought uncertainty to them regarding the outcome, so both parties were interested in a quick end to the war.

For the ordinary Norwegian soldier the war had seemed ill-prepared and ill-fought.[4] The allegations of the loss were against Christian Frederick and the Norwegian general Haxthausen; the latter was accused of treason. For the Norwegian government it probably[citation needed] had been more of a matter of getting the best possible bargaining position, as without the support of major powers Norway’s independence was impossible to secure. But by agreeing to talks following the victory at Langnes they were in a situation where they could avoid an unconditional surrender.

Aftermath
On 7 August, Bernadotte presented a proposal for a cease-fire. The proposal included a major concession—Bernadotte, on behalf of the Swedish government, accepted the Eidsvoll constitution. In doing so, he tacitly gave up any claims that Norway would be merely a Swedish province. Negotiations started in Moss, Norway on 10 August 1814, and after a few days of hard negotiations, a cease fire agreement, called the Convention of Moss, was signed on 14 August 1814. King Christian Frederick was forced to abdicate, but Norway remained nominally independent within a personal union with Sweden, under the Swedish king. Its Constitution was upheld with only such amendments as were required to allow it to enter into the union, and the two united kingdoms retained separate institutions, except for the King and the foreign service and policy.

 
 
 
 


1919 – James Lovelock, English biologist and chemist
James Ephraim Lovelock CH CBE FRS[2] (born 26 July 1919) is an independent scientist, environmentalist and futurist who lives in Devon, England. He is best known for proposing the Gaia hypothesis, which postulates that the Earth functions as a self-regulating system.[5]

Biography
James Lovelock was born in Letchworth Garden City in Hertfordshire, England, to working class parents who were strong believers in education. Nell, his mother, started work at 13 in a pickle factory. His father, Tom, had served six months hard labour for poaching in his teens and was illiterate until attending technical college. The family moved to London, where Lovelock’s dislike of authority made him, by his own account, an unhappy pupil at Strand School.[6] Lovelock could not afford to go to university after school, something which he believes helped prevent him becoming over-specialised and aided the development of Gaia theory. He worked at a photography firm, attending Birkbeck College during the evenings, before being accepted to study chemistry at the University of Manchester, where he was a student of the Nobel Prize laureate Professor Alexander Todd.[7] Lovelock worked at a Quaker farm before a recommendation from his professor led to him taking up a Medical Research Council post,[1] working on ways of shielding soldiers from burns. Lovelock refused to use the shaved and anaesthetised rabbits that were used as burn victims, and exposed his own skin to heat radiation instead, an experience he describes as “exquisitely painful”.[8] His student status enabled temporary deferment of military service during the Second World War, but he registered as a conscientious objector.[9] He later abandoned this position in the light of Nazi atrocities, and tried to enlist in the armed forces, but was told that his medical research was too valuable for the enlistment to be approved.[10] In 1948 Lovelock received a PhD[11] degree in medicine at the London School of Hygiene and Tropical Medicine. In the United States, he has conducted research at Yale, Baylor College of Medicine, and Harvard University.[1]

Career
A lifelong inventor, Lovelock has created and developed many scientific instruments, some of which were designed for NASA in its program of planetary exploration. It was while working as a consultant for NASA that Lovelock developed the Gaia hypothesis, for which he is most widely known.

In early 1961, Lovelock was engaged by NASA to develop sensitive instruments for the analysis of extraterrestrial atmospheres and planetary surfaces. The Viking program, which visited Mars in the late 1970s, was motivated in part to determine whether Mars supported life, and many of the sensors and experiments that were ultimately deployed aimed to resolve this issue. During work on a precursor of this program, Lovelock became interested in the composition of the Martian atmosphere, reasoning that many life forms on Mars would be obliged to make use of it (and, thus, alter it). However, the atmosphere was found to be in a stable condition close to its chemical equilibrium, with very little oxygen, methane, or hydrogen, but with an overwhelming abundance of carbon dioxide. To Lovelock, the stark contrast between the Martian atmosphere and chemically dynamic mixture of that of the Earth’s biosphere was strongly indicative of the absence of life on the planet.[12] However, when they were finally launched to Mars, the Viking probes still searched (unsuccessfully) for extant life there.
Electron capture detector developed by Lovelock, and in the Science Museum, London

Lovelock invented the electron capture detector, which ultimately assisted in discoveries about the persistence of CFCs and their role in stratospheric ozone depletion.[13][14][15] After studying the operation of the Earth’s sulphur cycle,[16] Lovelock and his colleagues, Robert Jay Charlson, Meinrat Andreae and Stephen G. Warren developed the CLAW hypothesis as a possible example of biological control of the Earth’s climate.[17]

Lovelock was elected a Fellow of the Royal Society in 1974. He served as the president of the Marine Biological Association (MBA) from 1986 to 1990, and has been an Honorary Visiting Fellow of Green Templeton College, Oxford (formerly Green College, Oxford) since 1994. He has been awarded a number of prestigious prizes including the Tswett Medal (1975), an American Chemical Society chromatography award (1980), the World Meteorological Organization Norbert Gerbier Prize (1988), the Dr A.H. Heineken Prize for the Environment (1990) and the Royal Geographical Society Discovery Lifetime award (2001). In 2006 he received the Wollaston Medal, the Geological Society’s highest Award, whose previous recipients include Charles Darwin [2]. He became a Commander of the Order of the British Empire CBE in 1990, and a member of the Companions of Honour in 2003. He is a patron of population concern charity Population Matters.

As an independent scientist, inventor, and author, Lovelock worked out of a barn-turned-laboratory he called his “experimental station” located in a wooded valley on the Devon/Cornwall border in the south-west of England.[18]

On 8 May 2012, he appeared on the Radio Four series “The Life Scientific”, talking to Jim al-Khalili about the Gaia hypothesis. On the program, he mentioned how his ideas had been received by various people, including Jonathan Porritt. He also mentioned how he had a claim for inventing the microwave oven. He later explained this claim in an interview with The Manchester Magazine. Lovelock said that he did create an instrument during his time studying causes of damage to living cells and tissue, which had, according to him, “almost everything you would expect in an ordinary microwave oven”. He invented the instrument for the purpose of heating up frozen hamsters in a way that caused less suffering to the animals, as opposed to the traditional way which involved putting red hot spoons on the animals’ chest to heat them up. He believes that at the time, nobody had gone that far and made an embodiment of an actual microwave oven. However, he does not claim to have been the first person to have the idea of using microwaves for cooking.[7]

More on wiki:

 
 
 
 

 
 
 
 
Luke Broadwater: $1M settlement planned for family of Tyrone West
The family’s attorney, A. Dwight Pettit, said the money would go to West’s three children — Nashay West, Tyrone West Jr. and a minor child — and lawyers’ fees.
 
 
 
 
By Rina Raphael: This App Wants To Relieve Americans’ Medical Debt
Better
 
 
 
 
Texas Monthly: The Drug Runners
 
 
 
 
Choose your music decade?

Andrew Liszewski: Brilliant Augmented Reality App Lets You Star in Your Own ’80s Music Video
 
 
 
 
Andrew Liszewski: Watch a Firefighter Ride a High-Powered Hose Thrashing Around Like a Raging Bull
 
 
 
 
Arc of Dreams: from foster care to the courtroom
 
 
 
 
Eillie Anzilotti: This App Connects Veterans In Crisis With Other Veterans Who Are Willing To Talk
Objective Zero
 
 
 
 

 
 
 
 
by johanmoberg: Online Herb Hydro Grow
 
 
 
 
by Penolopy Bulnick: 37+ Unusual Uses for Lonely Socks
 
 
 
 

 
 
 
 

 
 
 
 

Widget not in any sidebars

 
 
 
 

Widget not in any sidebars

 
 
 
 

Widget not in any sidebars

 
 
 
 

Widget not in any sidebars

FYI July 25, 2017


864 – The Edict of Pistres of Charles the Bald orders defensive measures against the Vikings.
The Edict of Pistres or Edictum Pistense was a capitulary promulgated, as its name suggests, at Pistres (modern Pîtres, in Eure) on 25 July 864. It is often cited by historians as one of the rare examples of successful government action on the part of Charles the Bald, King of West Francia.

At the time Vikings more than annually ravaged not only the Frankish coastlands but, with the aid of Europe’s numerous navigable rivers, much of the interior also. A king was most valued who could defeat them in the field and prevent their attacks in the future. The purpose and primary effect of the Edict was long thought to be the protection of the cities and countryside from Viking raids.

Charles created a large force of cavalry upon which he could call as needed. He ordered all men who had horses or could afford horses to serve in the army as cavalrymen. This was one of the beginnings of the French chivalry so famous for the next seven centuries. The intention of Charles was to have a mobile force with which to descend upon the raiders before they could up and leave with their booty.

To prevent the Vikings from even attaining a great booty, Charles also declared that fortified bridges should be built at all towns on rivers. This was to prevent the dreaded longships from sailing into the interior. Simon Coupland believes that only two bridges, at Pont-de-l’Arche (near Pistres) on the Seine and at Les Ponts-de-Cé on the Loire, were ever fortified, though a few others that had fallen into disrepair were rebuilt “in times of crisis in order to increase troop mobility”.[1] Charles also prohibited all trade in weapons with the Vikings, in order to prevent them from establishing bases in Gaul.[2] The penalty for selling horses to the Vikings was death. Since the prohibition on the sale of horses was new, it is probable that mounted Viking raids were on the rise.[3]

Aside from its auspicious military reforms, the Edict had political and economic consequences. King Pepin II of Aquitaine, against whom Charles had been fighting for decades, had been captured in 864 and was formally deposed at Pistres. Economically, besides the prohibitions on commerce with the enemy, Charles tightened his control of the mints and regulated the punishment for counterfeiting. Prior to this edict at least nine places in France had the right of minting but these were reduced to three. Charles also made an attempt to control the building of private castles, but this failed and even minor lords constructed fortresses of their own on local hilltops to defend themselves and their peasants from the constant threat of Scandinavian invasion.

 
 
 
 


1806 – Maria Weston Chapman, American abolitionist (d. 1885)
Maria Weston Chapman (July 25, 1806 – 1885)[1] was an American abolitionist. She was elected to the executive committee of the American Anti-Slavery Society in 1839 and from 1839 until 1842, she served as editor of the anti-slavery journal, The Non-Resistant.

Biography
Family

Maria Weston was born in 1806 in Weymouth, Massachusetts, the eldest of eight children, including five sisters, to Warren Weston and Anne Bates. Though the Westons were not wealthy, they were well connected and through her uncle’s patronage. Weston was educated in England and lived there for a time. She returned to Boston in 1828 to serve as principal of a newly founded and socially progressive girls’ high school.

Two years later she left the field of education to marry Henry Grafton Chapman, a second generation abolitionist and wealthy Boston merchant. Over the course of their twelve-year marriage, which ended in Henry’s death from tuberculosis in 1842, Chapman had four children, one of whom died in early childhood. Henry’s parents were also enthusiastic abolitionists. By all accounts the Chapman marriage was a good one, free from ideological and financial strain.

Abolitionism
Maria and Henry were both “Garrisonian” abolitionists, meaning that they believed in an “immediate” and uncompromising end to slavery, brought about by “moral suasion” or non-resistance. They rejected all political and institutional coercion—including churches, political parties and the federal government—as agencies for ending slavery. They did, however, support moral coercion that encompassed “come-outerism” and disunion, both of which opposed association with slaveholders. Gerald Sorin writes, “In [Maria’s] nonresistance principles and in her “come-outerism,” she was rigidly dogmatic and self-righteous, believing that ‘when one is perfectly right, one neither asks nor needs sympathy.’”

Anti-slavery work
Though Chapman came to the anti-slavery cause through her husband’s family, she quickly and stalwartly took up the cause, enduring pro-slavery mobs, social ridicule and public attacks on her character. Her sisters, notably Caroline and Anne, were also active abolitionists, though Maria is generally considered to be the most outspoken and active among her family.[2] According to Lee V. Chambers, through their “kin-work”, the sisters supported each other through family responsibilities in order to take their active public roles.[3] The Chapmans became central figures in the “Boston Clique,” which primarily consisted of wealthy and socially prominent supporters of William Lloyd Garrison.

In 1835, Chapman assumed the leadership of the Boston Anti-Slavery Bazaar, which had been founded the previous year by Lydia Maria Child and Louisa Loring as a major fundraising event. She directed the fair until 1858, when she unilaterally decided to replace the bazaar with the Anti-Slavery Subscription Anniversary. Chapman said that the fair had become passé; she argued that the Anniversary—an exclusive, invitation-only soirée featuring music, food and speeches—was more au courant and would raise more funds than the bazaar. As described by historian Benjamin Quarles, through these years Chapman and other abolitionists became experienced in using “all the refined techniques of solicitation” in their fundraising for the cause of abolitionism.[4]

In addition to her fair work, between 1835 and 1865, Chapman served on the executive and business committees of the Massachusetts Anti-Slavery Society (MASS), the New England Anti-Slavery Society (NEASS) and the American Anti-Slavery Society (AAS). Through these she was active in the petition campaigns of the 1830s. She wrote the annual reports of the Boston Female Anti-Slavery Society (BFASS) and published tracts to raise public awareness.

For nearly 20 years, between 1839 and 1858, Chapman edited The Liberty Bell, an annual anti-slavery gift book sold at the Boston Bazaar as part of fundraising. The giftbook was composed of contributions from various notable figures: Longfellow, Emerson, Elizabeth Barrett Browning, Harriet Martineau, and Bayard Taylor, among others, none of whom was paid for their contributions aside from a copy of The Liberty Bell.[5] She also served as editor to The Liberator in Garrison’s absence, and was on the editorial committee of the National Anti-Slavery Standard, the official mouthpiece of the AAS. Chapman was also a member of the peace organisation, the Non-Resistance Society, which published The Non-Resistant.[6]

Chapman was a prolific writer in her own right, publishing Right and Wrong in Massachusetts in 1839 and How Can I Help to Abolish Slavery? in 1855. Aside from these works, she published her poems and essays in abolitionist periodicals.[7] In 1840 divisions between Garrisonians and the more political wing of the anti-slavery movement split the AAS and correspondingly the BFASS into two opposing factions. Maria, nicknamed “Captain Chapman” and the “great goddess” by her opponents and “Lady Macbeth” even by her friends, outmaneuvered the opposition. She took control of a resurrected BFASS, which from then on mainly focused on organizing the Boston bazaar as a major fundraiser for abolitionism.

The church she attended is featured on the Boston Women’s Heritage Trail.[8]

Travels
Throughout her three decades of involvement in the anti-slavery movement, Chapman spent considerable amounts of time outside of the United States, first in Haiti (1841-1842) and later in Paris (1848-1855). In spite of her prolonged absences, she still figured centrally in the Boston movement generally and the Boston bazaar particularly. While abroad, she tenaciously solicited support and contributions for the Boston fairs from elite members of British and European society, such as Lady Byron, Harriet Martineau, Alexis de Toqueville, Victor Hugo, and Alphonse de Lamartine. When she returned to the U.S. in 1855, “bloody Kansas” and the rise of the Republican Party brought the issue of slavery to the centre of national debate. It was in this period that Chapman began to manifestly deviate from Garrisonian ideaology, by endorsing the Republican party and later by supporting both the American Civil War and Abraham Lincoln’s proposal in 1862 for gradual, compensated slave emancipation. Unlike many Garrisonians—and Garrison himself—Chapman gave no indication of being conflicted between the principle of non-coercion and the Civil War’s objective of abolishing slavery through violent force. Characteristically, Chapman was as resolute and unapologetic in her new beliefs as she had been in her old. Yet in spite of her newly expressed confidence in the state, Chapman seemingly felt little responsibility to former slaves once they were freed. In 1863, but for a passing interest in the AAS, Chapman retired from public life and for the next two decades, until her death in 1885, she “savored the perceived success of her cause and, equally, her own role in the victory.”

 
 
 
 

 
 
 
 

by Kate Sierzputowski: New Split-View Trash Sculptures by Bordalo II Combine Wood and Colorful Plastics Into Gigantic Animals
 
 
 
 
By Patrick Allan: Giphy’s Mobile Web Tool Turns Your Phone’s Videos and Photos Into GIFs
 
 
 
 
Adobe News Flash & The Future of Interactive Content
We’ll stop updating & distributing Flash Player by the end of 2020.
 
 
 
 

Andrew Liszewski: Watch 300 Jedi Absolutely Slaughter 60,000 Medieval Soldiers
 
 
 
 
by Christopher Jobson: Floaty Bird: When a Camera’s Frame Rate Matches a Bird’s Flapping Wings
 
 
 
 
by Ernie Smith: The Surprisingly Recent Innovation of the Toilet Duck

 
 
 
 
by Sarah Gailey: A Woman, Explaining Things
 
 
 
 
By Caroline Sinders: The Most Crucial Design Job Of The Future
 
 
 
 
By Valinda Chan: Getting it right: why infographics are not the same as data visualizations
So here’s the difference:

Infographics tell a premeditated story to guide the audience to conclusions (subjective). Data visualizations let the audience draw their own conclusions (objective).
 
 
 
 
Hrvoje Dominko: Despacito Websites
“Don’t just blindly copy design elements, copy the principles.”
Mario Šestak
 
 
 
 
By Tiantian Xu: 100 Days of Vector Illustration
 
 
 
 

 
 
 
 

Widget not in any sidebars

 
 
 
 

Widget not in any sidebars

 
 
 
 

Widget not in any sidebars

 
 
 
 

Widget not in any sidebars

FYI July 24, 2017


1929 – The Kellogg–Briand Pact, renouncing war as an instrument of foreign policy, goes into effect (it is first signed in Paris on August 27, 1928 by most leading world powers).
The Kellogg–Briand Pact (or Pact of Paris, officially General Treaty for Renunciation of War as an Instrument of National Policy[1]) is a 1928 international agreement in which signatory states promised not to use war to resolve “disputes or conflicts of whatever nature or of whatever origin they may be, which may arise among them”.[2] Parties failing to abide by this promise “should be denied of the benefits furnished by this treaty”.

It was signed by Germany, France, and the United States on 27 August 1928, and by most other nations soon after. Sponsored by France and the U.S., the Pact renounces the use of war and calls for the peaceful settlement of disputes. Similar provisions were incorporated into the Charter of the United Nations and other treaties and it became a stepping-stone to a more activist American policy.[3] It is named after its authors, United States Secretary of State Frank B. Kellogg and French foreign minister Aristide Briand.

More on wiki:

 
 
 
 


1786 – Joseph Nicollet, French mathematician and explorer (d. 1843)
Joseph Nicolas Nicollet (July 24, 1786 – September 11, 1843), also known as Jean-Nicolas Nicollet, was a French geographer, astronomer, and mathematician known for mapping the Upper Mississippi River basin during the 1830s. Nicollet led three expeditions in the region between the Mississippi and Missouri Rivers, primarily in Minnesota, South Dakota, and North Dakota.

Before emigrating to the United States, Nicollet was a professor of mathematics at Collège Louis-le-Grand, and a professor and astronomer at the Paris Observatory with Pierre-Simon Laplace. Political and academic changes in France led Nicollet to travel to the United States to do work that would bolster his reputation among academics in Europe.

Nicollet’s maps were among the most accurate of the time, correcting errors made by Zebulon Pike, and they provided the basis for all subsequent maps of the American interior. They were also among the first to depict elevation by hachuring and the only maps to use regional Native American placenames. Nicollet’s Map of the Hydrographical Basin of the Upper Mississippi was published in 1843, following his death. Nicollet Tower, located in Sisseton, South Dakota is a monument to Nicollet and his work and was constructed in 1991.

More on wiki:

 
 
 
 

 
 
 
 
Prachi Gupta: Sen. Patty Murray Is Not Giving Up on Rape Kit Legislation
Thanks in part to Griffin’s advocacy and work with state Rep. Tina Orwall, Washington has since passed several reforms regarding rape kits including, in 2016, becoming the first state to pass a law creating a statewide rape kit tracking system. But federal law still lags behind. Last May, in response to the GAO’s findings, Murray introduced the Survivors’ Access to Supportive Care Act with Sens. Jeanne Shaheen (D-NH), Claire McCaskill (D-MO), Kirsten Gillibrand (D-NY), Tammy Baldwin (D-WI), Richard Blumenthal (D-CT), and Barbara Boxer (D-CA). SASCA would ask the GAO to survey each state to determine the specific needs and standards of care for sexual assault examinations, create a federal guideline and training program around sexual assault health care (which currently does not exist), a federal grant to expand training and care offered at hospitals across the country, require colleges to educate students about sexual assault examination services, and build a resource center for hospitals receiving federal funding.
 
 
 
 
BloominThyme: How to Grow Peanuts
 
 
 
 
By Cayleigh Parrish: How To Launch A Killer Email Newsletter
 
 
 
 
By Claire Lower: Should You Be Seasoning Your Cast Iron With Flaxseed Oil?
 
 
 
 

 
 
 
 
Alison Nastasi: Is Electromagnetic Hypersensitivity a Real Illness?
 
 
 
 
By Tom McKay: Snooty, World’s Oldest Known Manatee, Dies at 69 and I’m Not Crying, You’re Crying
 
 
 
 

Widget not in any sidebars

 
 
 
 

Widget not in any sidebars

 
 
 
 

Widget not in any sidebars

 
 
 
 

Widget not in any sidebars

FYI July 23, 2017


1840 – The Province of Canada is created by the Act of Union.
The United Province of Canada, or the Province of Canada, or the United Canadas was a British colony in North America from 1841 to 1867. Its formation reflected recommendations made by John Lambton, 1st Earl of Durham in the Report on the Affairs of British North America following the Rebellions of 1837–38.

The Act of Union 1840, passed July 23, 1840, by the British Parliament and proclaimed by the Crown on February 10, 1841,[1] merged the Colonies of Upper Canada and Lower Canada by abolishing their separate parliaments and replacing them with a single one with two houses, a Legislative Council as the upper chamber and the Legislative Assembly as the lower chamber. In the aftermath of the Rebellions of 1837–1838, unification of the two Canadas was driven by two factors. Firstly, Upper Canada was near bankruptcy because it lacked stable tax revenues, and needed the resources of the more populous Lower Canada to fund its internal transportation improvements. And secondly, unification was an attempt to swamp the French vote by giving each of the former provinces the same number of parliamentary seats, despite the larger population of Lower Canada. Although Durham’s report had called for the Union of the Canadas and for responsible government (a government accountable to an independent local legislature), only the first was implemented. The new government was to be led by an appointed Governor General accountable only to the British Crown and the Queen’s Ministers. Responsible government was not to be achieved until the second LaFontaine-Baldwin ministry in 1849.

The Province of Canada ceased to exist at Canadian Confederation on July 1, 1867, when it was redivided into the Canadian provinces of Ontario and Quebec. From 1791 to 1841, the territory roughly corresponding to modern-day Southern Ontario in Canada belonged to the British colony of Upper Canada, while the southern portion of modern-day Quebec belonged to Lower Canada (along with Labrador until 1809, when Labrador was transferred to the colony of Newfoundland[2]). Upper Canada was primarily English-speaking, whereas Lower Canada was primarily French-speaking.

More on wiki:

 
 
 
 


1775 – Étienne-Louis Malus, French physicist and mathematician (d. 1812)
Étienne-Louis Malus /ˈɛtiˌɛn ˈluːiː ˌməˈluːs/ (French: [malys]; 23 July 1775 – 24 February 1812) was a French officer, engineer, physicist, and mathematician.

Malus was born in Paris, France. He participated in Napoleon’s expedition into Egypt (1798 to 1801) and was a member of the mathematics section of the Institut d’Égypte. Malus became a member of the Académie des Sciences in 1810. In 1810 the Royal Society of London awarded him the Rumford Medal.

His mathematical work was almost entirely concerned with the study of light. He studied geometric systems called ray systems, closely connected to Julius Plücker’s line geometry. He conducted experiments to verify Christiaan Huygens’s theories of light and rewrote the theory in analytical form. His discovery of the polarization of light by reflection was published in 1809 and his theory of double refraction of light in crystals, in 1810.

Malus attempted to identify the relationship between the polarising angle of reflection that he had discovered, and the refractive index of the reflecting material. While he deduced the correct relation for water, he was unable to do so for glasses due to the low quality of materials available to him (most glasses at that time showing a variation in refractive index between the surface and the interior of the glass). It was not until 1815 that Sir David Brewster was able to experiment with higher quality glasses and correctly formulate what is known as Brewster’s law.

Malus is probably best remembered for Malus’s law, giving the resultant intensity, when a polariser is placed in the path of an incident beam. His name is one of the 72 names inscribed on the Eiffel tower.

Selected works
Mémoire sur la mesure du pouvoir réfringent des corps opaques. in Nouveau bulletin des sciences de la Société philomathique de Paris, 1 (1807), 77–81
Mémoire sur de nouveaux phénomènes d’optique. ibid., 2 (1811), 291–295
Traité d’optique. in Mémoires présentés à l’Institut des sciences par divers savants, 2 (1811), 214–302
Théorie de la double réfraction de la lumière dans les substances cristallines. ibid., 303–508

 
 
 
 

Lauren Evans: Chris Christie Raises New Jersey’s Smoking Age to 21
 
 
 
 
By David Nield: All the Sensors in Your Smartphone, and How They Work
 
 
 
 
Comments and recommendations? I used dropbox for a while but it was causing havoc with wordpress. I still use Apple time Machine & Seagate 5tb external hard drive.

By David Nield: The Best Cloud Storage For Every Need
 
 
 
 
Comments?
By Mechanical Attraction: Harvesting Sound Energy From Passing Cars
 
 
 
 
by Paige Russell: 10 Unusual Uses for Pencils
 
 
 
 
by Ganhaar: The Eggstream Trailer
 
 
 
 

Lauren Bachand: Etsy Feature Addition: case study
 
 
 
 
By Marcus Swan: Meaning without words: an emoji revolution
 
 
 
 

By Jason Kottke: An alphabet made from classic rock band logos
 
 
 
 

By Kelly Faircloth: Who Knew? The History of Baking Powder Is Incredibly Dramatic
 
 
 
 

Widget not in any sidebars

 
 
 
 

Widget not in any sidebars

 
 
 
 

Widget not in any sidebars

 
 
 
 

Widget not in any sidebars

FYI July 22, 2017


1937 – New Deal: The United States Senate votes down President Franklin D. Roosevelt’s proposal to add more justices to the Supreme Court of the United States.
The Judicial Procedures Reform Bill of 1937[1] (frequently called the “court-packing plan”)[2] was a legislative initiative proposed by U.S. President Franklin D. Roosevelt to add more justices to the U.S. Supreme Court. Roosevelt’s purpose was to obtain favorable rulings regarding New Deal legislation that the court had ruled unconstitutional.[3] The central provision of the bill would have granted the President power to appoint an additional Justice to the U.S. Supreme Court, up to a maximum of six, for every member of the court over the age of 70 years and 6 months.

In the Judiciary Act of 1869 Congress had established that the United States Supreme Court would consist of the Chief Justice and eight associate justices. During Roosevelt’s first term the Supreme Court struck down several New Deal measures as being unconstitutional. Roosevelt sought to reverse this by changing the makeup of the court through the appointment of new additional justices who he hoped would rule his legislative initiatives did not exceed the constitutional authority of the government. Since the U.S. Constitution does not define the size of the Supreme Court, Roosevelt pointed out that it was within the power of the Congress to change it. The legislation was viewed by members of both parties as an attempt to stack the court, and was opposed by many Democrats, including Vice President John Nance Garner.[4][5] The bill came to be known as Roosevelt’s “court-packing plan”.[2]

In November 1936, Roosevelt won a sweeping reelection victory. In the months following, Roosevelt boldly proposed to reorganize the federal judiciary by adding a new justice each time a justice reached age seventy and failed to retire.[6] The legislation was unveiled on February 5, 1937, and was the subject of Roosevelt’s 9th Fireside chat of March 9, 1937.[7][8] Three weeks after the radio address the Supreme Court published an opinion upholding a Washington state minimum wage law in West Coast Hotel Co. v. Parrish.[9] The 5–4 ruling was the result of the apparently sudden jurisprudential shift by Associate Justice Owen Roberts, who joined with the wing of the bench supportive to the New Deal legislation. Since Roberts had previously ruled against most New Deal legislation, his support here was seen as a result of the political pressure the president was exerting on the court. Some interpreted his reversal as an effort to maintain the Court’s judicial independence by alleviating the political pressure to create a court more friendly to the New Deal. This reversal came to be known as “the switch in time that saved nine”; however, recent legal-historical scholarship has called that narrative into question[10] as Roberts’s decision and vote in the Parrish case predated both the public announcement and introduction of the 1937 bill.[11]

Roosevelt’s legislative initiative ultimately failed. The bill was held up in the Senate Judiciary Committee by Democratic committee chair Henry F. Ashurst who delayed hearings in the Judiciary Committee saying, “No haste, no hurry, no waste, no worry—that is the motto of this committee.”[12] As a result of his delaying efforts, the bill was held in committee for 165 days, and opponents of the bill credited Ashurst as instrumental in its defeat.[5] The bill was further undermined by the untimely death of its chief advocate in the U.S. Senate, Senate Majority Leader Joseph T. Robinson. Contemporary observers broadly viewed Roosevelt’s initiative as political maneuvering. Its failure exposed the limits of Roosevelt’s abilities to push forward legislation through direct public appeal. Public perception of his efforts here was in stark contrast to the reception of his legislative efforts during his first term.[13][14] Roosevelt ultimately prevailed in establishing a majority on the court friendly to his New Deal legislation, though some scholars view Roosevelt’s victory as pyrrhic.[14]

More on wiki:
 
 
 
 


1888 – Selman Waksman, Jewish-American biochemist and microbiologist, Nobel Prize laureate (d. 1973)
Selman Abraham Waksman (July 22, 1888 – August 16, 1973) was a Ukrainian-born, Jewish-American inventor, biochemist and microbiologist whose research into organic substances—largely into organisms that live in soil—and their decomposition promoted the discovery of Streptomycin, and several other antibiotics. A professor of biochemistry and microbiology at Rutgers University for four decades, he discovered over twenty antibiotics (a word he coined) and introduced procedures that have led to the development of many others. The proceeds earned from the licensing of his patents funded a foundation for microbiological research, which established the Waksman Institute of Microbiology located on Rutgers University’s Busch Campus in Piscataway, New Jersey (USA). In 1952 he was awarded the Nobel Prize in Physiology or Medicine in recognition “for his discovery of “streptomycin,” the first antibiotic active against tuberculosis.” Waksman was later accused of playing down the role of Albert Schatz, a PhD student who did the work under Waksman’s supervision to discover streptomycin.[1]

In 2005 Selman Waksman was granted an ACS National Historical Chemical Landmark in recognition of the significant work of his lab in isolating more than fifteen antibiotics, including streptomycin, which was the first effective treatment for tuberculosis.[2]

Biography
Selman Waksman was born on July 22, 1888, to Jewish parents, in Nova Pryluka, Podolia Governorate, Russian Empire,[3] now Vinnytsia Oblast, Ukraine. He was the son of Fradia (London) and Jacob Waksman.[4] He immigrated to the United States in 1910, shortly after receiving his matriculation diploma from the Fifth Gymnasium in Odessa, and became a naturalised American citizen six years later.

Waksman attended Rutgers College (now Rutgers University), where he was graduated in 1915 with a Bachelor of Science (BSc) in Agriculture. He continued his studies at Rutgers, receiving a Master of Science (MSc) the following year. During his graduate study, he worked under J. G. Lipman at the New Jersey Agricultural Experiment Station at Rutgers performing research in soil bacteriology. Waksman was then appointed as Research Fellow at the University of California, Berkeley from where he was awarded his Doctor of Philosophy (PhD) in Biochemistry in 1918.

Later he joined the faculty at Rutgers University in the Department of Biochemistry and Microbiology. It was at Rutgers that Waksman’s team discovered several antibiotics, including actinomycin, clavacin, streptothricin, streptomycin, grisein, neomycin, fradicin, candicidin, candidin, and others. Two of these, streptomycin and neomycin, have found extensive application in the treatment of numerous infectious diseases. Streptomycin was the first antibiotic that could be used to cure the disease tuberculosis. Waksman is credited with coining the term antibiotics, to describe compounds derived from other living organisms such as penicillin, though the term was first used by the French dermatologist François Henri Hallopeau, in 1871, to describe a substance opposed to the development of life.[5]

Many awards and honors were showered on Waksman after 1940, most notably the Nobel Prize in 1952; the Star of the Rising Sun, bestowed on him by the emperor of Japan, and the rank of Commandeur in the French Légion d’honneur.[3][6]

Selman Waksman died on August 16, 1973 and was interred at the Crowell Cemetery in Woods Hole, Barnstable County, Massachusetts. His tombstone is inscribed simply as Selman Abraham Waksman: Scientist, followed by his dates of birth and death, and the phrase “The earth will open and bring forth salvation” in Hebrew and English, which is a reference to Isaiah 45:8.[3][7]

He was the father of Byron Waksman, involved in Multiple sclerosis research .

Other little known contributions of Selman Waksman include anti-fouling paints for the Navy, the use of enzymes in detergents, and the use of Concord grape root stock to protect the French vineyards from fungal infection.

Streptomycin
Main article: Streptomycin

Waksman had been studying the Streptomyces family of organism since his college student days and had, for a time, been studying the organism Streptomyces griseus. Streptomycin was isolated from S. griseus and found effective against tuberculosis by one of Waksman’s graduate students, Albert Schatz.[8]

Controversy
The details and credit for the discovery of streptomycin and its usefulness as an antibiotic were strongly contested by Schatz, eventually leading to litigation.[9] Waksman and Rutgers settled out of court with Schatz, resulting in financial remuneration and entitlement to “legal and scientific credit as co-discoverer of streptomycin.”[10][11]

Systematic experiments to test several strains of antibiotic against several different disease organisms were under way in Waksman’s laboratory at the time. Their classic approach was to explore a complete matrix with rows consisting of antibiotics and columns consisting of different diseases. The bacteria which produced the antibiotic streptomycin was discovered by Schatz in the farmland outside his lab, and tested by him.[10] Waksman, however, eventually came to claim sole credit for the discovery.

Neomycin
Main article: Neomycin

Neomycin is derived from actinomycetes and was discovered by Waksman and Hubert A. Lechevalier, one of Waksman’s graduate students. The discovery was published in the journal Science.[12]

Nobel Prize
Waksman was awarded the Nobel Prize in 1952 “for his discovery of streptomycin, the first antibiotic effective against tuberculosis.”[13] In the award speech, Waksman was called “one of the greatest benefactors to mankind,” as the result of the discovery of streptomycin.[14] Schatz protested being left out of the award, but the Nobel committee ruled that he was a mere lab assistant working under an eminent scientist.[10]

In 1951,[15] using half of his personal patent royalties, Waksman created the Waksman Foundation for Microbiology.[16] At a meeting of the board of Trustees of the Foundation, held in July 1951 he urged the building of a facility for work in microbiology, named the Waksman Institute of Microbiology, which is located on the Busch campus of Rutgers University in Piscataway, New Jersey. First president of the Foundation, Waksman was succeeded in this position by his son, Byron H. Waksman, from 1970 to 2000.

The Selman A. Waksman Award in Microbiology of the National Academy of Sciences is given in his honor.[17]

Publications

Selman Waksman was author or co-author of over 400 scientific papers, as well as twenty-eight books[3] and 14 scientific pamphlets.

Enzymes (1926)
Humus: origin, chemical composition, and importance in nature (1936, 1938)
Principles of Soil Microbiology (1938)
My Life with the Microbes (1954) (an autobiography)

 
 
 
 

One bullet.
Hazel Cills: Former Owl City Member Daniel Jorgensen Pleads Guilty to Lewdness With a Child
The 32-year-old musician admitted that, in August 2013 in Atlantic City, he had exposed himself to a girl under the age of 13. Initially he was charged in 2015 with attempting to lure the girl and engage in criminal sexual contact. Jorgensen was ultimately sentenced to two years’ probation in Minneapolis.
 
 
 
 

By Gary Price: “Ireland’s Digital Content in Danger of Disappearing, Specialist Warns”
 
 
 
 
Mark Edwards show why they call him the fastest icon designer this site of the Mississippi.
 
 
 
 
I am glad she did not kill the parrot.
By Associated Press: ‘Don’t [expletive] shoot’: Wife convicted of murder witnessed by parrot

 
 
 
 
Seems reasonable to me. State funded birth control, no or prevention of drug addicted children and maybe these folks can get their lives going in a positive direction. what do the naysayers offer as an alternative?
By CBS News: Rise in drug-addicted babies prompts judge’s controversial solution
 
 
 
 
By David Tracy: Take Four Minutes Out Of Your Day To Brush Up On The Difference Between Diesel And Gas Engines
 
 
 
 

Widget not in any sidebars

 
 
 
 

Widget not in any sidebars

 
 
 
 

Widget not in any sidebars

 
 
 
 

Widget not in any sidebars

Cool Gus says: Where did Special Operations Forces come from? | Bob Mayer July 22, 2017

The lineage of US Special Operations Forces go back to before there was a United States. So dogs. Cool Gus has had his own Special Ops training. Just try getting by him when he’s on guard dut…

Cool Gus says: Where did Special Operations Forces come from? | Bob Mayer

By Messy Nessy Chic: Your Inner Child Needs this NYC Rooftop Cottage For Sale

Your Inner Child Needs this NYC Rooftop Cottage For Sale