Showing posts with label Science. Show all posts
Science
Earth and Saturn now have something in common: plastic. NASA's Cassini spacecraft detected the molecule propylene on Saturn's moon Titan, and propylene is one of the basic ingredients of modern plastic here on Earth.
It's the first extraterrestrial plastic ingredient ever found, reports NASA's Jet Propulsion Laboratory, which uses this quote from one of its scientists to help mere mortals grasp the discovery: "This chemical is all around us in everyday life, strung together in long chains to form a plastic called polypropylene. That plastic container at the grocery store with the recycling code 5 on the bottom—that's polypropylene."
So is this a stunning discovery? Far from it, explains Pacific Standard. Propylene is a hydrocarbon, and scientists have long known that Titan is teeming with other hydrocarbons such as methane and propane.
"Propylene was actually something scientists expected to find since its chemical kissing cousins were already known to be present," writes Michael Todd.And, sorry, entrepreneurs, Wired shoots a hole in your ambitious plan to zip over to Saturn to harvest the stuff. Not only would that probably violate international treaties set up to protect other worlds, but "fossil fuels are also likely to remain relatively cheap, plentiful, and easy enough to access for many years that space-based extraction of them will remain laughable," writes Adam Mann.
| BBC #end.
Science
A Scientific Study Confirms Our First Decision is The Right Decision. Always!
Alberta School of Business researcher Sarah Moore and colleagues from Duke and Cornell university’s say unconscious feelings about objects in the environment influence the pursuit of long-term goals.
Their study explores how the unconscious mind responds to objects in relation to an individual’s goals – and how the unconscious continues to influence feelings about these objects once the goals are reached, whether or not the outcome has been successful.
‘In the past few years, we recognised that some of [Sigmund] Freud’s ideas on the unconscious mind were, in fact, correct and that a lot of our decision-making and a lot of our feelings are based on things that we’re not really aware of,’ said Moore, who is an assistant professor in the Alberta School of Business.
‘In our study, we looked at how our unconscious feelings about objects in the environment influence how we pursue goals.’
Moore notes previous studies have shown when it comes to short-term, finite goals, such as responding to basic needs like thirst or hunger, the unconscious will evaluate objects and form preferences based on whether the object will help an individual achieve the goal.
She says in the case of thirst, items such as a water fountain or a bottle of Coke will be seen favourably, while a chocolate bar or KFC sign would not.
However, she explains that, once the goal is reached, those same objects will be evaluated differently.
‘Once your thirst is quenched, you don’t evaluate the water fountain positively anymore because you’ve accomplished the goal,’ she said. ‘But there are differences when we look at long-term goals.’
Moore’s research focused on longer-term goals, such as getting in shape or undertaking educational pursuits.
For both types of goals, she says, the process is similar in that the unconscious identifies and responds to positively to objects and triggers in the environment that support the goal.
However, the unconscious deals differently with these objects during progress towards long-term goals...
Moore says, unlike with short-term finite goals, the unconscious will continue to positively value objects related to the long-term goals even after a level of success has been achieved.
She says this phenomenon points to the indeterminate nature of the goal.
‘In some sense, we’re never “finished” long-term goals,’ said Moore.
‘If we successfully finish the small steps toward our long-term goals, it becomes a cycle: we take a small step, we succeed, we feel good about it; therefore, we continue to feel good about the long-term goal.
‘This process makes us more likely to take the next small step toward achieving that goal.’
What was surprising for the researchers was how participants in their study reacted to objects after a failure.
While the researchers expected the participants who failed to react negatively or express dislike for objects related to their test goal, Moore and her colleagues found that failure resulted in a neutral view of the objects.
‘You don’t hate the objects related to the goal because that goal is very important to you in the long run,’ said Moore.
‘Your unconscious is telling you ‘now is not the time to pursue the goal. You just failed; let’s leave it alone for a while.
‘We’re not going to pursue these objects in the environment; we’re going to switch to some other goal.’
/Fot.Corbis. Z: DailyMail/ #end.
Science × Unbelievable × Varia
A speculative look at how advanced genetic engineering technology might reshape people's faces over time.
From there, they reasoned out how humanity with advanced genetic engineeringtechnology might reshape itself over time, taking over the role played by natural selection so far. Lamm then created a series of images of what he thinks the human face might look like 20,000 years, 60,000 years and 100,000 years in the future (Note: He said that we shouldn’t read too much into the fact that the man and woman are Caucasian because those were just the best models he could find).
Image: Today
The first image is an unmodified photo of a man and woman from the present. Nothing special.
Image: 20,000 years
This one shows some changes, but they are not too major yet. Heads are a bit bigger to accommodate larger brains, and those yellow rings that you see in the models’ eyes are special lenses that act kind of like Google Glass does today, but in a much more powerful way.
Image: 60,000 years
In the 60,000 years image, we’re starting to see some major changes. Heads are even larger, but the eyes have grown too. Lamm speculates that this would be a result of human colonization of the solar system, with people living farther away from the sun where there is less light. Skin pigmentation would change and our eyelids would become thicker to offer more protection against UV rays for those living outside of the Earth’s protective ozone layer.
Image: 100,000 years
100,000 years! Here Lamm predicts big changes, the most notable of which is the big Japanese Manga-style eyes that may feature “eye-shine enhance low-light vision and even a sideways blink from re-constituted plica semilunaris” to offer extra protection against cosmic rays. These futuristic faces follow the golden ratio proportions and are perfectly symmetrical from left to right, and have larger nostrils to make breathing in off-planet environments easier, as well as denser hair to contain heat loss from their even larger heads. Various implants might allow the man and woman of the future to always be connected, but these would be subtle and almost invisible.
Now remember, Nickolay Lamm and Dr. Kwan stress that this is not a prediction, but rather speculation (“one possible timeline”), and that it is impossible to know for sure what the future holds. This is just their answer to the question “What do you think the human face might look like in 100,000 years and why?” There are, without a doubt, many other answers, some of which might seem more plausible. But it’s interesting food for thought.
Personally, if I had to criticize this project, I would say that the timeline is probably too long. We’re already starting to have the ability to modify ourselves, so if we ever decide to do so (it’s probably a question of “when” rather than “if”), it probably won’t take thousands of years. Just in the past 100 years, we've gone from barely having mastered powered flight with the Wright Brothers to landing space probes on almost all planets and moons of the solar system, from Morse code telegraphs to a worldwide communication network made up of billions of electronic devices, each of which is more powerful than the supercomputers of a few decades ago. So technological and scientific progress is really fast and it’s accelerating. The human race’s capabilities in 50 years should be even more impressive to us today than today’s tech would be for someone from 50 years ago — and that’s saying something.
My own speculation on how humans might modify themselves over time would probably go into a different direction than Lamm’s — and wouldn’t result in very striking images because I think most changes wouldn’t be visible. For example, if we successfully cure the diseases of aging (the SENS Research Foundation is working on this, for example), we would look the same, except that people would keep their young adult bodies, and you might not be able to superficially tell the difference between someone who is 30 and someone who is 60 years old. Maybe we’ll upgrade the human eye to give ourselves piercing hawk-like vision and awesome low-light capabilities, but that eye 2.0 might not look different from the outside. Same if we improve our red blood cells so they can carry 10 times more oxygen, our livers to better eliminate toxins or our metabolisms to maintain a healthy weight whatever we do. All these changes would be huge for humanity, yet they might not be visible in a photograph.
But all that is just speculation, one of many possible futures. The bottom line is we can all have an impact on how the future turns out, so let’s make it a good one.
| MNN / Photo: Courtesy of Nickolay Lamm#end.
Lifestyle × Science
A new study shows that the sexes really do see the world differently. Men notice small details and moving things while women are more sensitive to color changes.
"As with other senses, such as hearing and the olfactory system, there are marked sex differences in vision between men and women," researcher Israel Abramov, of the City University of New York (CUNY), said in a statement. Research has shown women have more sensitive ears and sniffers than men.
"[A] recent, large review of the literature concluded that, in most cases females had better sensitivity, and discriminated and categorized odors better than males," Abramov and colleagues write Tuesday (Sept. 4) in the journal Biology of Sex Differences.
Abramov and his team from CUNY's Brooklyn and Hunter Colleges compared the vision of males and females over age 16 who had normal color vision and 20/20 sight — or at least 20/20 vision with glasses or contacts.
In one part of the study, the researchers asked the volunteers to describe different colors shown to them. They found that the guys required a slightly longer wavelength of a color to experience the same shade as women and the men were less able to tell the difference between hues. [Your Color Red Really Could Be My Blue]
The researchers also showed the participants images made up of light and dark bars that varied in width and alternated in color so that they appeared to flicker, a measure of participants' sensitivity to contrast. Compared with the women, the male volunteers were better able to identify the more rapidly changing images made up of thinner bars, the researchers said.
Abramov explained in a statement these elements of vision are linked to specific sets of thalamic neurons in the brain's primary visual cortex. The development of these neurons is controlled by male sex hormones called androgens when the embryo is developing into a fetus.
"We suggest that, since these neurons are guided by the cortex during embryogenesis, that testosterone plays a major role, somehow leading to different connectivity between males and females," Abramov said. "The evolutionary driving force between these differences is less clear."
Previous research found that men and women also focus differently. In experiments at the University of Southern California, researchers found that men are likely to fixate on the mouth of a person in conversation and also are more likely to be distracted by movement behind that person. Meanwhile, women tend to shift their gaze between a speaker's eyes and body, and they are more likely to be distracted by other people, the researchers found.
| LiveScience /Foto: Michael Zhang#end.
People × Science × World
Twitter Co-Founder Evan Williams has an ambitious new plan: to shift our daily reading habits away from consuming incremental news bites and towards engaging with enlightened ideas curated by an intelligent algorithm.
Before Twitter terraformed the landscape of news distribution, Williams’s first smash hit, Blogger, became the branded namesake for an upstart generation of amateur writers to challenge the established players
Most importantly, Medium, his new platform for publishing mostly long-form content, has quickly garnered popularity — and infamy. In only a few months, its most popular contributions are making front-page headlines and snagging millions of views. In our Silicon Valley bubble, its contributors semi-regularly spark industry wide-conversations among the Internet elite.
“The site from Twitter’s co-founders is one year old, and still mysterious,” wrote The Atlantic‘s Alexis Madrigal recently, in one of many stories attempting to understand the Internet multi-millionaire’s enigmatic new project.
Now, for the first time since Williams launched the beta of Medium last year at our own TechCrunch Disrupt conference, Williams is ready to talk.
News “Crap” Vs. A Book
Williams is taking aim squarely at the news industry’s most embarrassing vulnerability: the incessant need to trump up mundane happenings in order to habituate readers into needing news like a daily drug fix.“News in general doesn’t matter most of the time, and most people would be far better off if they spent their time consuming less news and more ideas that have more lasting import,” he tells me during our interview inside a temporary Market Street office space that’s housing Medium, until the top two floors are ready for his growing team. “Even if it’s fiction, it’s probably better most of the time.”
It’s true. The daily news cycle doesn’t always do its job at enlightening American democracy. In the aptly titled research paper, “Does the Media Matter”, a team of economists found that getting a randomized group of citizens to read the Washington Post did nothing for “political knowledge, stated opinions, or turnout in post-election survey and voter data.”
News, alone, is evidently insufficient to make us a more informed society.
Instead, Williams argues, citizens should re-calibrate their ravenous appetite for information towards more awe-inspiring content. “Published written ideas and stories are life-changing,” he gushes, recalling his early childhood fascination with books as the motivation to take on the media establishment. The Internet “was freeing that up, that excitement about knowledge that’s inside of books–multiplied and freed and unlocked for the world; and, the world would be better in every way.”
In Williams’s grand vision, the public reads for enlightenment; news takes a backseat directly in proportion to how often it leaves us more informed and inspired.
In addition to better content, the news itself might be better written by industry professionals. Climate deniers from conservative outlets, he argues, are a prime example of how the media has failed in its obligation to inform the public. During an extended rant on global warming during our conversation, he didn’t complete the explanation on why industry-professionals-as-writers would solve the problem. But, it’s easy to imagine that if nearly all climate scientists believe in man-made global warming, it would be difficult for media outlets to find a credible writer to claim otherwise.
The elephant in the room was that Williams was not-so-subtly attacking me and my colleagues, especially considering he had claimed that “the state of tech blogs is atrocious — its utter crap.” I asked him to explain what he meant, trying not to sound offended.
Diplomatically clarifying his words, he responded: “Part of the reason a lot of tech blogs are bad is the people writing them don’t really understand what they’re writing about. And so I want to change our definition of professional writing. At least expand it.”
Clearly taking a position on the long-standing debate between journalists and industry insiders, Williams says that the kinds of weekend columns TechCrunch runs from noted businesspeople “are absolutely more valuable” than some of the daily news written by reporters with little business experience.
However, Williams was clear: “please don’t set this up as Evan thinks tech blogs are crap and therefore is fixing them with Medium. People are going to publish crap on Medium.”
Williams was referring to a number of infamous Medium posts that were brazenly elitist and spread misinformation. Silicon Valley entrepreneur Peter Shih’s “10 Things I Hate About You: San Francisco Edition,” was widely criticized for, among other things, perpetuating Silicon Valley’s abject misogyny and callousness toward the homeless. In another embarrasing moment, Medium contributor Michele Catalano wrongly implied that nefarious government spies seized her computer after she searched Google for “backpacks” and “pressure cookers” . Both posts were later modified or taken down. Though the missteps spawned a series of thoughtful counter-posts on Medium and a handful of media outlets, Williams didn’t try to spin the reactions a win.
“People are going to publish crap on Medium…. And guess what? There’s crap on Twitter. There’s crap on blogs. There’s crap on the Internet. And if we try to keep crap off the Internet, the Internet wouldn’t be important,” he argues, with a noticeable defensiveness in his voice that belies his leaned-back posture. “The system’s working if there’s great stuff that otherwise wouldn’t see the light of day and/or gets more attention than it would otherwise.”
Okay then, so what’s Williams’s solution for putting a spotlight on the good stuff?
A Simple Medium To Attract Every Good Idea
“Everyone has a story or insight that is worth repeating and they just don’t have the venue to get it heard,” adds former Wired.com Editor Evan Hansen, a senior editor at Medium, charged with building out its tech, science, and business coverage. Medium sees itself as a hybrid between professional outlets like The New York Times and the unwashed blogger free-for-all of The Huffington Post (which is owned by TechCrunch parent company, Aol). Instead, Medium wants to be the platform for everyone’s one truly viral idea.Health startup entrepreneur Nick Crocker probably never thought his simple post about a walk through the junk food aisles at his local grocery store would snag over 1 million(!) views. Crocker’s rather elegantly crafted “The World Is Fucking Insane” is a photo-heavy, first-person journey though grocery aisles lined with monster stacks of chemically altered sugary foods on his way to pick up some milk. Its visual simplicity evidently expressed the public’s latent frustration for America’s health crisis in a way that other statistics-packed medical news did not.
In another example, Aron Solomon put a national spotlight on smartphone taxi app, Uber, after it unintentionally instituted surge pricing during Toronto’s massive summer storm. A few news outlets covered the embarrassing incident, but nothing else went viral like the voice of an innocent bystander outraged at the negligent price gouging in his hair-raising post, “The Don’t Be An Asshole Rule.”
Most importantly, both of these posts were composed in the heat of the moment. At Medium, there’s no need to register a website, sift through a mountain of design options, and re-organize your schedule for the habit of blogging. You just write. “If it’s on a whim, that whim is killed the moment you’re forced to find a unique sub domain and find a template,” explains Williams.
The sheer simplicity of Medium’s writing platform is garnering accolades from respected writers and designers. Medium is “the best composition experience on the web, hands-down” wrote early Facebook designer Julie Zhuo, in a Medium post about Medium (so meta). “You see exactly what your post is going to look like. There is no translation, no guess-work, no typey-typey into some fat text area and wondering whether it’ll do ‘s and ‘s and
New York Times tech columnist Nick Bilton also gives Medium a thumps up. “I really like Medium — it’s one of the rare instances where the technology is truly in the background,” he writes to me in an email. “I’d love to be able to replace WordPress with Medium on my personal site.” Fortunately for WordPress, Medium has no plans to become a separate blogging platform — but, it might become home to the occasional industry muse who doesn’t want to hassle with setting up a blog.
Still, Medium isn’t betting that viral posts from one-hit wonders are a sustainable foundation. It has allocated a sizable budget to pay for professional magazine-style exposes. Most recently, it bankrolled a massive 10,000 word, movie-worthy script about a 62-year-old commando whose tantalizing life has included a mission to recover $3 million in gold bullion in the Peruvian mines. “Mercenary” was edited by the same acclaimed author whose 2007 Wired piece about freeing Iranian hostages eventually became the Academy Award-winning fictionalized re-enactment, Argo. Over the next 18 months, the partnering studio, Epic, promises five more edge-of-your-seat stories.
War correspondent, David Axe, is also bringing some heavy long-form explainers about creepy military tech to the pages of Medium. Despite Medium’s implied reluctance against professional writers, it’s clearly willing to invest in public Internet journalism.
“I always felt you couldn’t live without the big expensive scoops as a serious brand in media,” explains Hansen. “If you pursue the low end all the time then the advertisers want nothing to do with you. People with money and affluent readers and people with positions in the areas that you cover of authority and influence don’t read you. You become kind of like, you’re nothing.”
Hansen maintains that Medium is still experimenting. Wherever its ends up, Medium evidently wants to be the home of any bold, viral idea — and it’s willing to run a financial and engineering bulldozer over any barrier to writing.
Yet, even if writers are willing to come to Medium , how will readers find them at a still-obscure publisher?
A Pandora For Substantive Reads, Pageviews Be Damned
Traditional news editors stake their reputations on having an intuition for what drives eyeballs to their sites. Editors don’t, however, know whether readers leave more informed.Williams thinks Medium has an answer: an intelligent algorithm that suggests stories, primarily based on how long users spend reading certain articles (which he’s discussing publicly for the first time). Like Pandora did for music discovery, Medium’s new intelligent curator aims to improve the ol’ human-powered system of manually scrolling through the Internet and asking others what to read.
In the algorithm itself, Medium prioritizes time spent on an article, rather than simple page views. “Time spent is not actually a value in itself, but in a world where people have infinite choices, it’s a pretty good measure if people are getting value,” explains Williams.
In fairness to news editors, we do know how much time readers spend on an article: We know that less than 60 percent will read more than half of an article, and a significant slice won’t read anything at all. “I’m going to keep this brief, because you’re not going to stick around for long. I’ve already lost a bunch of you,” joked tech columnist Farhad Manjoo, in a cathartic post for Slate that was aptly titled “You Won’t Finish This Article: Why People Online Don’t Read To The End.”
But, because advertisers pay for page views, the incentive is to fish for clicks, no matter how much we try to feature other kinds of higher-quality content.
For example, after Miley Cyrus’ infamous burlesque dance in front of the MTV’s Video Music Awards’ impressionable tween audience, The Onion brilliantly lambasted CNN’s decision to make a burlesque show front-page news.
“So, you may ask, why was this morning’s top story, a spot usually given to the most important foreign or domestic news of the day, headlined “Miley Cyrus Did What???” and accompanied by the subhead “Twerks, stuns at VMAs”?,” wrote The Onion, in a parody OpEd by CNN’s managing editor.
“The answer is pretty simple. It was an attempt to get you to click on CNN.com so that we could drive up our web traffic, which in turn would allow us to increase our advertising revenue. There was nothing, and I mean nothing, about that story that related to the important news of the day, the chronicling of significant human events, or the idea that journalism itself can be a force for positive change in the world.”
I think most of us in the news industry would love if our audience only cared about deeply substantive stories, but expensive content with relatively few page views doesn’t pay the bills. So, what’s Medium’s plan to make money without advertising?
“Web People” And Financial Stability
The short answer is that no one really knows. “Well, it’s got to be sustainable at some level. So I think revenue is in the model,” says Hansen, with a casual attitude that indicates just how little focus Medium is currently devoting to the issue of monetization.One option is selling eBooks off of its cinematic scripts. A portion of the readers may be willing to pay for the convenience of a Kindle version of “Mercenary,” for instance. But, traffic has to be outstanding for that kind of venture to make money. Pageviews to Mercenary have been okay. “It hasn’t blown the lid off,” admits Hansen, in reference to other Medium pieces that snagged a few million pageviews.
Other monetization options include licensing its technology, and revenue sharing with established media brands that want to post stories on and from Medium (Mother Jones has placed some stories on Medium, while Gawker has wholesale reblogged popular Medium posts on their own site).
Though Medium doesn’t have a solid business plan yet, there is a method to the madness of thinking about product first and money second (or third). Williams has a fascinating way of grouping business types in Silicon Valley between those who successfully managed companies through the dot-com bust (“Web People”) and those who packed up their empty bags and left (“Dot-com People”).
He explains that Web People “loved the web. We loved what was possible and we loved the creativity and we were in it to create; we weren’t in it to make money.”
So, when the bubble burst, Web People stayed — some ultimately making the (very) profitable products we all use today. “The Web People are more sustainable because they kept going. Because they’re driven to create, they’re attracted to the web because of its creative potential. They weren’t scared away when it seemed like it wasn’t an instant path to riches.” he explains, “They were persistent.”
An Optimistic Bet That Keeps Williams Persistent
“I think more people would be in a better place if more people shared their ideas,” says Williams. Seen this way, Medium is just the next logical step in Williams’ three-product cycle to inject better ideas into the world. Blogger helped open the doors for pajama bloggers to compete with the media moguls. A few years later, Twitter gave the power of broadcast distribution to everyone who had 140 characters to share.Now, to complete the circuit, Medium wants to make viral information more substantive — the hope in the Pandora’s box of communication. “It’s also an optimistic stance to say that we can build a system where good things can shine and get attention. And there’s an audience for ideas and stories that appeal to more than just the most base desires of human beings.”
Or, in essence, Medium’s biggest bet is, “people will read long things — they’ll read a lot.” And that there’s a business in this.
| techcrunch.com #end.
Science
New research reveals people have a so-called "sixth sense." No, it’s not the ability to see dead people. Think more ... Rain Man. It’s referred to as “numerosity"—or number sense. Scientists have reportedly found a region of the brain that has a sort of “map” for perceiving numbers.
TRANSCRIPT: New research reveals people have a so-called ‘sixth sense.’ No, it’s not the ability to see dead people. Think more...Rain Man. It’s referred to as “numerosity” — or number sense. Scientists have reportedly found a region of the brain that has a sort of “map” for perceiving numbers. (Via The Huffington Post ) Basically it’s the ability to look at a group of items and tell how many there are without actually counting each individual one. LiveScience reports participants were placed in an MRI scanner and looked at a series of dots over time. From one dot, then two dots, all the way up to eight dots. An advanced imaging method allowed researchers to see how each participant’s brains reacted — and it was actually quite organized. The small number of dots were represented in one area of the brain while the larger numbers were represented in another. NPR explains, like the other five senses, numerosity appears to originate in a specific part of the brain. And the better you are at number sensing the better you tend to do on standardized tests. The researchers haven’t come across a person with the same number sense as Dustin Hoffman’s character in Rain Man, but say the amount of skill can vary among individuals. | Newsy #end.
Science
From plastic and potato chips to matches and microwaves, these strokes of scientific serendipity have had a big impact on our lives.
1.Matches.
Many of us wonder what life was like before electricity or the Internet (shudder), but imagine life before matches. We’re talking magnifying glasses and flint. For those of us who like to create controlled flame from time to time with the strike of a match, we can thank a British pharmacist and his dirty mixing stick. In 1826, John Walker noticed a dried lump on the end of a stick while he was stirring a mix of chemicals. When he tried to scrape it off, voila, sparks and flame. Jumping on the discovery, Walker marketed the first friction matches as “Friction Lights” and sold them at his pharmacy. The initial matches were made of cardboard but he soon replaced those with three-inch long hand-cut wooden splints; the matches came in a box equipped with a piece of sandpaper for striking. Although advised to patent his invention, he chose not to because he considered the product a benefit to mankind — which didn’t stop others from ripping off the idea and taking over the market share, leading Walker to stop producing his version.
2. Mauveine (aniline purple dye)
Before the 1850s, the general palette of common clothing was decidedly drab. Dyes and paints were made from natural materials. Plants, leaves, roots, minerals and insects were used to create lovely hues, but most often they were subtle, inconsistent and impermanent. All this changed in 1856 when 18-year-old chemistry student William Perkins was working to create an artificial quinine to help treat malaria, and instead came up with a muddy coal tar residue. Upon closer inspection, he noticed a stunning color: mauve. And just like that, Perkins had stumbled across the world's first aniline dye, a dye that would consistently produce a vivid and uniform shade that paved the way for synthetic colors as we know them today. (The 1980s thank you, Mr. Perkins.) The royal court fell head over heels for mauve, as did all of London and much of the world. But aside from the mauve madness, the first commercial application of a chemistry discovery created a paradigm shift. Organic chemistry became exciting and profitable — and as a result, it enticed many young minds to pursue industrial applications of chemistry, ultimately leading to important advances in medicine, perfume, photography and explosives.
3. Penicillin
Although antibiotics may get a bum rap for their prevalence and overuse, life before them was fraught with untamable infection and few defensive tools. Penicillin was the first antibiotic, a discovery that happened in 1929 when a young bacteriologist, Sir Alexander Fleming, was tidying up his lab. After having been on vacation, he returned to work to find that a petri dish of Staphylococcus bacteria had been left uncovered; and he noticed that mold on the culture had killed many of the bacteria. He identified the mold as penicillium notatum, and upon further research found that it could kill other bacteria and could be given to small animals without ill effect. A decade later, Howard Florey and Ernst Chain picked up where Fleming left off and isolated the bacteria-killing substance found in the mold – penicillin. The three won the Nobel Prize in medicine in 1945 "for the discovery of penicillin and its curative effect in various infectious diseases.” At right, a laboratory worker measures purified penicillin into bottles. In this process, the substance was freeze-dried and the ice evaporated off under vacuum. The powder left behind was penicillin.
4. Microwave oven
Of all the newfangled, ultra-mod, sci-fi kitchen appliances of the future, few are as notable as the microwave oven. Baking a potato in eight minutes must have seemed beyond imagination before this. The technology that promised to revolutionize the load on housewives everywhere, not to mention bachelors, was discovered in the 1940s when the U.S. company Raytheon was working on wartime magnetron tubes used in radar defense. Percy Spencer, an engineer at the company, was working on a magnetron when he noticed that a candy bar in his pocket had started to melt due to the microwaves. Eureka! Spencer developed a box for cooking and found that indeed, when food was placed in the box with the microwave energy, it cooked quickly. Raytheon filed a U.S. patent for the process and the first microwave oven was placed in a New England restaurant for testing. The first home microwave oven was introduced in 1967 by Amana (a division of Raytheon), to the delight of Jane Jetson wannabes everywhere.
5. Plastic
Although earlier plastics had relied on organic material, the first fully synthetic plastic was invented in 1907 when Leo Hendrik Baekeland accidentally created Bakelite. His initial quest was to invent a ready replacement for shellac, an expensive product derived from lac beetles. Baekeland combined formaldehyde with phenol, a waste product of coal, and subjected the mixture to heat. Rather than a shellac-like material, he inadvertently created a polymer that was unique in that it didn’t melt under heat and stress. The new thermosetting plastic was used for everything from phones to jewelry to clocks. It was also the first synthetic material to really stand on its own; it wasn’t used to mimic a natural material like ivory or tortoise shell, ushering in a era of new synthetic materials that has yet to subside.
6. Potato chips
Behold the potato chip: the salty, greasy, crispy wisp of tuber for which Americans dole out more than $7 billion a year. The life of the potato chip didn’t start out as an accident, more of a prank, but its imminent success took its inventor by surprise. As legend has it, in 1853 Saratoga Springs restaurant cook George "Speck" Crum was annoyed with the complaints of a wealthy patron who repeatedly returned his thickly cut French style potatoes, a common preparation at the time. After the third return, the exasperated Crum sliced the potatoes as thinly as he could, fried the daylights out of them, and covered them in what he assumed to be a prohibitive amount of salt. Much to his surprise, and perhaps initial chagrin, the patron adored them and ordered another round. They quickly became the house specialty, and the history of snacking was changed forever. So much so, in fact, that a major study by Harvard University recently revealed that the potato chip is the number one reason for weight gain in the United States. (We can't blame Chum for that.)
7. X-rays
In 1895, German physicist Wilhelm Conrad Röntgen was tinkering with a tube of cathode rays, the phosphorescent stream of electrons used today in everything from televisions to fluorescent light bulbs, when he noticed that a piece of paper covered in barium platinocyanide began to glow across the room. He knew that the flickering he saw was not being created by the cathode rays because they would not travel that far. Not knowing what the rays were, he named it X-radiation signifying the unknown nature. Upon further research he discovered a host of materials transparent to the radiation and that the rays could affect photographic plates. He took an X-ray photograph of his wife's hand that showed her bones and a ring; the image aroused great interest and ensured his place in the history of medicine and science. He was awarded the Nobel Prize in physics in 1901.
8. Safety glass
Back in the early days of automobiles, before seatbelts and airbags were part of the package, one of the gravest dangers was injury from shards of shattered windshield glass. We can thank French artist and chemist Édouard Bénédictus for chancing upon the invention of laminated glass, also known as safety glass. While in his lab, a glass flask dropped and broke but didn’t shatter, Bénédictus realized that the interior was coated with plastic cellulose nitrate that held the now-harmless broken pieces together. He applied for a patent in 1909 with a vision of increasing the safety of cars, but manufacturers rejected the idea to keep costs down. However, the glass became standard for gas mask lenses in World War I. With its success on the battle field, the automobile industry finally ceded and by the 1930s most cars were equipped with glass that didn’t splinter into jagged pieces upon impact.
9. Viagra
Much like the fountain of youth, humans have long sought magic ingredients that promise to boost the libido and enhance sexual function. But the breakthrough that gave us Viagra (sildenafil) didn’t occur when researchers were looking for ways to make men manly; rather, they were testing sildenafil as a cure for hypertension and heart disease. After two phases of testing, researchers came to the conclusion that the drug failed to show promising results for the heart, but test subjects noted that … well, you know what part of the body it did wonders for. Bingo! Pfizer patented Viagra in 1996 and it was approved for use in erectile dysfunction by the U.S. FDA in 1998. Sales of Viagra continue to exceed well more than $1 billion per year. Bonus tip: Researchers have also found that 1 milligram of sildenafil dissolved in a vase of water can make fresh cut flowers, um, "stand at attention" for up to a week beyond their natural life span.
10. Chocolate chip cookies
Not all chance discoveries came at the hands of scientists fiddling in labs. Sometimes they happened to cooks twiddling in kitchens — and sometimes in the kitchens of restored tollhouses. Case in point: The beloved Toll House Cookie. Ruth Wakefield and her husband owned and operated the Toll House Inn in Massachusetts where Ruth cooked for the guests. According to legend, one day in 1937 while making cookie dough, she realized she was out of melting baker’s chocolate and instead used a chocolate bar that she chopped into bits, hoping it would melt as well. It didn’t, and thus was born America's favorite cookie. Did the chocolate chip cookie change the world? Probably not, unless you calculate the combined moments of pleasure derived from biting into one fresh from the oven. They’ve certainly been responsible for changing a lot of moods. | MNN #end.
Science × Unbelievable
It's another one of those chance scientific breakthroughs: Scientists from Cornell and Germany have created the thinnest glass known to man entirely by accident, reports LiveScience.
Just how thin? A hard-to-fathom 2 atoms thick, which means you'll need an electron microscope to check it out, notes RedOrbit. The discovery might not only shed important insights into the unusual liquid-vs-solid structure of glass, it earned the scientists some real-world bragging rights: Guinness World Records has included the discovery in its 2014 book.
So how did they create the glass? As the Cornell Chronicle explains, the scientists were making a different uber-thin material called graphine and noticed some "muck" on it. That muck turned out to be made of silicon and oxygen, the stuff of glass. Their best guess is that an air leak caused copper foils to react with a quartz furnace. By studying the happy accident, the researchers have for the first time revealed "the precise arrangement of atoms in glass," explains the Verge. Amazingly, it's nearly spot-on with a theoretical model drawn in 1932 by physicist WH Zachariasen. As for real-world applications, the 2D glass could someday be used in transistors and lead to faster processors for computers and smartphones. (Click to read about another accidental discovery, this one of the world's best water absorbers.) | LiveScience #end.






