Science and Journalism in Society

Brandeis University JOUR 130B

Month: January 2016 (page 1 of 2)

Companies responsible for contaminating US decades ago now in need to reduce risk of carcinogens reaching the dinner table

My brother, an environmental engineer, goes into work every day at a small environmental consulting office in downtown Boston. When you think environmental, you might think “out in the field” or hands-on-work, but instead he works behind the scenes, programming and inputing data into computer models of rivers  significant for high toxic concentrations. His firm serves as the mediator between the EPA and large industrial corporations, responsible for river pollution that happened decades ago, such as PCBs in the Hudson river. The bulk of the work they do is to identify the concentrations of toxins at specific location points and try to decipher where they originated from – not an easy task when the actual pollution occurred forty to seventy years ago, and in consulting, these companies will put up all defenses to reparations of their mistake until absolutely proven that they are at fault.

It’s interesting to think that it took until the 1970s for the US to ban pouring toxic waste into rivers. Who ever thought that was a good idea? Why were hundreds of companies following suit and doing the same thing? It might’ve been the cheapest disposal method then, but now, forty years after legislation banned river waste practices, these same companies are responsible to contribute millions of dollars for clean up and preventative risk. It has taken decades to drill into our heads that whatever you put into the environment has to end up somewhere, and often it stays close to home.

An article published by WNYC this month talks about the pollution of the Passaic River in New Jersey. The EPA has found approximately 100 companies responsible for the contamination of the River from byproducts of manufacturing products such as Agent Orange and hydraulic fluids. These toxic wastes have been strongly correlated as carcinogenic risks and cause of liver and birth defects. However, their current concern is that the fish from these rivers are being caught and eaten by people of the community. While it is strongly advised against consuming the fish and shellfish of the river, it is hard to reach that message out to community member  unwilling to accept or comprehend the associated high health risks. An emergency “fish exchange” program was recently instated last year, in which the EPA in partnership with the polluting companies encouraged people to exchange fish caught from the river for frozen tilapia bought from Costco. However, only 170 fish had been exchanged during the season the program ran, from June to October 2015.

Last minute executions like these are no long term solution to purifying these rivers and preserving the health of the community. To clean up the Passiac River estimates to be around 1.7 billion to remove and replace two feet of the sediment bed. Even then, it would take many years to deem the fish safe to eat. It’s a large project that needs more attention and pressure on polluters to make up for their mistakes.

Hot or Not: When Media Allows Opinion to Confound Climate Science

Welcome to the 21st century, an age in which more than 97% of published climate scientists and 200 international science organizations are in consensus that the climate is changing due to human activities.

And yet, one week prior to the New York Times breaking the story about 2015 being the hottest year on record, a Forbes blog published a piece by climate change denier and Contributor James Taylor, who denigrated current climate data and haughtily insisted that no, “2015 was not even close to hottest year on record.” 

So, how exactly did NASA and NOAA scientists – not the “global warming activists” whom a seemingly very confused Taylor continues to insist are the voices behind the science – reach the conclusion that 2015 was the hottest year on record? And what exactly was Taylor trying to say, in his jargon-laden, confusing argument? Let’s examine.

Taylor claims: “Satellite temperature readings going back to 1979 show 1998 was by far the warmest year in the satellite era…”.

When Taylor keeps saying “satellite”, he actually is referring to those big machines that orbits around us in space and are able to measure things like temperature. El Nino describes “a temporary change in the climate of the Pacific ocean”, usually leading to ocean warming.

However, in determining the global average temperature,  scientists follow the temperature of the Earth’s surface, not the lower atmosphere. Furthermore, a previous study found that “the satellite trends could be off (too cold) by perhaps 30%“.

What about that giant graph with garish blue lines and dots? Taylor attempts to a illustrate a historical record of global temperatures using a difficult-to-read graph showing temperatures from the Earth’s lower atmosphere. Again, the problem with this is, scientists use Earth’s surface temperatures, not lower atmosphere temperatures to measure warming. Furthermore, the graph was pulled from a blogpost written by a climate change skeptic. If I’ve learned anything from college, it is to make sure your sources are credible.

The final piece of “evidence” cited in this opinion is a report written by S. Fred Singer. Ah yes, the same man notorious for denying the negative health impacts of secondhand smoke, and for whom Naomi Oreskes wrote about in Merchants of Doubt. Singer is another ardent denier of anthropogenic climate change.

An audacious Taylor concludes his story by attributing the report to a bunch of baloney courtesy of “global warming activists”, before also baselessly accusing them of misleading head-fakes and doctoring temperature records. He also throws a jab at the “compliant media”, I’m guessing, for responsibly informing the public about such inconvenient truths.

When the Times broke the story a week later, Forbes did publish an article, written by Staff writer Alex Knapp,  covering what NASA and NOAA scientists reported. It can be found under the technology section. But as of Jan. 25, Taylor’s Opinion piece has racked up nearly 30,000 views to the news article’s 2,347.  

When Forbes launched the blogging platform used by Taylor, Forbes Chief Products Officer Lewis DVorkin said that accepted Contributors are “vetted by our editors and our staff. We look at their experience, we look at their credentials and what they’ve done. And we turn many people away.” Unfortunately, this appears to be an example of what happens when people like Taylor can using the  prominence and influence of the platform to disseminate junk science.

Finding Dogs

Who let the dogs out: men or wolves? And more importantly, when did they get out?

Surprisingly little is known about how dogs came to be at man’s side. What is known is that they are descendants of wolves anywhere from over 30,000 to 15,000 years ago. Scientists still debate whether a brave hunter decided to tame a wolf puppy, or if wolves evolved to be friendlier towards prehistoric humans. After all, begging for scraps is easier than running down prey.

Much of the confusion is because of dog fans of the 19th century. Dog fanatics had a “giant whirlwind blender of the European crazy Victorian dog-breeding frenzy.” As a result, dog genes look like a truck hit them. Dr. Larson, who received his Ph.D. at Oxford, is leading a collaboration between nearly every canine geneticist to create a canine database to establish order and provide a bank of information for research to sprout from. The team travels the world to collect from fossils and hope to have 1,500 DNA samples.

With Dr. Larson’s information, we may be able to determine the dawn of the dog.

 

 

Sources:

Gorman, James. The Big Search to Find Out Where Dogs Come From. The New York Times. The New York Times Company. 18 Jan 2016. Web. 1 Jan 2016. http://www.nytimes.com/2016/01/19/science/the-big-search-to-find-out-where-dogs-come-from.html?rref=collection%2Fsectioncollection%2Fscience&action=click&contentCollection=science&region=stream&module=stream_unit&version=latest&contentPlacement=19&pgtype=sectionfront

Should you be concerned about the Zika Virus?

The Zika virus (ZIKV) will manifest first as a mild headache, then a rash covering prominent areas of the body.  Usually followed by mild fever and back pain the next day, you will actually start to feel better by the end of the second day of illness.  The fever will subside and the rash will eventually go away within a couple of weeks.  There are no deaths associated with the illness.  Other than the discomfort of being ill for a couple of days, why is this a public health concern in Africa, Asia, and now Latin America?  To uncover the truth, we will have to go below the surface.

Similar to yellow fever, dengue fever, and West Nile, ZIKV is primarily transmitted as a mosquito-borne illness (via Aedes aegypti mosquitoes). Originating in the Zika Forest of Uganda, there have been cases in other African countries, such as, Tanzania, Egypt, Nigeria, and Sierra Leone, and regions of Asia, including India, Thailand, and the Philippines. It wasn’t until 2007, nearly 60-years after its discovery, that the virus was detected outside of Africa and Asia.  So how did this become a large concern in Latin America?  As most mosquito- transmitted illnesses, the answer likely lies through tourism, especially in Brazil because of the FIFA world soccer cup in 2014.

It was reported that an animal study using in mice have shown the virus is a neurotoxic. Neuronal cells were degraded and general softening of the brain was observed in these young mice.  But this conclusion was not cited and I could not identify any such research publication.  In contrary to this observation, Dr. Brain D. Foy of Colorado State University (an expert in insect-borne illnesses) states that mice, rates and other common research model organisms cannot be infected with the ZIKV, making it difficult to study the virus in traditional animal studies. So what is the evidence of the effects of Zika Virus on brain development? Two pregnant woman in Brazil were found with ZIKV in the amniotic fluid and those fetuses were born with microcephaly (a smaller head circumference). In association with the rise of ZIKV, there was also in increase in the amount of microcephaly births in the Northeast regions of Brazil. There have been more than 150 microcephaly cases in this region last year, which happens to be Brazil’s poorest region.  Because of this two isolated cases and a correlation, couples who want to start families are being told to delay their pregnancies in other countries.  Colombia, the second in highest infection rates, is recommending women delay their pregnancies for six to eight months.  However, El Salvador is advising women to delay for two years.  Despite El Salvador’s more than 5,000 cases of the Zika virus, there has not been any reports of microcephaly.

The Zika Virus, along with any other mosquito-transmitted illness, should not be taken lightly.  But the scientific evidence directly linking the virus outbreak with birth defects is simply lacking.  The only rational fear that I can see in this situation is fear for the unknown.

Was it Medical Malpractice?

As the surgeon at NYU medical center is ready to close up his patients chest after a lengthy heart surgery, he accidentally pokes and small extra whole in her heart. For the next 48 hours the woman’s heart is beating twice as fast as it is suppose to and she is pretty touch and go. The doctors are not sure if they should open her up again or if the heart will heal itself. Having been sedated for the last couple of days, when the doctors decide to try and take her breathing tube out, they realize she still cannot breathe on her own. Before they sedate her again, she writes: “No flowers, no something (they could not read it), and no obituary”. Doctors and nurses wouldn’t even allow her significant other into the ICU room for more than 10 minutes at a time. What could possibly be going on in her mind? How does she feel? Does she actually know what is going on? Was this medical malpractice?

The doctors rushed a woman into surgery because they found scar tissue from a previous surgery clotted in her intestine. She had been unable to fully digest her food for a few days. At eighty years old, surgery is already a bit risky, but of course what adds to the complications is when the doctor accidentally punctures her lung, causing her lung to collapse and eventually goes into heart failure. The icing on the cake was that the doctors didn’t stitch her up correctly and soon after, she had an infection in her blood. Doctors and nurses told her family to expect the worse that she probably wasn’t going to make it. She spent two weeks in the ICU, another week in the hospital, touch and go for about half of it. After, she spent a month in a rehab facility. She had sores on the back of her head from lying in bed so much, endured weeks of physical therapy, used a walker to move around for the first time in her life, and had to undergo speech therapy. Was this medical malpractice?

Medical malpractice is hard battle. It is an act or omission by a health care provider in which the treatment delivered is below the accepted standards of medicine and causes injury or death to the patient. It is important to acknowledge medical malpractice when it happens, even though it can be hard to justify. The case is generally complicated start to finish. It could take months, even years to complete. The cases are expensive, time consuming (taking off work), and sometimes-just plain impractical. There are many types of medical malpractice. Just a few are: incorrect incision, incomplete surgical procedure, and inappropriate postoperative care. Claims require sophisticated medical knowledge and strong defense from both sides, generally from experts in the field. Though the injury has to be physically damaging, the malpractice takes an emotional toll on the person and the family too. Even though these cases were accidental, they were definitely harmful. So, were these two situations malpractice?

*These two woman in the stories are related to me

Don’t Tell Me What to Eat

Too often I read articles that claim if you do not have Celiac disease and you go gluten free, there are no benefits to your health. But what most of these articles also state is that when these people chose gluten free, they cut out foods such as cookies, cakes, and bread, which are considered to be unhealthy. Removing these foods from the diet, even if the reason is because they are “going gluten-free”, does have health benefits. It forces individuals to, potentially, make healthier choices. What articles need to emphasize is that the gluten free diet fad can be harmful, or at least lack benefits, when people replace common gluten-containing foods with highly processed gluten free versions. Going gluten free may help guide people to avoid making unhealthy choices, choosing fruit over cake for dessert or having a salad instead of a pizza for lunch. Although weight loss or reports of people feeling better on a gluten free diet may not be attributed directly to the lack of gluten itself does not mean that a gluten free diet does not have any health benefits to those without Celiac disease.

Not enough is understood about gluten and its affects on the human body, but for those with Celiac, cutting out gluten is vital. Wheat gluten is a protein that is created when glutenin and gliadin, two molecules, combine and form a bond. This process occurs when dough is kneaded and is responsible for bread’s chewy texture. Celiac disease is an allergy to wheat gluten where the immune system triggers an inflammatory response in the individual’s intestine. Essentially the body mistakes gluten for a foreign body, attacks it, and can cause serious damage to the surface of the small intestine. As the presence of celiac disease increases and more research on gluten is published, more people are educating themselves via the media and deciding to partake in a gluten free diet.

Articles published about the benefits or harmful effects of gluten or gluten free diets need to share all the facts before they advise readers what type of diet to follow. Yes, if you are choosing to go gluten free, make sure you supplement your diet correctly so you don’t lose out on essential nutrients from whole grains. Just because you are removing gluten from you diet does not mean you should substitute common gluten foods for gluten free versions – check labels, educate yourself on the ingredients your are putting into your body (same goes for foods with gluten too). There should be an emphasis on all natural, whole foods. Also, writers should include information on the processing of wheat and how majority of that process is harsh and unnatural. Once given all the information, readers can make their own decisions about whether they want to eat gluten or not, and in what amount.

Every body is different, some people can digest gluten, some people cant, and some can in different amounts. Same goes for all food groups. Information shared about different foods and diets need to make clear that certain foods work for some people and may not work for others. Trust me if you told me to eat vegan I would have stomachaches for days – I physically cannot digest that much fiber and vegetables. Speaking from experience, I have been gluten free for about six years and was diagnosed with Celiac disease a year ago. I am in support of reducing gluten or going gluten free, but not to substitute gluten free foods for those with gluten, but rather to eat a healthier diet. As Michael Spector elaborates in his article, the Western diet today is full of sugary foods and carbohydrates that are refined and highly caloric. Majority of the wheat eaten in the American diet comes from white flour that is highly processed, containing gluten while lacking vitamins and nutrients. Committing to a gluten free diet forces you to remove foods that contain white flour, amongst other things, that have been linked or potentially can contribute to negative health outcomes. Choosing a gluten free diet can have benefits for some. It also can have harmful effects if not done correctly. It also may not be necessary, may contribute to weight loss, and also may not. Before advising readers to chose a diet, lets share all the facts, emphasize the importance of natural foods, and reiterate that every body is different meaning one diet may not be the answer for you as it is for me.

 

http://www.newyorker.com/magazine/2014/11/03/grain

http://www.livescience.com/36863-gluten-free-diet-healthy.html

 

October done right? or wrong?

Think Before you Pink– Stop the Distraction

http://bcaction.org/2014/09/30/think-before-you-pink-stop-the-distraction/

                How do you feel in October when all of your community is ‘pinkwashed’? I used to feel informed and proud to be in the presence of so many symbols of thriving life, awareness and productive research. After reading the article, Think Before you Pink– Stop the Distraction, I now feel angered by the whole sham behind the pink products. Author Karuna Jaggar asks, “what have all these pink ribbon products and promotions done for women living with and at risk of breast cancer?” It disgusts me to learn that what started as a means for creating awareness has turned into a “pink ribbon culture” which piles in the bills for marketing giants who don’t put it towards the use you would expect. The goal of purchasing that extra tote with a pink ribbon on it is not to put dollars in the pockets of marketing companies; it’s not even to create more ‘awareness;’ it is to create action. I just always assumed that paying extra for the pink items in October meant that research was being funded, but unfortunately, in most situations, this is not the case. More often than not, “companies sell products, make profits, and seek customer goodwill by claiming to care about breast cancer.” They are so successful in their profits because they rely on people’s goodwill. Nowhere is the fine print of where the money goes ever offered or discussed publicly. Here is the most outrageous part of the whole bogus pinkwashing: some companies use the popularity of the pink ribbon to sell products which are actually linked to the cause of breast cancer itself! For example, Alhambra Water “is pinkwashing by selling plastic polycarbonate water bottles which contain BPA, a hormone-disrupting chemical linked to breast cancer – while claiming to care about (and profiting from) breast cancer.” How’s that for an oxymoron? Clearly, many industries have used pinkwashing as a tactic to take advantage of people’s altruism and charity to selfishly benefit only themselves. So, next October (or whenever you feel like donating to breast cancer action), make sure to do your own research and find out where exactly your money will end up.

No more animal testing: a better way to develop drugs?

For the longest time, animals have been used as a model for humans in drug testing. Many medications do not make it to market because the models fail to correctly reflect how the human body reacts to drugs. Even those that become commercially available take anywhere from 10 to 15 years and cost billions of dollars before they reach that phase.

However, the Wyss Institute for Biologically Inspired Engineering invented a better method for drug development: organs-on-chips (OCs). These OCs aim to imitate the way human organs respond to drugs and are lined with human cells on a clear flexible synthetic plastic the size of an adult’s thumb. The OCs will cut the amount of time and money it costs to develop new drugs as it presents an alternative to the more expensive and lower quality animal models.  

Despite the positive outlook on this technology, scientists are still unsure of whether the OCs fully and accurately represent a human organ’s response. Thus, more research needs to be conducted, but these organ-on-chips are the new future to drug testing.

Sources:

http://wyss.harvard.edu/viewpage/461/

http://www.nature.com/news/organs-on-chips-go-mainstream-1.17977

Vitamin Supplements: The Ongoing Debate Between the Yaba-daba Doers and Disbelievers

Despite the hundreds of miles separating us, her voice rang with such clarity in my head that she may as well have been in the same room as me.

“You sound like you’re getting a cold, Madeline.  Take some extra Vitamin C’s and drink plenty of fluids.”

And as I have for the last two decades, I took the extra vitamins–not the awesome Flinstone ones of cushy elementary school days past, but the chewable ones labeled “Adult.”  This isn’t because it’s corroborated medical advice, or because I’m afraid to lie to my mom (if you’re reading this, mom, I would never lie to you).  It’s an engrained behavioral response—I have always followed a daily supplement routine and up the dosage at the tickle of a sore throat or onset of a stuffy nose.

Recently, however, vitamin supplements have been accruing more rants than raves from the scientific community.  The latest medical findings are challenging the alleged immunity boost previously believed to be acquired from taking dietary supplements.

The immune system is comprised of two components, the innate and the acquired response.  The innate response identifies that there is an intrusion, and the acquired response eliminates the foreign bodies.  Charles Bangham, a professor of immunology and infectious diseases at Imperial College London, says that there is no place for vitamins in this equation.  The only way to speed up the process and boost immunity is through vaccination; otherwise, what manufacturers are doing “is implying that if someone on a normal diet takes [vitamins] they will improve immune function, which is plain wrong.”

Tim Ballard, vice-chair of the Royal College of General Practitioners, cites “methodological weaknesses” and the inimitable quality of findings in studies that demonstrate the benefit of vitamins.  He says there isn’t much one can do to combat disease “other than a healthy diet and regular light exercise.”

But medical professionals’ opposition to vitamin supplements have yet to influence consumers. In 2014, consumers spent $14.3 billion on all vitamin- and mineral-containing supplements, $5.7 billion of which went into the multivitamin/mineral supplement sector.  According to the National Institutes of Health, more than one-third of all Americans take my mom’s advice.

Is this a problem of miscommunication between the medical realm and the public?  Or is it simply that old habits die hard?  Are advertising companies really good at what they do, or will there always be a cohort that will take their mothers’ advice over the cold embrace of statistics and scientific findings?  Whether you’re a superstitious supplement user or a vitamin virgin, there are still two persistent sides to the debate.

 

Sources:

https://ods.od.nih.gov/factsheets/MVMS-HealthProfessional/

http://www.theguardian.com/science/2016/jan/24/health-foods-immune-system-colds-vitamins

Obesity Is Caused by Eating Too Much ?

Original report website: http://well.blogs.nytimes.com/2016/01/07/rethinking-weight-loss-and-the-reasons-were-always-hungry/?_r=0

The article I read titled Rethinking Weight Loss and the Reasons We’re ‘Always Hungry’ published by the New York Times, is a piece interviewing the author of Always Hungry, David Ludwig, an obesity expert and professor of nutrition at Harvard T.H.Chan School of Public Health. Before reading it thoroughly, through scanning it I noticed that this report employs question and answer form. The title suggested me that this article might challenge the conventional thoughts of why we are always hungry and how to lose weight. As expected, the journalist used an analogy by Dr. Ludwig in the introductory paragraph to overturn the previous study on losing weight. Dr. Ludwig believes that calorie counting would not be the effective way to lose weight.

The beginning part attracts readers’ attention by it concise language use and an interesting analogy. In the following passage, after a brief description of what is this book about, the journalist started to interview with questions. I would like to quote some of the questions the journalist ask here:

  1. What is the basic message of your book?

Following question: But we’ve all been told that obesity is caused by eating too much. Is that not the case?

  1. That’s very different from the conventional wisdom that weight loss boils down to calories in, calories out.
  2. If it’s not overeating, then what is the underlying cause of obesity?
  3. How do you get your obese patients to lower their insulin?

From the questions above, we did not see too many complicated terms, but see clear and easily understanding questions regarding to this topic.

As a (preparatory) journalist, I assume conducting this interview requires much of professional science knowledge background. But in reality, it does not. Analyzing the questions this journalist came up with, there were two characteristics:

The first characteristic, this journalist might not know too much biological theories of obesity, but he has the conventional and common sense such as “obesity if caused by eating too much”, “decrease the calories in could result in weight loss” etc. However, these common senses are exactly what the readers know, and triggering readers’ curiosity that, what is wrong with this conventional sense? Many readers do not have a solid knowledge background on complicated biology terms, and they are seeking for simplified words and explanations to answer their question. Based on this fact, it is not a bad thing for a journalist who does not know too much about one specific science field, contrarily it might help him to find the questions that strikes a chord with his readers, figured our answers to these questions, and that is what readers want to gain from this report.

The second characteristic, though the journalist might not have a strong knowledge background on weight loss and obesity, but he have done research on Dr. Ludwig’s new book, and the question he prepares are hit home on the main topic of this book. Because of the research before interview, the questions the journalist asked are flexible according to Dr.Ludwig’s answers. When Dr. Ludwig answered that low fat and very high carbohydrate diet people kept increased the level of hormone insulin thus caused their obesity, the journalist asked in the following question, how to lower obesity patients’ insulin. This coherent and logical question flow helps readers to easily understand and captured the answers they are seeking for.

In general, this piece changed my first impression to be a science event covered journalist. The goal of us is to use the simplified words or terms to explain a complicated scientific phenomenon. Research and having a big picture is indeed, which will help to ask flexible questions during the interview.

 

« Older posts

Protected by Akismet
Blog with WordPress

Welcome Guest | Login (Brandeis Members Only)