“Research” Doesn’t Mean “Unquestionably True”

I like to hear people talk about how obesity wasn’t a problem until everyone started eating low-fat foods. That would be a super intriguing theory if anyone ever consumed a low-fat diet, but they didn’t. In fact, we are eating 20% more fat that we did in 1970. We are also eating 30% more grain (mainly white flour) than in 1970. This is actually an improvement; in 2000 we were consuming 45% more grain than in 1970. Yet, sales of gluten-free foods continue to skyrocket. Take a moment to ponder the absolute absurdity of this…

The truth is that Americans eat more and move less than ever before. This isn’t news. Recently, a small group of people has gone ahead and added an occasional meat and grass-fed butter binge to their repertoire to break up the monotony. Occasionally people throw in a few burpees here and there, usually followed by an average of FIVE HOURS of television a day. Guess what? Low-fat foods aren’t the problem here; rice and skim milk aren’t the culprits in this dilemma that we are facing.

I often hear “health experts” attempt to back up their biased beliefs with research that isn’t even lucky enough to be classified as “questionable” or even “preliminary.” I recently had the pleasure of hearing a person with a 12-month online education on enthralling topics like the blood-type diet, now debunked for over two decades, announce to a room full of vulnerable people that she is a “research junkie” and is always reading the latest science. I bet she is actually reading a lot of stuff; I’m not going to take that away from her. I believe that she believes that she is very well educated.

The truth is that just because something was published in a magazine, or even a journal, doesn’t make it accurate. It doesn’t matter if 1,000 media outlets report on something if that something doesn’t have any credibility, to begin with. Sometimes research is done to be media-friendly, but it’s based on biased sources. At other times, researchers can skew the results based on their emotions or personal beliefs. What gets the most ratings? What gets the most hits? What sells the most product?

I wish that everyone, from bloggers to health professionals to the average American just trying to figure out what the Hell to eat these days, would ask themselves a few questions before take a piece of research as gospel:

  • Are the researchers being paid by an interest with a stake in the outcome (such as a food or supplement manufacturer)? Studies that are funded by a combination of sources, such as the government and industry tend to be the best and reviews with no funding are generally the worst.
  • What did the research design include? How many subjects were there, how long was the study, etc.? The fact that “research” on ingredients in popular dietary supplements are often studied for four weeks or so on only ten people, in a non-blinded fashion, doesn’t stop manufacturers from profiting to the tune of millions of dollars a year.
  • Did the study account for all potential confounders? “Is it exposure to X or something else about people with disease X that is causing their illness?” For example, people who take vitamin supplements also tend to eat better, overall,  exercise more, be better educated, and have more expendable income.
  • Were the researchers balanced and unbiased? Review their political, social, and professional affiliations and check out their degree(s) and credentials. Identify conflicts of interest, such as an author that is being supported by advertising.
  • Were primary or secondary sources of information used? Look for sources that are members of, or are governed by reputable industry. A study synopsis published in a “front of the book” blurb in a monthly lifestyle magazine is a secondary source.
  • Was the study done on people or is it “preliminary animal research”? People and mice are pretty different. Mice are tougher; you’d be amazed at what a mouse can survive.
  • Was the study based on the participants’ recollection? Researchers might ask participants how much physical activity they do or how often they eat a specific food, but experience has taught us that people are very rarely accurate when answering these kinds of questions. For instance, they might think that they are eating a low-fat diet when really they are eating a high-fat diet with a few low-fat cookies and a handful of low-fat potato chips added to the menu occasionally.
  • If the researchers gave the participants instructions such as taking a daily supplement or eating a certain number of servings of food, is there any evidence that the participants followed the directions correctly as given? Do you tell your doctor when you don’t take your medication as prescribed? Probably not; you don’t want to upset your doctor. Study participants don’t wan’t to upset the researchers.
  • Was the active substance compared to a placebo in a blinded fashion? Blinding is vital to control for the placebo effect.
  • Could there be reverse causation? For example, are people becoming overweight because they are drinking diet soda or are obese people more likely to choose diet soda in an effort to lose weight?

Be wary of the meta-analysis. When you see that the researchers reviewed hundreds of studies, it’s easy to assume that the results are credible, but sometimes the meta-analysis isn’t the best way to answer a specific question. For example, authors concluded that people randomly assigned to replace saturated fat with polyunsaturated fat had no lower risk of heart disease, but they had included a group of people that were given margarine that was absolutely loaded with trans-fat. They also failed to mention the evidence showing that saturated fat increased LDL-cholesterol. The media grabbed onto this meta-analysis and loudly announced that “butter is back!” No, it isn’t. A meta-analysis is only as good as the studies reviewed; you can conduct a meta-analysis of 500 really poorly done studies, and people assume that since such a large number of studies was included, it must be true; this is nothing more than wishful thinking.

Go ahead and take a minute to delve deeply into something that seemed like maybe it was too good to be true when you recently heard about it. The unfortunate truth is that it likely  IS too good to be true.

While you do that, I’ll be patting myself on the back, because it turns out I’m going to live forever. The “research junkie” says that people with type A blood should eat a vegetarian diet with a high intake of fruits, vegetables, grains, legumes, nuts, and seeds.

Maybe she is on to something after all! If only everyone were a blood type A…