The Ancestral Indigenous Diet: A Whole Foods Meat-Based Carnivore Diet

Home > Other > The Ancestral Indigenous Diet: A Whole Foods Meat-Based Carnivore Diet > Page 7
The Ancestral Indigenous Diet: A Whole Foods Meat-Based Carnivore Diet Page 7

by Frank Tufano


  The body triggers an immune response whenever these fats end up in the bloodstream. Over time, your body will become composed of linoleic acid (a type of Omega-6 fat). And when cholesterol becomes composed of linoleic acid, it can get inflamed in the arteries. This inflammation, more than just the mere presence of cholesterol itself, may be what is causing so many strokes and heart attacks in the United States.

  But the meals we have used as examples are all junk food, you might say. What about “healthier” options? Surely they are fine? Right?

  Chicken breast and rice might not seem so bad from an inflammation perspective. Because they are so lean, even grain-fed chickens aren’t adding much in the way of Omega-6 fatty acids. But they probably have hormones and antibiotics that aren’t great to ingest. And what type of rice are you eating? With all the attention now paid to the glycemic index, most people now know that white rice breaks down quickly, so maybe you will avoid it and the related insulin response. Whole-grain rice, however, comes with lectins that can trigger an immune response, among other issues. And any type probably is laced with pollutants, arsenic, and agrochemicals.

  This sure is a lot to keep track of. And don’t forget the primary goal of our diet. Beyond everything else, your “healthy” lunch is still missing the most important thing we need: nutrients.

  Hoping Vitamins Will Protect You

  If nutrient density is the first goal of the diet, the second is removing inflammation. Of course, you could try to figure out what foods your body tolerates and which ones you don’t through process of elimination. But the easiest way is to simply remove everything that doesn’t serve a purpose and then reintroduce foods one at a time.

  This is one of the primary reasons I have chosen to eat carnivore and stick to animal foods. First off, I know this gives my immune system all the vitamins and nutrients it needs to operate optimally. This means that my body is much more capable of handling any inflammation that does occur. Because even if I eat perfectly, I still live in New York and am exposed to contaminants and pollution that my body has to deal with. How can it handle all that if it’s so busy fighting off all the perceived attacks it is so worried about from sugars, lectins, and other issues related to leaky gut?

  There is one anecdote that helps explain my mentality on all this. I once went through a phase where I ate liver everyday with butter. Generally, butter causes me to break out. In some capacity, there is a combination of nutrients and bacteria in every individual’s microbiome that allows different people to tolerate different foods. Combined with allergies and other issues, this means that you and I may react very differently to certain things we eat. The fat particles in butter can be hard to digest for some, and were likely giving me leaky gut.

  I’m not fully sure about the science of it all. But my theory was that, even though I am allergic to butter, consuming it alongside such a high-vitamin food would prevent me from getting acne. It worked! The Vitamin A from liver, in particular, seems to always help eliminate acne whenever I have a breakout.

  This is still kind of bit stupid though. The better idea — which I quickly learned — was to simply eliminate the butter and eat liver by itself. I can get all the nutrients available from butter in another way. So why am I torturing myself and looking for anecdotes to an allergy that I can just avoid in the first place

  The Elimination Protocol: Cut It Out

  Over the past decade, the “health and wellness” community has been focused more and more on antioxidants. Study after study tries to determine which foods and plants we can eat to lower the levels of oxidative stress that cause inflammation, accelerate aging, and generally harm our health. People are now stuffing their faces with blueberries, turmeric, and goji berries — even chocolate and wine — in hopes of repairing themselves.

  That’s all fine and good. But it seems to me that there’s an even better solution. Rather than looking for magic superfoods that will help reverse the damage — like I once did with liver to cure my acne — why not just stop consuming the things that we know cause oxidative stress and inflammation in the first place?

  At some level, everything you ever eat puts some stress on the body. (This is a main reason why fasting can be so beneficial.) But certain foods — like grass-fed, hormone- and antibiotic-free beef — cause almost no inflammation. Omega-3 fatty acids also don’t cause nearly the same immune-system response as the Omega-6 fatty acids that people shovel into their mouths all day long.

  It’s almost too obvious. The best way to protect yourself from the dangers of inflammation would be to never eat. But that is unrealistic for any human who wants to remain alive.

  The second-best way, then, is to stop eating inflammatory foods. If you can get all the nutrients you need from animal foods and avoid all the inflammation you don’t want, doesn’t it just make sense to stick to these foods? Or is that kale and spinach smoothie just too delicious to give up?

  If you don’t actually need it for nutrition — and it’s not even good for you — the solution is simple: Cut it out. What you don’t eat can’t hurt you.

  Chapter 6

  Plant Food Myths:

  Why Fruits, Vegetables, and Grains Aren’t Necessary

  Oatmeal is a common part of the modern diet and one deemed as a bona fide health food by nutritionists across the world. This is despite the fact that it is usually sold in single-serving packets mixed with apple cinnamon or brown maple sugar flavor concoctions. Once reserved for vegetarian stores, oat milk has also exploded on the scene and forms the base for millions of vegan Oatly oatmilk smoothies every day.

  In these forms, it’s hard to find anything good to say about this grain. But it has long had a place in the human diet — if prepared properly. The way you make your oatmeal can make all the difference between your breakfast being a nourishing energy source or a potentially inflammatory food to avoid.

  Let’s explore this with a little history lesson. Oatmeal has been a staple in Scotland since before the place was even called Scotland. After arriving in the British Isles, likely during the Iron Age, and becoming part of the traditional Gaelic culture in the region, the grain has become synonymous with these people.

  When Samuel Johnson wrote the Dictionary of the English Language in 1755, he included this in his definition of oats: “A grain, which in England is generally given to horses, but in Scotland supports the people.” (Some local humor can be seen in a popular response: “That’s why England has such fine horses and Scotland such fine men.”)

  Oats became the preferred grain here because they grew well in the wet, cold, harsh climate of these northern highlands. They also happened to be a good choice due to their relatively higher caloric and fat content compared to other options at the time like bulgur wheat, buckwheat, rye, sorghum, and barley, according to the Whole Grains Council. In times of scarcity, oats could be stretched just that much further.

  Today you can even buy “Scottish Oats” all across the world, although even this steel-cut variety differs greatly from the raw “groats” that were used historically. Generally, you won’t find these in your local supermarket, and even the hippies that may opt for this type fail to cook them properly in line with the traditional preparation method.

  In a process that dentist Weston Price observed during his trip to the isolated Isles of Lewis and Harris almost a century ago, the not-yet-modernized Gaelic communities — who we wrote had “teeth of unusual perfection” and “a physique that rivals that found in almost any place in the world” — relied heavily on fermentation to create their porridge.

  Among those still living a traditional lifestyle and eating the traditional way, oats were a main energy source to complement a diet otherwise full of fish, lobster, crabs, oysters, and clams. “An important and highly relished article of diet has been baked cod’s head stuffed with chopped cod’s liver and oatmeal,” wrote Price in his classic book Nutrition and Physical Degeneration. He reported that “fruits are practically unknown” and that eve
n dairy was limited.

  Another book, The Scots Kitchen, written by Florence Marian McNeill in 1929, helps explain the traditional process of turning raw oats into something known to these people as “sughan” (or “sowans” in English). First, the full groat would be milled and ground into husks called “sids.” These would then be soaked in water for seven days. Sometimes they would soak the mixture even longer. After that long bath, the sids would then be tossed aside and the cloudy water was reserved to rest for a few more days. The floating remnants — forming something of a cloudy sediment — would settle over time in the container. This substance is what would become the sowans.

  Through generations of passed-on wisdom, these Gaelic communities had learned that this process was a great way to extract all the nutrition from the grain while discarding the negative parts that were hard to digest and generally detrimental to consume. The substance left over "contains practically all the nutritious properties of the oatmeal in its most easily digested form,” wrote McNeill. In modern terminology, we would say this method was a way to make the nutrients more “bioavailable" to our bodies.

  To actually prepare the meal into is final sowans form, they would then heat it up, adding salt and dairy (raw cream or milk most of the time). The result was a meal both high in calories from the grains and rich in vitamins and other nutrients from the added raw milk fat.

  The Role of Oats in the Gaelic Diet

  This classic dish can be traced back to an ancient culture and perfectly illustrates the historic role that plant foods have played in the human diet. Even after the advent of livestock, it was difficult to procure a community’s worth of all the calories from animals. There were only so many cows, and any of them could get sick and die at a moment’s notice from a mysterious illness sent by the gods.

  Besides, these domesticated animals didn’t really resemble modern cattle. At least not functionally. The cows these old Gaelic people would have raised would have produced a fraction of the milk that today’s super cows pump out. The increase in just the last century alone has been staggering. Genetic selection, breeding, farm science, and chemicals have forever changed husbandry into profit-motivated industry.

  In 1900, according to a study from the Journal of Dairy Sciences, a dairy cow in the United States could be expected to produce an average of around 11 pounds of milk per day. By 1950, this number was up to 14.6 pounds, and it nearly doubled to 28.4 pounds each day by 1975, per USDA figures. But they were just getting started. In 2000, the average cow would produce 49.9 pounds per day — and the top-performers would do way better than that.

  “Production as high as 60 kilograms (132 pounds) per day is not uncommon,” stated the Journal of Dairy Sciences paper about modern high-producing cows. “In fact, the current world-record Holstein produced more than 30,000 kilograms (72,752 pounds) of milk in a year. That’s almost 90 kilograms (198 pounds) per day on average — enough to feed more than 100 people.”

  The people who once lived in modern-day Scotland — even in the best seasons — never had access to the animal products we take for granted today. Living with scarcity, it only made sense to stretch everything as far as possible.

  That was likely the inspiration for this porridge dish that nobody in today’s society would ever expend so much effort to make. As labor intensive as this process was, the caloric expenditure in doing so was probably comparable to gathering wild plant foods and it was a much more certain way to feed yourself.

  But they knew that whole grains were not worth eating in their natural form. So they went through all this to make the sowans. In a pure nutritional sense, they would have been fine just drinking raw cream and eating cheese. But it wouldn’t last as long for the community or the individual family.

  Thus, a great compromise: Use half the dairy and mix it into some long-prepared oats. You preserve your resources, get adequate nutrition, and fuel up with carbs for enough energy to make it until tomorrow. There’s even an added bonus to this survival strategy: It tastes great!

  Understanding and Avoiding Anti-Nutrients

  There was one other great reason to put so much effort into oatmeal preparation. Without even knowing the science behind it, they had also gone to great lengths to remove most the dreaded “anti-nutrients” contained in the grains.

  This was something they surely learned over time. It is likely that they observed negative effects if they consume oats prepared in other ways. Maybe the issues were just digestive. Maybe it was just feeling bad or lacking energy. Or perhaps they saw dental problems, suffered skeletal issues, or even stunted development.

  Either way, we can forgive the Gaelic people living centuries ago for not knowing the term “anti-nutrients.” But it’s really really strange that almost nobody does talks about it either.

  What exactly are antinutrients? in the most basic terms, anti-nutrients are any negative substances found in plant foods that can inhibit certain processes in the body, such as digestive enzymes or mineral absorption, and in turn impair metabolic function.

  In something closer to normal English, they are compounds that make the “paper value” of minerals and vitamins in many grains and vegetables misleading. The numbers and RDA percentages you see on the bag of veggies may look promising. But antinutrients, among other factors, mean your body probably isn’t going to ever use a lot of it. So does it really matter how much magnesium is technically in a pound of spinach if your gut is incapable of ever absorbing it?

  Gluten, a protein in many grains that some people can't tolerate, is the one anti-nutrient that has become well known. Some cereals also contain other proteins, such as avenins (in oats) and gliadin (in wheat), that can cause digestive issues. Then there are the lectins (found in high levels in legumes), oxalates (common in green vegetables), and many others that are widespread in modern diets.

  As a rule, you don’t want to be consuming these. They are believed to have evolved as something of a defense mechanism for plants. Think of it like this: Cheetahs got really fast to avoid being eaten. Turtles weren’t so lucky in that department, but they have survived with strong armor. Cobras have poisonous venom and long fangs.

  Plants, on the other hand, can't move or physically protect themselves very well. So they turned to chemistry and started producing substances that can harm anything that eats them. Insects and other smaller animals can face significant problems if they consume too many.

  There have even been documented cases of people dying from toxicity caused by glycoalkaloids in sprouted potatoes. But for humans, anti-nutrients historically have been more of a complication. They can cause an inflammatory response in our digestive systems, disrupt the microbiome in our intestines, and lead to other chronic problems.

  In high enough amounts, they may even contribute to problems like “leaky gut” syndrome (permeability of the epithelial lining), that damages the walls of our small intestine badly enough to allow larger particles to pass through into our bloodstream. These substances are not meant to exit the gut this way. So it is speculated that this can cause a whole host of issues and promote chronic inflammation because it triggers our immune system to start fighting what it sees as a foreign invasive.

  These and other concerns are a large reason why the Ancestral Indigenous Diet includes no plant foods. In the future — after you fix years of health problems caused by a bad diet — if high-quality organic plant foods are accessible, incorporating them and gauging your individual tolerance could be a hypothetical goal. But, simply put, you don’t actually need them for nutrition. They are often inflammatory. And they may cause other types of damage in our gut.

  It is not within the scope of this book to break down all the complex science associated with these compounds or participate in the endless debate about how serious all the potential effects might be. There are countless studies across dozens of peer-reviewed journals that have been analyzing all this for years.

  But there are undoubtedly some real concerns to consider. So
the following breakdown offers a list of a few of the most common anti-nutrients, where they are typically found, and why they might be problematic.

  Phytic Acid / Oxalates

  Found in: grains, legumes, greens, nuts, seeds

  While phytic acid (phytates) and oxalates are different in nature, they are spoken about together quite frequently in a vegetarian/vegan diet because they both inhibit the absorption of various minerals. They are unique from one another scientifically but the main difference for our purposes is that phytic acid is found in grains, legumes, nuts, and seeds, while oxalates are in both greens and legumes.

  Phytic acid inhibits the absorption of phosphorus, calcium, copper, iron, magnesium, zinc, and manganese. The degree of disruption can be as high as 80% (for phosphorus and calcium) or 40% (manganese) depending upon the mineral. Oxalates are primarily associated with inhibiting the uptake of calcium, iron, and magnesium (by binding to these mineral). Because these two ant-nutrients often overlaps so greatly in vegan/vegetarian diets where intake is so high (and animal sources don’t exist), this is one reason that anemia (iron deficiency) is so common.

  Glucosinolates

  Found in: cruciferous vegetables, including broccoli, cauliflower, kale, cabbage, brussels sprouts

  Glucosinolates are known to be goitrogens (meaning that they can potentially encourage the growth of goiters when consumed in large amounts) and have been shown in some animal studies to lead to free radicals. They also go hand in hand with the presence of isothiocyanate, which can stimulate detoxification enzymes (creating an inflammatory response), interfere with DNA segregation (leading to cell death), and inhibit both iodine uptake and thyroid hormone. Beyond this, glucosinolates consumption is associated with indoles (inhibiting ATP energy metabolism) and nitriles (stimulating detoxification enzymes).

 

‹ Prev