Created on Tuesday, 24 January 2017 21:10
If you supplement with Vitamin D3, you are wise, and that’s true for just about everybody. But, if you do so thinking that it eliminates your need for sun exposure, you are wrong. It has long been suspected that sunlight does more for the human body than facilitate the production of Vitamin D. And now there is some concrete evidence for that.
A new study by Georgetown Medical School researchers and published in Scientific Report discovered that sun exposure has immune-boosting effects that are independent of Vitamin D. They found that blue light specifically activates certain key immune cells and increases their motility. These cells are known as T lymphocytes. The researchers found that the blue light triggered the synthesis of hydrogen peroxide which then activated key signaling pathways that led to increased movement and motility of the T cells.
Blue light is known to reach the dermis, which is the second layer of the skin beneath the epidermis. And the dermis is loaded with T lymphocytes. These T cells can go from the dermis into the blood and get to places all over the body. So, there is a systemic effect from this, where sunlight seems to power-up your whole immune system to help you fight infection.
What does this mean from a practical standpoint? I think it means that if you’re taking Vitamin D, and even if you are taking a lot of Vitamin D, you should still get sun exposure. And this isn’t the first study to suggest sunlight benefits us outside of Vitamin D. For instance, the mood-elevating effects of sunlight are believed to be independent of Vitamin D.
When you consider the extent to which we are solar beings, that our energy, and I mean our very own physical and mental energy comes, in a straight line, from the sun’s energy which showers the Earth, and that are biological cycles, known as circadian rhythms, are timed with the solar cycle, it stands to reason that the effects of sunlight upon us would be vast and not just one thing, Vitamin D.
So, even though we can benefit from pushing the Vitamin D envelope by taking it supplementally, it does not eliminate our need for sunlight exposure.
Men have it easy. If you’re outside, working or playing or just walking, and the sun comes out and it’s warm, don’t be modest; take your shirt off. You can’t get arrested. I do it frequently. Why the heck not?
But remember: you want to avoid tanning because tanning is a sign of damage, plus tanned skin is not as receptive to the sunlight. So, this is a delicate balancing act where you want to get small amounts of sunlight often- never staying out in it long enough to provoke a tanning response.
Sunlight is one of the primary and fundamental needs of life, and unfortunately, people tend to either ignore the need for it or overdo it. With care and caution, you should make it a priority to obtain a healthful amount of sunlight- your solar food.
Created on Tuesday, 24 January 2017 00:30
We have it good here in Texas. This time of year- winter- the Texas Ruby Red grapefruit are in season, and they are cheap, plentiful, and sweeter than any other grapefruit you are ever going to eat. Texas grows oranges too, although I don't say they surpass the best California navels. But, I like oranges and grapefruits, and I like other acid fruits, such as kiwis, berries of all kinds, and pineapples. And even apples are quite acidic, being high in malic acid rather than citric acid. And there are plenty of other fruits with rather high acid content, such as plums, mangoes, grapes, and more. Oops, I left out tomatoes.
So, if you are eating a wide variety of these fruits in significant quantity, that's a heck of a lot of acid. Are there any negative consequences to that?
First note that there is no danger of the acid building up in your body and making you acid, that is, your blood acid. These organic acids are broken down very quickly to carbonic acid, and that very quickly gets converted to carbon dioxide, which you automatically exhale. So, you breathe out the acid from these fruits, and the process by which this happens is very efficient. So, that isn't a worry.
The biggest problem from eating acid fruit is the effect it has on your teeth, which is to erode your dental enamel. And, that enamel doesn't grow back. Once it's gone, it's gone. And when it's gone, your teeth look yellow and dingy; they may chip and break more easily; they may become painfully sensitive to heat and cold and sugar, and they may become more prone to tooth decay. So, you want to hold on to your teeth enamel for as long as you possibly can.
Then, the other problem that has been linked to eating a lot of acid fruit is the development of canker sores. These mouth sores, which can form on the inside of the lips or the undersurface of the tongue or the inside of the cheek are the painful ulcers can last for 1 to 2 weeks and making eating, talking, and even sleeping painful and difficult. Acid fruits are known to be a major trigger of canker sores. And that has been true in my experience and the experience of others I know. Someone may have a citrus tree or a plum tree at home, and when they're plentiful, they indulge liberally, and in a little while, they have one or more canker sores. The acid irritates the cells- actually burns them a little- and the body responds with inflammation, and in susceptible people, the inflammation advances to a destructive form. It really is like an auto-immune disease- one that millions upon millions of people have.
The good news is that canker sores, except in rare cases, eventually heal without a trace, leaving no scar and no remnant that they were ever there. No remnant, that is, except the memory of how painful they were.
So, what can you do to prevent them?
First, control the amount of acid fruit you eat. Limit it to reasonable, moderate amounts. Second, if a fruit is unusually acidic, meaning that the orange or the grapefruit or the pineapple or whatever is super-sour, then don't eat it. For instance, you really shouldn't buy navel oranges until January. They rush them to market in the fall, as early as October, but they really aren't ripe yet. You know how tart a fruit should be, and if it's more sour than that, don't eat it. Pass on it. Make that a rule. Third, rinse out your mouth with warm water after eating acid fruits to remove any film of acid that may still be clinging to your teeth. Fourth, for the most part, avoid acidic drinks. If you take your oranges or grapefruit in juice form, it may actually concentrate the acid since the fiber and other solid matter is being discarded. The acid easily passes into the juice. And note also that when you start juicing, you usually wind up consuming more. Once you start juicing oranges, you might go through 3 or 4 to come up with a whole glass, or even more than 3 or 4, depending on how juicy they are. And that's more than you would eat if you were just eating them.
A life without oranges, grapefruits, plums, mangoes, etc.- I'm not sure it would be worth living. So, I am not suggesting that you avoid these foods completely. In one of the largest and longest longitudinal studies ever done, the lifestyle factor that was most associated with living longer was eating fresh fruit. So, let's not throw the baby out with the bath water. But, there is a happy medium in which you can enjoy these fruits but not suffer ill-effects from eating them. Be conscious and aware of how much acid fruits you eat, and don't go overboard.
I should also point out that mechanical trauma to the mouth from biting your lip or cheek can trigger the start of a canker sore. So, what can I say except eat slowly and chew your food well, taking small, manageable bites without carelessly traumatizing yourself. Become a conscious eater.
Created on Friday, 06 January 2017 21:03
This write-up by Dr. John Cannell is very interesting because it concerns an 83 year old woman with pancreatic cancer who started taking 100,000 IUs a day of Vitamin D3 to treat her cancer, and she showed signs of remission which went on for at least 8 months. After that, they lost track of her, but who knows, maybe she is still going strong.
I'm posting this for two reasons: first, in case someone you know develops pancreatic cancer, and hopefully it won't be you yourself. And second, to demonstrate how safe taking Vitamin D3 really is because this woman took 100,000 IUs a day for at least 8 months, apparently with no adverse effects at all. It didn't even cause her blood calcium level to rise too high- and that's the first thing you look for. And it makes me realize that people like me who take 5000 IUs a day have got nothing to worry about.
Pancreatic cancer is one of the most dangerous cancers people develop, ranking the fourth most common type of cancer that results in mortality. Pancreatic cancer typically has a very poor prognosis: 25% of people survive one year and only 5% live for five years.
By the time an individual develops symptoms, the tumor has already spread. The most common symptoms and signs of pancreatic cancer are abdominal pain, jaundice (yellow skin), weight loss, light-colored stools and dark urine.
A recent paper reported on an 83-year-old woman who experienced jaundice, unintentional weight loss and abdominal discomfort. She was diagnosed with metastatic pancreatic cancer in January of 2015. The patient underwent one course of chemotherapy before deciding not to undergo anymore chemotherapy. Unknown to her doctor, she started taking 50,000 IU/day of vitamin D in March of 2015 to treat her cancer.
Her initial pancreatic CAT scan showed a 3.6 x 2.7 cm mass in her pancreas with metastasis in her lymph nodes. On 9/4/15, the lesion was slightly smaller, and she was feeling quite well. Her calcium was high normal at 9.6, and her 25(OH)D was reported as >150 ng/ml. She was lost to follow up in January of 2016 after having 8 months of symptom free pancreatic cancer.
The authors state:
“Given the poor prognosis of pancreatic cancer and the limited treatment options for patients, this case should stimulate further investigation. The daily dose of 50,000 IU of vitamin D3 was well tolerated in our patient for over 10 months at the time of writing. Consideration should be given to a clinical trial that evaluates a similar dose.”
Due to the poor prognosis and emotional toll of this disease, pancreatic cancer is a health outcome that urgently requires further research. This case report demonstrates that not everyone who takes 50,000 IU/day will develop hypercalcemia. I agree with the author’s statement that researchers should use pharmacological doses of vitamin D (50,000 to 100,000 IU/day) in a clinical trial. I predict some people will respond to such treatment.
Created on Thursday, 29 December 2016 18:48
It could make a huge difference in national health if Americans, across the board, upped their intake of magnesium. That conclusion is derived from a recent study out of China.
"On December 8, 2016 BMC Medicine published the results of a meta-analysis conducted by researchers at Zhejiang University in China which concluded that consuming a higher amount of magnesium is associated with a lower risk of heart failure, stroke, type 2 diabetes and all-cause mortality during up to 30 years of follow-up. The meta-analysis is the first to investigate the effect of dietary magnesium intake on the risk of heart failure and the first quantitative meta-analysis to examine the dose-response relationship between dietary magnesium intake and all-cause mortality."
“Fudi Wang of Zhejiang University’s School of Public Health and colleagues selected 40 publications that included a total of over a million subjects for their analysis. Food frequency questionnaire or dietary recall responses provided information concerning magnesium intake.”
“Over the studies' follow-up periods, 7,678 cases of cardiovascular disease, 6,845 cases of coronary heart disease, 701 cases of heart failure, 14,755 cases of stroke, 26,299 cases of type 2 diabetes and 10,983 deaths were documented. Each 100 milligram (mg) per day increase in magnesium intake was associated with a 22% reduction in heart failure risk, a 7% decrease in stroke risk, a 19% decrease in the risk of type 2 diabetes and a 10% lower risk of dying from any cause.”
“In their discussion, the authors observe that, in comparison with oral supplements and intravenous infusions, increasing the intake of magnesium via the diet may only moderately increase magnesium levels. Although foods such as nuts, beans and whole grains are good sources of the mineral, the authors advise that the daily requirement for magnesium is difficult to achieve by consuming a single serving of any one food item.”
"Our meta-analysis provides the most up-to-date evidence supporting a link between the role of magnesium in food and reducing the risk of disease," Dr Wang stated. "Our findings will be important for informing the public and policy makers on dietary guidelines to reduce magnesium deficiency related health risks."
Dr. Cinque: Magnesium is the second most abundant mineral in the human body after calcium. Besides being a structural mineral like calcium (where magnesium is the second most abundant mineral in bones) magnesium is high enzymatic. There are over 400 bio-chemical reactions in the human body that are dependent on magnesium as a co-factor, and there may be more that aren’t known. It is obvious from this study that magnesium deficiency is extremely common, and correcting it would ameliorate chronic degenerative diseases across the board. How tragic it is that something so plentiful and so accessible and so inexpensive could make such a big difference! What are we waiting for?
Magnesium occurs mainly in unrefined plant foods, particularly green vegetables, beans, nuts, seeds, and whole grains. Increasing these foods would be a very good idea. In fact, they are the foods that people should primarily be eating, with the addition of fruits. Everything else should be an afterthought.
But, as the article states, getting up to, say, 400 mgs magnesium a day may be hard for many people to reach even when they try to eat healthily. And note that most multis contain very little magnesium- not a meaningful amount- because it’s just too bulky to squeeze into multis.
So, what I do is keep a container of Mag Complete around which supplies 120 mgs magnesium per capsule. Its product number is CP1830. It contains several forms of very usable magnesium. And I take it at night before bed. The reason I do that is because magnesium is known to be relaxing- to the nerves and muscles. So, it can help with sleep. It doesn’t make you sleepy but it does help you relax so that you can fall asleep naturally. So, I take one before bed, and if I wake up during the night, I may take another. So, between that, and my natural foods diet, and the small amount of magnesium I get from my multi, I am sure I am getting plenty of magnesium.
Really, it is an awful shame that people should be suffering and dying early because of a deficiency of magnesium. It’s a tragedy. It’s a matter of dying out of ignorance- and that may be not just patient ignorance but doctor ignorance. How many doctors are encouraging their patients to consume more magnesium to prevent diseases?
So, make sure that you are getting enough magnesium. It is extremely safe. The worst thing that will happen if you take too much is that you may get some loose stools- as in milk of magnesia. That’s right; in high amounts, magnesium is also used as a laxative.
Created on Friday, 09 December 2016 19:58
I believe that irrational exuberance dominated the mindset of Americans that Modern Medicine was going to increase lifespans indefinitely. Life expectancy of Americans dipped slightly in 2015 compared with 2014, according to the latest data from the CDC.
And keep in mind that the CDC is like the OPEC of drug companies, so it’s hardly unbiased. I’m not saying that they would ever skew statistics, but then again, yes I am. I know very well that they do. This is the same organization that says that 36,000 people die every year of the flu, just to bolster the sale of the ridiculous flu vaccine.
The following is from the report:
In 2015, life expectancy at birth was 78.8 years for all Americans, a decrease of 0.1 year from 78.9 years in 2014, wrote Elizabeth Arias, PhD, and colleagues from the National Center for Health Statistics, a division of the CDC.
For males, life expectancy at birth changed from 76.5 years in 2014 to 76.3 years in 2015, a decrease of 0.2 year, and for females, it decreased 0.1 year from 81.3 years in 2014 to 81.2 years in 2015.
In 2015, life expectancy at age 65 years for the total population was 19.4 years, the same as in 2014. Life expectancy at age 65 was 20.6 years for women and 18.0 years for men, both unchanged from 2014. In 2015, the difference in life expectancy at age 65 between women and men held steady at 2.6 years.
In 2015, a total of 2,712,630 resident deaths were registered in the United States — 86,212 more than in 2014. From 2014 to 2015, the age-adjusted death rate for the total population rose 1.2%, from 724.6 deaths per 100,000 in 2014 to 733.1 in 2015.
"The rate for the total population rose significantly for the first time since 1999," the authors report.
Top Causes of Death
There was no change from 2014 to 2015 in the 10 top causes of death: heart disease, cancer, chronic lower respiratory tract diseases, unintentional injuries, stroke, Alzheimer's disease, diabetes, influenza and pneumonia, kidney disease, and suicide. Together they accounted for 74.2% of all deaths in the United States in 2015.
However, from 2014 to 2015, age-adjusted death rates rose for 8 of 10 leading causes of death and decreased for 1. The rate increased 0.9% for heart disease, 2.7% for chronic lower respiratory tract diseases, 6.7% for unintentional injuries, 3.0% for stroke, 15.7% for Alzheimer's disease, 1.9% for diabetes, 1.5% for kidney disease, and 2.3% for suicide. The rate decreased by 1.7% for cancer. Age-adjusted death rates for influenza and pneumonia did not change significantly.
Dr. Cinque: So, is just a fluke, or have we topped out in life expectancy in this country? I think it’s more likely the latter. And I don’t think there is anyone more cynical than I am about Medicine. Modern Medicine, in many instances, is contributing to the death rate not the survival rate. I believe the evidence shows that many of the pharmacological interventions are shortening lives rather than lengthening them. It's certainly true of scandalous drugs like Vioxx. And whether you agree with me or not, it is nevertheless true that many medical treatments aren’t even tested for the effect they are having on longevity. For instance, take high blood pressure drugs. Has it ever been scientifically tested whether the use of anti-hypertensive drugs is prolonging lives? No, it hasn’t. They have never done a double-blind study in which a Control group received placebo pills while the test group got treated. That, they say, would be unethical, and it’s the same excuse they give for not testing vaccines. But, there is some data available concerning the widely popular statin drugs, and the evidence is clear that they are NOT prolonging lives. Here is an article about it by Scottish physician and researcher Malcolm Kendrick.
But, there are other reasons besides misguided medical practices that are causing longevity in America to stall. A major one is the rising rate of obesity, in both adults and children. Another is the rising rate of physical inactivity, in both adults and children. Smoking rates have supposedly come down to their lowest level ever, but I have to wonder how accurate the claims are. Tobacco companies seem to be doing well, and by my observations from being out in public, it seems like there are still plenty of people smoking. What say you?
In any case, the expectation that Medicine was going to continue lengthening lives indefinitely seems to be a pipe dream. And if stem cell therapy is going to change that, it certainly hasn’t happened yet. The greatest potential of Modern Medicine to prolonging lives, in my opinion, is bio-identical hormone replacement.
Created on Tuesday, 08 November 2016 21:15
Why would a college-educated young man from a well-to-do family from the DC area hitch-hike from South Dakota to Northern Alaska and then wander off into the remote woods with a rifle, ammunition, and little else, determined to live off the land shooting wild game and foraging for wild plants?
A lot of people have wondered about that, but there are no clear answers. And, it ended tragically. He survived for a while on game like squirrels, porcupines, birds, ducks, and even a moose once, while he also ate wild berries and this native root called "wild potato" although it is unrelated to the potato that we know. It's actually a leguminous plant, and it's edible root has the texture of carrot.
But, it was still a low calorie diet because the meats were very lean and the plant foods were all very low in calories. From the start he started losing weight- and he was thin to begin with. After about two months, he had had enough, and he still had the bodily reserves to walk out, which he decided to do. However, the creek that he crossed getting there had swelled to a raging river from snow melt, and he could not cross it. It was too treacherous from the rapids and the rocks, plus the water was deep and only slightly above freezing. He never would have made it, and he knew it. So, he went back to the abandoned bus that he had turned into his camp, and he resumed doing what he was doing. I suppose that his hope at that point was that somebody would come along who could help him get out, but nobody showed up.
Then, there was a piece of bad luck. The wild potato he was eating turns very hard and fibrous in the late summer; it loses its succulence. So, he resorted to eating its seeds. But, what he didn't realize is that the seeds are NOT edible. The seeds are high in a toxic amino acid called canavanine.
I know about canavanine. Remember back in the 1980s that there was a big kick of eating alfalfa sprouts? People grew them at home; markets sold them; salad bars served them; they were everywhere. Well, alfalfa sprouts also contain canavanine but not as much as this other plant. And, it's one of those things where you have to eat a lot of it to be poisoned. A little cluster of alfalfa sprouts wasn't gong to kill anybody. But, he was eating large quantities of these toxic seeds. And reportedly, his thin, weakened, undernourished condition made it harder for his body to tolerate the canavanine. The effect that it had on him symptomatically was to make him very, very weak, to where he could hardly stand. And obviously, if you can't stand and walk, you can't hunt, and you can't forage. So, he just starved. Having arrived there in late April, it's believed that he probably died in the abandoned bus in mid-August. His corpse, which was found by hunters inside a sleeping bag within the bus, presumably about three weeks after he died, weighed 66 pounds.
Again, a lot of questions are circulating about what drove him to do this extremely extreme thing. But, a question that I have which I haven't seen asked before is this: he was a young man; 24 years old; at the height of his hormonal surge of young adulthood. So, why wasn't he more interested in other things. You know, chasing girls? He wasn't going to find them hiding behind trees in the Alaskan wilderness. Even before he went to Alaska, he didn't seem to have much interest that way. And apparently, somebody at college ribbed him about being gay, which he lambasted as nonsense, and I have no reason to doubt it. He wasn't interested in boys. He wasn't interested in girls. He wasn't interested in anybody. It wasn't that he was homosexual; it was that he was asexual. But why? What was wrong with him?
At various times in life, there are certain values and interests and urges and pursuits that you expect normal people to have. Why didn't he have them?
So, I have to assume that he had some pretty major psychiatric illness going on. And reflected in that, I believe, is the very slipshod, haphazard way he went about preparing for this venture, with woefully inadequate equipment, supplies, and knowledge.
It's notable that he had taken other solo road trips before this, but always remaining within civilization, but upon returning home from these road trips, he was also exceedingly thin. I mean to where his mother was aghast at the sight of him and started cooking 'round the clock to revitalize him. So, why did that happen? Money may have been a factor, but I doubt that accounts for it. His family was well-off, and he could have gone to them for money. I think that once he got away from a structured day in which meals happen according to schedule, according to the clock, that he would literally forget to eat. Hunger, alone, was not spurring him to eat enough food, and it happens. And, I think that may have been part of his mental illness too.
I have seen programs on television about guys who were surviving in the wild in extremely harsh places, like Northern Canada, but these were highly trained individuals. And since, there were tv cameras there and a crew recording it, the guy obviously wasn't really alone, although they made it look that way. I'm not saying that he cheated and took help, but he was protected in the event of an emergency. There was no chance that he was going to starve to death. He was like a tightwalker with a net. What Chris McCandless did was extremely reckless, almost to the point of being suicidal. And there's mental illness rearing its ugly head again.
Humans are obviously natural beings, but it doesn't mean that we can live in the wild, especially not in a place like that. Why do it in Northern Alaska? Why not go to a tropical rain forest? And even there, I am sure there are lots of things that can go wrong. Most people know that if you just release a domesticated dog into the wild, he is not going to survive; he is not a wild animal. He is domesticated. Well, we are domesticated too, and 10X more so.
What happened to Chris McCandless was like the exaggeration of all the human ventures that just aren't well thought out. He was lacking in judgment, again, part of his mental illness. He wasn't completely lacking in judgment because when he reached the river and realized it was too treacherous to cross, he did turn back. But, he struggled with it. He almost tried to cross it. There was probably a 50/50 chance that he would have. He probably put his hand in the water and felt the sting of the icyness, and that jarred him into coherence. That put the brake on. But, he needed a lot more brakes on himself than he had. He killed himself. He killed himself the moment he disappeared into those woods.
If you'd like to read the book, it's called Into The Wild, and the author is Jon Krakauer. It's very well written.
Created on Sunday, 02 October 2016 21:27
This is way off-topic, but I'm putting it up anyway, for the same reason that Bill Clinton gave us: "because I can."
The JFK story is a lie, but we are bombarded with lies, including a lot of medical lies. And, one of those medical lies which they tell in order to support the booming kidney transplant industry is that it doesn't hurt to donate a kidney.
It damn well hurts you. It hurts you a lot. It is a crime against yourself to do it. And now, refreshingly, here is an article written by a medical student who at the age of 18 was conned into donating a kidney but now, he regrets it.
I'm putting the whole article up right here. But, Medicine has known all along that donating a kidney is extremely compromising. Consider that the most widely used measure of kidney function is the blood creatinine test. Normally it's at about 1mg/dl or less. And the lower the better. You donate a kidney, and it rises to nearly 2. Once it gets to 3, you're in early stage kidney failure. So, you're half-way there just from donating a kidney. And it makes sense. Normally, in life, your kidneys take turns working. Each as its own ureter draining into the bladder, but at any given time, 90% of the output is coming from just one kidney. Meanwhile, the other one is resting and repairing. But, obviously, if you donate a kidney, you're remaining kidney has to work 24/7/365 year after year after year until you die. It's like a hamster on a treadmill that never stops, and doing twice the work as before.
So, read this refreshingly honest article that somehow slipped past the censors. ANY SURGEON WHO CUTS A HEALTHY KIDNEY OUT OF SOMEONE SHOULD BE PROSECUTED FOR MEDICAL MALPRACTICE AND INFLICTING SEVERE BODILY HARM.
At 18 years old, he donated a kidney. Now, he regrets it.
When I was 18, my stepfather’s brother had been on dialysis for just over a year. He was thin, he exercised regularly and he seemingly was in perfect health, but inexplicably his kidneys began to fail him. Although I was just about to leave for college, I’d heard enough about the misery of dialysis to decide to get tested as a possible donor. In the back of my mind, I knew that the chances of our compatibility were incredibly low because we were not related by blood. Perhaps that made it easy for me to decide to get tested.
When we received the results, I was stunned to find out that he and I were a match. The transplant team gave me plenty of opportunities to back out of the donation, and it put me through countless evaluations, physical and psychological. Much of my family was steadfast against my becoming a donor. Looking back, who could blame them? Their son-grandson-nephew was going to undergo a major operation with no benefit to himself.
However, I continued to be confident in my choice. I relied on the one fact that would be repeated to me many times: “The rate of kidney failure in kidney donors is the same as the general population.” Why wouldn’t everyone donate a kidney, I wondered.
My mother was the only one to — reluctantly — support my decision. She accompanied me to San Francisco, where the surgery took place, and we settled in for the weeks that I would spend recovering. On the day of the surgery, anesthesia flowed into my arm and the world swiftly slipped away. Then, just as quickly, it seemed, I awoke, nauseated and confused. So much preparation for such a short nap. The anxiety I’d felt about the surgery was now gone — as was one of my kidneys.
[Your iPhone will soon help you sign up to be an organ donor]
Michael Poulson regrets giving that kidney away. (University Photo)
An uneventful recovery came and went. I returned to college and resumed a normal life. Likewise, my step-uncle did very well and is living a full and healthy life, as is my donated kidney.
Five years after the surgery, when I was 23 and getting ready to go to medical school, I began working in a research lab that was looking at kidney donors who had gone on to develop kidney failure. For that research, I talked to more than 100 such donors. In some cases, the remaining kidneys failed; in others, the organ became injured or developed cancer. The more I learned, the more nervous I became about the logic of my decision at age 18 to donate.
And then in 2014, a study looking at long-term risks for kidney donors found that they had a greater risk of developing end-stage renal disease. Another study that same year raised the possibility that they may face a heightened risk of dying of cardiovascular disease and all-cause mortality (although this point remains controversial).
Other studies and surveys, though, suggest that the risk, while greater, is still fairly small.
The truth is, it is hard to get good numbers about what happens to donors. Hospitals are required to follow them for only two years post-donation, which does not catch such long-term complications as chronic kidney disease, cardiovascular issues or psychiatric issues. There is no national registry for kidney donors or other large-scale means of tracking long-term outcomes.
The result is that we know neither the denominator (the total number of kidney transplants that have occurred over the decades) nor the numerator (the number of donors who have gone into kidney failure). And what we do know is incomplete. Yet the need for donors remains great, as the number of Americans needing a kidney transplant has steadily increased — to more than 120,000 — while the number of transplants performed has remained relatively steady — at about 30,000 per year .
[The U.S. spends more money on this medical condition than any other]
Donors are lauded for their altruism and bravery for what is promoted as a benign procedure with low long-term risk. We are told about neither the reality of donation risks nor the scarcity of data that’s available.
As a medical student and soon-to-be physician, I’ve come to better understand the imperfections in the idea of informed consent. We work with the data we have, and patients aren’t always told that it may not be that solid. At the time of my surgery, I thought the system was designed to protect me as a donor. Yet, now, more than eight years later, I am angry that I was never fully informed of the lack of research or the unknown long-term health implications for me.
Mostly I’ve come to terms with the increased risks of being a kidney donor. But I’d be lying if I said I don’t get anxious about it. I feel vulnerable. Sometimes I can think of nothing but my remaining kidney. I’ll feel pressure on my ribs, and I think, “Is that my kidney acting up, or simply back tension?” Or I’ll wonder: “Should I be feeling this lump? Am I going into kidney failure?”
Being a kidney donor has become a part of my identity. Some people — particularly in medical school — have put me on a pedestal for my altruism and bravery. But often I find myself hiding the fact that I donated, which I’d like to think of as an act of modesty. The sad and difficult truth is this: Knowing what I know now, I regret donating in the first place.
Created on Wednesday, 28 September 2016 16:36
The term “side effects” is a euphemism for the adverse, toxic effects of medical drugs. And keep in mind that often the desired, sought-after effects are also toxic. For instance, acid-blockers work by poisoning the cells that produce stomach acid. Impairing the production of stomach acid is certainly a toxic effect in my book, since producing stomach acid is normal and healthy.
But, the biggest problem with the popular understanding of “side effects” is that if they don’t manifest visibly and palpably that they don’t exist. It’s often assumed that if a medical drug is well tolerated in the act of taking it, if it doesn’t cause you pain or discomfort, that it must be safe. That is a delusion. Let’s say, for instance, that a drug is poisoning the cells in your bone marrow which produce blood cells. So, those cells are under attack, and they start producing abnormal, defective blood cells, whether red, white, or platelets, or a combination. Are you going to feel anything? Probably not and for a long time. There are no pain receptors in your bone marrow. And if your blood contains abnormal cells, that is a high number of them, you won’t necessarily feel anything right away either. Eventually, say if you become anemic from the toxic effect of a medical drug, you’ll start experiencing symptoms, such as fatigue, shortness of breath, lack of stamina, paleness, etc. But, by then, by the time symptoms appear, the condition will be advanced. The early and intermediate stages of the drug-induced pathology will probably entail no symptoms at all.
It’s quite true that some people may tolerate a medical drug better than other people. And the converse is also true that some people may not tolerate a medical drug that most tolerate. Take, for instance, statin drugs. Statin drugs cause muscle breakdown which can lead to pain, which is very common. But, in some people, the muscle breakdown is so great that it overwhelms the kidneys with the breakdown products of muscle protein. And, the result is they go into kidney failure. Of course, not everybody goes into kidney failure from taking a statin, but, I think it’s fair to say that everybody heads in that direction from taking a statin. Statins increase the risk of kidney failure, diabetes, and cancer. And that’s in everybody. And that’s in exchange for what? A vanishingly small statistical reduction in heart disease risk? It’s so small that 100 people would have to take statins for 10 years in order for 1 of them to avoid 1 heart attack. The risk/reward ratio for those drugs is absolutely appalling.
Antibiotics are another class of drug that work by poisoning. The whole idea of them is to poison: bacteria. And you hope that that can be done without poisoning you- very much. But, at least with antibiotics, it’s usually a temporary thing. You take them for a week, maybe 10 days; maybe even 2 weeks. But, it’s not a life sentence. However, there are many drugs that are meant to be a life sentence. They put you on blood pressure drugs with the expectation that it is a life sentence. They put you on diabetes drugs with the same expectation. When they put you on drugs for arthritis, they don’t expect you to ever stop taking them. It’s meant to be permanent.
And even drugs that are supposed to be for temporary use often wind up being permanent or at least long-term. For instance, most sleeping pills say that you should only take them for 10 days. But, if people were only going to take them for 10 days, how could the drug company afford the expensive ads? They know very well that people go on Ambien or another sleep drug for years and years and years. That includes drugs that were only tested for 6 weeks, meaning that they tested the safety of the drug over just 6 weeks of use.
So, there is no such thing as a “side effect.” There are only effects. And most drugs not only have toxic effects but work in their desired effect through poisoning something.
The fact is that there are very few drugs in Medicine that anyone with sense should want to take. Almost always, there are alternatives to taking medical drugs. And oftentimes, just living with your condition, whatever it is, is superior to treating it with medical drugs. I kid you not.
The time has come not only to reevaluate medical drugs, but to reevaluate our attitude towards medical drugs. They are, generally speaking, harmful and dangerous, and that is a fact.