Three Meal Diet(3M Diet) Improves Glycemic Control In Type 2 DM

Eating a carbohydrate-rich breakfast followed by a substantial lunch and a small dinner — the so-called “three-meal diet” (3Mdiet) — promotes weight loss and significantly improves glucose control in type 2 diabetes, a randomized, controlled trial suggests.

In fact, the 3Mdiet improved glycemic control so significantly that patients could reduce their total daily insulin dose, along with the need for additional antidiabetic medications, relative to baseline, the same small study indicates.

“The traditional diabetic diet specifies six small meals spread throughout the day, but this ‘6Mdiet’ as it is called has not been effective for glycemic control,” lead author Daniela Jakubowicz, MD, professor of medicine, Tel Aviv University, Israel, explains in a statement from American Friends of Tel Aviv University.

“Our research proposes shifting the starch-rich calories to the early hours of the day [to] produce a glucose balance, and we believe that through this regimen it will be possible for diabetics to significantly reduce or even stop injections of insulin and most anti-diabetic medications to achieve excellent control of glucose levels,” she added.

In the study, published in the December issue of Diabetes Care, the researchers hypothesize that the 3Mdiet is more in-sync with the natural biological clock — metabolism is optimized for eating in the morning and for fasting during the evening and night, when people are supposed to be asleep.

Both Diets Had Same Macronutrients and Calories

The trial involved 28 volunteers with type 2 diabetes of 5 years’ duration or longer treated with insulin for at least 1 year prior to study entry, at a total daily insulin dose in excess of 25 units. The average body mass index (BMI) was 32.4 kg/m2 and average baseline A1c was 8.1%.

“Subjects were sedentary at baseline and were asked to maintain their usual physical activity levels,” the investigators observe.

Patients were randomized to the 3Mdiet or 6Mdiet and followed for 12 weeks.

Importantly, “both diets had the same macronutrient composition of fat, protein, and carbohydrates (35%, 25%, 40%, respectively), but with different meal timing, frequency, and calorie and carbohydrate distribution over the day,” the authors point out.

For example, participants on the 3Mdiet consumed a large breakfast of, for example, bread, fruits, and sweets, of around 700 calories, a medium-sized lunch of 600 calories, and a small dinner of 200 calories, specifically lacking starches, sweets, and fruits.

In contrast, those randomized to the 6Mdiet ate breakfast, lunch, and dinner along with three snacks of 150 calories each such that caloric consumption was relatively uniform across the day, with snacks scheduled for 11:00, 17:00, and 22:00 hours.

Significant Weight Loss With 3Mdiet

After 12 weeks of the intervention, those on the 3Mdiet lost on average 5.4 kg compared with a small 0.3-kg weight gain in the 6Mdiet group, a difference that was highly significant (P < .0001), the investigators report.

Over the same 12 weeks, participants in the 3Mdiet group had a 1.2% decrease in A1c compared with a nonsignificant decrease in the 6Mdiet group (P = .5).

The magnitude of reduction in A1c is comparable to the decrease seen with the addition of a glucagon-like peptide-1 (GLP-1) receptor agonist or a sodium-glucose cotransporter 2 (SGLT2) inhibitor in patients with type 2 diabetes treated with insulin, the investigators observe.

At the end of the study intervention, fasting glucose had dropped in both groups, but the reduction was more pronounced in the 3Mdiet group than in the 6Mdiet group.

Specifically, at week 12, fasting glucose had dropped to 110 mg/dL from 165 mg/dL at baseline in the 3Mdiet group, compared with 141 mg/dL from 164 mg/dL at baseline in the 6Mdiet group.

There was also a significant reduction in daily 24-hour mean glucose levels at 12 weeks in the 3Mdiet group compared with no significant change in the same glucose parameter in the 6Mdiet group (P < .05).

Continuous glucose monitoring (CGM) assessments in both groups also showed that time spent in normoglycemia increased from 59% at baseline to 83% at week 12 in the 3Mdiet group.

“In contrast, the 6Mdiet did not change the time spent in normoglycemia throughout the study,” say Jakubowicz and colleagues.

Participants assigned to the 3Mdiet also experienced a significant decrease in hunger and overall cravings for sweets, fast food, and high-fat food by study end, whereas those assigned to the 6Mdiet did not experience any change in hunger or have less of a craving for the same foods.

And importantly, in spite of the significant improvements in glycemic measures observed in the 3Mdiet group, participants in that group were not at greater risk for hypoglycemic events.

Total Daily Insulin Dose Reduced With 3Mdiet

For those on the 3Mdiet, the total daily insulin dose dropped by 26 units across the study interval from 60 units at baseline to 34 units at 12 weeks.

Again, in contrast, there was a small increase of 4 units in the total daily insulin dose for those on the 6Mdiet across the same study interval.

The magnitude of reduction in insulin needs seen among those on the 3Mdiet is again comparable to that achieved with the addition of a GLP-1 agonist or SGLT2 inhibitor, the authors stress.

And they observe that the significant reduction in the need for insulin seen in the 3Mdiet group was also independent of any weight loss among the same participants.

This is in contrast to the usual pattern of insulin use and weight gain seen in type 2 diabetes, when many patients gradually require increasing doses of insulin to meet glucose targets.

This increase in the need for insulin often leads to a vicious cycle of weight gain, which in turn may increase insulin resistance, further escalating the need for insulin, continued weight gain, and the likelihood that patients will fall short of their glycemic targets.

The authors suggest that a circadian pattern of diet-induced thermogenesis in which thermogenesis peaks after participants consume a high-calorie breakfast might help explain why the 3Mdiet led to greater weight loss compared with the standard diabetic diet.

Source : Medscape

New Drug For Triple Negative Breast Cancer ‘Capecitabine’

Triple-negative breast cancer (TNBC) is so called because its tumors lack three common receptors known to fuel tumor growth — estrogen, progesterone, and HER2/neu. These biological targets allow many breast tumors to be “druggable” with potentially curative therapies such as tamoxifen and trastuzumab. TNBC is lamented — and feared — because of its paucity of effective treatment options.

Now German investigators indicate that the chemotherapy drug capecitabine impacts this difficult-to-treat breast cancer in early stage disease.

Capecitabine improves both disease-free and overall survival (DFS and OS) when used as an add-on to other standard chemotherapy, either before or after surgery, reported Marion van Mackelenbergh, MD, of the University of Kiel, Germany, and colleagues here at the San Antonio Breast Cancer Symposium 2019.

Capecitabine’s efficacy had been hinted at in single studies but only fully surfaced via the German team’s new meta-analysis of 12 clinical trials involving more than 15,000 patients, said Priyanka Sharma, MD, University of Kansas Medical Center, Westwood, Kansas, who acted as meeting discussant of the study.

The meta-analysis showed that adding capecitabine to standard chemotherapy in TNBC improves DFS by 18% (hazard ratio [HR], 0.82; P = 0.004) and OS by 22% (HR, 0.78; P = 0.004).

It’s a “MAJOR FINDING,” tweeted meeting attendee Harold Burstein, MD, Dana-Farber Cancer Institute, Boston (@DrHBurstein). “We have undervalued this approach because all-comers trials rarely showed a benefit.”

The new study is “big news for #TNBC #SABCS19,” posted Burstein, who is known for measured comments about breast cancer research and treatment.

Capecitabine is an oral fluoropyrimidine, which are inactive prodrugs of cytotoxic 5-FU that are absorbed through the gastrointestinal mucosa and converted to 5-FU by enzymes. Capecitabine is approved for use as monotherapy or in combination with docetaxel in metastatic breast cancer.

The new results address usage in early breast cancer.

The results provide evidence to support some current guidelines, observed Sharma. The National Comprehensive Cancer Network and St Gallen guidelines both say that clinicians should “consider adjuvant capecitabine in the setting of residual disease following neoadjuvant taxane/alkylator and anthracycline chemotherapy,” she said.

Such choices mean that only early breast cancer patients at highest risk (ie, still have lingering disease after initial chemo) will be exposed to the additional toxicity of capecitabine, Sharma added, in advising about practical application of the new findings.

The new meta-analysis fills a void, said the researchers.

“Despite the large number of patients with early breast cancer that have been treated with capecitabine in randomized trials no individual patient data meta-analysis has yet been conducted,” the team wrote in their meeting abstract.

To do so, the German investigators searched for completed randomized trials involving use of capecitabine in early breast cancer as adjuvant or neoadjuvant therapy and having at least 100 patients.

Individual data from 15,457 patients was collected, including 7980 who received capecitabine during the course of their treatment and 7477 patients who were treated in control arms. Median age at diagnosis was 54 years in both groups. Most patients had stage 2 tumors (55.9%) at diagnosis and the majority presented with nodal involvement (74.0%). Estrogen and progesterone receptor positivity was observed in 66.0% and 56.9%, respectively, and 15.1% of patients were diagnosed as HER2-positive. In sum, 2816 patients (18.2%) received neoadjuvant treatment and 12,641 (81.8%) an adjuvant chemotherapy regimen.

Notably, there was also no effect on DFS if studies (n = 5) were selected in which capecitabine was given instead of another drug. Thus, seven studies, in which capecitabine was given in addition to standard chemo, remained and those showed the above-referenced results.

Source : Medscape

Gadolinium Based Contrast May Be Used Safely In CKD Patients

Patients with chronic kidney disease (CKD) who receive a gadolinium-based contrast agent (GBCA) have a low risk of developing nephrogenic systemic fibrosis (NSF), a systematic review and meta-analysis published online December 9 in JAMA Internal Medicine has shown.

“Findings suggest that the risk of nephrogenic systemic fibrosis from group II gadolinium-based contrast agent administration in stage 4 or 5 chronic kidney disease is likely less than 0.07%,” write Sean A. Woolen, MD, from the University of Michigan, Ann Arbor, and colleagues.

“[P]otential diagnostic harms of withholding group II gadolinium-based contrast agents for indicated examinations may outweigh the risk of nephrogenic systemic fibrosis in this population.”

NSF is a rare disorder that may occur in patients with acute kidney injury or stage 4 or 5 CKD who have been exposed to GBCAs.

The characteristic features of NSF include diffuse skin thickening and fibrosis. In severe cases, systemic involvement may affect the heart, lungs, liver, and skeletal muscle. The lesions are irreversible, progressive, and sometimes fatal.

The risk for NSF to individual patients remains poorly understood. With this in mind, Woolen and colleagues assessed the incidence of NSF in patients with stage 4 or 5 CKD after exposure to a group II GBCA.

The investigators included studies that involved patients with “stage 4 or 5 CKD with or without dialysis, administration of an unconfounded American College of Radiology classification group II GBCA (gadobenate dimeglumine, gadobutrol, gadoterate meglumine, or gadoteridol), and incident NSF as an outcome.”

They excluded conference abstracts, retracted manuscripts, narrative reviews, editorials, case reports, and manuscripts not reporting total group II GBCA administrations.

Overall, the final study cohort comprised 16 studies that included 4931 patients.

These studies were published from May 2008 through April 2019, had a collective study period of 1997 through 2017, and had retrospective cohort (11 of 16 [69%]) and prospective cohort (5 of 16 [31%]) designs.

Most of the 16 studies were conducted in Europe (8 of 16; 50%) and the United States (7 of 16; 44%). Seven (44%) were multicenter studies.

Across all 16 studies, the pooled incidence of NSF in patients with stage 4 or 5 CKD who had received a group II GBCA was 0%.

The upper bound of the 95% confidence interval for this estimate was 0.07%.

According to the authors, the upper bound of risk among the GBCAs involved ranged from 0.12% (for gadobenate dimeglumine; 0 of 3167) to 1.59% (for gadoteridol; 0 of 230).

This range reflects sample size, they add.

“These data support recent updates to ACR [American College of Radiology] and European Society of Urogenital Radiology guidelines liberalizing use of low-risk GBCAs for indicated examinations” for patients with stage 4 or 5 CKD, Woolen and colleagues conclude.

In an invited commentary, Saugar Maripuri, MD, and Kirsten L. Johansen, MD, both from Hennepin County Medical Center, Minneapolis, Minnesota, concur that these findings support the assertion that risk for NSF is extremely low in this patient population.

However, they also highlight some of the limitations of this study.

First, although dialysis patients have the highest risk of developing NSF, it is not known how many of the included GDBA exposures occurred in patients with stage 4 or 5 CKD, they note.

Second, they stress that a pooled risk estimate does not reflect differences in risk according to level of kidney function.

“Nonetheless, one cannot ignore the fact that not a single reported case of NSF occurred in nearly 5000 patient exposures,” they write.

Although the data favor use of group II GBCAs for patients with CKD, a disconnect persists between the more conservative approach taken by the US Food and Drug Administration (FDA) and the more permissive guidelines from the ACR.

Maripuri and Johansen explain that in 2010, the FDA updated its gadolinium safety communication, warning that three agents (gadopentetate dimeglumine, gadodiamide, and gadoversetamide) were associated with most cases of NSF and thus were contraindicated in patients with CKD. However, other GBCAs could be used cautiously under certain circumstances.

“This incongruity may be mirrored by a disconnect between nephrologists and radiologists,” they say, “with the former concerned that the lack of cases may be driven by avoidance of GBCAs in high-risk patients and the latter more convinced by the biochemical case for safety of newer GBCAs.”

In view of the emerging data, the editorialists’ opinion is that group II GBCAs may be used cautiously in at-risk patients, including those receiving dialysis. They recommend, however, that patients be given the lowest possible dose and that repeated exposures be avoided. Those undergoing hemodialysis should receive treatment shortly after GBCA exposure, they add.

Although they acknowledge that changing practice behavior could be challenging, they emphasize that nephrologists may need to come to terms with the natural tendency to focus more on avoiding errors of commission than errors of omission.

“Perhaps the combination of zero events and a solid biochemical rationale will help get us there,” Maripuri and Johansen conclude.

Source : Medscape

A Migraine Headache Variant Often Missed

A 29-year-old woman presents for evaluation of headaches. She reports that they have grown increasingly frequent over the past 12 months. She describes them as a feeling of pressure and pain in her forehead, under her eyes, and over her cheeks. Her only other complaint is “feeling stuffy,” though she denies fever, cough, sneezing, or purulent nasal discharge. She reports that she “sometimes” takes ibuprofen to treat her headache but that it doesn’t work well, so she usually “doesn’t bother.”

What is the most likely etiology?

1. Cluster headache

2. Migraine headache

3. Sinus headache

4. Tension headache

5. Medication overuse headache

 

The diagnosis in this case is a variant of migraine headache—an important one that has long been misunderstood.The classic picture is the patient who claims to experience sinus headaches two or three times a year. The typical story is, “I get congested, I take antibiotics, and 2 days later I’m better.” We’ve all seen that patient—often on repeated occasions. These patients are typically experiencing a variant of migraine.

Let’s go through the data.

The largest study involved almost 3000 adult patients recruited from a primary care setting with a history of self-reported or physician-diagnosed “sinus” headache who reported at least six headaches during the previous 6 months. On evaluation, 88% of these patients met International Headache Society (HIS) diagnostic criteria for migraine-type headaches. The most common reported symptoms in this cohort were sinus pressure (84%), sinus pain (82%), and nasal congestion (63%).

The Sinus, Allergy and Migraine Study (SAMS), which recruited patients who believed they had sinus headaches via newspaper advertisements, came to essentially the same conclusion. Approximately 100 patients participated. Final diagnoses, based on IHS criteria, were:

  • Migraine with or without aura: 52%;
  • Probable migraine: 23%;
  • Chronic migraine with medication overuse headache: 11%; and
  • Nonclassifiable headache: 9%.

So how long does it take for a patient initially misdiagnosed with sinusitis to get a correct diagnosis of migraine? A more recent study recruited 130 adult patients with migraine who were seen in a referral practice. Just over 80% of this cohort had initially been misdiagnosed as having sinusitis, with a mean delay of migraine diagnosis of almost 8 years (range, 1-38 years). Chronic migraine was more common in this initially misdiagnosed group than in patients appropriately diagnosed at the onset. Medication overuse headache was only found in the misdiagnosed group.

If diagnosis is uncertain—could response to triptans be helpful information? A small, open-label, nonrandomized study involving 54 patients referred to a tertiary-center otolaryngology department sought to answer this question. All patients presented with multiple episodes of self- or physician-diagnosed “sinus headache.” The vast majority reported having headaches that occurred daily, or multiple times per week, and that lasted hours. All received rigorous evaluation, including nasal endoscopy and CT. Those with negative results were treated with triptans.

Of the 38 patients who completed follow-up, over 80% reported significant reduction in headache pain with use of triptans; over 90% experienced significant pain relief with migraine-directed therapy. Of note, the investigators attributed the high dropout rate to patients who were reluctant to accept a diagnosis of migraine.

Keep in mind that patients with intermittent “sinus headache” do not have high fever, acute presentation, or significant sinus pain. These are the people who complain of a “sinus headache” lasting 12-48 hours several times a year. Although they are often treated with antibiotics, they typically get better with or without antibiotics and with or without decongestants.

As the US Centers for Disease Control and Prevention emphasized in the recently released 2019 antibiotic resistance report, antibiotic resistance is higher than previously estimated and is not going away. This is one group of patients that should not be contributing to overuse.

Source : Medscape

Intermittent Fasting – A New Dietery Modification For Weight Loss

I want you to think about the first calorie you consumed yesterday. Mine was probably the sugar in my coffee around 6 AM.

Now think about the last calorie you consumed yesterday. Mine would have been some sugar in my tea around 9:30 PM.

Most adults in the United States are like me, consuming calories over an approximately 15-hour period.

But if you haven’t been living under a pizza lately, you will have heard of intermittent fasting, a dietary plan that extols the virtue of prolonged fasts to reset the metabolism. The details on any individual plan vary, but the central idea revolves around time-restricted eating—limiting caloric consumption to specific hours on the clock. And now, thanks to this paper appearing in Cell Metabolism, we have some evidence that a relatively modest time-restricted eating plan can significantly improve blood parameters among individuals with the metabolic syndrome.

This is a small but nicely done study. Nineteen individuals with metabolic syndrome who had a daily eating interval of about 15 hours were followed for 3 months, during which they were asked to restrict their eating to a 10-hour window—think 8 AM to 6 PM.

Other than that, there were no particular requirements. Participants could eat whatever they wanted and however much they wanted, provided it was in that timeframe.

By and large, this was a compliant bunch, reducing their eating window to just over 10 hours. Detailed dietary profiling found that they weren’t skipping meals but compressing them—eating breakfast a bit later and dinner a bit earlier.

And in that process, they ended up taking in fewer calories, about 200 fewer calories a day than during the baseline period. That reduction in caloric intake led to a fair amount of weight loss: around 7 pounds over the 3-month study.

Several metabolic parameters improved. Body fat and systolic blood pressure decreased, LDL cholesterol went down, and the average participant lost about 4 cm of waist circumference.

But not everything changed so dramatically. Fasting blood sugar and hemoglobin A1c got a bit lower, but not to the point of statistical significance, for example.

There were a lot of measurements done in this study; 32 are reported in the outcome table, so we need to be a bit worried about false positives. But that’s not really the main limitation here.

The main limitation is that these patients were enrolled in a study. See, without a control group, we don’t know if the beneficial changes seen were due to the effects of intermittent fasting or just because the patients knew they were being “watched.” They had to log in to an app, go to study visits, and so on. That alone may be enough to change behaviors in a beneficial direction.

In other words, we don’t have great support here for particularly large, unique effects of intermittent fasting compared with other diets that lead to calorie restriction.

And this leads to one of my central theories of diet studies: Any diet that makes it harder to eat, whether you are limiting certain types of foods or certain times of day, will probably [lead to weight loss]. One of the central drivers of the obesity epidemic is our ad libitum access to food. We often see promising results like this when we simply limit that free access.

What I like about time-restricted eating is that it’s pretty easy to explain: Eat inside these hours, don’t eat outside of these hours. That’s a bit easier than explaining how, for example, ketosis works. But in the end, the key to any diet plan is adherence. Researchers contacted these participants 3 months after the study ended. At that point, only five were still adherent to the calorie window.

Future studies examining novel dietary interventions would do well to prove that participants not only understand the diet but can stick with it.

Source : Medscape

Uterine Transplantation : Both Success & Failure ….

Until recently, adoption and surrogacy were the only means for women with absolute uterine factor infertility (AUFI) to have a family. The concept of uterine transplantation is not new , but the first such surgery resulting in a live birth was reported only 5 years ago from Sweden. Since then, uterine transplant programs have been established in multiple countries. A recent report summarizes the first 45 cases of uterine transplantation with known outcomes.

AUFI is rare, and the uterine problems that cause it can be congenital or acquired. The most common congenital anomaly is Mayer-Rokitansky-Küster-Hauser (MRKH) syndrome. Acquired uterine infertility can be caused by conditions that interfere with implantation (eg, Asherman syndrome, fibroids, adenomyosis, damage due to ischemia/radiation).

MRKH was the cause of AUFI in 89% of the 45 uterine transplant recipients, whose mean age was 27.8 years. Live donors were used in 36 (80%) of the transplants, whereas the organs came from deceased donors in the remaining nine cases. About half of the donors were relatives of the uterine recipients.

Eleven percent of the live donors suffered severe complications (eg, ureteric injury, fistula formation, vaginal cuff dehiscence), and 28% were affected by minor complications (eg, infection, hypotonic bladder, constipation, leg pain, anemia).

Graft failure following transplantation occurred in 13 women (28.8%), leading to emergency hysterectomies for thrombosis, ischemia, or infection. In another seven cases (15.5%), planned hysterectomy was performed after successful delivery. The remaining 25 women (55.5%) still have functioning grafts.

So far, 18 live births have been reported (17 from live donors and one from a deceased donor). All were born by cesarean section. The mean gestational age of the infants at delivery was 34+6 weeks and the mean birth weight was 2500 g. All neonates did well and no congenital anomalies were seen. One third of the women developed preeclampsia and 22% developed cholestasis.

In summary, uterine transplant offers an alternative method to establish a family in women with AUFI. The procedure is associated with significant risks, however, as almost half of the patients experienced minor or more severe complications.

Viewpoint

Uterine transplantation offers women with AUFI another option for starting a family. The procedure is complex, however, and not without risk. Before transplantation, a recipient must undergo in vitro fertilization to cryopreserve embryos for later transfer. Therefore, the ideal candidate would be a young woman with good ovarian reserve, who is either in a stable relationship or open to using donated sperm. Many transplant programs have age restrictions and require the availability of a certain number of embryos before considering a uterine transplant.

The use of a live donor, preferably a close relative, is ideal. This would allow advanced planning and coordination of the hysterectomy and transplant procedures to minimize the risk for ischemic damage of the organ. Donation by a relative allows the use of lower-dose immunosuppression. Live donors are usually older women who have reached their own desired family size. The older uterus may be less elastic and the blood vessels are more likely to be sclerotic, potentially increasing the risk for graft rejection and compromising perinatal outcomes. On the other hand, the use of a brain-dead donor precludes advanced planning, so the risk for ischemic injury is higher.

Most children in this series were born after live-donor uterine transplantation. Prenatal medical complications and the risk for preterm delivery and low birth weight are increased after uterine transplant, possibly as a result of underlying maternal medical problems, suboptimal function of the transplanted uterus, or the need for immunosuppressant therapy. All newborns in this series did well and no congenital anomalies were detected.

Finally, one has to consider the ethical issues surrounding uterus transplantation because the uterus is not a vital organ, and two or three people are exposed to risks (donor, recipient, and newborn) as a result of the procedure. Some of these concerns may be addressed in the future if bioengineered uteri become available. Until then, the surgical technique and the preparation of the recipient should be improved further.

Source : Medscape