-
PloS One 2024Opposed to other spectral CT techniques, fat quantification in dual-layer detector CT (dlCT) has only recently been developed. The impact of concomitant iron overload...
OBJECTIVES
Opposed to other spectral CT techniques, fat quantification in dual-layer detector CT (dlCT) has only recently been developed. The impact of concomitant iron overload and dlCT-specific protocol settings such as the dose right index (DRI), a measure of image noise and tube current, on dlCT fat quantification was unclear. Further, spectral information became newly available <120 kV. Therefore, this study's objective was to evaluate the impact of iron, changing tube voltage, and DRI on dlCT fat quantification.
MATERIAL AND METHODS
Phantoms with 0 and 8mg/cm3 iron; 0 and 5mg/cm3 iodine; 0, 10, 20, 35, 50, and 100% fat and liver equivalent, respectively, were scanned with a dlCT (CT7500, Philips, the Netherlands) at 100kV/20DRI, 120kV/20DRI, 140kV/20DRI, and at 120kV/16DRI, 120kV/24DRI. Material decomposition was done for fat, liver, and iodine (A1); for fat, liver, and iron (A2); and for fat, liver, and combined reference values of iodine and iron (A3). All scans were analyzed with reference values from 120kV/20DRI. For statistics, the intraclass correlation coefficient (ICC) and Bland-Altman analyses were used.
RESULTS
In phantoms with iron and iodine, results were best for A3 with a mean deviation to phantom fat of 1.3±2.6% (ICC 0.999 [95%-confidence interval 0.996-1]). The standard approach A1 yielded a deviation of -2.5±3.0% (0.998[0.994-0.999]), A2 of 6.1±4.8% (0.991[0.974-0.997]). With A3 and changing tube voltage, the maximal difference between quantified fat and the phantom ground truth occurred at 100kV with 4.6±2.1%. Differences between scans were largest between 100kV and 140kV (2.0%[-7.1-11.2]). The maximal difference of changing DRI occurred between 16 and 24 DRI with 0.4%[-2.2-3.0].
CONCLUSION
For dlCT fat quantification in the presence of iron, material decomposition with combined reference values for iodine and iron delivers the most accurate results. Tube voltage-specific calibration of reference values is advisable while the impact of the DRI on dlCT fat quantification is neglectable.
Topics: Iron Overload; Phantoms, Imaging; Tomography, X-Ray Computed; Humans; Radiation Dosage; Adipose Tissue; Liver; Iron; Iodine
PubMed: 38781228
DOI: 10.1371/journal.pone.0302863 -
Cureus Apr 2024Background Childhood obesity is recognized as a chronic illness with limited therapeutic options. Tackling obesity (BMI; the weight in kilograms divided by the square of...
Background Childhood obesity is recognized as a chronic illness with limited therapeutic options. Tackling obesity (BMI; the weight in kilograms divided by the square of the height in meters, at the 95th percentile or higher) with lifestyle interventions, especially in adolescents, has proven to be a daunting task, yielding only modest results. Research on the use of liraglutide for weight reduction in pediatric patients has yielded conflicting results. Notably, there is a lack of studies in the Middle East reporting on the outcomes of glucagon-like peptide 1 (GLP-1) receptor agonists in treating obesity in children and adolescents, with or without diabetes. This study, conducted in the Middle East, represents the first investigation into the utilization of liraglutide for weight reduction in this pediatric population. Methods This retrospective study collected data on 22 consecutive participants, aged 12 to 19 years, who were diagnosed with obesity (defined as having a BMI greater than the 95th percentile for their age and sex) and had either type 2 diabetes mellitus (T2DM) or were non-diabetic who attended endocrine clinics in Sidra Medicine, Doha, Qatar, between 2020 and 2022. The study protocol involved a liraglutide treatment period spanning 18 months (72 weeks), with scheduled follow-up appointments at six-month intervals. The primary endpoints were changes in weight and BMI from baseline to the 72-week mark. Secondary endpoints were safety measures and changes in HbA1c. Results Out of the initial cohort of 22 patients, 12 completed the full 72-week duration of the study, while 10 patients either discontinued treatment or did not adhere to the prescribed medication regimen due to side effects. Among the 12 patients who completed the study, six had a diagnosis of T2DM. At baseline, the weight, standard deviation score (SDS), BMI, and BMI standard deviation (SD) were 113.9 kg, 2.9, 40.9 kg/m, and 2.6 respectively. At the 18-month follow-up, the weight, SDS, BMI, and BMI SD were 117.8kg, 2.6, 39kg/m, and 2.5, respectively. Thus, no statistically significant change in the weight parameters was evident at 18 months compared to baseline. Dropout from the study and poor compliance were high (10 out of 22 patients) due to side effects, mainly gastrointestinal (nausea, abdominal pain, diarrhea, and vomiting). No statistically significant differences were observed between obese vs. obese with T2DM. No significant change in HbA1c was found between baseline and treatment follow-up in the diabetes patients. No adverse effects in terms of impairment of liver and kidney function or pancreatitis were observed. Conclusions The administration of liraglutide to adolescents with obesity, regardless of whether they had T2DM or not, in a real-life setting, did not yield statistically significant reductions in BMI/weight parameters, and HbA1c levels at the 72-week mark. Nevertheless, the study findings indicate that liraglutide is deemed safe for utilization within this age group, despite the presence of mild gastrointestinal side effects.
PubMed: 38779269
DOI: 10.7759/cureus.58720 -
Clinical Nutrition ESPEN Jun 2024Malnutrition, risk of malnutrition, and risk factors for malnutrition are prevalent among acutely admitted medical patients aged ≥65 years and have significant... (Randomized Controlled Trial)
Randomized Controlled Trial
Effectiveness of a multidisciplinary and transitional nutritional intervention compared with standard care on health-related quality of life among acutely admitted medical patients aged ≥65 years with malnutrition or risk of malnutrition: A randomized controlled trial.
BACKGROUND & AIM
Malnutrition, risk of malnutrition, and risk factors for malnutrition are prevalent among acutely admitted medical patients aged ≥65 years and have significant health-related consequences. Consequently, we aimed to investigate the effectiveness of a multidisciplinary and transitional nutritional intervention on health-related quality of life compared with standard care.
METHODS
The study was a block randomized, observer-blinded clinical trial with two parallel arms. The Intervention Group was offered a multidisciplinary transitional nutritional intervention consisting of dietary counselling and six sub-interventions targeting individually assessed risk factors for malnutrition, while the Control Group received standard care. The inclusion criteria were a Mini Nutritional Assessment Short-Form score ≤11, age ≥65 years, and an acute admittance to the Emergency Department. Outcomes were assessed on admission and 8 and 16 weeks after hospital discharge. The primary outcome was the difference between groups in change in health-related quality of life (assessed by the EuroQol-5D-5L) from baseline to 16 weeks after discharge. The secondary outcomes were difference in intake of energy and protein, well-being, muscle strength, and body weight at all timepoints.
RESULTS
From October 2018 to April 2021, 130 participants were included. Sixteen weeks after discharge, 29% in the Intervention Group and 19% in the Control Group were lost to follow-up. Compliance varied between the sub-interventions targeting nutritional risk factors and was generally low after discharge, ranging from 0 to 61%. No difference was found between groups on change in health-related quality of life or on well-being, muscle strength, and body weight at any timepoint, neither using the intention-to-treat analysis nor the per-protocol analysis. The protein intake was higher in the Intervention Group during hospitalization (1.1 (Standard Deviation (SD) 0.4) vs 0.8 (SD 0.5) g/kg/day, p = 0.0092) and 8 weeks after discharge (1.2 (SD 0.5) vs 0.9 (0.4) g/kg/day, p = 0.0025). The percentual intake of calculated protein requirements (82% (SD 24) vs 61% (SD 32), p = 0.0021), but not of calculated energy requirements (89% (SD 23) vs 80% (SD 37), p = 0.2), was higher in the Intervention Group than in the Control Group during hospitalization. Additionally, the Intervention Group had a significantly higher percentual intake of calculated protein requirements (94% (SD 41) vs 74% (SD 30), p = 0.015) and calculated energy requirements (115% (SD 37) vs 94% (SD 31), p = 0.0070) 8 weeks after discharge. The intake of energy and protein was comparable between the groups 16 weeks after discharge.
CONCLUSION
We found no effect of a multidisciplinary and transitional nutritional intervention for acutely admitted medical patients aged ≥65 years with malnutrition or risk of malnutrition on our primary outcome, health-related quality of life 16 weeks after discharge. Nor did the intervention affect the secondary outcomes, well-being, muscle strength, and body weight from admission to 8 or 16 weeks after discharge. However, the intervention improved energy and protein intake during hospitalization and 8 weeks after discharge. Low compliance with the intervention after discharge may have compromised the effect of the intervention. The study is registered at ClinicalTrials.gov (identifier: NCT03741283).
Topics: Humans; Quality of Life; Aged; Male; Female; Malnutrition; Aged, 80 and over; Nutrition Assessment; Nutritional Status; Risk Factors; Hospitalization; Geriatric Assessment; Nutrition Therapy; Treatment Outcome
PubMed: 38777473
DOI: 10.1016/j.clnesp.2024.02.031 -
Frontiers in Oral Health 2024This article reports on four rare cases involving multiple trauma-induced adjacent missing anterior teeth in the maxillary or mandibular region. These cases were...
OBJECTIVES
This article reports on four rare cases involving multiple trauma-induced adjacent missing anterior teeth in the maxillary or mandibular region. These cases were successfully treated using a 4-axial implant-based alternative insert and an immediate loading protocol.
MATERIAL AND METHODS
This series of cases was summarized by retrospective study that 4 patients who received a total of 20 immediately loaded implants. These patients had suffered from trauma-induced loss of 8-9 adjacent anterior teeth. The 4-axial-implants were inserted with the assistance of digital pioneer drill guides. The surgical procedure involved alveolar bone trimming or ultrasonic osteotomy, eliminating the need for traditional large-area bone augmentation. Pre- and post-operative CBCT was matched using DTX Studio Implant software, the deviation of implant between actual position and preoperative design was measured and compared using SPSS software package.
RESULTS
The average follow-up duration 48 months after implant prostheses, the cumulative retention rate of the implants was 100%, the marginal bone loss averaged 0.53 mm (SD 0.15 mm), and buccal plate bone loss averaged 0.62 mm (SD 0.41 mm).
CONCLUSIONS
This retrospective clinical report demonstrates the successful treatment of several patients with multiple adjacent maxillary or mandibular anterior teeth using four implant-supported screws to fix the frame and employing immediate loading. The approach resulted in long-term stable clinical outcomes. Moreover, the method not only shortens the period of edentulism but also facilitates easy disassembly, maintenance, and cleaning. Consequently, it emerges as a highly favorable clinical option for patients suffering from extensive tooth loss.
PubMed: 38774040
DOI: 10.3389/froh.2024.1369494 -
Sports Medicine - Open May 2024While it has been examined whether there are similar magnitudes of muscle strength and hypertrophy adaptations between low-load resistance training combined with...
Potential Moderators of the Effects of Blood Flow Restriction Training on Muscle Strength and Hypertrophy: A Meta-analysis Based on a Comparison with High-Load Resistance Training.
BACKGROUND
While it has been examined whether there are similar magnitudes of muscle strength and hypertrophy adaptations between low-load resistance training combined with blood-flow restriction training (BFR-RT) and high-load resistance training (HL-RT), some important potential moderators (e.g., age, sex, upper and lower limbs, frequency and duration etc.) have yet to be analyzed further. Furthermore, training status, specificity of muscle strength tests (dynamic versus isometric or isokinetic) and specificity of muscle mass assessments (locations of muscle hypertrophy assessments) seem to exhibit different effects on the results of the analysis. The role of these influencing factors, therefore, remains to be elucidated.
OBJECTIVES
The aim of this meta-analysis was to compare the effects of BFR- versus HL-RT on muscle adaptations, when considering the influence of population characteristics (training status, sex and age), protocol characteristics (upper or lower limbs, duration and frequency) and test specificity.
METHODS
Studies were identified through database searches based on the following inclusion criteria: (1) pre- and post-training assessment of muscular strength; (2) pre- and post-training assessment of muscular hypertrophy; (3) comparison of BFR-RT vs. HL-RT; (4) score ≥ 4 on PEDro scale; (5) means and standard deviations (or standard errors) are reported or allow estimation from graphs. In cases where the fifth criterion was not met, the data were requested directly from the authors.
RESULTS
The main finding of the present study was that training status was an important influencing factor in the effects of BFR-RT. The trained individuals may gain greater muscle strength and hypertrophy with BFR-RT as compared to HL-RT. However, the results showed that the untrained individuals experienced similar muscle mass gains and superior muscle strength gains in with HL-RT compared to BFR-RT.
CONCLUSION
Compared to HL-RT, training status is an important factor influencing the effects of the BFR-RT, in which trained can obtain greater muscle strength and hypertrophy gains in BFR-RT, while untrained individuals can obtain greater strength gains and similar hypertrophy in HL-RT.
PubMed: 38773002
DOI: 10.1186/s40798-024-00719-3 -
The Journal of Prosthetic Dentistry May 2024The optimal pretreatment of radicular dentin before cementing a post with glass ionomer cement is unclear.
STATEMENT OF PROBLEM
The optimal pretreatment of radicular dentin before cementing a post with glass ionomer cement is unclear.
PURPOSE
The purpose of this in vitro study was to evaluate the retention of prefabricated tapered titanium posts to endodontically treated teeth after applying different pretreatment protocols on the radicular dentin.
MATERIAL AND METHODS
The coronal part of 32 single-rooted human teeth was removed 1-mm coronally to the cemento-enamel junction. All specimens received endodontic treatment, and the root canals were prepared with an instrument to a depth of 10 mm to receive a titanium post. The dentin walls of each specimen were roughened with a hand-held diamond cutting instrument. The specimens were randomly divided according to the surface treatments into 4 groups (n=8): KW: etched with 20% to 30% polyacrylic acid (PAA) (Ketac Conditioner) and rinsed with water; KWI: etched with 20% to 30% PAA, rinsed with water and 70% isopropanol; DW: etched with 30% to 50% PAA (Durelon Liquid) and rinsed with water; DWI: etched with 30% to 50% PAA, rinsed with water and 70% isopropanol. The prefabricated titanium posts were airborne-particle abraded and cemented with glass ionomer cement. The specimens were fixed in custom-made brass cylindrical holders with autopolymerizing acrylic resin with the holder parallel to the long axis of the post. All specimens were stored in water for 3 days at 37 °C. Retention was evaluated using a tensile test with a universal testing machine (Zwick Z010) at a crosshead speed of 2 mm/min. Data were statistically analyzed with a 1-way ANOVA, followed by the Tukey post hoc test for pairwise comparisons between groups (α=.05).
RESULTS
Mean ±standard deviation retention values ranged from 201.8 ±55.5 N (KW) to 328.1 ±70.9 N (DWI). Groups DWI and KWI (316 ±58.3 N) showed statistically higher retention values than group KW (P<.05) but did not significantly differ from retention values obtained in group DW (P>.05).
CONCLUSIONS
An additional final rinse with isopropanol after using PAA increased the retention of the post significantly for all groups. Although group DWI achieved the highest retention values, pretreatment of radicular dentin as in group KWI may also be considered.
PubMed: 38772782
DOI: 10.1016/j.prosdent.2024.05.004 -
American Society of Clinical Oncology... Jun 2024Clinical trials are essential for advancing oncology treatment strategies and have contributed significantly to the decline in cancer mortality rates over the past... (Review)
Review
Clinical trials are essential for advancing oncology treatment strategies and have contributed significantly to the decline in cancer mortality rates over the past decades. Traditional explanatory trials, focused on establishing intervention efficacy in ideal settings, often lack generalizability and may not reflect real-world patient care scenarios. Furthermore, increasing complexity in cancer clinical trial design has led to challenges such as protocol deviations, slow enrollment leading to lengthened durations of trial, and escalating costs. By contrast, pragmatic trials aim to assess intervention effectiveness in more representative patient populations under routine clinical conditions. Here, we review the principles, methodologies, challenges, and advantages of incorporating pragmatic features (PFs) into cancer clinical trials. We illustrate the application of pragmatic trial designs in oncology and discuss the QUASAR collaborative, TAPUR study, and the ongoing PRAGMATICA-LUNG trial. Although not all oncology trials may be amenable to adopting fully pragmatic designs, integration of PFs when feasible will enhance trial generalizability and real-world applicability. Project Pragmatica and similar initiatives advocate for the integration of real-world practice with clinical trials, fostering a nuanced approach to oncology research that balances efficacy and effectiveness assessments, ultimately with a goal of improving patient outcomes.
Topics: Humans; Neoplasms; Clinical Trials as Topic; Research Design; Pragmatic Clinical Trials as Topic; Medical Oncology
PubMed: 38771997
DOI: 10.1200/EDBK_100040 -
Frontiers in Bioengineering and... 2024The aging process is commonly accompanied by a general or specific loss of muscle mass, force and/or function that inevitably impact on a person's quality of life. To...
The aging process is commonly accompanied by a general or specific loss of muscle mass, force and/or function that inevitably impact on a person's quality of life. To date, various clinical tests and assessments are routinely performed to evaluate the biomechanical status of an individual, to support and inform the clinical management and decision-making process (e.g., to design a tailored rehabilitation program). However, these assessments (e.g., gait analysis or strength measures on a dynamometer) are typically conducted independently from one another or at different time points, providing clinicians with valuable yet fragmented information. We hereby describe a comprehensive protocol that combines both measurements (maximal voluntary isometric contraction test, superimposed neuromuscular electrical stimulation, electromyography, gait analysis, magnetic resonance imaging, and clinical measures) and methods (musculoskeletal modeling and simulations) to enable the full characterization of an individual from the biomechanical standpoint. The protocol, which requires approximately 4 h and 30 min to be completed in all its parts, was tested on twenty healthy young participants and five elderlies, as a proof of concept. The implemented data processing and elaboration procedures allowing for the extraction of several biomechanical parameters (including muscle volumes and cross-sectional areas, muscle activation and co-contraction levels) are thoroughly described to enable replication. The main parameters extracted are reported as mean and standard deviation across the two populations, to highlight the potential of the proposed approach and show some preliminary findings (which were in agreement with previous literature).
PubMed: 38770274
DOI: 10.3389/fbioe.2024.1356417 -
Cureus May 2024Introduction The emergence of the COVID-19 pandemic necessitated the implementation of novel guidelines for managing appendicitis, prompting an evaluation of its effects...
Introduction The emergence of the COVID-19 pandemic necessitated the implementation of novel guidelines for managing appendicitis, prompting an evaluation of its effects on patient presentation and treatment at a district general hospital. Healthcare facilities worldwide have adapted protocols to meet the unique challenges of the pandemic, ensuring safe and efficient care. Our study assesses the pandemic's influence on patient demographics, clinical outcomes, surgical procedures, and adherence to guidelines among individuals undergoing emergency appendicitis surgery. Through this investigation, we aimed to determine whether significant deviations occurred in managing acute appendicitis amidst the pandemic. Methodology Consecutive adult patients (≥18 years) diagnosed with acute appendicitis were included in two cohorts for this retrospective analysis, comparing cases treated during the COVID-19 pandemic period (April to September 2020) with those treated one year prior. All patients underwent standardized assessments upon emergency department admission, including imaging studies and COVID-19 testing. Demographics, laboratory results, surgical details, and outcomes were compared between the pre- and post-pandemic groups, focusing on their overall management. Results The research involved a total of 172 individuals. During the pandemic (April to September 2020), 91 of these participants underwent surgery, which is more than the 81 individuals who had surgery during the same period the previous year (April to September 2019). Preoperative C-reactive protein levels were significantly higher in the pandemic group ( = 0.0455). The time from admission to surgery was shorter in the pandemic group (7.5 ± 4.6 vs. 5.8 ± 4.9; = 0.0155). The overall operative and laparoscopic operative times were longer in the pandemic group (65 vs. 71 minutes, = 0.391, and 55 vs. 62 minutes, = 0.1424, respectively). However, these differences were not statistically significant. The number of patients presenting with complicated appendicitis was significantly higher in the pandemic group than in the nonpandemic group (44.4% vs. 61.4%; = 0.034). The length of stay was shorter in the pandemic group ( = 0.53). Conclusions Our study suggests that surgery for acute appendicitis remains safe and feasible during the COVID-19 pandemic, with comparable outcomes. However, we noted an increase in the number of patients presenting with complicated appendicitis, possibly influenced by national pandemic guidelines in the United Kingdom. Despite this trend, our findings affirm the continued effectiveness of surgical management for acute appendicitis during the pandemic, highlighting the adaptability of healthcare systems in addressing emergent medical needs under challenging circumstances.
PubMed: 38770054
DOI: 10.7759/cureus.60674 -
Global Spine Journal May 2024Randomised controlled trial.
Effectiveness of a Preoperative Bowel Preparation Protocol for Patients With Adolescent Idiopathic Scoliosis to Decrease Postoperative Gastrointestinal Morbidities and the Hospital Length of Stay.
STUDY DESIGN
Randomised controlled trial.
OBJECTIVE
This study aimed to determine the effectiveness of a preoperative bowel preparation protocol comprising bisacodyl to minimize postoperative gastrointestinal morbidities and the hospital length of stay for patients with adolescent idiopathic scoliosis.
SUMMARY OF BACKGROUND DATA
Patients who undergo scoliosis correction surgery frequently experience postoperative gastrointestinal morbidities and a prolonged hospital length of stay. Emesis, paralytic ileus and constipation are the most common gastrointestinal morbidities. Opioid medication is a well-known risk factor for gastrointestinal complications after scoliosis correction surgery.
METHODS
Eighty-seven patients (22 boys [25.3%] and 65 girls [74.7%]) with a mean age of 17.7 years (standard deviation [SD], ±2.2 years) diagnosed with adolescent idiopathic scoliosis were enrolled in this study and randomized into 2 groups. Group A comprised 44 patients who received a preoperative bowel preparation comprising bisacodyl. Group B comprised 43 patients who did not receive any preoperative medication. Demographic data, height, weight, medical and surgical comorbidities, Risser status, number of instrumented levels and preoperative opioid consumption of all patients were evaluated.
RESULTS
Group A experienced fewer postoperative abdominal symptoms than group B. The mean hospital length of stay was 4.1 days (SD, ±.6 days; median, 4 days; range, 3-5 days) for group A; however, it was 5.3 days (SD, ±.8 days; median, 5 days; range, 4-7 days) for group B ( = .01).
CONCLUSION
The use of a bowel preparation protocol before scoliosis correction surgery for patients with adolescent idiopathic scoliosis can effectively decrease postoperative gastrointestinal morbidities and the hospital length of stay.
PubMed: 38767157
DOI: 10.1177/21925682241249107