-
The Western Journal of Emergency... May 2024During the coronavirus 2019 pandemic, hospitals in the United States experienced a shortage of contrast agent, much of which is manufactured in China. As a result, there...
INTRODUCTION
During the coronavirus 2019 pandemic, hospitals in the United States experienced a shortage of contrast agent, much of which is manufactured in China. As a result, there was a significantly decreased amount of intravenous (IV) contrast available. We sought to determine the effect of restricting the use of IV contrast on emergency department (ED) length of stay (LOS).
METHODS
We conducted a single-institution, retrospective cohort study on adult patients presenting with abdominal pain to the ED from March 7-July 5, 2022. Of 26,122 patient encounters reviewed, 3,028 (11.6%) included abdominopelvic CT with a complaint including "abdominal pain." We excluded patients with outside imaging and non-ED scans. Routine IV contrast agent was administered to approximately 74.6% of patients between March 7-May 6, 2022, when we altered usage guidelines due to a nationwide shortage. Between May 6-July 5, 2022, 32.8% of patients received IV contrast after institutional recommendations were made to limit contrast use. We compared patient demographics and clinical characteristics between groups with chi-square test for frequency data. We analyzed ED LOS with nonparametric Wilcoxon rank-sum test for continuous measures with focus before and after new ED protocols. We also used statistical process control charts and plotted the 1, 2 and 3 sigma control limits to visualize the variation in ED LOS over time. The charts include the average (mean) of the data and upper and lower control limits, corresponding to the number of standard deviations away from the mean.
RESULTS
After use of routine IV contrast was discontinued, ED LOS (229.0 vs 212.5 minutes, = <0.001) declined by 16.5 minutes (95% confidence interval -10, -22).
CONCLUSION
Intravenous contrast adds significantly to ED LOS. Decreased use of routine IV contrast in the ED accelerates time to CT completion. A policy change to limit IV contrast during a national shortage significantly decreased ED LOS.
Topics: Humans; Emergency Service, Hospital; Contrast Media; Retrospective Studies; COVID-19; Tomography, X-Ray Computed; Female; Male; Middle Aged; Length of Stay; United States; Administration, Intravenous; Adult; Abdominal Pain; SARS-CoV-2; Pandemics; Aged
PubMed: 38801039
DOI: 10.5811/westjem.18515 -
Open Forum Infectious Diseases May 2024People with human immunodeficiency virus (HIV; PWH) in the United States have a lower incidence of colon cancer than the general population. The lower incidence may be...
BACKGROUND
People with human immunodeficiency virus (HIV; PWH) in the United States have a lower incidence of colon cancer than the general population. The lower incidence may be explained by differences in receipt of screening. Thus, we sought to estimate colon cancer incidence under scenarios in which Medicaid beneficiaries, with or without HIV, followed the same screening protocols.
METHODS
We used data from 1.5 million Medicaid beneficiaries who were enrolled in 14 US states in 2001-2015 and aged 50-64 years; 72 747 beneficiaries had HIV. We estimated risks of colon cancer and death by age, censoring beneficiaries when they deviated from 3 screening protocols, which were based on Medicaid's coverage policy for endoscopies during the time period, with endoscopy once every 2, 4, or 10 years. We used inverse probability weights to control for baseline and time-varying confounding and informative loss to follow-up. Analyses were performed overall, by sex, and by race/ethnicity.
RESULTS
PWH had a lower incidence of colon cancer than beneficiaries without HIV. Compared with beneficiaries without HIV, the risk difference at age 65 years was -1.6% lower (95% confidence interval, -2.3% to -.7%) among PWH with the 2-year protocol and -0.8% lower (-1.3% to -.3%) with the 10-year protocol. Results were consistent across subgroup and sensitivity analyses.
CONCLUSIONS
Our findings suggest that the lower risk of colon cancer that has been observed among PWH aged 50-64 years compared with those without HIV is not due to differences in receipt of lower endoscopy. Keywords: colon cancer, colorectal cancer screening, endoscopy, Medicaid, human immunodeficiency virus.
PubMed: 38798894
DOI: 10.1093/ofid/ofae246 -
Pilot and Feasibility Studies May 2024Infections after elective colorectal surgery remain a significant burden for patients and the healthcare system. Adult studies suggest that the combination of oral...
Pre-operative mechanical bowel preparation and prophylactic oral antibiotics for pediatric patients undergoing elective colorectal surgery: a protocol for a randomized controlled feasibility trial.
BACKGROUND
Infections after elective colorectal surgery remain a significant burden for patients and the healthcare system. Adult studies suggest that the combination of oral antibiotics and mechanical bowel preparation is effective at reducing infections after colorectal surgery. In children, there is limited evidence for either of these practices and the utility of combining oral antibiotics with mechanical bowel preparation remains uncertain.
METHODS
This study aims to determine the feasibility of conducting a randomized controlled trial assessing the efficacy of oral antibiotics, with or without mechanical bowel preparation, in reducing the rates of post-operative infection in pediatric colorectal surgery. Participants aged 3 months to 18 years undergoing elective colorectal surgery will be randomized pre-operatively to one of three trial arms: (1) oral antibiotics; (2) oral antibiotics and mechanical bowel preparation; or (3) standard care. Twelve patients will be included in each trial arm. Feasibility outcomes of interest include the rate of participant recruitment, post-randomization exclusions, protocol deviations, adverse events, and missed follow-up appointments. Secondary outcomes include the rate of post-operative surgical site infections, length of hospital stay, time to full enteral feeds, reoperation, readmission, and complications.
DISCUSSION
If the results of this trial prove feasible, a multi-center trial will be completed with sufficient power to evaluate the optimal pre-operative bowel preperation for pediatric patients undergoing elective colorectal surgery.
TRIAL REGISTRATION
ClinicalTrials.gov: NCT03593252.
PubMed: 38796500
DOI: 10.1186/s40814-024-01476-6 -
Sensors (Basel, Switzerland) May 2024Multi-agent systems are utilized more often in the research community and industry, as they can complete tasks faster and more efficiently than single-agent systems....
Multi-agent systems are utilized more often in the research community and industry, as they can complete tasks faster and more efficiently than single-agent systems. Therefore, in this paper, we are going to present an optimal approach to the multi-agent navigation problem in simply connected workspaces. The task involves each agent reaching its destination starting from an initial position and following an optimal collision-free trajectory. To achieve this, we design a decentralized control protocol, defined by a navigation function, where each agent is equipped with a navigation controller that resolves imminent safety conflicts with the others, as well as the workspace boundary, without requesting knowledge about the goal position of the other agents. Our approach is rendered sub-optimal, since each agent owns a predetermined optimal policy calculated by a novel off-policy iterative method. We use this method because the computational complexity of learning-based methods needed to calculate the global optimal solution becomes unrealistic as the number of agents increases. To achieve our goal, we examine how much the yielded sub-optimal trajectory deviates from the optimal one and how much time the multi-agent system needs to accomplish its task as we increase the number of agents. Finally, we compare our method results with a discrete centralized policy method, also known as a Multi-Agent Poli-RRT* algorithm, to demonstrate the validity of our method when it is attached to other research algorithms.
PubMed: 38793989
DOI: 10.3390/s24103134 -
International Journal of Environmental... May 2024Accurate body temperature measurement is essential for monitoring and managing safety during outdoor activities. Physical activities are an essential consideration for... (Observational Study)
Observational Study
Accurate body temperature measurement is essential for monitoring and managing safety during outdoor activities. Physical activities are an essential consideration for public health, with sports taking up an important proportion of these. Athletes' performances can be directly affected by body temperature fluctuations, with overheating or hypothermia posing serious health risks. Monitoring these temperatures allows coaches and medical staff to make decisions that enhance performance and safety. Traditional methods, like oral, axillary, and tympanic readings, are widely used, but face challenges during intense physical activities in real-world environments. This study evaluated the agreement, correlation, and interchangeability of oral, axillary, and tympanic temperature measurements in outdoor exercise conditions. Systems developed for specific placements might generate different sensor readouts. Conducted as an observational field study, it involved 21 adult participants (11 males and 10 females, average age 25.14 ± 5.80 years) that underwent the Yo-Yo intermittent recovery test protocol on an outdoor court. The main outcomes measured were the agreement and correlation between temperature readings from the three methods, both before and after exercise. The results indicate poor agreement between the measurement sites, with significant deviations observed post-exercise. Although the Spearman correlation coefficients showed consistent temperature changes post-exercise across all methods, the standard deviations in the pairwise comparisons exceeded 0.67 °C. This study concluded that widely used temperature measurement methods are challenging to use during outdoor exercises and should not be considered interchangeable. This variability, especially after exercise, underscores the need for further research using gold standard temperature measurement methods to determine the most suitable site for accurate readings. Care should thus be taken when temperature screening is done at scale using traditional methods, as each measurement site should be considered within its own right.
Topics: Humans; Male; Adult; Female; Exercise; Body Temperature; Young Adult; Mouth; Ear; Monitoring, Physiologic
PubMed: 38791809
DOI: 10.3390/ijerph21050595 -
International Journal of Molecular... May 2024Recombinant adeno-associated virus (rAAV) has emerged as a prominent vector for in vivo gene therapy, owing to its distinct advantages. Accurate determination of the...
Recombinant adeno-associated virus (rAAV) has emerged as a prominent vector for in vivo gene therapy, owing to its distinct advantages. Accurate determination of the rAAV genome titer is crucial for ensuring the safe and effective administration of clinical doses. The evolution of the rAAV genome titer assay from quantitative PCR (qPCR) to digital PCR (dPCR) has enhanced accuracy and precision, yet practical challenges persist. This study systematically investigated the impact of various operational factors on genome titration in a single-factor manner, aiming to address potential sources of variability in the quantitative determination process. Our findings revealed that a pretreatment procedure without genome extraction exhibits superior precision compared with titration with genome extraction. Additionally, notable variations in titration results across different brands of dPCR instruments were documented, with relative standard deviation (RSD) reaching 23.47% for AAV5 and 11.57% for AAV8. Notably, optimal operations about DNase I digestion were identified; we thought treatment time exceeding 30 min was necessary, and there was no need for thermal inactivation after digestion. And we highlighted that thermal capsid disruption before serial dilution substantially affected AAV genome titers, causing a greater than ten-fold decrease. Conversely, this study found that additive components of dilution buffer are not significant contributors to titration variations. Furthermore, we found that repeated freeze-thaw cycles significantly compromised AAV genome titers. In conclusion, a comprehensive dPCR titration protocol, incorporating insights from these impact factors, was proposed and successfully tested across multiple serotypes of AAV. The results demonstrate acceptable variations, with the RSD consistently below 5.00% for all tested AAV samples. This study provides valuable insights to reduce variability and improve the reproducibility of AAV genome titration using dPCR.
Topics: Dependovirus; Genome, Viral; Genetic Vectors; Humans; Polymerase Chain Reaction; HEK293 Cells; Genetic Therapy; Viral Load
PubMed: 38791184
DOI: 10.3390/ijms25105149 -
Biomedicines May 2024The fracture of nickel-titanium (Ni-Ti) instruments during root canal instrumentation leads to compromised outcomes in endodontic treatments. Despite the significant...
The fracture of nickel-titanium (Ni-Ti) instruments during root canal instrumentation leads to compromised outcomes in endodontic treatments. Despite the significant impact of instrument facture during a root canal treatment, there is still no universally accepted method to address this complication. Several previous studies have shown the ability of a Neodymium: Yttrium-Aluminum-Perovskite (Nd: YAP) laser to cut endodontic files. This study aims to determine safe irradiation conditions for a clinical procedure involving the use of a Neodymium: Yttrium-Aluminum-Perovskite (Nd: YAP) laser for removing fractured nickel-titanium files in root canals. A total of 54 extracted permanent human teeth ( = 54) were used. This study involved nine distinct groups, each employing different irradiation conditions. Groups 1 s, 3 s, 5 s, 10 s, and 15 s simply consist of irradiation for 1, 3, 5, 10, and 15 s, respectively. After identifying the longest and safest duration time, four additional groups were proposed (labeled A, B, C, and D). Group A was composed of three series of irradiations of 5 s each separated by a rest time of 30 s (L5s + 30 s RT). Group B consisted of three series of irradiations of 5 s each separated by a rest time of 60 s (L5s + 60 s RT). Group C consisted of two series of irradiations of 5 s each separated by a rest time of 30 s (L5s + 30 s RT), and group D consisted of two series of irradiations of 5 s each separated by a rest time of 5 s (L5s + 5 s RT). In all groups, during the rest time, continuous irrigation with 2.5 mL of sodium hypochlorite (3% NaOCl) was carried out. The variation in temperature during irradiation was registered with a thermocouple during irradiation with different protocols. The mean and standard deviation of the temperature increase was noted. The calculation of the temperature was made as the Δ of the highest recorded temperature at the root surface minus (-) that recorded at baseline (37°). Additionally, scanning electron microscopy (SEM) was used after irradiation in all groups in order to assess the morphological changes in the root dentinal walls. The Nd: YAP laser irradiation parameters were a power of 3W, an energy of 300 mJ per pulse, a fiber diameter of 200 µm, a pulsed mode of irradiation with a frequency of 10 Hz, a pulse duration of 150 µs, and an energy density of 955.41 J/cm. Our results show that the safest protocol for bypassing and/or removing broken instruments involves three series of irradiation of 5 s each with a rest time of 30 s between each series. Furthermore, our results suggest that continuous irradiation for 10 s or more may be harmful for periodontal tissue.
PubMed: 38790993
DOI: 10.3390/biomedicines12051031 -
PloS One 2024Opposed to other spectral CT techniques, fat quantification in dual-layer detector CT (dlCT) has only recently been developed. The impact of concomitant iron overload...
OBJECTIVES
Opposed to other spectral CT techniques, fat quantification in dual-layer detector CT (dlCT) has only recently been developed. The impact of concomitant iron overload and dlCT-specific protocol settings such as the dose right index (DRI), a measure of image noise and tube current, on dlCT fat quantification was unclear. Further, spectral information became newly available <120 kV. Therefore, this study's objective was to evaluate the impact of iron, changing tube voltage, and DRI on dlCT fat quantification.
MATERIAL AND METHODS
Phantoms with 0 and 8mg/cm3 iron; 0 and 5mg/cm3 iodine; 0, 10, 20, 35, 50, and 100% fat and liver equivalent, respectively, were scanned with a dlCT (CT7500, Philips, the Netherlands) at 100kV/20DRI, 120kV/20DRI, 140kV/20DRI, and at 120kV/16DRI, 120kV/24DRI. Material decomposition was done for fat, liver, and iodine (A1); for fat, liver, and iron (A2); and for fat, liver, and combined reference values of iodine and iron (A3). All scans were analyzed with reference values from 120kV/20DRI. For statistics, the intraclass correlation coefficient (ICC) and Bland-Altman analyses were used.
RESULTS
In phantoms with iron and iodine, results were best for A3 with a mean deviation to phantom fat of 1.3±2.6% (ICC 0.999 [95%-confidence interval 0.996-1]). The standard approach A1 yielded a deviation of -2.5±3.0% (0.998[0.994-0.999]), A2 of 6.1±4.8% (0.991[0.974-0.997]). With A3 and changing tube voltage, the maximal difference between quantified fat and the phantom ground truth occurred at 100kV with 4.6±2.1%. Differences between scans were largest between 100kV and 140kV (2.0%[-7.1-11.2]). The maximal difference of changing DRI occurred between 16 and 24 DRI with 0.4%[-2.2-3.0].
CONCLUSION
For dlCT fat quantification in the presence of iron, material decomposition with combined reference values for iodine and iron delivers the most accurate results. Tube voltage-specific calibration of reference values is advisable while the impact of the DRI on dlCT fat quantification is neglectable.
Topics: Iron Overload; Phantoms, Imaging; Tomography, X-Ray Computed; Humans; Radiation Dosage; Adipose Tissue; Liver; Iron; Iodine
PubMed: 38781228
DOI: 10.1371/journal.pone.0302863 -
Cureus Apr 2024Background Childhood obesity is recognized as a chronic illness with limited therapeutic options. Tackling obesity (BMI; the weight in kilograms divided by the square of...
Background Childhood obesity is recognized as a chronic illness with limited therapeutic options. Tackling obesity (BMI; the weight in kilograms divided by the square of the height in meters, at the 95th percentile or higher) with lifestyle interventions, especially in adolescents, has proven to be a daunting task, yielding only modest results. Research on the use of liraglutide for weight reduction in pediatric patients has yielded conflicting results. Notably, there is a lack of studies in the Middle East reporting on the outcomes of glucagon-like peptide 1 (GLP-1) receptor agonists in treating obesity in children and adolescents, with or without diabetes. This study, conducted in the Middle East, represents the first investigation into the utilization of liraglutide for weight reduction in this pediatric population. Methods This retrospective study collected data on 22 consecutive participants, aged 12 to 19 years, who were diagnosed with obesity (defined as having a BMI greater than the 95th percentile for their age and sex) and had either type 2 diabetes mellitus (T2DM) or were non-diabetic who attended endocrine clinics in Sidra Medicine, Doha, Qatar, between 2020 and 2022. The study protocol involved a liraglutide treatment period spanning 18 months (72 weeks), with scheduled follow-up appointments at six-month intervals. The primary endpoints were changes in weight and BMI from baseline to the 72-week mark. Secondary endpoints were safety measures and changes in HbA1c. Results Out of the initial cohort of 22 patients, 12 completed the full 72-week duration of the study, while 10 patients either discontinued treatment or did not adhere to the prescribed medication regimen due to side effects. Among the 12 patients who completed the study, six had a diagnosis of T2DM. At baseline, the weight, standard deviation score (SDS), BMI, and BMI standard deviation (SD) were 113.9 kg, 2.9, 40.9 kg/m, and 2.6 respectively. At the 18-month follow-up, the weight, SDS, BMI, and BMI SD were 117.8kg, 2.6, 39kg/m, and 2.5, respectively. Thus, no statistically significant change in the weight parameters was evident at 18 months compared to baseline. Dropout from the study and poor compliance were high (10 out of 22 patients) due to side effects, mainly gastrointestinal (nausea, abdominal pain, diarrhea, and vomiting). No statistically significant differences were observed between obese vs. obese with T2DM. No significant change in HbA1c was found between baseline and treatment follow-up in the diabetes patients. No adverse effects in terms of impairment of liver and kidney function or pancreatitis were observed. Conclusions The administration of liraglutide to adolescents with obesity, regardless of whether they had T2DM or not, in a real-life setting, did not yield statistically significant reductions in BMI/weight parameters, and HbA1c levels at the 72-week mark. Nevertheless, the study findings indicate that liraglutide is deemed safe for utilization within this age group, despite the presence of mild gastrointestinal side effects.
PubMed: 38779269
DOI: 10.7759/cureus.58720 -
Clinical Nutrition ESPEN Jun 2024Malnutrition, risk of malnutrition, and risk factors for malnutrition are prevalent among acutely admitted medical patients aged ≥65 years and have significant... (Randomized Controlled Trial)
Randomized Controlled Trial
Effectiveness of a multidisciplinary and transitional nutritional intervention compared with standard care on health-related quality of life among acutely admitted medical patients aged ≥65 years with malnutrition or risk of malnutrition: A randomized controlled trial.
BACKGROUND & AIM
Malnutrition, risk of malnutrition, and risk factors for malnutrition are prevalent among acutely admitted medical patients aged ≥65 years and have significant health-related consequences. Consequently, we aimed to investigate the effectiveness of a multidisciplinary and transitional nutritional intervention on health-related quality of life compared with standard care.
METHODS
The study was a block randomized, observer-blinded clinical trial with two parallel arms. The Intervention Group was offered a multidisciplinary transitional nutritional intervention consisting of dietary counselling and six sub-interventions targeting individually assessed risk factors for malnutrition, while the Control Group received standard care. The inclusion criteria were a Mini Nutritional Assessment Short-Form score ≤11, age ≥65 years, and an acute admittance to the Emergency Department. Outcomes were assessed on admission and 8 and 16 weeks after hospital discharge. The primary outcome was the difference between groups in change in health-related quality of life (assessed by the EuroQol-5D-5L) from baseline to 16 weeks after discharge. The secondary outcomes were difference in intake of energy and protein, well-being, muscle strength, and body weight at all timepoints.
RESULTS
From October 2018 to April 2021, 130 participants were included. Sixteen weeks after discharge, 29% in the Intervention Group and 19% in the Control Group were lost to follow-up. Compliance varied between the sub-interventions targeting nutritional risk factors and was generally low after discharge, ranging from 0 to 61%. No difference was found between groups on change in health-related quality of life or on well-being, muscle strength, and body weight at any timepoint, neither using the intention-to-treat analysis nor the per-protocol analysis. The protein intake was higher in the Intervention Group during hospitalization (1.1 (Standard Deviation (SD) 0.4) vs 0.8 (SD 0.5) g/kg/day, p = 0.0092) and 8 weeks after discharge (1.2 (SD 0.5) vs 0.9 (0.4) g/kg/day, p = 0.0025). The percentual intake of calculated protein requirements (82% (SD 24) vs 61% (SD 32), p = 0.0021), but not of calculated energy requirements (89% (SD 23) vs 80% (SD 37), p = 0.2), was higher in the Intervention Group than in the Control Group during hospitalization. Additionally, the Intervention Group had a significantly higher percentual intake of calculated protein requirements (94% (SD 41) vs 74% (SD 30), p = 0.015) and calculated energy requirements (115% (SD 37) vs 94% (SD 31), p = 0.0070) 8 weeks after discharge. The intake of energy and protein was comparable between the groups 16 weeks after discharge.
CONCLUSION
We found no effect of a multidisciplinary and transitional nutritional intervention for acutely admitted medical patients aged ≥65 years with malnutrition or risk of malnutrition on our primary outcome, health-related quality of life 16 weeks after discharge. Nor did the intervention affect the secondary outcomes, well-being, muscle strength, and body weight from admission to 8 or 16 weeks after discharge. However, the intervention improved energy and protein intake during hospitalization and 8 weeks after discharge. Low compliance with the intervention after discharge may have compromised the effect of the intervention. The study is registered at ClinicalTrials.gov (identifier: NCT03741283).
Topics: Humans; Quality of Life; Aged; Male; Female; Malnutrition; Aged, 80 and over; Nutrition Assessment; Nutritional Status; Risk Factors; Hospitalization; Geriatric Assessment; Nutrition Therapy; Treatment Outcome
PubMed: 38777473
DOI: 10.1016/j.clnesp.2024.02.031