-
European Journal of Public Health Dec 2023Missed opportunities constitute a main driver of suboptimal seasonal influenza vaccination (SIV) coverage in older adults. Vaccine co-administration is a way to reduce...
BACKGROUND
Missed opportunities constitute a main driver of suboptimal seasonal influenza vaccination (SIV) coverage in older adults. Vaccine co-administration is a way to reduce these missed opportunities. In this study, we quantified missed opportunities for SIV, identified some of their socio-structural correlates and documented SIV co-administration patterns.
METHODS
In this registry-based retrospective cohort study, we verified the SIV status of all subjects aged ≥65 years who received at least one dose of coronavirus disease 2019 (COVID-19), pneumococcal or herpes zoster vaccines during the 2022/23 influenza season. The frequency of concomitant same-day administration of SIV with other target vaccines was also assessed.
RESULTS
Among 41 112, 5482 and 3432 older adults who received ≥1 dose of COVID-19, pneumococcal and herpes zoster vaccines, missed opportunities for SIV accounted for 23.3%, 5.0% and 13.2%, respectively. Younger, male and foreign-born individuals were generally more prone to missing SIV. The co-administration of SIV with other recommended vaccines was relatively low, being 11.0%, 53.1% and 17.1% in COVID-19, pneumococcal and herpes zoster cohorts, respectively.
CONCLUSIONS
A sizeable proportion of older adults who received other recommended vaccines during the last influenza season did not receive SIV. This share of missed opportunities, which are subject to some social inequalities, may be addressed by increasing vaccine co-administration rates and implementing tailored health promotion interventions.
Topics: Humans; Male; Aged; Influenza Vaccines; Influenza, Human; Retrospective Studies; Vaccination; Pneumococcal Vaccines; Herpes Zoster; Herpes Zoster Vaccine; COVID-19; Italy
PubMed: 37632235
DOI: 10.1093/eurpub/ckad155 -
Vaccines Dec 2023The persistence of geographic inequities in vaccination coverage often evidences the presence of zero-dose and missed communities and their vulnerabilities to...
Geospatial Analyses of Recent Household Surveys to Assess Changes in the Distribution of Zero-Dose Children and Their Associated Factors before and during the COVID-19 Pandemic in Nigeria.
The persistence of geographic inequities in vaccination coverage often evidences the presence of zero-dose and missed communities and their vulnerabilities to vaccine-preventable diseases. These inequities were exacerbated in many places during the coronavirus disease 2019 (COVID-19) pandemic, due to severe disruptions to vaccination services. Understanding changes in zero-dose prevalence and its associated risk factors in the context of the COVID-19 pandemic is, therefore, critical to designing effective strategies to reach vulnerable populations. Using data from nationally representative household surveys conducted before the COVID-19 pandemic, in 2018, and during the pandemic, in 2021, in Nigeria, we fitted Bayesian geostatistical models to map the distribution of three vaccination coverage indicators: receipt of the first dose of diphtheria-tetanus-pertussis-containing vaccine (DTP1), the first dose of measles-containing vaccine (MCV1), and any of the four basic vaccines (bacilli Calmette-Guerin (BCG), oral polio vaccine (OPV0), DTP1, and MCV1), and the corresponding zero-dose estimates independently at a 1 × 1 km resolution and the district level during both time periods. We also explored changes in the factors associated with non-vaccination at the national and regional levels using multilevel logistic regression models. Our results revealed no increases in zero-dose prevalence due to the pandemic at the national level, although considerable increases were observed in a few districts. We found substantial subnational heterogeneities in vaccination coverage and zero-dose prevalence both before and during the pandemic, showing broadly similar patterns in both time periods. Areas with relatively higher zero-dose prevalence occurred mostly in the north and a few places in the south in both time periods. We also found consistent areas of low coverage and high zero-dose prevalence using all three zero-dose indicators, revealing the areas in greatest need. At the national level, risk factors related to socioeconomic/demographic status (e.g., maternal education), maternal access to and utilization of health services, and remoteness were strongly associated with the odds of being zero dose in both time periods, while those related to communication were mostly relevant before the pandemic. These associations were also supported at the regional level, but we additionally identified risk factors specific to zero-dose children in each region; for example, communication and cross-border migration in the northwest. Our findings can help guide tailored strategies to reduce zero-dose prevalence and boost coverage levels in Nigeria.
PubMed: 38140234
DOI: 10.3390/vaccines11121830 -
Vaccines Sep 2023While there is a coordinated effort around reaching zero dose children and closing existing equity gaps in immunization delivery, it is important that there is agreement...
Defining the Zero Dose Child: A Comparative Analysis of Two Approaches and Their Impact on Assessing the Zero Dose Burden and Vulnerability Profiles across 82 Low- and Middle-Income Countries.
While there is a coordinated effort around reaching zero dose children and closing existing equity gaps in immunization delivery, it is important that there is agreement and clarity around how 'zero dose status' is defined and what is gained and lost by using different indicators for zero dose status. There are two popular approaches used in research, program design, and advocacy to define zero dose status: one uses a single vaccine to serve as a proxy for zero dose status, while another uses a subset of vaccines to identify children who have missed all routine vaccines. We provide a global analysis utilizing the most recent publicly available DHS and MICS data from 2010 to 2020 to compare the number, proportion, and profile of children aged 12 to 23 months who are 'penta-zero dose' (have not received the pentavalent vaccine), 'truly' zero dose (have not received any dose of BCG, polio, pentavalent, or measles vaccines), and 'misclassified' zero dose children (those who are penta-zero dose but have received at least one other vaccine). Our analysis includes 194,829 observations from 82 low- and middle-income countries. Globally, 14.2% of children are penta-zero dose and 7.5% are truly zero dose, suggesting that 46.5% of penta-zero dose children have had at least one contact with the immunization system. While there are similarities in the profile of children that are penta-zero dose and truly zero dose, there are key differences between the proportion of key characteristics among truly zero dose and misclassified zero dose children, including access to maternal and child health services. By understanding the extent of the connection zero dose children may have with the health and immunization system and contrasting it with how much the use of a more feasible definition of zero dose may underestimate the level of vulnerability in the zero dose population, we provide insights that can help immunization programs design strategies that better target the most disadvantaged populations. If the vulnerability profiles of the truly zero dose children are qualitatively different from that of the penta-zero dose children, then failing to distinguish the truly zero dose populations, and how to optimally reach them, may lead to the development of misguided or inefficient strategies for vaccinating the most disadvantaged population of children.
PubMed: 37896946
DOI: 10.3390/vaccines11101543 -
Medical Physics Feb 2024Radiotherapy dose predictions have been trained with data from previously treated patients of similar sites and prescriptions. However, clinical datasets are often...
BACKGROUND
Radiotherapy dose predictions have been trained with data from previously treated patients of similar sites and prescriptions. However, clinical datasets are often inconsistent and do not contain the same number of organ at risk (OAR) structures. The effects of missing contour data in deep learning-based dose prediction models have not been studied.
PURPOSE
The purpose of this study was to investigate the impacts of incomplete contour sets in the context of deep learning-based radiotherapy dose prediction models trained with clinical datasets and to introduce a novel data substitution method that utilizes automated contours for undefined structures.
METHODS
We trained Standard U-Nets and Cascade U-Nets to predict the volumetric dose distributions of patients with head and neck cancers (HNC) using three input variations to evaluate the effects of missing contours, as well as a novel data substitution method. Each architecture was trained with the original contour (OC) inputs, which included missing information, hybrid contour (HC) inputs, where automated OAR contours generated in software were substituted for missing contour data, and automated contour (AC) inputs containing only automated OAR contours. 120 HNC treatments were used for model training, 30 were used for validation and tuning, and 44 were used for evaluation and testing. Model performance and accuracy were evaluated with global whole body dose agreement, PTV coverage accuracy, and OAR dose agreement. The differences in these values between dataset variations were used to determine the effects of missing data and automated contour substitutions.
RESULTS
Automated contours used as substitutions for missing data were found to improve dose prediction accuracy in the Standard U-Net and Cascade U-Net, with a statistically significant difference in some global metrics and/or OAR metrics. For both models, PTV coverage between input variations was unaffected by the substitution technique. Automated contours in HC and AC datasets improved mean dose accuracy for some OAR contours, including the mandible and brainstem, with a greater improvement seen with HC datasets. Global dose metrics, including mean absolute error, mean error, and percent error were different for the Standard U-Net but not for the Cascade U-Net.
CONCLUSION
Automated contours used as a substitution for contour data improved prediction accuracy for some but not all dose prediction metrics. Compared to the Standard U-Net models, the Cascade U-Net achieved greater precision.
Topics: Humans; Organs at Risk; Radiotherapy Planning, Computer-Assisted; Head and Neck Neoplasms; Radiotherapy Dosage; Software
PubMed: 38127972
DOI: 10.1002/mp.16898 -
The Cochrane Database of Systematic... Oct 2023Vitamin D possesses immunomodulatory properties and has been implicated in the pathogenesis and severity of inflammatory bowel disease (IBD). Animal studies and emerging... (Review)
Review
BACKGROUND
Vitamin D possesses immunomodulatory properties and has been implicated in the pathogenesis and severity of inflammatory bowel disease (IBD). Animal studies and emerging epidemiological evidence have demonstrated an association between vitamin D deficiency and worse disease activity. However, the role of vitamin D for the treatment of IBD is unclear.
OBJECTIVES
To evaluate the benefits and harms of vitamin D supplementation as a treatment for IBD.
SEARCH METHODS
We used standard, extensive Cochrane search methods. The latest search date was Jun 2023.
SELECTION CRITERIA
We included randomised controlled trials (RCTs) in people of all ages with active or inactive IBD comparing any dose of vitamin D with another dose of vitamin D, another intervention, placebo, or no intervention. We defined doses as: vitamin D (all doses), any-treatment-dose vitamin D (greater than 400 IU/day), high-treatment-dose vitamin D (greater than 1000 IU/day), low-treatment-dose vitamin D (400 IU/day to 1000 IU/day), and supplemental-dose vitamin D (less than 400 IU/day).
DATA COLLECTION AND ANALYSIS
We used standard Cochrane methods. Our primary outcomes were 1. clinical response for people with active disease, 2. clinical relapse for people in remission, 3. quality of life, and 4. withdrawals due to adverse events. Our secondary outcomes were 5. disease activity at end of study, 6. normalisation of vitamin D levels at end of study, and 7. total serious adverse events. We used GRADE to assess certainty of evidence for each outcome.
MAIN RESULTS
We included 22 RCTs with 1874 participants. Study duration ranged from four to 52 weeks. Ten studies enroled people with Crohn's disease (CD), five enroled people with ulcerative colitis (UC), and seven enroled people with CD and people with UC. Seventeen studies included adults, three included children, and two included both. Four studies enroled people with active disease, six enroled people in remission, and 12 enroled both. We assessed each study for risk of bias across seven individual domains. Five studies were at low risk of bias across all seven domains. Ten studies were at unclear risk of bias in at least one domain but with no areas of high risk of bias. Seven studies were at high risk of bias for blinding of participants and assessors. Vitamin D (all doses) versus placebo or no treatment Thirteen studies compared vitamin D against placebo or no treatment. We could not draw any conclusions on clinical response for UC as the certainty of the evidence was very low (risk ratio (RR) 4.00, 95% confidence interval (CI) 1.51 to 10.57; 1 study, 60 participants). There were no data on CD. There may be fewer clinical relapses for IBD when using vitamin D compared to placebo or no treatment (RR 0.57, 95% CI 0.34 to 0.96; 3 studies, 310 participants). The certainty of the evidence was low. We could not draw any conclusions on quality of life for IBD (standardised mean difference (SMD) -0.13, 95% CI -3.10 to 2.83 (the SMD value indicates a negligent decrease in quality of life, and the corresponding CIs indicate that the effect can range from a large decrease to a large increase in quality of life); 2 studies, 243 participants) or withdrawals due to adverse events for IBD (RR 1.97, 95% CI 0.18 to 21.27; 12 studies, 1251 participants; note 11 studies reported withdrawals but recorded 0 events in both groups. Thus, the RR and CIs were calculated from 1 study rather than 12). The certainty of the evidence was very low. High-treatment-dose vitamin D versus low-treatment-dose vitamin D Five studies compared high treatment vitamin D doses against low treatment vitamin D doses. There were no data on clinical response. There may be no difference in clinical relapse for CD (RR 0.48, 95% CI 0.23 to 1.01; 1 study, 34 participants). The certainty of the evidence was low. We could not draw any conclusions on withdrawals due to adverse events for IBD as the certainty of the evidence was very low (RR 0.89, 95% CI 0.06 to 13.08; 3 studies, 104 participants; note 2 studies reported withdrawals but recorded 0 events in both groups. Thus, the RR and CIs were calculated from 1 study rather than 3). The data on quality of life and disease activity could not be meta-analysed, were of very low certainty, and no conclusions could be drawn. Any-treatment-dose vitamin D versus supplemental-dose vitamin D Four studies compared treatment doses of vitamin D against supplemental doses. There were no data on clinical response and relapse. There were no data on quality of life that could be meta-analysed. We could not draw any conclusions on withdrawals due to adverse events for IBD as the certainty of the evidence was very low (RR 3.09, 95% CI 0.13 to 73.17; 4 studies, 233 participants; note 3 studies reported withdrawals but recorded 0 events in both groups. Thus, the RR and CIs were calculated from 1 study rather than 4).
AUTHORS' CONCLUSIONS
There may be fewer clinical relapses when comparing vitamin D with placebo, but we cannot draw any conclusions on differences in clinical response, quality of life, or withdrawals, due to very low-certainty evidence. When comparing high and low doses of vitamin D, there were no data for clinical response, but there may be no difference in relapse for CD. We cannot draw conclusions on the other outcomes due to very low certainty evidence. Finally, comparing vitamin D (all doses) to supplemental-dose vitamin D, there were no data on clinical relapse or response, and we could not draw conclusions on other outcomes due to very low certainty evidence or missing data. It is difficult to make any clear recommendations for future research on the basis of the findings of this review. Future studies must be clear on the baseline populations, the purpose of vitamin D treatment, and, therefore, study an appropriate dosing strategy. Stakeholders in the field may wish to reach consensus on such issues prior to new studies.
Topics: Adult; Animals; Child; Humans; Vitamin D; Remission Induction; Neoplasm Recurrence, Local; Colitis, Ulcerative; Crohn Disease; Recurrence
PubMed: 37781953
DOI: 10.1002/14651858.CD011806.pub2 -
European Journal of Radiology Mar 2024The computed tomography (CT) technique is extensively employed as an imaging modality in clinical settings. The radiation dose of CT, however, is significantly high,... (Review)
Review
The computed tomography (CT) technique is extensively employed as an imaging modality in clinical settings. The radiation dose of CT, however, is significantly high, thereby raising concerns regarding the potential radiation damage it may cause. The reduction of X-ray exposure dose in CT scanning may result in a significant decline in imaging quality, thereby elevating the risk of missed diagnosis and misdiagnosis. The reduction of CT radiation dose and acquisition of high-quality images to meet clinical diagnostic requirements have always been a critical research focus and challenge in the field of CT. Over the years, scholars have conducted extensive research on enhancing low-dose CT (LDCT) imaging algorithms, among which deep learning-based algorithms have demonstrated superior performance. In this review, we initially introduced the conventional algorithms for CT image reconstruction along with their respective advantages and disadvantages. Subsequently, we provided a detailed description of four aspects concerning the application of deep neural networks in LDCT imaging process: preprocessing in the projection domain, post-processing in the image domain, dual-domain processing imaging, and direct deep learning-based reconstruction (DLR). Furthermore, an analysis was conducted to evaluate the merits and demerits of each method. The commercial and clinical applications of the LDCT-DLR algorithm were also presented in an overview. Finally, we summarized the existing issues pertaining to LDCT-DLR and concluded the paper while outlining prospective trends for algorithmic advancement.
Topics: Humans; Deep Learning; Prospective Studies; Radiation Dosage; Tomography, X-Ray Computed; Algorithms; Radiographic Image Interpretation, Computer-Assisted; Image Processing, Computer-Assisted
PubMed: 38325188
DOI: 10.1016/j.ejrad.2024.111355 -
Nutrition Journal Dec 2023To investigate the relationship between dietary carotenoid intake and sleep duration.
OBJECTIVE
To investigate the relationship between dietary carotenoid intake and sleep duration.
METHODS
Adults enrolled in the National Health and Nutrition Examination Survey (NHANES) 2007-2018 without missing information on dietary carotenoid intake (α-carotene, β-carotene, β-cryptoxanthin, lycopene, and lutein + zeaxanthin), sleep duration, and covariates were included. Participants' carotenoid consumption was divided into three groups by quartiles and sleep duration was grouped as short (< 7 h/night), optimal (7-8 h/night), and long (> 8 h/night). Multinominal logistic regression was constructed to examine the association between dietary carotenoid intake and sleep duration. Restricted cubic spline (RCS) regression was further utilized to explore their dose-response relationship. The weighted quantile sum (WQS) model was adopted to calculate the mixed and individual effect of 5 carotenoid sub-types on sleep duration.
RESULTS
Multinominal logistic regression presented that people with higher intakes of α-carotene, β-carotene, β-cryptoxanthin, lycopene, and lutein + zeaxanthin were less likely to sleep too short or too long. Consistent with the findings from multinominal logistic regression, the RCS models suggested a reverse U-shaped relationship between sleep duration and carotenoid intakes. The mixed effects were also significant, where β-cryptoxanthin and lutein + zeaxanthin were the top 2 contributors associated with the decreased risks of short sleep duration, while β-carotene, α-carotene, and β-cryptoxanthin were the main factors related to the lower risk of long sleep duration.
CONCLUSION
Our study revealed that the American adults with optimal sleep duration were associated with more dietary carotenoid intake, in comparison to short or long sleepers.
Topics: Adult; Humans; United States; Lycopene; beta Carotene; Nutrition Surveys; Lutein; Zeaxanthins; Beta-Cryptoxanthin; Sleep Duration; Carotenoids; Diet
PubMed: 38062512
DOI: 10.1186/s12937-023-00898-x -
Clinical Pharmacology and Therapeutics Feb 2024Genetic polymorphisms in drug metabolizing enzymes and drug-drug interactions are major sources of inadequate drug exposure and ensuing adverse effects or insufficient... (Review)
Review
Genetic polymorphisms in drug metabolizing enzymes and drug-drug interactions are major sources of inadequate drug exposure and ensuing adverse effects or insufficient responses. The current challenge in assessing drug-drug gene interactions (DDGIs) for the development of precise dose adjustment recommendation systems is to take into account both simultaneously. Here, we analyze the static models of DDGI from in vivo data and focus on the concept of phenoconversion to model inhibition and genetic polymorphisms jointly. These models are applicable to datasets where pharmacokinetic information is missing and are being used in clinical support systems and consensus dose adjustment guidelines. We show that all such models can be handled by the same formal framework, and that models that differ at first sight are all versions of the same linear phenoconversion model. This model includes the linear pharmacogenetic and inhibition models as special cases. We highlight present challenges in this endeavor and the open issues for future research in developing DDGI models for recommendation systems.
PubMed: 38318716
DOI: 10.1002/cpt.3188 -
Critical Care and Resuscitation :... Sep 2022The effect of initiating continuous renal replacement therapy (CRRT) on urine output, fluid balance and mean arterial pressure (MAP) in adult intensive care unit (ICU)...
The effect of initiating continuous renal replacement therapy (CRRT) on urine output, fluid balance and mean arterial pressure (MAP) in adult intensive care unit (ICU) patients is unclear. We aimed to evaluate the impact of CRRT on urine output, MAP, vasopressor requirements and fluid balance, and to identify factors affecting urine output during CRRT. Retrospective cohort study using data from existing databases and CRRT machines. Medical and surgical ICUs at a single university-associated centre. Patients undergoing CRRT between 2015 and 2018. Hourly urine output, fluid balance, MAP and vasopressor dose 24 hours before and after CRRT commencement. Missing values were estimated via Kaplan smoothing univariate time-series imputation. Mixed linear modelling was performed with noradrenaline equivalent dose and urine output as outcomes. In 215 patients, CRRT initiation was associated with a reduction in urine output. Multivariate analysis confirmed an immediate urine output decrease (-0.092 mL/kg/h; 95% confidence interval [CI], -0.150 to -0.034 mL/kg/h) and subsequent progressive urine output decline (effect estimate, -0.01 mL/kg/h; 95% CI, -0.02 to -0.01 mL/kg/h). Age and greater vasopressor dose were associated with lower post-CRRT urine output. Higher MAP and lower rates of net ultrafiltration were associated with higher post-CRRT urine output. With MAP unchanged, vasopressor dose increased in the 24 hours before CRRT, then plateaued and declined in the 24 hours thereafter (effect estimate, -0.004 μg/kg/ min per hour; 95% CI, -0.005 to -0.004 μg/kg/min per hour). Fluid balance remained positive but declined towards neutrality following CRRT implementation. CRRT was associated with decreased urine output despite a gradual decline in vasopressor and a positive fluid balance. The mechanisms behind the reduction in urine output associated with commencement of CRRT requires further investigation.
PubMed: 38046211
DOI: 10.51893/2022.3.OA5 -
JAMA Ophthalmology Dec 2023Intra-arterial chemotherapy (IAC) has quickly gained popularity as a mainstay of treatment for retinoblastoma. Intra-arterial chemotherapy has been described as having...
IMPORTANCE
Intra-arterial chemotherapy (IAC) has quickly gained popularity as a mainstay of treatment for retinoblastoma. Intra-arterial chemotherapy has been described as having several advantages over systemic chemotherapy, including reducing systemic toxicity and neutropenia; however, studies on the risk of neutropenia after IAC remain limited.
OBJECTIVE
To estimate the incidence of neutropenia after IAC, as well as identify risk factors associated with the development of neutropenia.
DESIGN, SETTING, AND PARTICIPANTS
This case series included pediatric patients with unilateral or bilateral retinoblastoma who were treated with IAC at a single quaternary care center from July 13, 2013, to January 6, 2023.
EXPOSURE
All patients were treated with IAC and underwent multiple IAC cycles depending on treatment response. The primary chemotherapy agent used was melphalan, but topotecan or carboplatin could be used along with melphalan. Melphalan doses were kept to 0.4 mg/kg or less per cycle. After each IAC cycle, complete blood cell counts were obtained within 10 to 12 days and repeated until the absolute neutrophil count (ANC) was greater than or equal to 1000/μL.
MAIN OUTCOMES AND MEASURES
The primary outcome was the minimum ANC after each IAC cycle. The secondary outcome was the development of severe (grade 3 or 4) neutropenia (ANC <1000/μL). Regression analyses were used to identify associations between variables and outcomes. Receiver operating characteristic curves were used to calculate threshold dose for each chemotherapy agent potentially associated with the development of severe neutropenia.
RESULTS
A total of 64 eyes of 49 patients (mean [SD] age, 1.7 [1.2] years; 25 females [51.0%]) with retinoblastoma were treated with 171 cycles of IAC. The mean (SD) nadir ANC was 1325.3 (890.7)/μL and occurred a median (IQR) of 10 (10-14) days (range, 6-28 days) after IAC administration. The frequency distribution of post-IAC neutropenia grades 0, 1, 2, 3, 4, and missing was 31 (18.1% of cycles), 25 (14.6%), 40 (23.4%), 37 (21.6%), 26 (15.2%), and 12 (7.0%), respectively. Factors weakly correlated with a lower ANC were higher melphalan dose (β = -2356 [95% CI, -4120.6 to -611.2]; adjusted R2 = 0.251; P = .01) and higher topotecan dose (β = -4056 [95% CI, -7003.6 to -1344.5]; adjusted R2 = 0.251; P = .006).
CONCLUSIONS AND RELEVANCE
In this case series of patients with retinoblastoma, the incidence of severe neutropenia after IAC was nearly 40%, which is higher than previously reported. Extended laboratory monitoring may aid in capturing previously overlooked cases of neutropenia. Topotecan may be associated with the development of neutropenia; limiting topotecan doses, especially in the setting of a high melphalan dose, may be beneficial in reducing the risk of neutropenia.
Topics: Female; Humans; Child; Infant; Retinoblastoma; Retinal Neoplasms; Melphalan; Topotecan; Incidence; Neutropenia; Infusions, Intra-Arterial; Risk Factors
PubMed: 37917073
DOI: 10.1001/jamaophthalmol.2023.4825