-
Heart and Vessels Sep 2023The outcome of the patients undergoing cardiac surgery with cardiopulmonary bypass (CPB) is also influenced by the renal and hepatic organ functions. Risk...
The outcome of the patients undergoing cardiac surgery with cardiopulmonary bypass (CPB) is also influenced by the renal and hepatic organ functions. Risk stratification, using scores such as EURO Score II or STS Short-Term Risk Calculator for patients undergoing cardiac surgery with cardiopulmonary bypass, ignores the quantitative renal and hepatic function; therefore, MELD-Score was applied in these cases. We retrospectively examined patient data using the MELD score as a predictor of mortality. To perform a univariate analysis of the data, patients were classified into three groups based on the MELD Score: MELD < 10 (Group 1), MELD 10 to 19 (Group 2), and MELD ≥ 20 (Group 3). A total of 11,477 participants were included in the study, though several patients with either missing MELD scores or lack of creatinine, bilirubin, or INR levels were dropped from the original cohort. Eventually, 10,882 patients were included in the analysis. The primary outcome was defined as postoperative, in-hospital mortality. Secondary outcomes such as postoperative bleeding, including the requirement for repeat thoracotomy, postoperative neurological complications, and assessment of catecholamines on weaning from cardiopulmonary bypass/ requirement of mechanical circulatory support were examined. A higher MELD score was associated with increased postoperative mortality. Patients with MELD > 20 experienced a 31.2% postoperative mortality, compared to Group 1 (4.6%) and Group 2 (17.5%). The highest rates of postoperative bleeding (13.8%) and, repeat thoracotomy (13.2%) & postoperative pneumonia (17.4%) were seen in Group 3 (threefold increase when compared to Group 1, renal failure requiring dialysis (N = 235, 2.7% in Group 1 v/s N = 78, 22.9% in Group 3) or requiring high dose catecholamines post-operatively or mechanical circulatory support (IABP/ECLS). Incidentally, an increased MELD Score was not associated with a significant increase in the postoperative incidence of stroke/TIA or the presence of sternal wound infections/complications. A higher mortality was observed in patients with reduced liver and renal function, with a significant increase in patients with a MELD score > 20. As the current risk stratification scores do not consider this, we recommend applying the MELD score before considering patients for cardiac surgery.
Topics: Humans; Retrospective Studies; Cardiac Surgical Procedures; Risk Factors; Postoperative Complications; Liver; Risk Assessment
PubMed: 37004541
DOI: 10.1007/s00380-023-02262-9 -
European Journal of Cancer (Oxford,... Aug 2023The pharmaceutical industry's productivity has been declining over the last two decades and high attrition rates and reduced regulatory approvals are being seen. The...
BACKGROUND
The pharmaceutical industry's productivity has been declining over the last two decades and high attrition rates and reduced regulatory approvals are being seen. The development of oncology drugs is particularly challenging with low rates of approval for novel treatments when compared with other therapeutic areas. Reliably establishing the potential of novel treatment and the corresponding optimal dosage is a key component to ensure efficient overall development. A growing interest lies in terminating developments of poor treatments quickly while enabling accelerated development for highly promising interventions.
METHODS
One approach to reliably establish the optimal dosage and the potential of a novel treatment and thereby improve efficiency in the drug development pathway is the use of novel statistical designs that make efficient use of the data collected.
RESULTS
In this paper, we discuss different (seamless) strategies for early oncology development and illustrate their strengths and weaknesses through real trial examples. We provide some directions for good practices in early oncology development, discuss frequently seen missed opportunities for improved efficiency and some future opportunities that have yet to fully develop their potential in early oncology treatment development.
DISCUSSION
Modern methods for dose-finding have the potential to shorten and improve dose-finding and only small changes to current approaches are required to realise this potential.
Topics: Humans; Medical Oncology; Drug Development; Research Design; Neoplasms
PubMed: 37301716
DOI: 10.1016/j.ejca.2023.05.005 -
Drug Metabolism and Disposition: the... Sep 2023Understanding the extended clearance concept and establishing a physiologically based pharmacokinetic (PBPK) model are crucial for investigating the impact of changes in... (Review)
Review
A 20-Year Research Overview: Quantitative Prediction of Hepatic Clearance Using the In Vitro-In Vivo Extrapolation Approach Based on Physiologically Based Pharmacokinetic Modeling and Extended Clearance Concept.
Understanding the extended clearance concept and establishing a physiologically based pharmacokinetic (PBPK) model are crucial for investigating the impact of changes in transporter and metabolizing enzyme abundance/functions on drug pharmacokinetics in blood and tissues. This mini-review provides an overview of the extended clearance concept and a PBPK model that includes transporter-mediated uptake processes in the liver. In general, complete in vitro and in vivo extrapolation (IVIVE) poses challenges due to missing factors that bridge the gap between in vitro and in vivo systems. By considering key in vitro parameters, we can capture in vivo pharmacokinetics, a strategy known as the top-down or middle-out approach. We present the latest progress, theory, and practice of the Cluster Gauss-Newton method, which is used for middle-out analyses. As examples of poor IVIVE, we discuss "albumin-mediated hepatic uptake" and "time-dependent inhibition" of OATP1Bs. The hepatic uptake of highly plasma-bound drugs is more efficient than what can be accounted for by their unbound concentration alone. This phenomenon is referred to as "albumin-mediated" hepatic uptake. IVIVE was improved by measuring hepatic uptake clearance in vitro in the presence of physiologic albumin concentrations. Lastly, we demonstrate the application of Cluster Gauss-Newton method-based analysis to the target-mediated drug disposition of bosentan. Incorporating saturable target binding and OATP1B-mediated hepatic uptake into the PBPK model enables the consideration of nonlinear kinetics across a wide dose range and the prediction of receptor occupancy over time. SIGNIFICANCE STATEMENT: There have been multiple instances where researchers' endeavors to unravel the underlying mechanism of poor in vitro-in vivo extrapolation have led to the discovery of previously undisclosed truths. These include 1) albumin-mediated hepatic uptake, 2) the target-mediated drug disposition in small molecules, and 3) the existence of a trans-inhibition mechanism by inhibitors for OATP1B-mediated hepatic uptake of drugs. Consequently, poor in vitro-in vivo extrapolation and the subsequent inquisitiveness of scientists may serve as a pivotal gateway to uncover hidden mechanisms.
Topics: Hepatocytes; Kinetics; Models, Biological; Liver; Albumins
PubMed: 37407092
DOI: 10.1124/dmd.123.001344 -
Investigative Radiology Aug 2023Deep learning approaches are playing an ever-increasing role throughout diagnostic medicine, especially in neuroradiology, to solve a wide range of problems such as...
Deep learning approaches are playing an ever-increasing role throughout diagnostic medicine, especially in neuroradiology, to solve a wide range of problems such as segmentation, synthesis of missing sequences, and image quality improvement. Of particular interest is their application in the reduction of gadolinium-based contrast agents, the administration of which has been under cautious reevaluation in recent years because of concerns about gadolinium deposition and its unclear long-term consequences. A growing number of studies are investigating the reduction (low-dose approach) or even complete substitution (zero-dose approach) of gadolinium-based contrast agents in diverse patient populations using a variety of deep learning methods. This work aims to highlight selected research and discusses the advantages and limitations of recent deep learning approaches, the challenges of assessing its output, and the progress toward clinical applicability distinguishing between the low-dose and zero-dose approach.
Topics: Humans; Contrast Media; Deep Learning; Gadolinium; Magnetic Resonance Imaging; Radiopharmaceuticals
PubMed: 36822654
DOI: 10.1097/RLI.0000000000000963 -
PloS One 2023In 2021, an estimated 18 million children did not receive a single dose of routine vaccinations and constitute the population known as zero dose children. There is...
INTRODUCTION
In 2021, an estimated 18 million children did not receive a single dose of routine vaccinations and constitute the population known as zero dose children. There is growing momentum and investment in reaching zero dose children and addressing the gross inequity in the reach of immunization services. To effectively do so, there is an urgent need to characterize more deeply the population of zero dose children and the barriers they face in accessing routine immunization services.
METHODS
We utilized the most recent DHS and MICS data spanning 2011 to 2020 from low, lower-middle, and upper-middle income countries. Zero dose status was defined as children aged 12-23 months who had not received any doses of BCG, DTP-containing, polio, and measles-containing vaccines. We estimated the prevalence of zero-dose children in the entire study sample, by country income level, and by region, and characterized the zero dose population by household-level factors. Multivariate logistic regressions were used to determine the household-level sociodemographic and health care access factors associated with zero dose immunization status. To pool multicountry data, we adjusted the original survey weights according to the country's population of children 12-23 months of age. To contextualize our findings, we utilized United Nations Population Division birth cohort data to estimate the study population as a proportion of the global and country income group populations.
RESULTS
We included a total of 82 countries in our univariate analyses and 68 countries in our multivariate model. Overall, 7.5% of the study population were zero dose children. More than half (51.9%) of this population was concentrated in African countries. Zero dose children were predominantly situated in rural areas (75.8%) and in households in the lowest two wealth quintiles (62.7%) and were born to mothers who completed fewer than four antenatal care (ANC) visits (66.5%) and had home births (58.5%). Yet, surprisingly, a considerable proportion of zero dose children's mothers did receive appropriate care during pregnancy (33.5% of zero dose children have mothers who received at least 4 ANC visits). When controlled for other factors, children had three times the odds (OR = 3.00, 95% CI: 2.72, 3.30) of being zero dose if their mother had not received any tetanus injections, 2.46 times the odds (95% CI: 2.21, 2.74) of being zero dose if their mother had not received any ANC visits, and had nearly twice the odds (OR = 1.87, 95% CI: 1.70, 2.05) of being zero dose if their mother had a home delivery, compared to children of mothers who received at least 2 tetanus injections, received at least 4 ANC visits, and had a facility delivery, respectively.
DISCUSSION
A lack of access to maternal health care was a strong risk factor of zero dose status and highlights important opportunities to improve the quality and integration of maternal and child health programs. Additionally, because a substantial proportion of zero dose children and their mothers do receive appropriate care, approaches to reach zero dose children should incorporate mitigating missed opportunities for vaccination.
Topics: Child; Humans; Female; Pregnancy; Infant; Developing Countries; Tetanus; Vaccination; Immunization; Risk Factors; Measles Vaccine
PubMed: 38060516
DOI: 10.1371/journal.pone.0287459 -
The Journal of Rural Health : Official... Apr 2024Adolescent human papillomavirus (HPV) vaccination rates continue to remain lower than other adolescent vaccines, both nationwide and in Iowa. This study examined...
PURPOSE
Adolescent human papillomavirus (HPV) vaccination rates continue to remain lower than other adolescent vaccines, both nationwide and in Iowa. This study examined predictors of missed opportunities for first-dose HPV vaccine administrations in Iowa in order to conduct more targeted outreach and improve adolescent HPV vaccine uptake.
METHODS
A retrospective study was conducted to identify predictors of missed opportunities for first-dose HPV vaccination in Iowa adolescents using Iowa's Immunization Registry Information System. The study population included 154,905 adolescents aged 11-15 years between 2019 and 2022. Missed opportunity for first-dose HPV vaccination was defined as a vaccination encounter where an adolescent received a Tdap and/or MenACWY vaccine but did not receive the first-dose HPV vaccine during the same encounter.
FINDINGS
Over a third of the study population experienced a missed opportunity for HPV vaccination between 2019 and 2022. Missed opportunity for vaccination was most common among individuals living in a rural county (aOR = 1.36), underinsured adolescents (aOR = 1.74), males (aOR = 1.12), teens 13-15 years of age (aOR = 1.76), and White race and non-Hispanic ethnicity.
CONCLUSION
This study builds on previously reported predictors of missed opportunity for HPV vaccination in adolescents. Increased understanding of provider needs and barriers to administering HPV vaccination and further analysis of how the Vaccines for Children Program can play a role in HPV vaccination uptake is necessary to improve HPV vaccination rates among adolescents in Iowa and more specifically in rural communities.
PubMed: 38683043
DOI: 10.1111/jrh.12839 -
Naunyn-Schmiedeberg's Archives of... Feb 2024Hydroxychloroquine (HCQ) has been repurposed and used for the treatment of COVID-19 patients; however, its efficacy remains controversial, maybe partly due to the... (Meta-Analysis)
Meta-Analysis
Hydroxychloroquine (HCQ) has been repurposed and used for the treatment of COVID-19 patients; however, its efficacy remains controversial, maybe partly due to the dosage, ranging from 200 to 800 mg/day, reported in different studies. Indeed, HCQ low dose (≤ 2.4 g/5 days) showed a lower risk of side effects compared to high doses. In this study, we performed a systematic review and meta-analysis to investigate the effect of low-dose HCQ used alone on three outcomes including in-hospital mortality, the need for mechanical ventilation, and ICU admission in COVID-19 patients. A systematic review of English literature was conducted from January 2020 to April 2022, in PubMed, Cochrane Library, and Google Scholar. Studies reporting a dosage of 400 mg twice the first day, followed by 200 mg twice for four days were included. Pooled odds ratios and 95% confidence intervals were calculated using random-effects models. Eleven studies (12,503 patients) were retained in the quantitative analysis, four observational cohort studies, and seven RCTs. When pooling both observational and RCTs, low-dose HCQ was associated with decreased mortality (OR = 0.73, 95% CI: [0.55-0.97], I = 58%), but not with mechanical ventilation need (OR = 1.03, 95% CI: [0.56-1.89], I = 67%) and ICU admission rate (OR = 0.70, 95% CI: [0.42-1.17], I = 47%). However, no effect was observed when pooling only RCTs. Despite RCTs limitations, treatment with low-dose HCQ was not associated with improvement in mortality, mechanical ventilation need and ICU admission rate in COVID-19 patients.
Topics: Humans; COVID-19; COVID-19 Drug Treatment; Hydroxychloroquine; Respiration, Artificial; Observational Studies as Topic; Randomized Controlled Trials as Topic
PubMed: 37639021
DOI: 10.1007/s00210-023-02688-y -
The New England Journal of Medicine Jun 2024Adrenal insufficiency in patients with classic 21-hydroxylase deficiency congenital adrenal hyperplasia (CAH) is treated with glucocorticoid replacement therapy. Control...
BACKGROUND
Adrenal insufficiency in patients with classic 21-hydroxylase deficiency congenital adrenal hyperplasia (CAH) is treated with glucocorticoid replacement therapy. Control of adrenal-derived androgen excess usually requires supraphysiologic glucocorticoid dosing, which predisposes patients to glucocorticoid-related complications. Crinecerfont, an oral corticotropin-releasing factor type 1 receptor antagonist, lowered androstenedione levels in phase 2 trials involving patients with CAH.
METHODS
In this phase 3 trial, we randomly assigned adults with CAH in a 2:1 ratio to receive crinecerfont or placebo for 24 weeks. Glucocorticoid treatment was maintained at a stable level for 4 weeks to evaluate androstenedione values, followed by glucocorticoid dose reduction and optimization over 20 weeks to achieve the lowest glucocorticoid dose that maintained androstenedione control (≤120% of the baseline value or within the reference range). The primary efficacy end point was the percent change in the daily glucocorticoid dose from baseline to week 24 with maintenance of androstenedione control.
RESULTS
All 182 patients who underwent randomization (122 to the crinecerfont group and 60 to the placebo group) were included in the 24-week analysis, with imputation of missing values; 176 patients (97%) remained in the trial at week 24. The mean glucocorticoid dose at baseline was 17.6 mg per square meter of body-surface area per day of hydrocortisone equivalents; the mean androstenedione level was elevated at 620 ng per deciliter. At week 24, the change in the glucocorticoid dose (with androstenedione control) was -27.3% in the crinecerfont group and -10.3% in the placebo group (least-squares mean difference, -17.0 percentage points; P<0.001). A physiologic glucocorticoid dose (with androstenedione control) was reported in 63% of the patients in the crinecerfont group and in 18% in the placebo group (P<0.001). At week 4, androstenedione levels decreased with crinecerfont (-299 ng per deciliter) but increased with placebo (45.5 ng per deciliter) (least-squares mean difference, -345 ng per deciliter; P<0.001). Fatigue and headache were the most common adverse events in the two trial groups.
CONCLUSIONS
Among patients with CAH, the use of crinecerfont resulted in a greater decrease from baseline in the mean daily glucocorticoid dose, including a reduction to the physiologic range, than placebo following evaluation of adrenal androgen levels. (Funded by Neurocrine Biosciences; CAHtalyst ClinicalTrials.gov number, NCT04490915.).
PubMed: 38828955
DOI: 10.1056/NEJMoa2404656 -
Seizure Aug 2023Objective seizure count estimates are crucial for ambulatory epilepsy management. Wearables have shown promise for the detection of tonic-clonic seizures but may suffer...
OBJECTIVE
Objective seizure count estimates are crucial for ambulatory epilepsy management. Wearables have shown promise for the detection of tonic-clonic seizures but may suffer from false alarms and undetected seizures. Seizure signatures recorded by wearables often occur over prolonged periods, including increased levels of electrodermal activity and heart rate long after seizure EEG onset, however, previous detection methods only partially exploited these signatures. Understanding the utility of these prolonged signatures for seizure count estimation and what factors generally determine seizure logging performance, including the role of data quality vs. algorithm performance, is thus crucial for improving wearables-based epilepsy monitoring and determining which patients benefit most from this technology.
METHODS
In this retrospective study we examined 76 pediatric epilepsy patients during multiday video-EEG monitoring equipped with a wearable (Empatica E4; records of electrodermal activity, EDA, accelerometry, ACC, heart rate, HR; 1983 h total recording time; 45 tonic-clonic seizures). To log seizures on prolonged data trends, we applied deep learning on continuous overlapping 1-hour segments of multimodal data in a leave-one-subject-out approach. We systematically examined factors influencing logging performance, including patient age, antiseizure medication (ASM) load, seizure type and duration, and data artifacts. To gain insights into algorithm function and feature importance we applied Uniform Manifold Approximation and Projection (UMAP, to represent the separability of learned features) and SHapley Additive exPlanations (SHAP, to represent the most informative data signatures).
RESULTS
Performance for tonic-clonic seizure logging increased systematically with patient age (AUC 0.61 for patients 〈 11 years, AUC 0.77 for patients between 11-15 years, AUC 0.85 for patients 〉 15 years). Across all ages, AUC was 0.75 corresponding to a sensitivity of 0.52 and a false alarm rate of 0.28/24 h. Seizures under high ASM load or with shorter duration were detected worse (P=.025, P=.033, respectively). UMAP visualized discriminatory power at the individual patient level, SHAP analyses identified clonic motor activity and peri/postictal increases in HR and EDA as most informative. In contrast, in missed seizures, these features were absent indicating that recording quality but not the algorithm caused the low sensitivity in these patients.
SIGNIFICANCE
Our results demonstrate the utility of prolonged, postictal data segments for seizure logging, contribute to algorithm explainability and point to influencing factors, including high ASM dose and short seizure duration. Collectively, these results may help to identify patients who particularly benefit from such technology.
Topics: Humans; Child; Infant; Retrospective Studies; Data Accuracy; Seizures; Epilepsy; Electroencephalography; Wearable Electronic Devices
PubMed: 37336056
DOI: 10.1016/j.seizure.2023.06.002 -
Journal of Cardiothoracic Surgery Oct 2023The background is that intravenous adrenaline administration is recommended for advanced cardiovascular life support in adults and endotracheal administration is given... (Observational Study)
Observational Study
Changes in vital signs during adrenaline administration for hemostasis in intracordal injection: an observational study with a hypothetical design of endotracheal adrenaline administration in cardiopulmonary arrest.
BACKGROUND
The background is that intravenous adrenaline administration is recommended for advanced cardiovascular life support in adults and endotracheal administration is given low priority. The reason is that the optimal dose of adrenaline in endotracheal administration is unknown, and it is ethically difficult to design studies of endotracheal adrenaline administration with non-cardiopulmonary arrest. We otolaryngologists think so because we administered adrenaline to the vocal folds for hemostasis after intracordal injection under local anesthesia, but have had few cases of vital changes. We hypothesized that examining vital signs before and after adrenaline administration for hemostasis would help determine the optimal dose of endotracheal adrenaline.
METHODS
We retrospectively examined the medical records of 79 patients who visited our hospital from January 2018 to December 2020 and received adrenaline in the vocal folds and trachea for hemostasis by intracordal injection under local anesthesia to investigate changes in heart rate and systolic blood pressure before and after the injection.
RESULTS
The mean heart rates before and after injection were 83.96 ± 18.51 (standard deviation) beats per minute (bpm) and 81.50 ± 15.38 (standard deviation) bpm, respectively. The mean systolic blood pressure before and after the injection were 138.13 ± 25.33 (standard deviation) mmHg and 135.72 ± 22.19 (standard deviation) mmHg, respectively. Heart rate and systolic blood pressure had P-values of 0.136, and 0.450, respectively, indicating no significant differences.
CONCLUSIONS
Although this study was an observational, changes in vital signs were investigated assuming endotracheal adrenaline administration. The current recommended dose of adrenaline in endotracheal administration with cardiopulmonary arrest may not be effective. In some cases of cardiopulmonary arrest, intravenous and intraosseous routes of adrenaline administration may be difficult and the opportunity for resuscitation may be missed. Therefore, it is desirable to have many options for adrenaline administration. Therefore, if the optimal dose and efficacy of endotracheal adrenaline administration can be clarified, early adrenaline administration will be possible, which will improve return of spontaneous circulation (ROSC) and survival discharge rates.
Topics: Adult; Humans; Blood Pressure; Cardiopulmonary Resuscitation; Epinephrine; Heart Arrest; Hemostasis; Retrospective Studies
PubMed: 37803400
DOI: 10.1186/s13019-023-02376-1