-
Indian Journal of Public Health Oct 2023We measured COVID-19-related stigma and discrimination and its drivers using a concurrent mixed-methods design in Punjab. The simple random sampling was used to select...
We measured COVID-19-related stigma and discrimination and its drivers using a concurrent mixed-methods design in Punjab. The simple random sampling was used to select blocks, subcenters, and urban primary healthcenters from each of the four selected districts. The systematic random sampling was used to select households. A sample of 423 adults was interviewed using a structured questionnaire and 10 in-depth interviews were conducted using an interview guide. Binary logistic regression was performed to find the predictors. Stigma prevalence was mild 18%, moderate 45%, and severe 37%. Logistic regression indicated that stigma was lower in the rural compared to the urban population (P < 0.01). Hospitalized patients faced discrimination more often compared to those who were treated/quarantined at home. People feared police (71%), testing (69%), and contracting the infection (65%). Fear of screening, disclosure of status, and transmission of the virus were the drivers of stigma and discrimination. Co-occurrence of labeling, stereotyping, and cognitive separation was observed.
Topics: Humans; COVID-19; India; Social Stigma; Male; Female; Adult; SARS-CoV-2; Middle Aged; Stereotyping; Rural Population; Urban Population; Social Discrimination; Young Adult
PubMed: 38934816
DOI: 10.4103/ijph.ijph_1255_22 -
Asia-Pacific Journal of Oncology Nursing Jun 2024The delivery of bad news is an unpleasant but necessary medical procedure. However, few studies have addressed the experiences and preferences of the families of...
OBJECTIVE
The delivery of bad news is an unpleasant but necessary medical procedure. However, few studies have addressed the experiences and preferences of the families of school-aged children with cancer when they are informed of the children's condition. This study aimed to explore families of school-age children with cancer for their preferences and experiences of truth-telling.
METHODS
This descriptive phenomenological qualitative research was conducted using focus group interviews and semistructured interview guidelines were adopted for in-depth interviews. Fifteen families participated in the study. The data were analyzed using Colaizzi's analysis. Data were collected from August 2019 to May 2020.
RESULTS
The study identified two major themes: "caught in a dilemma" and "kind and comprehensive team support." The first major theme focused on families' experiences with cancer truth-telling. Three sub-themes emerged: (1) cultural aspects of cancer disclosure, (2) decision-making regarding informing pediatric patients about their illness, and (3) content of disclosure after weighing the pros and cons. The second major theme, which revealed families' preferences for delivering bad news, was classified into three sub-themes: (1) have integrity, (2) be realistic, and (3) be supportive.
CONCLUSIONS
This study underscores the dilemma encountered by the families of children with cancer after disclosure and their inclination toward receiving comprehensive information and continuous support. Health care personnel must improve their truth-telling ability in order to better address the needs of such families and to provide continuous support throughout the truth-telling process.
PubMed: 38933686
DOI: 10.1016/j.apjon.2024.100500 -
Children (Basel, Switzerland) Jun 2024Through a thematic analysis of firsthand posts from 258 abuse survivors in online forums from 2016 to 2023, this research examines the barriers that Chinese children...
Through a thematic analysis of firsthand posts from 258 abuse survivors in online forums from 2016 to 2023, this research examines the barriers that Chinese children encounter when disclosing sexual abuse. The anonymous narratives shed light on the motives behind survivors' reluctance to reveal abuse, the outcomes following disclosure, and the wider implications for survivors and their families under culture. The findings underscore the need for early intervention upon disclosure, aiming to safeguard children from further harm and foster the development of an effective child protection framework.
PubMed: 38929267
DOI: 10.3390/children11060688 -
Brain Sciences Jun 2024Cerebral intraparenchymal hemorrhage due to electrode implantation (CIPHEI) is a rare but serious complication of deep brain stimulation (DBS) surgery. This study...
Cerebral Intraparenchymal Hemorrhage due to Implantation of Electrodes for Deep Brain Stimulation: Insights from a Large Single-Center Retrospective Cross-Sectional Analysis.
Cerebral intraparenchymal hemorrhage due to electrode implantation (CIPHEI) is a rare but serious complication of deep brain stimulation (DBS) surgery. This study retrospectively investigated a large single-center cohort of DBS implantations to calculate the frequency of CIPHEI and identify patient- and procedure-related risk factors for CIPHEI and their potential interactions. We analyzed all DBS implantations between January 2013 and December 2021 in a generalized linear model for binomial responses using bias reduction to account for sparse sampling of CIPHEIs. As potential risk factors, we considered age, gender, history of arterial hypertension, level of invasivity, types of micro/macroelectrodes, and implanted DBS electrodes. If available, postoperative coagulation and platelet function were exploratorily assessed in CIPHEI patients. We identified 17 CIPHEI cases across 839 electrode implantations in 435 included procedures in 418 patients (3.9%). Exploration and cross-validation analyses revealed that the three-way interaction of older age (above 60 years), high invasivity (i.e., use of combined micro/macroelectrodes), and implantation of directional DBS electrodes accounted for 82.4% of the CIPHEI cases. Acquired platelet dysfunction was present only in one CIPHEI case. The findings at our center suggested implantation of directional DBS electrodes as a new potential risk factor, while known risks of older age and high invasivity were confirmed. However, CIPHEI risk is not driven by the three factors alone but by their combined presence. The contributions of the three factors to CIPHEI are hence not independent, suggesting that potentially modifiable procedural risks should be carefully evaluated when planning DBS surgery in patients at risk.
PubMed: 38928612
DOI: 10.3390/brainsci14060612 -
International Journal of Spine Surgery Jun 2024Progenitor cells derived from intervertebral disc tissue demonstrated immunomodulatory and regenerative properties in preclinical studies. We report the safety and...
Allogeneic Disc Progenitor Cells Safely Increase Disc Volume and Improve Pain, Disability, and Quality of Life in Patients With Lumbar Disc Degeneration-Results of an FDA-Approved Biologic Therapy Randomized Clinical Trial.
BACKGROUND
Progenitor cells derived from intervertebral disc tissue demonstrated immunomodulatory and regenerative properties in preclinical studies. We report the safety and efficacy results of a US Food and Drug Administration-approved clinical trial of these cells for the treatment of symptomatic degenerative disc disease.
METHODS
Sixty patients with symptomatic single-level lumbar degenerative disc disease (mean age 37.9 years, 60% men) were enrolled in a randomized, double-blinded, placebo-controlled Phase I/Phase II study at 13 clinical sites. They were randomized to receive single intradiscal injections of either low-dose cells ( = 20), high-dose cells ( = 20), vehicle alone ( = 10), or placebo ( = 10). The primary endpoint was mean visual analog scale (VAS) pain improvement >30% at 52 weeks. Disc volume was radiologically assessed. Adverse events (AEs), regardless of whether they were related to treatment, were reported. Patients were assessed at baseline and at 4, 12, 26, 52, 78, and 104 weeks posttreatment.
RESULTS
At week 52, the high-dose group had a mean VAS percentage decrease from baseline (-62.8%, = 0.0005), achieving the endpoint of back pain improvement >30%; the mean change was also significantly greater than the minimal clinically important difference of a 20-point decrease (-42.8, = 0.001). This clinical improvement was maintained at week 104. The vehicle group had a smaller significant decrease in VAS (-52.8%, = 0.044), while the low-dose and placebo groups showed nonsignificant improvements. Only the high-dose group had a significant change in disc volume, with mean increases of 249.0 mm ( = 0.028) at 52 weeks and 402.1 mm ( = 0.028) at 104 weeks. A minority of patients (18.3%) reported AEs that were severe. Overall, 6.7% of patients experienced serious AEs, all in the vehicle ( = 1) or placebo ( = 3) groups, none treatment related.
CONCLUSIONS
High-dose allogeneic disc progenitor cells produced statistically significant, clinically meaningful improvements in back pain and disc volume at 1 year following a single intradiscal injection and were safe and well tolerated. These improvements were maintained at 2 years post-injection.
CLINICAL TRIAL REGISTRATION
NCT03347708-Study to Evaluate the Safety and Preliminary Efficacy of Injectable Disc Cell Therapy, a Treatment for Symptomatic Lumbar Intervertebral Disc Degeneration.
PubMed: 38925869
DOI: 10.14444/8609 -
BMJ (Clinical Research Ed.) Jun 2024To assess the efficacy and safety of colchicine versus placebo on reducing the risk of subsequent stroke after high risk non-cardioembolic ischaemic stroke or transient... (Randomized Controlled Trial)
Randomized Controlled Trial
OBJECTIVES
To assess the efficacy and safety of colchicine versus placebo on reducing the risk of subsequent stroke after high risk non-cardioembolic ischaemic stroke or transient ischaemic attack within the first three months of symptom onset (CHANCE-3).
DESIGN
Multicentre, double blind, randomised, placebo controlled trial.
SETTING
244 hospitals in China between 11 August 2022 and 13 April 2023.
PARTICIPANTS
8343 patients aged 40 years of age or older with a minor-to-moderate ischaemic stroke or transient ischaemic attack and a high sensitivity C-reactive protein ≥2 mg/L were enrolled.
INTERVENTIONS
Patients were randomly assigned 1:1 within 24 h of symptom onset to receive colchicine (0.5 mg twice daily on days 1-3, followed by 0.5 mg daily thereafter) or placebo for 90 days.
MAIN OUTCOME MEASURES
The primary efficacy outcome was any new stroke within 90 days after randomisation. The primary safety outcome was any serious adverse event during the treatment period. All efficacy and safety analyses were by intention to treat.
RESULTS
4176 patients were assigned to the colchicine group and 4167 were assigned to the placebo group. Stroke occurred within 90 days in 264 patients (6.3%) in the colchicine group and 270 patients (6.5%) in the placebo group (hazard ratio 0.98 (95% confidence interval 0.83 to 1.16); P=0.79). Any serious adverse event was observed in 91 (2.2%) patients in the colchicine group and 88 (2.1%) in the placebo group (P=0.83).
CONCLUSIONS
The study did not provide evidence that low-dose colchicine could reduce the risk of subsequent stroke within 90 days as compared with placebo among patients with acute non-cardioembolic minor-to-moderate ischaemic stroke or transient ischaemic attack and a high sensitivity C-reactive protein ≥2 mg/L.
TRIAL REGISTRATION
ClinicalTrials.gov, NCT05439356.
Topics: Humans; Colchicine; Male; Female; Double-Blind Method; Middle Aged; Ischemic Attack, Transient; Aged; Ischemic Stroke; Treatment Outcome; China; C-Reactive Protein; Adult
PubMed: 38925803
DOI: 10.1136/bmj-2023-079061 -
BMJ (Clinical Research Ed.) Jun 2024To evaluate the comparative effectiveness of sodium-glucose cotransporter-2 (SGLT-2) inhibitors, glucagon-like peptide-1 (GLP-1) receptor agonists, and dipeptidyl... (Comparative Study)
Comparative Study
SGLT-2 inhibitors, GLP-1 receptor agonists, and DPP-4 inhibitors and risk of hyperkalemia among people with type 2 diabetes in clinical practice: population based cohort study.
OBJECTIVES
To evaluate the comparative effectiveness of sodium-glucose cotransporter-2 (SGLT-2) inhibitors, glucagon-like peptide-1 (GLP-1) receptor agonists, and dipeptidyl peptidase-4 (DPP-4) inhibitors in preventing hyperkalemia in people with type 2 diabetes in routine clinical practice.
DESIGN
Population based cohort study with active-comparator, new user design.
SETTING
Claims data from Medicare and two large commercial insurance databases in the United States from April 2013 to April 2022.
PARTICIPANTS
1:1 propensity score matched adults with type 2 diabetes newly starting SGLT-2 inhibitors versus DPP-4 inhibitors (n=778 908), GLP-1 receptor agonists versus DPP-4 inhibitors (n=729 820), and SGLT-2 inhibitors versus GLP-1 receptor agonists (n=873 460).
MAIN OUTCOME MEASURES
Hyperkalemia diagnosis in the inpatient or outpatient setting. Secondary outcomes were hyperkalemia defined as serum potassium levels ≥5.5 mmol/L and hyperkalemia diagnosis in the inpatient or emergency department setting.
RESULTS
Starting SGLT-2 inhibitor treatment was associated with a lower rate of hyperkalemia than DPP-4 inhibitor treatment (hazard ratio 0.75, 95% confidence interval (CI) 0.73 to 0.78) and a slight reduction in rate compared with GLP-1 receptor agonists (0.92, 0.89 to 0.95). Use of GLP-1 receptor agonists was associated with a lower rate of hyperkalemia than DPP-4 inhibitors (0.79, 0.77 to 0.82). The three year absolute risk was 2.4% (95% CI 2.1% to 2.7%) lower for SGLT-2 inhibitors than DPP-4 inhibitors (4.6% 7.0%), 1.8% (1.4% to 2.1%) lower for GLP-1 receptor agonists than DPP-4 inhibitors (5.7% 7.5%), and 1.2% (0.9% to 1.5%) lower for SGLT-2 inhibitors than GLP-1 receptor agonists (4.7% 6.0%). Findings were consistent for the secondary outcomes and among subgroups defined by age, sex, race, medical conditions, other drug use, and hemoglobin A1c levels on the relative scale. Benefits for SGLT-2 inhibitors and GLP-1 receptor agonists on the absolute scale were largest for those with heart failure, chronic kidney disease, or those using mineralocorticoid receptor antagonists. Compared with DPP-4 inhibitors, the lower rate of hyperkalemia was consistently observed across individual agents in the SGLT-2 inhibitor (canagliflozin, dapagliflozin, empagliflozin) and GLP-1 receptor agonist (dulaglutide, exenatide, liraglutide, semaglutide) classes.
CONCLUSIONS
In people with type 2 diabetes, SGLT-2 inhibitors and GLP-1 receptor agonists were associated with a lower risk of hyperkalemia than DPP-4 inhibitors in the overall population and across relevant subgroups. The consistency of associations among individual agents in the SGLT-2 inhibitor and GLP-1 receptor agonist classes suggests a class effect. These ancillary benefits of SGLT-2 inhibitors and GLP-1 receptor agonists further support their use in people with type 2 diabetes, especially in those at risk of hyperkalemia.
Topics: Humans; Diabetes Mellitus, Type 2; Hyperkalemia; Sodium-Glucose Transporter 2 Inhibitors; Dipeptidyl-Peptidase IV Inhibitors; Male; Female; Glucagon-Like Peptide-1 Receptor; Aged; Middle Aged; United States; Cohort Studies; Hypoglycemic Agents; Propensity Score; Glucagon-Like Peptide-1 Receptor Agonists
PubMed: 38925801
DOI: 10.1136/bmj-2023-078483 -
BMJ (Clinical Research Ed.) Jun 2024To investigate the incidence of cardiovascular disease (CVD) overall and by age, sex, and socioeconomic status, and its variation over time, in the UK during 2000-19.
OBJECTIVE
To investigate the incidence of cardiovascular disease (CVD) overall and by age, sex, and socioeconomic status, and its variation over time, in the UK during 2000-19.
DESIGN
Population based study.
SETTING
UK.
PARTICIPANTS
1 650 052 individuals registered with a general practice contributing to Clinical Practice Research Datalink and newly diagnosed with at least one CVD from 1 January 2000 to 30 June 2019.
MAIN OUTCOME MEASURES
The primary outcome was incident diagnosis of CVD, comprising acute coronary syndrome, aortic aneurysm, aortic stenosis, atrial fibrillation or flutter, chronic ischaemic heart disease, heart failure, peripheral artery disease, second or third degree heart block, stroke (ischaemic, haemorrhagic, and unspecified), and venous thromboembolism (deep vein thrombosis or pulmonary embolism). Disease incidence rates were calculated individually and as a composite outcome of all 10 CVDs combined and were standardised for age and sex using the 2013 European standard population. Negative binomial regression models investigated temporal trends and variation by age, sex, and socioeconomic status.
RESULTS
The mean age of the population was 70.5 years and 47.6% (n=784 904) were women. The age and sex standardised incidence of all 10 prespecified CVDs declined by 19% during 2000-19 (incidence rate ratio 2017-19 2000-02: 0.80, 95% confidence interval 0.73 to 0.88). The incidence of coronary heart disease and stroke decreased by about 30% (incidence rate ratios for acute coronary syndrome, chronic ischaemic heart disease, and stroke were 0.70 (0.69 to 0.70), 0.67 (0.66 to 0.67), and 0.75 (0.67 to 0.83), respectively). In parallel, an increasing number of diagnoses of cardiac arrhythmias, valve disease, and thromboembolic diseases were observed. As a result, the overall incidence of CVDs across the 10 conditions remained relatively stable from the mid-2000s. Age stratified analyses further showed that the observed decline in coronary heart disease incidence was largely restricted to age groups older than 60 years, with little or no improvement in younger age groups. Trends were generally similar between men and women. A socioeconomic gradient was observed for almost every CVD investigated. The gradient did not decrease over time and was most noticeable for peripheral artery disease (incidence rate ratio most deprived least deprived: 1.98 (1.87 to 2.09)), acute coronary syndrome (1.55 (1.54 to 1.57)), and heart failure (1.50 (1.41 to 1.59)).
CONCLUSIONS
Despite substantial improvements in the prevention of atherosclerotic diseases in the UK, the overall burden of CVDs remained high during 2000-19. For CVDs to decrease further, future prevention strategies might need to consider a broader spectrum of conditions, including arrhythmias, valve diseases, and thromboembolism, and examine the specific needs of younger age groups and socioeconomically deprived populations.
Topics: Humans; Female; Male; United Kingdom; Incidence; Aged; Middle Aged; Cardiovascular Diseases; Adult; Aged, 80 and over; Social Class; Age Distribution; Sex Distribution; Young Adult
PubMed: 38925788
DOI: 10.1136/bmj-2023-078523 -
Poultry Science May 2024This review is a summary of a Poultry Science Association symposium addressing myopathies in broilers' breast meat, focusing on the interactions between genetics,...
This review is a summary of a Poultry Science Association symposium addressing myopathies in broilers' breast meat, focusing on the interactions between genetics, nutrition, husbandry, and meat processing. The Pectoralis major myopathies (woody breast [WB]; white striping [WS]; spaghetti meat [SM]) and Pectoralis minor ("feathering") are described, followed by discussing their prevalence, potential causes, current and future ways to mitigate, as well as detection methods (in live birds and meat) as well as ways to utilize affected meat. Overall, breast myopathies remain an important focus across the poultry industry and whilst a lot of data and knowledge has been gathered, it is clear that there is still a lot to understand. As there are multiple factors impacting the occurrence of breast myopathies, their reduction relies on a holistic approach. Ongoing balanced breeding strategies by poultry breeders is targeting the longer-term genetic component but comprehending the significant influence from nongenetic factors (short-term solutions such as nutrition) remains a key area of opportunity. Consequently, understanding the physiology and biological needs of the muscle through the life of the bird is critical to reduce the myopathies (e.g., minimizing oxidative stress) and gain more insight into their etiology.
PubMed: 38925081
DOI: 10.1016/j.psj.2024.103801 -
Poultry Science May 2024In the food industry, assessing the quality of poultry carcasses during processing is a crucial step. This study proposes an effective approach for automating the...
In the food industry, assessing the quality of poultry carcasses during processing is a crucial step. This study proposes an effective approach for automating the assessment of carcass quality without requiring skilled labor or inspector involvement. The proposed system is based on machine learning (ML) and computer vision (CV) techniques, enabling automated defect detection and carcass quality assessment. To this end, an end-to-end framework called CarcassFormer is introduced. It is built upon a Transformer-based architecture designed to effectively extract visual representations while simultaneously detecting, segmenting, and classifying poultry carcass defects. Our proposed framework is capable of analyzing imperfections resulting from production and transport welfare issues, as well as processing plant stunner, scalder, picker, and other equipment malfunctions. To benchmark the framework, a dataset of 7,321 images was initially acquired, which contained both single and multiple carcasses per image. In this study, the performance of the CarcassFormer system is compared with other state-of-the-art (SOTA) approaches for both classification, detection, and segmentation tasks. Through extensive quantitative experiments, our framework consistently outperforms existing methods, demonstrating re- markable improvements across various evaluation metrics such as AP, AP@50, and AP@75. Furthermore, the qualitative results highlight the strengths of CarcassFormer in capturing fine details, including feathers, and accurately localizing and segmenting carcasses with high precision. To facilitate further research and collaboration, the source code and trained models will be made publicly available upon acceptance.
PubMed: 38925080
DOI: 10.1016/j.psj.2024.103765