-
Brain Sciences Feb 2021This preliminary study assessed the effects of noise and stimulus presentation order on recall of spoken words and recorded pupil sizes while normal-hearing listeners...
This preliminary study assessed the effects of noise and stimulus presentation order on recall of spoken words and recorded pupil sizes while normal-hearing listeners were trying to encode a series of words for a subsequent recall task. In three listening conditions (stationary noise in Experiment 1; quiet versus four-talker babble in Experiment 2), participants were assigned to remember as many words as possible to recall them in any order after each list of seven sentences. In the two noise conditions, lists of sentences fixed at 65 dB SPL were presented at an easily audible level via a loudspeaker. Reading span (RS) scores were used as a grouping variable, based on a median split. The primacy effect was present apart from the noise interference, and the high-RS group significantly outperformed the low-RS group at free recall measured in the quiet and four-talker babble noise conditions. RS scores were positively correlated with free-recall scores. In both quiet and four-talker babble noise conditions, sentence baselines after correction to the initial stimulus baseline increased significantly with increasing memory load. Larger sentence baselines but smaller peak pupil dilations seemed to be associated with noise interruption. The analysis method of pupil dilation used in this study is likely to provide a more thorough understanding of how listeners respond to a later recall task in comparison with previously used methods. Further studies are needed to confirm the applicability of our method in people with impaired hearing using multiple repetitions to estimate the allocation of relevant cognitive resources.
PubMed: 33672410
DOI: 10.3390/brainsci11020277 -
Physiological Reports Apr 2019Unrestrained barometric plethysmography is a common method used for characterizing breathing patterns in small animals. One source of variation between unrestrained...
Unrestrained barometric plethysmography is a common method used for characterizing breathing patterns in small animals. One source of variation between unrestrained barometric plethysmography studies is the segment of baseline. Baseline may be analyzed as a predetermined time-point, or using tailored segments when each animal is visually calm. We compared a quiet, minimally active (no sniffing/grooming) breathing segment to a predetermined time-point at 1 h for baseline measurements in young and middle-aged mice during the dark and light cycles. Additionally, we evaluated the magnitude of change for gas challenges based on these two baseline segments. C57BL/6JEiJ x C3Sn.BliA-Pde6b /DnJ male mice underwent unrestrained barometric plethysmography with the following baselines used to determine breathing frequency, tidal volume (VT) and minute ventilation (VE): (1) 30-sec of quiet breathing and (2) a 10-min period from 50 to 60 min. Animals were also exposed to 10 min of hypoxic (10% O , balanced N ), hypercapnic (5% CO , balanced air) and hypoxic hypercapnic (10% O , 5% CO , balanced N ) gas. Both frequency and VE were higher during the predetermined 10-min baseline versus the 30-sec baseline, while VT was lower (P < 0.05). However, VE/V was similar between the baseline time segments (P > 0.05) in an analysis of one cohort. During baseline, dark cycle testing had increased VT values versus those in the light (P < 0.05). For gas challenges, both frequency and VE showed higher percent change from the 30-sec baseline compared to the predetermined 10-min baseline (P < 0.05), while VT showed a greater change from the 10-min baseline (P < 0.05). Dark cycle hypoxic exposure resulted in larger percent change in breathing frequency versus the light cycle (P < 0.05). Overall, light and dark cycle pattern of breathing differences emerged along with differences between the 30-sec behavior observational method versus a predetermined time segment for baseline.
Topics: Aging; Animals; Circadian Rhythm; Hypercapnia; Hypoxia; Male; Mice; Mice, Inbred C57BL; Respiration
PubMed: 31004390
DOI: 10.14814/phy2.14060 -
Clinical Pharmacokinetics Nov 2023Vericiguat is a soluble guanylate cyclase stimulator indicated to reduce the risk of cardiovascular death and hospitalization due to heart failure. A dedicated QTc study...
BACKGROUND AND OBJECTIVES
Vericiguat is a soluble guanylate cyclase stimulator indicated to reduce the risk of cardiovascular death and hospitalization due to heart failure. A dedicated QTc study in patients with chronic coronary syndromes demonstrated no clinically relevant QTc effect of vericiguat for exposures across the therapeutic dose range (2.5-10 mg). Interval prolongation concentration-QTc (C-QTc) modeling was performed to complement the statistical evaluations of QTc in the dedicated QTc study.
METHODS
Individual time-matched, baseline- and placebo-corrected Fridericia-corrected QT interval values (ΔΔQTcF) were derived. Two approaches for ΔΔQTcF calculation were investigated: (1) ΔΔQTcF correction with data from a single baseline (as in the primary statistical analysis); and (2) ΔΔQTcF correction with a modeled baseline (considering all available individual non-treatment baselines). The ΔΔQTcF values were related to observed vericiguat concentrations with linear mixed-effects modeling.
RESULTS
For both modeling approaches, a positive relationship was found between ΔΔQTcF and vericiguat concentration; however, the slope for the single-baseline approach was not statistically significant, whereas the slope from the modeled-baseline approach was statistically significant. The upper bound of the two-sided 90% confidence interval for model-derived QTc was < 10 ms at the highest observed exposure (745 μg/L; investigated dose range 2.5-10 mg).
CONCLUSION
By applying a single-baseline approach and a modeled-baseline approach that integrated all available QTc data across doses to characterize the QTc prolongation potential, this study showed that vericiguat 2.5-10 mg is not associated with clinically relevant QTc effects, in line with the conclusion from the primary statistical analysis.
CLINICAL TRIALS REGISTRATION NUMBER
ClinicalTrials.gov NCT03504982.
Topics: Humans; Electrocardiography; Double-Blind Method; Heart; Heterocyclic Compounds, 2-Ring; Cross-Over Studies; Heart Rate
PubMed: 37672197
DOI: 10.1007/s40262-023-01282-y -
Cardiovascular Diabetology Jan 2024The triglyceride-glucose (TyG) index is regarded as a sophisticated surrogate biomarker for insulin resistance, offering a refined means for evaluating cardiovascular...
BACKGROUND
The triglyceride-glucose (TyG) index is regarded as a sophisticated surrogate biomarker for insulin resistance, offering a refined means for evaluating cardiovascular diseases (CVDs). However, prospective cohort studies have not simultaneously conducted baseline and multi-timepoint trajectory assessments of the TyG index in relation to CVDs and their subtypes in elderly participants.
METHODS
After excluding data deficiencies and conditions that could influence the research outcomes, this study ultimately incorporated a cohort of 20,185 participants, with data chronicles extending from 2016 to 2022. The TyG index was calculated as Ln [fasting triglyceride (mg/dL) × fasting glucose (mg/dL)/2]. Latent Class Trajectory Model (LCTM) was used to assess the change trends of the TyG index over multiple time points. Utilizing the Cox proportional-hazards models, we assessed the relationship between the baseline quartiles of the TyG index and various trajectories with CVDs and subtypes.
RESULTS
During the mean follow-up time of 4.25 years, 11,099 patients experienced new CVDs in the elderly population. After stratifying by baseline TyG quartiles, the higher TyG level was associated with an increased risk of CVDs; the aHR and 95% CI for the highest quartile group were 1.28 (1.19-1.39). Five trajectory patterns were identified by the LCTM model. The low gradual increase group as the reference, the medium stable group, and the high gradual increase group exhibited an elevated risk of CVDs onset, aHR and 95%CIs were 1.17 (1.10-1.25) and 1.25 (1.15-1.35). Similar results were observed between the trajectories of the TyG index with subtypes of CVDs.
CONCLUSION
Participants with high levels of baseline TyG index and medium stable or high gradual increase trajectories were associated with an elevated risk of developing CVDs in elderly populations.
Topics: Humans; Aged; Cardiovascular Diseases; Prospective Studies; Fasting; Glucose; Triglycerides; Blood Glucose; Risk Factors; Biomarkers; Risk Assessment
PubMed: 38172936
DOI: 10.1186/s12933-023-02100-2 -
The American Journal of Managed Care Oct 2023To describe trends in US health care spending in a large, national, and commercially insured population during the COVID-19 pandemic.
OBJECTIVES
To describe trends in US health care spending in a large, national, and commercially insured population during the COVID-19 pandemic.
STUDY DESIGN
Cross-sectional study of commercially insured members enrolled between May 1, 2018, and December 31, 2021.
METHODS
The study utilized a population-based sample of continuously enrolled members in a geographically diverse federation of Blue Cross Blue Shield plans across the United States. Our sample excluded Medicare and Medicare Advantage beneficiaries. The COVID-19 exposure period was defined as 2020-2021; 2018-2019 were pre-COVID-19 years. We defined 4 post-COVID-19 periods: March 1 to April 30, 2020; May 1 to December 31, 2020; January 1 to March 31, 2021; and April 1 to December 31, 2021. The primary outcome was inflation-adjusted overall per-member per-month (PMPM) medical spending adjusted for age, sex, Elixhauser comorbidities, area-level racial composition, income, and education.
RESULTS
Our sample included 97,319,130 individuals. Mean PMPM medical spending decreased from $370.92 in January-February 2020 to $281.00 in March-April 2020. Between May and December 2020, mean PMPM medical spending recovered to-but did not exceed-prepandemic levels. Mean PMPM medical spending stayed below prepandemic levels between January and March 2021, rose above prepandemic baselines between April and June 2021, and decreased below baseline between July and December 2021.
CONCLUSIONS
The COVID-19 pandemic induced a spending shock in 2020, and health care spending did not recover to near baseline until mid-2021, with some emerging evidence of pent-up demand. The observed spending below baseline through the end of 2021 will pose challenges to setting spending benchmarks for alternative payment and shared savings models.
Topics: Aged; Humans; United States; Cross-Sectional Studies; Pandemics; Medicare; Health Expenditures; COVID-19
PubMed: 37870545
DOI: 10.37765/ajmc.2023.89440 -
Sensors (Basel, Switzerland) Jul 2023This paper evaluates the potential application of Raman baselines in characterizing organic deposition. Taking the layered sediments (Stromatolite) formed by the growth...
This paper evaluates the potential application of Raman baselines in characterizing organic deposition. Taking the layered sediments (Stromatolite) formed by the growth of early life on the Earth as the research object, Raman spectroscopy is an essential means to detect deep-space extraterrestrial life. Fluorescence is the main factor that interferes with Raman spectroscopy detection, which will cause the enhancement of the Raman baseline and annihilate Raman information. The paper aims to evaluate fluorescence contained in the Raman baseline and characterize organic sedimentary structure using the Raman baseline. This study achieves spectral image fusion combined with mapping technology to obtain high spatial and spectral resolution fusion images. To clarify that the fluorescence of organic matter deposition is the main factor causing Raman baseline enhancement, 5041 Raman spectra were obtained in the scanning area of 710 μm × 710 μm, and the correlation mechanism between the gray level of the light-dark layer of the detection point and the Raman baseline was compared. The spatial distribution of carbonate minerals and organic precipitations was detected by combining mapping technology. In addition, based on the BI-IHS algorithm, the spectral image fusion of Raman fluorescence mapping and reflection micrograph, polarization micrograph, and orthogonal polarization micrograph are realized, respectively. A fusion image with high spectral resolution and high spatial resolution is obtained. The results show that the Raman baseline can be used as helpful information to characterize stromatolite organic sedimentary structure.
Topics: Algorithms; Carbonates; Organic Chemicals; Spectrum Analysis, Raman
PubMed: 37447978
DOI: 10.3390/s23136128 -
Nature Communications Dec 2023Carbon credits generated through jurisdictional-scale avoided deforestation projects require accurate estimates of deforestation emission baselines, but there are...
Carbon credits generated through jurisdictional-scale avoided deforestation projects require accurate estimates of deforestation emission baselines, but there are serious challenges to their robustness. We assessed the variability, accuracy, and uncertainty of baselining methods by applying sensitivity and variable importance analysis on a range of typically-used methods and parameters for 2,794 jurisdictions worldwide. The median jurisdiction's deforestation emission baseline varied by 171% (90% range: 87%-440%) of its mean, with a median forecast error of 0.778 times (90% range: 0.548-3.56) the actual deforestation rate. Moreover, variable importance analysis emphasised the strong influence of the deforestation projection approach. For the median jurisdiction, 68.0% of possible methods (90% range: 61.1%-85.6%) exceeded 15% uncertainty. Tropical and polar biomes exhibited larger uncertainties in carbon estimations. The use of sensitivity analyses, multi-model, and multi-source ensemble approaches could reduce variabilities and biases. These findings provide a roadmap for improving baseline estimations to enhance carbon market integrity and trust.
PubMed: 38092814
DOI: 10.1038/s41467-023-44127-9 -
North American Spine Society Journal Dec 2023Lumbar spine fusion (LSF) surgery is a viable form of treatment for several spinal disorders. Treatment effects are preferably to be endorsed in real-life settings.
BACKGROUND CONTEXT
Lumbar spine fusion (LSF) surgery is a viable form of treatment for several spinal disorders. Treatment effects are preferably to be endorsed in real-life settings.
METHODS
This prospective study evaluated the 10-year outcomes of LSF. A population-based series of elective LSFs performed at 2 spine centers between January 2008 and June 2012 were enrolled. Surgeries for tumor, acute fracture, or infection, neuromuscular scoliosis, or postoperative conditions were excluded. The following patient-reported outcome measures (PROMs) were collected at baseline, and 1, 2, 5, and 10 years postsurgery: VAS for back and leg pain, ODI, SF-36. Longitudinal measures of PROMs were analyzed using mixed-effects models.
RESULTS
A total of 683 patients met the inclusion criteria, and 630 (92%) of them completed baseline and at least 1 follow-up PROMs, and they constituted the study population. Mean age was 61 (SD 12) years, 69% women. According to surgical indication, patients were stratified into degenerative spondylolisthesis (DS, n=332, 53%), spinal stenosis (SS, n=102, 16%), isthmic spondylolisthesis (IS, n=97, 15%), degenerative disc disease (DDD, n=52, 8%), and deformity (DF, n=47, 7%).All diagnostic cohorts demonstrated significant improvement at 1 year, followed by a partial loss of benefits by 10 years. ODI baselines and changes at 1 and 10 years were: (DS) 45, -21, and -14; (SS) 51, -24, and -13; (IS) 41, -24, and -20; (DDD) 50, -20, and -20; and (DF) 50, -21, and -16, respectively. Comparable patterns were seen in pain scores. Significant HRQoL achievements were recorded in all cohorts, greatest in physical domains, but also substantial in mental aspects of HRQoL.
CONCLUSIONS
Benefits of LSF were partially lost but still meaningful at 10 years of surgery. Long-term benefits seemed milder with degenerative conditions, reflecting the progress of the ongoing spinal degeneration. Benefits were most overt in pain and physical function measures.
PubMed: 37840551
DOI: 10.1016/j.xnsj.2023.100276 -
Annals of the American Thoracic Society Apr 2021In 2017, the U.S. Centers for Disease Control and Prevention (CDC) developed a new surveillance definition of sepsis, the adult sepsis event (ASE), to better track...
In 2017, the U.S. Centers for Disease Control and Prevention (CDC) developed a new surveillance definition of sepsis, the adult sepsis event (ASE), to better track sepsis epidemiology. The ASE requires evidence of acute organ dysfunction and defines baseline organ function pragmatically as the best in-hospital value. This approach may undercount sepsis if new organ dysfunction does not resolve by discharge. To understand how sepsis identification and outcomes differ when using the best laboratory values during hospitalization versus methods that use historical lookbacks to define baseline organ function. We identified all patients hospitalized at 138 Veterans Affairs hospitals (2013-2018) admitted via the emergency department with two or more systemic inflammatory response criteria, were treated with antibiotics within 48 hours (i.e., had potential infection), and completed 4+ days of antibiotics (i.e., had suspected infection). We considered the following three approaches to defining baseline renal, hematologic, and liver function: the best values during hospitalization (as in the Centers for Disease Control and Prevention's ASE), the best values during hospitalization plus the prior 90 days (3-mo baseline), and the best values during hospitalization plus the prior 180 days (6-mo baseline). We determined how many patients met the criteria for sepsis by each approach, and then compared characteristics and outcomes of sepsis hospitalizations between the three approaches. Among 608,128 hospitalizations with potential infection, 72.1%, 68.5%, and 58.4% had creatinine, platelet, and total bilirubin measured, respectively, in the prior 3 months. A total of 86.0%, 82.6%, and 74.8%, respectively, had these labs in the prior 6 months. Using the hospital baseline, 100,568 hospitalizations met criteria for community-acquired sepsis. By contrast, 111,983 and 117,435 met criteria for sepsis using the 3- and 6-month baselines, for a relative increase of 11% and 17%, respectively. Patient characteristics were similar across the three approaches. In-hospital mortality was 7.2%, 7.0%, and 6.8% for sepsis hospitalizations identified using the hospital, 3-month baseline, and 6-month baseline. The 30-day mortality was 12.5%, 12.7%, and 12.5%, respectively. Among veterans hospitalized with potential infection, the majority had laboratory values in the prior 6 months. Using 3- and 6-month lookbacks to define baseline organ function resulted in an 11% and 17% relative increase, respectively, in the number of sepsis hospitalizations identified.
Topics: Adult; Cohort Studies; Emergency Service, Hospital; Hospital Mortality; Hospitalization; Humans; Retrospective Studies; Sepsis
PubMed: 33476245
DOI: 10.1513/AnnalsATS.202009-1130OC