-
Journal of the American Academy of... Jul 2021Speech recognition in noisy environments is a challenge for both cochlear implant (CI) users and device manufacturers. CI manufacturers have been investing in...
BACKGROUND
Speech recognition in noisy environments is a challenge for both cochlear implant (CI) users and device manufacturers. CI manufacturers have been investing in technological innovations for processors and researching strategies to improve signal processing and signal design for better aesthetic acceptance and everyday use.
PURPOSE
This study aimed to compare speech recognition in CI users using off-the-ear (OTE) and behind-the-ear (BTE) processors.
DESIGN
A cross-sectional study was conducted with 51 CI recipients, all users of the BTE Nucleus 5 (CP810) sound processor. Speech perception performances were compared in quiet and noisy conditions using the BTE sound processor Nucleus 5 (N5) and OTE sound processor Kanso. Each participant was tested with the Brazilian-Portuguese version of the hearing in noise test using each sound processor in a randomized order. Three test conditions were analyzed with both sound processors: (i) speech level fixed at 65 decibel sound pressure level in a quiet, (ii) speech and noise at fixed levels, and (iii) adaptive speech levels with a fixed noise level. To determine the relative performance of OTE with respect to BTE, paired comparison analyses were performed.
RESULTS
The paired -tests showed no significant difference between the N5 and Kanso in quiet conditions. In all noise conditions, the performance of the OTE (Kanso) sound processor was superior to that of the BTE (N5), regardless of the order in which they were used. With the speech and noise at fixed levels, a significant mean 8.1 percentage point difference was seen between Kanso (78.10%) and N5 (70.7%) in the sentence scores.
CONCLUSION
CI users had a lower signal-to-noise ratio and a higher percentage of sentence recognition with the OTE processor than with the BTE processor.
Topics: Brazil; Cochlear Implants; Cross-Sectional Studies; Humans; Speech; Speech Perception
PubMed: 34847587
DOI: 10.1055/s-0041-1735252 -
Meat Science Nov 2018Historically, meat and poultry processors in the U.S. have relied on the use of synthetic antioxidants like butylated hydroxyanisole, butylated hydroxytoluene,... (Review)
Review
Historically, meat and poultry processors in the U.S. have relied on the use of synthetic antioxidants like butylated hydroxyanisole, butylated hydroxytoluene, tert-butylhydroquinone, and propyl gallate, as well as tocopherols to prevent lipid and protein oxidation. There is a trend towards utilizing natural antioxidants as replacements for synthetic ones. Some processors are already using multi-functional ingredients, such as rosemary and oregano, approved for use as spices and natural flavors to curb oxidation. Yet, there are still other ingredients that have not been applied in this fashion. Spices and natural flavors can often be incorporated in products that have defined statements of identity or composition. Further, these ingredients allow the processor to transition to a clean label without compromising the shelf life and quality of the products. Spices and natural flavors may have higher minimum effective concentrations than their synthetic counterparts, but they will offer increased consumer acceptability, decreased potential health risks, and can often achieve the same degree of oxidation prevention.
Topics: Animals; Antioxidants; Flavoring Agents; Food Handling; Food Preservation; Food Preservatives; Humans; Meat Products; Plant Extracts; Spices; United States
PubMed: 30071458
DOI: 10.1016/j.meatsci.2018.07.020 -
Acta Otorrinolaringologica Espanola 2022Osseointegrated auditory devices are hearing gadgets that use the bone conduction of sound to produce hearing improvement. The mechanisms and factors that contribute to...
BACKGROUND AND OBJECTIVE
Osseointegrated auditory devices are hearing gadgets that use the bone conduction of sound to produce hearing improvement. The mechanisms and factors that contribute to this sound transmission have been widely studied, however, there are other aspects that remain unknown, for instance, the influence of the processor power output. The aim of this study was to know if there is any relationship between the power output created by the devices and the hearing improvement that they achieve.
MATERIALS Y METHODS
44 patients were implanted with a percutaneous Baha® 5 model. Hearing thresholds in pure tone audiometry, free-field audiometry, and speech recognition (in quiet and in noise) were measured pre and postoperatively in each patient. The direct bone conduction thresholds and the power output values from the processors were also obtained.
RESULTS
The pure tone average threshold in free field was 39.29 dB (SD = 9.15), so that the mean gain was 29.18 dB (SD = 10.13) with the device. This involved an air-bone gap closure in 63.64% of patients. The pure tone average threshold in direct bone conduction was 27.6 dB (SD = 10.91), which was 8.4 dB better than the pure tone average threshold via bone conduction. The mean gain in speech recognition was 39.15% (SD = 23.98) at 40 dB and 36.66% (SD = 26.76) at 60 dB. The mean gain in the signal-to-noise ratio was -5.9 dB (SD = 4.32). On the other hand, the mean power output values were 27.95 dB µN (SD = 6.51) in G40 and 26.22 dB µN (SD = 6.49) in G60. When analysing the relationship between bone conduction thresholds and G40 and G60 values, a correlation from the frequency of 1000 Hz was observed. However, no statistically significant association between power output, functional gain or speech recognition gain was found.
CONCLUSIONS
The osseointegrated auditory devices generate hearing improvement in tonal thresholds and speech recognition, even in noise. Most patients closed the air-bone gap with the device. There is a direct relationship between the bone conduction threshold and the power output values from the processor, but only in mid and high frequencies. However, the relationship between power output and gain in speech recognition is weaker. Further investigation of contributing factors is necessary.
Topics: Audiometry, Pure-Tone; Auditory Threshold; Hearing; Hearing Aids; Humans; Speech Perception
PubMed: 35397830
DOI: 10.1016/j.otoeng.2021.01.002 -
Data in Brief Dec 2022As e-Commerce continues to shift our shopping preference from the physical to online marketplace, we leave behind digital traces of our personally identifiable details....
As e-Commerce continues to shift our shopping preference from the physical to online marketplace, we leave behind digital traces of our personally identifiable details. For example, the merchant keeps record of your name and address; the payment processor stores your transaction details including account or card information, and every website you visit stores other information such as your device address and type. Cybercriminals constantly steal and use some of this information to commit identity fraud, ultimately leading to devastating consequences to the victims; but also, to the card issuers and payment processors with whom the financial liability most often lies. To this end, we recognise that data is generally compromised in this digital age, and personal data such as card number, password, personal identification number and account details can be easily stolen and used by someone else. However, there is a plethora of data relating to a person's behaviour biometrics that are almost impossible to steal, such as the way they type on a keyboard, move the cursor, or whether they normally do so via a mouse, touchpad or trackball. This data, commonly called keystroke, mouse and touchscreen dynamics, can be used to create a unique profile for the legitimate card owner, that can be utilised as an additional layer of user authentication during online card payments. Machine learning is a powerful technique for analysing such data to gain knowledge; and has been widely used successfully in many sectors for profiling e.g., genome classification in molecular biology and genetics where predictions are made for one or more forms of biochemical activity along the genome. Similar techniques are applicable in the financial sector to detect anomaly in user keyboard and mouse behaviour when entering card details online, such that they can be used to distinguish between a legitimate and an illegitimate card owner. In this article, a behaviour biometrics (i.e., keystroke and mouse dynamics) dataset, collected from 88 individuals, is presented. The dataset holds a total of 1760 instances categorised into two classes (i.e., legitimate and illegitimate card owners' behaviour). The data was collected to facilitate an academic start-up project (called CyberSignature1) which received funding from Innovate UK, under the Cyber Security Academic Startup Accelerator Programme. The dataset could be helpful to researchers who apply machine learning to develop applications using keystroke and mouse dynamics e.g., in cybersecurity to prevent identity theft. The dataset, entitled 'Behaviour Biometrics Dataset', is freely available on the Mendeley Data repository.
PubMed: 36426040
DOI: 10.1016/j.dib.2022.108728 -
Nature Communications Apr 2022Computational meta-optics brings a twist on the accelerating hardware with the benefits of ultrafast speed, ultra-low power consumption, and parallel information...
Computational meta-optics brings a twist on the accelerating hardware with the benefits of ultrafast speed, ultra-low power consumption, and parallel information processing in versatile applications. Recent advent of metasurfaces have enabled the full manipulation of electromagnetic waves within subwavelength scales, promising the multifunctional, high-throughput, compact and flat optical processors. In this trend, metasurfaces with nonlocality or multi-layer structures are proposed to perform analog optical computations based on Green's function or Fourier transform, intrinsically constrained by limited operations or large footprints/volume. Here, we showcase a Fourier-based metaprocessor to impart customized highly flexible transfer functions for analog computing upon our single-layer Huygens' metasurface. Basic mathematical operations, including differentiation and cross-correlation, are performed by directly modulating complex wavefronts in spatial Fourier domain, facilitating edge detection and pattern recognition of various image processing. Our work substantiates an ultracompact and powerful kernel processor, which could find important applications for optical analog computing and image processing.
Topics: Computers; Fourier Analysis; Image Processing, Computer-Assisted; Optics and Photonics
PubMed: 35449139
DOI: 10.1038/s41467-022-29732-4 -
Nature Jun 2023Quantum computing promises to offer substantial speed-ups over its classical counterpart for certain problems. However, the greatest impediment to realizing its full...
Quantum computing promises to offer substantial speed-ups over its classical counterpart for certain problems. However, the greatest impediment to realizing its full potential is noise that is inherent to these systems. The widely accepted solution to this challenge is the implementation of fault-tolerant quantum circuits, which is out of reach for current processors. Here we report experiments on a noisy 127-qubit processor and demonstrate the measurement of accurate expectation values for circuit volumes at a scale beyond brute-force classical computation. We argue that this represents evidence for the utility of quantum computing in a pre-fault-tolerant era. These experimental results are enabled by advances in the coherence and calibration of a superconducting processor at this scale and the ability to characterize and controllably manipulate noise across such a large device. We establish the accuracy of the measured expectation values by comparing them with the output of exactly verifiable circuits. In the regime of strong entanglement, the quantum computer provides correct results for which leading classical approximations such as pure-state-based 1D (matrix product states, MPS) and 2D (isometric tensor network states, isoTNS) tensor network methods break down. These experiments demonstrate a foundational tool for the realization of near-term quantum applications.
PubMed: 37316724
DOI: 10.1038/s41586-023-06096-3 -
Molecular Ecology Resources Oct 2021Software tools for linkage disequilibrium (LD) analyses are designed to calculate LD among all genetic variants in a single region. Since compute and memory requirements...
Software tools for linkage disequilibrium (LD) analyses are designed to calculate LD among all genetic variants in a single region. Since compute and memory requirements grow quadratically with the distance between variants, using these tools for long-range LD calculations leads to long execution times and increased allocation of memory resources. Furthermore, widely used tools do not fully utilize the computational resources of modern processors and/or graphics processing cards, limiting future large-scale analyses on thousands of samples. We present quickLD, a stand-alone and open-source software that computes several LD-related statistics, including the commonly used r . quickLD calculates pairwise LD between genetic variants in a single region or in arbitrarily distant regions with negligible memory requirements. Moreover, quickLD achieves up to 95% and 97% of the theoretical peak performance of a CPU and a GPU, respectively, enabling 21.5× faster processing than current state-of-the-art software on a multicore processor and 49.5× faster processing when the aggregate processing power of a multicore CPU and a GPU is harnessed. quickLD can also be used in studies of selection, recombination, genetic drift, inbreeding and gene flow. The software is available at https://github.com/pephco/quickLD.
Topics: Algorithms; Genetic Linkage; Linkage Disequilibrium; Software
PubMed: 34062051
DOI: 10.1111/1755-0998.13438 -
Musculoskeletal Science & Practice Jun 2022Magnetic resonance imaging (MRI) is the standard to quantify size and structure of lumbar muscles. Three-dimensional volumetric measures are expected to be more closely...
INTRODUCTION
Magnetic resonance imaging (MRI) is the standard to quantify size and structure of lumbar muscles. Three-dimensional volumetric measures are expected to be more closely related to muscle function than two-dimensional measures such as cross-sectional area. Reliability and agreement of a standardized method should be established to enable the use of MRI to assess lumbar muscle characteristics.
OBJECTIVES
This study investigates the intra- and inter-processor reliability for the quantification of (1) muscle volume and (2) fat fraction based on chemical shift MRI images using axial 3D-volume measurements of the lumbar multifidus in patients with low back pain.
METHODS
Two processors manually segmented the lumbar multifidus on the MRI scans of 18 patients with low back pain using Mevislab software following a well-defined method. Fat fraction of the segmented volume was calculated. Reliability and agreement were determined using intra-class correlation coefficients (ICC), Bland-Altman plots and calculation of the standard error of measurement (SEM) and minimal detectable change (MDC).
RESULTS
Excellent ICCs were found for both intra-processor and inter-processor analysis of lumbar multifidus volume measurement, with slightly better results for the intra-processor reliability. The SEMs for volume were lower than 4.1 cm³. Excellent reliability and agreement were also found for fat fraction measures, with ICCs of 0.985-0.998 and SEMs below 0.946%.
CONCLUSION
The proposed method to quantify muscle volume and fat fraction of the lumbar multifidus on MRI was highly reliable, and can be used in further research on lumbar multifidus structure.
Topics: Humans; Low Back Pain; Lumbosacral Region; Magnetic Resonance Imaging; Paraspinal Muscles; Reproducibility of Results
PubMed: 35245881
DOI: 10.1016/j.msksp.2022.102532 -
International Journal of Environmental... Mar 2023fish can be an affordable and accessible animal-source food in many Low- and Middle-Income Countries (LMIC).
INTRODUCTION
fish can be an affordable and accessible animal-source food in many Low- and Middle-Income Countries (LMIC).
BACKGROUND
Traditional fish processing methods pose a risk of exposing fish to various contaminants that may reduce their nutritional benefit. In addition, a lack of literacy may increase women fish processors' vulnerability to malnutrition and foodborne diseases.
OBJECTIVE
The overall aim of the project was to educate women and youth fish processors in Delta State, Nigeria about the benefit of fish in the human diet and to develop low literacy tools to help them better market their products. The objective of this study was to describe the development and validation of a low-literacy flipbook designed to teach women fish processors about nutrition and food safety.
METHOD
developing and validating instructional material requires understanding the population, high-quality and relevant graphics, and the involvement of relevant experts to conduct the content validation using the Content Validity Index (CVI) and the index value translated with the Modified Kappa Index ().
RESULT
The Item-level Content Validity Index (I-CVI) value of all domains evaluated at the initial stage was 0.83 and the Scale-level Content Validity Index (S-CVI) was 0.90. At the final stage, the material was validated with CVI 0.983 by four experts and satisfied the expected minimum CVI value for this study (CVI ≥ 0.83, -value = 0.05). The overall evaluation of the newly developed and validated flipbook was "excellent".
CONCLUSIONS
the developed material was found to be appropriate for training fish processors in Nigeria in nutrition and food safety and could be modified for a population of fish processors in other LMICs.
Topics: Humans; Female; Adolescent; Surveys and Questionnaires; Nigeria; Nutritional Status; Diet; Food Safety; Reproducibility of Results
PubMed: 36981799
DOI: 10.3390/ijerph20064891 -
Reports on Progress in Physics.... Sep 2022Quantum annealing (QA) is a heuristic quantum optimization algorithm that can be used to solve combinatorial optimization problems. In recent years, advances in quantum... (Review)
Review
Quantum annealing (QA) is a heuristic quantum optimization algorithm that can be used to solve combinatorial optimization problems. In recent years, advances in quantum technologies have enabled the development of small- and intermediate-scale quantum processors that implement the QA algorithm for programmable use. Specifically, QA processors produced by D-Wave systems have been studied and tested extensively in both research and industrial settings across different disciplines. In this paper we provide a literature review of the theoretical motivations for QA as a heuristic quantum optimization algorithm, the software and hardware that is required to use such quantum processors, and the state-of-the-art applications and proofs-of-concepts that have been demonstrated using them. The goal of our review is to provide a centralized and condensed source regarding applications of QA technology. We identify the advantages, limitations, and potential of QA for both researchers and practitioners from various fields.
PubMed: 36001953
DOI: 10.1088/1361-6633/ac8c54