-
Physical Review Letters May 2021Ternary quantum processors offer significant potential computational advantages over conventional qubit technologies, leveraging the encoding and processing of quantum...
Ternary quantum processors offer significant potential computational advantages over conventional qubit technologies, leveraging the encoding and processing of quantum information in qutrits (three-level systems). To evaluate and compare the performance of such emerging quantum hardware it is essential to have robust benchmarking methods suitable for a higher-dimensional Hilbert space. We demonstrate extensions of industry standard randomized benchmarking (RB) protocols, developed and used extensively for qubits, suitable for ternary quantum logic. Using a superconducting five-qutrit processor, we find an average single-qutrit process infidelity of 3.8×10^{-3}. Through interleaved RB, we characterize a few relevant gates, and employ simultaneous RB to fully characterize crosstalk errors. Finally, we apply cycle benchmarking to a two-qutrit CSUM gate and obtain a two-qutrit process fidelity of 0.85. Our results present and demonstrate RB-based tools to characterize the performance of a qutrit processor, and a general approach to diagnose control errors in future qudit hardware.
PubMed: 34114846
DOI: 10.1103/PhysRevLett.126.210504 -
Journal of the American Chemical Society Nov 2023Programmable biomolecule-mediated computing is a new computing paradigm as compared to contemporary electronic computing. It employs nucleic acids and analogous... (Review)
Review
Programmable biomolecule-mediated computing is a new computing paradigm as compared to contemporary electronic computing. It employs nucleic acids and analogous biomolecular structures as information-storing and -processing substrates to tackle computational problems. It is of great significance to investigate the various issues of programmable biomolecule-mediated processors that are capable of automatically processing, storing, and displaying information. This Perspective provides several conceptual designs of programmable biomolecule-mediated processors and provides some insights into potential future research directions for programmable biomolecule-mediated processors.
PubMed: 37864571
DOI: 10.1021/jacs.3c04142 -
Seminars in Hearing May 2021This case study examines the methods used to troubleshoot a cochlear implant processor via video visit with a nonagenarian (90+ years old) with a bimodal cochlear... (Review)
Review
This case study examines the methods used to troubleshoot a cochlear implant processor via video visit with a nonagenarian (90+ years old) with a bimodal cochlear implant system. This article will discuss the evaluation and management as well as which specific issues could be addressed virtually and how they were resolved. Examples will be provided about how to virtually connect with the patient and how to best facilitate communication during a video visit. Additionally, this article will examine the captioning apps and other hearing assistive technology available for smartphones that can provide further assistance during a cell phone call along with their benefits and limitations.
PubMed: 34381294
DOI: 10.1055/s-0041-1731691 -
Philosophical Transactions. Series A,... Oct 2017There is a broad design space for concurrent computer processors: they can be optimized for low power, low latency or high throughput. This freedom to tune each...
There is a broad design space for concurrent computer processors: they can be optimized for low power, low latency or high throughput. This freedom to tune each processor design to its niche has led to an increasing diversity of machines, from powerful pocketable devices to those responsible for complex and critical tasks, such as car guidance systems. Given this context, academic concurrency research sounds notes of both caution and optimism. Caution because recent work has uncovered flaws in the way we explain the subtle memory behaviour of concurrent systems: specifications have been shown to be incorrect, leading to bugs throughout the many layers of the system. And optimism because our tools and methods for verifying the correctness of concurrent code-although built above an idealized model of concurrency-are becoming more mature. This paper looks at the way we specify the memory behaviour of concurrent systems and suggests a new direction. Currently, there is a siloed approach, with each processor and programming language specified separately in an incomparable way. But this does not match the structure of our programs, which may use multiple processors and languages together. Instead we propose a approach, where program components carry with them a description of the sort of concurrency they rely on, and there is a mechanism for composing these. This will support not only components written for the multiple varied processors found in a modern system but also those that use idealized models of concurrency, providing a sound footing for mature verification techniques.This article is part of the themed issue 'Verified trustworthy software systems'.
PubMed: 28871054
DOI: 10.1098/rsta.2015.0406 -
Physical Review Letters Sep 2022Approximation based on perturbation theory is the foundation for most of the quantitative predictions of quantum mechanics, whether in quantum many-body physics,...
Approximation based on perturbation theory is the foundation for most of the quantitative predictions of quantum mechanics, whether in quantum many-body physics, chemistry, quantum field theory, or other domains. Quantum computing provides an alternative to the perturbation paradigm, yet state-of-the-art quantum processors with tens of noisy qubits are of limited practical utility. Here, we introduce perturbative quantum simulation, which combines the complementary strengths of the two approaches, enabling the solution of large practical quantum problems using limited noisy intermediate-scale quantum hardware. The use of a quantum processor eliminates the need to identify a solvable unperturbed Hamiltonian, while the introduction of perturbative coupling permits the quantum processor to simulate systems larger than the available number of physical qubits. We present an explicit perturbative expansion that mimics the Dyson series expansion and involves only local unitary operations, and show its optimality over other expansions under certain conditions. We numerically benchmark the method for interacting bosons, fermions, and quantum spins in different topologies, and study different physical phenomena, such as information propagation, charge-spin separation, and magnetism, on systems of up to 48 qubits only using an 8+1 qubit quantum hardware. We demonstrate our scheme on the IBM quantum cloud, verifying its noise robustness and illustrating its potential for benchmarking large quantum processors with smaller ones.
PubMed: 36179156
DOI: 10.1103/PhysRevLett.129.120505 -
IEEE Transactions on Biomedical... Apr 2020This paper reviews the state of the arts and trends of the AI-Based biomedical processing algorithms and hardware. The algorithms and hardware for different biomedical... (Review)
Review
This paper reviews the state of the arts and trends of the AI-Based biomedical processing algorithms and hardware. The algorithms and hardware for different biomedical applications such as ECG, EEG and hearing aid have been reviewed and discussed. For algorithm design, various widely used biomedical signal classification algorithms have been discussed including support vector machine (SVM), back propagation neural network (BPNN), convolutional neural networks (CNN), probabilistic neural networks (PNN), recurrent neural networks (RNN), Short-term Memory Network (LSTM), fuzzy neural network and etc. The pros and cons of the classification algorithms have been analyzed and compared in the context of application scenarios. The research trends of AI-Based biomedical processing algorithms and applications are also discussed. For hardware design, various AI-Based biomedical processors have been reviewed and discussed, including ECG classification processor, EEG classification processor, EMG classification processor and hearing aid processor. Various techniques on architecture and circuit level have been analyzed and compared. The research trends of the AI-Based biomedical processor have also been discussed.
Topics: Algorithms; Artificial Intelligence; Biomedical Engineering; Electrodiagnosis; Humans; Signal Processing, Computer-Assisted
PubMed: 32078560
DOI: 10.1109/TBCAS.2020.2974154 -
Attention, Perception & Psychophysics Aug 2019The architecture of the numerical cognition system is currently not well understood, but at a general level, assumptions are made about two core components: a quantity...
The architecture of the numerical cognition system is currently not well understood, but at a general level, assumptions are made about two core components: a quantity processor and an identity processor. The quantity processor is concerned with accessing and using the stored magnitude denoted by a given digit, and the identity processor is concerned with recovery of the corresponding digit's identity. Blanc-Goldhammer and Cohen (Journal of Experimental Psychology: Learning, Memory, and Cognition, 40, 1389-1403, 2014) established that the recovery and use of quantity information operates in an unlimited-capacity fashion. Here we assessed whether the identity processor operates in a similar fashion. We present two experiments that were digit identity variations of Blanc-Goldhammer and Cohen's magnitude estimation paradigm. The data across both experiments reveal a limited-capacity identity processor whose operation reflects cross-talk with the quantity processor. Such findings provide useful evidence that can be used to adjudicate between competing models of the human number-processing system.
Topics: Cognition; Humans; Learning; Mathematics; Mental Processes; Reaction Time
PubMed: 31073948
DOI: 10.3758/s13414-019-01745-0 -
Acta Oto-laryngologica Mar 2021Signal processing algorithms are the hidden components in the audio processor that converts the received acoustic signal into electrical impulses while maintaining as...
Signal processing algorithms are the hidden components in the audio processor that converts the received acoustic signal into electrical impulses while maintaining as much relevant information as possible. Signal processing algorithms should be smart enough to mimic the functionality of external, middle and the inner-ear to provide the cochlear implant (CI) user with a hearing experience as natural as possible. Modern sound processing strategies are based on the continuous interleaved sampling (CIS) strategy proposed by B. Wilson in 1991, which provided envelope information over several intracochlear electrodes. The CIS strategy brought significant gains in speech perception. Translational research activities of MED-EL resulted in further improvements in speech understanding in noisy environments as well as enjoyment of music by not only coding CIS-based envelope information, but by also representing temporal fine structure information in the stimulation patterns of the apical channels. Further developments include "complete cochlear coverage" made possible by deep insertion of the intracochlear electrode, elaborate front end processing, anatomy based fitting (ABF), triphasic pulse stimulation instrumental in the suppression of facial nerve stimulation, and bimodal delay compensation allowing unilateral CI users to experience hearing with hearing aids on the contralateral ear. The large number of hardware developments might be exemplified by the RONDO, the world's first single unit audio processor in 2013. This article covers the milestones of translational research around the signal processing and audio processor topic that took place in association with MED-EL.
Topics: Acoustic Stimulation; Auditory Pathways; Cochlear Implantation; Cochlear Implants; History, 20th Century; History, 21st Century; Humans; Sound Localization; Speech Perception
PubMed: 33818264
DOI: 10.1080/00016489.2021.1888504 -
Evolutionary Computation Dec 2022Evolution-in-Materio is a computational paradigm in which an algorithm reconfigures a material's properties to achieve a specific computational function. This article...
Evolution-in-Materio is a computational paradigm in which an algorithm reconfigures a material's properties to achieve a specific computational function. This article addresses the question of how successful and well performing Evolution-in-Materio processors can be designed through the selection of nanomaterials and an evolutionary algorithm for a target application. A physical model of a nanomaterial network is developed which allows for both randomness, and the possibility of Ohmic and non-Ohmic conduction, that are characteristic of such materials. These differing networks are then exploited by differential evolution, which optimises several configuration parameters (e.g., configuration voltages, weights, etc.), to solve different classification problems. We show that ideal nanomaterial choice depends upon problem complexity, with more complex problems being favoured by complex voltage dependence of conductivity and vice versa. Furthermore, we highlight how intrinsic nanomaterial electrical properties can be exploited by differing configuration parameters, clarifying the role and limitations of these techniques. These findings provide guidance for the rational design of nanomaterials and algorithms for future Evolution-in-Materio processors.
Topics: Algorithms
PubMed: 35289840
DOI: 10.1162/evco_a_00309 -
Acta Otorrinolaringologica Espanola 2022Osseointegrated auditory devices are hearing gadgets that use the bone conduction of sound to produce hearing improvement. The mechanisms and factors that contribute to...
BACKGROUND AND OBJECTIVE
Osseointegrated auditory devices are hearing gadgets that use the bone conduction of sound to produce hearing improvement. The mechanisms and factors that contribute to this sound transmission have been widely studied, however, there are other aspects that remain unknown, for instance, the influence of the processor power output. The aim of this study was to know if there is any relationship between the power output created by the devices and the hearing improvement that they achieve.
MATERIALS Y METHODS
44 patients were implanted with a percutaneous Baha® 5 model. Hearing thresholds in pure tone audiometry, free-field audiometry, and speech recognition (in quiet and in noise) were measured pre and postoperatively in each patient. The direct bone conduction thresholds and the power output values from the processors were also obtained.
RESULTS
The pure tone average threshold in free field was 39.29 dB (SD = 9.15), so that the mean gain was 29.18 dB (SD = 10.13) with the device. This involved an air-bone gap closure in 63.64% of patients. The pure tone average threshold in direct bone conduction was 27.6 dB (SD = 10.91), which was 8.4 dB better than the pure tone average threshold via bone conduction. The mean gain in speech recognition was 39.15% (SD = 23.98) at 40 dB and 36.66% (SD = 26.76) at 60 dB. The mean gain in the signal-to-noise ratio was -5.9 dB (SD = 4.32). On the other hand, the mean power output values were 27.95 dB µN (SD = 6.51) in G40 and 26.22 dB µN (SD = 6.49) in G60. When analysing the relationship between bone conduction thresholds and G40 and G60 values, a correlation from the frequency of 1000 Hz was observed. However, no statistically significant association between power output, functional gain or speech recognition gain was found.
CONCLUSIONS
The osseointegrated auditory devices generate hearing improvement in tonal thresholds and speech recognition, even in noise. Most patients closed the air-bone gap with the device. There is a direct relationship between the bone conduction threshold and the power output values from the processor, but only in mid and high frequencies. However, the relationship between power output and gain in speech recognition is weaker. Further investigation of contributing factors is necessary.
Topics: Audiometry, Pure-Tone; Auditory Threshold; Hearing; Hearing Aids; Humans; Speech Perception
PubMed: 35397830
DOI: 10.1016/j.otoeng.2021.01.002