-
The Annals of Thoracic Surgery Aug 2020
Topics: Data Management; Heart Valve Diseases; Humans; Mitral Valve; Surgeons
PubMed: 31982439
DOI: 10.1016/j.athoracsur.2019.12.013 -
The Annals of Thoracic Surgery Sep 2019
Topics: Aortic Dissection; Data Management; Databases, Factual; Dissection; Humans
PubMed: 31026429
DOI: 10.1016/j.athoracsur.2019.03.071 -
The Journal of Urology Jan 2020
Topics: Androgen Antagonists; Data Management; Hormone Replacement Therapy; Humans; Lipids; Male; Prostatic Neoplasms, Castration-Resistant
PubMed: 31609669
DOI: 10.1097/01.JU.0000604072.36697.15 -
Journal of Medical Internet Research Nov 2023In the context of the Medical Informatics Initiative, medical data integration centers (DICs) have implemented complex data flows to transfer routine health care data...
BACKGROUND
In the context of the Medical Informatics Initiative, medical data integration centers (DICs) have implemented complex data flows to transfer routine health care data into research data repositories for secondary use. Data management practices are of importance throughout these processes, and special attention should be given to provenance aspects. Insufficient knowledge can lead to validity risks and reduce the confidence and quality of the processed data. The need to implement maintainable data management practices is undisputed, but there is a great lack of clarity on the status.
OBJECTIVE
Our study examines the current data management practices throughout the data life cycle within the Medical Informatics in Research and Care in University Medicine (MIRACUM) consortium. We present a framework for the maturity status of data management practices and present recommendations to enable a trustful dissemination and reuse of routine health care data.
METHODS
In this mixed methods study, we conducted semistructured interviews with stakeholders from 10 DICs between July and September 2021. We used a self-designed questionnaire that we tailored to the MIRACUM DICs, to collect qualitative and quantitative data. Our study method is compliant with the Good Reporting of a Mixed Methods Study (GRAMMS) checklist.
RESULTS
Our study provides insights into the data management practices at the MIRACUM DICs. We identify several traceability issues that can be partially explained with a lack of contextual information within nonharmonized workflow steps, unclear responsibilities, missing or incomplete data elements, and incomplete information about the computational environment information. Based on the identified shortcomings, we suggest a data management maturity framework to reach more clarity and to help define enhanced data management strategies.
CONCLUSIONS
The data management maturity framework supports the production and dissemination of accurate and provenance-enriched data for secondary use. Our work serves as a catalyst for the derivation of an overarching data management strategy, abiding data integrity and provenance characteristics as key factors. We envision that this work will lead to the generation of fairer and maintained health research data of high quality.
Topics: Humans; Data Management; Delivery of Health Care; Medical Informatics; Surveys and Questionnaires
PubMed: 37938878
DOI: 10.2196/48809 -
Sensors (Basel, Switzerland) Jun 2021Pipelines play an important role in the national/international transportation of natural gas, petroleum products, and other energy resources. Pipelines are set up in... (Review)
Review
Pipelines play an important role in the national/international transportation of natural gas, petroleum products, and other energy resources. Pipelines are set up in different environments and consequently suffer various damage challenges, such as environmental electrochemical reaction, welding defects, and external force damage, etc. Defects like metal loss, pitting, and cracks destroy the pipeline's integrity and cause serious safety issues. This should be prevented before it occurs to ensure the safe operation of the pipeline. In recent years, different non-destructive testing (NDT) methods have been developed for in-line pipeline inspection. These are magnetic flux leakage (MFL) testing, ultrasonic testing (UT), electromagnetic acoustic technology (EMAT), eddy current testing (EC). Single modality or different kinds of integrated NDT system named Pipeline Inspection Gauge (PIG) or un-piggable robotic inspection systems have been developed. Moreover, data management in conjunction with historic data for condition-based pipeline maintenance becomes important as well. In this study, various inspection methods in association with non-destructive testing are investigated. The state of the art of PIGs, un-piggable robots, as well as instrumental applications, are systematically compared. Furthermore, data models and management are utilized for defect quantification, classification, failure prediction and maintenance. Finally, the challenges, problems, and development trends of pipeline inspection as well as data management are derived and discussed.
Topics: Acoustics; Data Management; Electromagnetic Phenomena; Transportation
PubMed: 34205033
DOI: 10.3390/s21113862 -
Journal of Biotechnology Nov 2021Collaborative research is common practice in modern life sciences. For most projects several researchers from multiple universities collaborate on a specific topic....
Collaborative research is common practice in modern life sciences. For most projects several researchers from multiple universities collaborate on a specific topic. Frequently, these research projects produce a wealth of data that requires central and secure storage, which should also allow for easy sharing among project participants. Only under best circumstances, this comes with minimal technical overhead for the researchers. Moreover, the need for data to be analyzed in a reproducible way often poses a challenge for researchers without a data science background and thus represents an overly time-consuming process. Here, we report on the integration of CyVerse Austria (CAT), a new cyberinfrastructure for a local community of life science researchers, and provide two examples how it can be used to facilitate FAIR data management and reproducible analytics for teaching and research. In particular, we describe in detail how CAT can be used (i) as a teaching platform with a defined software environment and data management/sharing possibilities, and (ii) to build a data analysis pipeline using the Docker technology tailored to the needs and interests of the researcher.
Topics: Austria; Data Management; Software
PubMed: 34400238
DOI: 10.1016/j.jbiotec.2021.08.004 -
Dermatologic Surgery : Official... Dec 2019
Topics: Data Management; Pharmacovigilance; Sclerosing Solutions; World Health Organization
PubMed: 31663875
DOI: 10.1097/DSS.0000000000002169 -
Annales de Pathologie Apr 2019Tumor banks are asked to clinical and translationnal research project development in oncology. They strongly participate to the assessment, then to the validation of... (Review)
Review
Tumor banks are asked to clinical and translationnal research project development in oncology. They strongly participate to the assessment, then to the validation of diagnostic, prognostic and predictive biomarkers. The progressive change of these structures leads to induce a professionalization of their functioning and to identify them as key actors in oncology by the stakeholders of the public and private worlds. The progresses made in biotechnologies and therapeutics are rapidly modifying the impact and the proper functioning of the biobanks. These latter are now facing different challenges, in particular for their sustainability. Among the major issues, the integration of the clinical and biological data becoming increasingly complex leads to urgently consider an optimization of the role of different biobanks in France. Their goal is to be an attractive counterpart face to the international competition. The purpose of this review is to briefly describe the current evolution of the biobanks, then their present and future challenges, and finally the role made by the pathologists in these new issues in oncology field.
Topics: Data Management; Forecasting; Humans; Neoplasms; Tissue Banks
PubMed: 30819623
DOI: 10.1016/j.annpat.2019.01.017 -
Journal of Visualized Experiments : JoVE Jun 2023Transmission electron microscopy (TEM) enables users to study materials at their fundamental, atomic scale. Complex experiments routinely generate thousands of images... (Review)
Review
Transmission electron microscopy (TEM) enables users to study materials at their fundamental, atomic scale. Complex experiments routinely generate thousands of images with numerous parameters that require time-consuming and complicated analysis. AXON synchronicity is a machine-vision synchronization (MVS) software solution designed to address the pain points inherent to TEM studies. Once installed on the microscope, it enables the continuous synchronization of images and metadata generated by the microscope, detector, and in situ systems during an experiment. This connectivity enables the application of machine-vision algorithms that apply a combination of spatial, beam, and digital corrections to center and track a region of interest within the field of view and provide immediate image stabilization. In addition to the substantial improvement in resolution afforded by such stabilization, metadata synchronization enables the application of computational and image analysis algorithms that calculate variables between images. This calculated metadata can be used to analyze trends or identify key areas of interest within a dataset, leading to new insights and the development of more sophisticated machine-vision capabilities in the future. One such module that builds on this calculated metadata is dose calibration and management. The dose module provides state-of-the-art calibration, tracking, and management of both the electron fluence (e/Å·s) and cumulative dose (e/Å) that is delivered to specific areas of the sample on a pixel-by-pixel basis. This enables a comprehensive overview of the interaction between the electron beam and the sample. Experiment analysis is streamlined through a dedicated analysis software in which datasets consisting of images and corresponding metadata are easily visualized, sorted, filtered, and exported. Combined, these tools facilitate efficient collaborations and experimental analysis, encourage data mining and enhance the microscopy experience.
Topics: Data Management; Workflow; Software; Microscopy, Electron, Transmission; Algorithms
PubMed: 37427942
DOI: 10.3791/65446 -
Drug Discovery Today Apr 2023The FAIR (findable, accessible, interoperable and reusable) principles are data management and stewardship guidelines aimed at increasing the effective use of scientific... (Review)
Review
The FAIR (findable, accessible, interoperable and reusable) principles are data management and stewardship guidelines aimed at increasing the effective use of scientific research data. Adherence to these principles in managing data assets in pharmaceutical research and development (R&D) offers pharmaceutical companies the potential to maximise the value of such assets, but the endeavour is costly and challenging. We describe the 'FAIR-Decide' framework, which aims to guide decision-making on the retrospective FAIRification of existing datasets by using business analysis techniques to estimate costs and expected benefits. This framework supports decision-making on FAIRification in the pharmaceutical R&D industry and can be integrated into a company's data management strategy.
Topics: Retrospective Studies; Research; Drug Industry; Data Management; Pharmaceutical Preparations
PubMed: 36716952
DOI: 10.1016/j.drudis.2023.103510