-
Experimental Neurology Aug 2024Effective data management and sharing have become increasingly crucial in biomedical research; however, many laboratory researchers lack the necessary tools and...
Effective data management and sharing have become increasingly crucial in biomedical research; however, many laboratory researchers lack the necessary tools and knowledge to address this challenge. This article provides an introductory guide into research data management (RDM), and the importance of FAIR (Findable, Accessible, Interoperable, and Reusable) data-sharing principles for laboratory researchers produced by practicing scientists. We explore the advantages of implementing organized data management strategies and introduce key concepts such as data standards, data documentation, and the distinction between machine and human-readable data formats. Furthermore, we offer practical guidance for creating a data management plan and establishing efficient data workflows within the laboratory setting, suitable for labs of all sizes. This includes an examination of requirements analysis, the development of a data dictionary for routine data elements, the implementation of unique subject identifiers, and the formulation of standard operating procedures (SOPs) for seamless data flow. To aid researchers in implementing these practices, we present a simple organizational system as an illustrative example, which can be tailored to suit individual needs and research requirements. By presenting a user-friendly approach, this guide serves as an introduction to the field of RDM and offers practical tips to help researchers effortlessly meet the common data management and sharing mandates rapidly becoming prevalent in biomedical research.
Topics: Humans; Biomedical Research; Data Management; Information Dissemination; Research Personnel
PubMed: 38762093
DOI: 10.1016/j.expneurol.2024.114815 -
Frontiers in Plant Science 2023The importance of improving the FAIRness (findability, accessibility, interoperability, reusability) of research data is undeniable, especially in the face of large,... (Review)
Review
The importance of improving the FAIRness (findability, accessibility, interoperability, reusability) of research data is undeniable, especially in the face of large, complex datasets currently being produced by omics technologies. Facilitating the integration of a dataset with other types of data increases the likelihood of reuse, and the potential of answering novel research questions. Ontologies are a useful tool for semantically tagging datasets as adding relevant metadata increases the understanding of how data was produced and increases its interoperability. Ontologies provide concepts for a particular domain as well as the relationships between concepts. By tagging data with ontology terms, data becomes both human- and machine- interpretable, allowing for increased reuse and interoperability. However, the task of identifying ontologies relevant to a particular research domain or technology is challenging, especially within the diverse realm of fundamental plant research. In this review, we outline the ontologies most relevant to the fundamental plant sciences and how they can be used to annotate data related to plant-specific experiments within metadata frameworks, such as Investigation-Study-Assay (ISA). We also outline repositories and platforms most useful for identifying applicable ontologies or finding ontology terms.
PubMed: 38098789
DOI: 10.3389/fpls.2023.1279694 -
Plastic and Reconstructive Surgery Oct 2023Blockchain technology has attracted substantial interest in recent years, most notably for its effect on global economics through the advent of cryptocurrency. Within...
Blockchain technology has attracted substantial interest in recent years, most notably for its effect on global economics through the advent of cryptocurrency. Within the health care domain, blockchain technology has been actively explored as a tool for improving personal health data management, medical device security, and clinical trial management. Despite a strong demand for innovation and cutting-edge technology in plastic surgery, integration of blockchain technologies within plastic surgery is in its infancy. Recent advances and mainstream adoption of blockchain are gaining momentum and have shown significant promise for improving patient care and information management. In this article, the authors explain what defines a blockchain and discuss its history and potential applications in plastic surgery. Existing evidence suggests that blockchain can enable patient-centered data management, improve privacy, and provide additional safeguards against human error. Integration of blockchain technology into clinical practice requires further research and development to demonstrate its safety and efficacy for patients and providers.
Topics: Humans; Blockchain; Delivery of Health Care; Privacy; Data Management; Computer Security
PubMed: 36917745
DOI: 10.1097/PRS.0000000000010409 -
Molecular Ecology Resources Oct 2023Advances in sequencing technologies and declining costs are increasing the accessibility of large-scale biodiversity genomic datasets. To maximize the impact of these...
Advances in sequencing technologies and declining costs are increasing the accessibility of large-scale biodiversity genomic datasets. To maximize the impact of these data, a careful, considered approach to data management is essential. However, challenges associated with the management of such datasets remain, exacerbated by uncertainty among the research community as to what constitutes best practices. As an interdisciplinary team with diverse data management experience, we recognize the growing need for guidance on comprehensive data management practices that minimize the risks of data loss, maximize efficiency for stand-alone projects, enhance opportunities for data reuse, facilitate Indigenous data sovereignty and uphold the FAIR and CARE Guiding Principles. Here, we describe four fictional personas reflecting differing user experiences with data management to identify data management challenges across the biodiversity genomics research ecosystem. We then use these personas to demonstrate realistic considerations, compromises and actions for biodiversity genomic data management. We also launch the Biodiversity Genomics Data Management Hub (https://genomicsaotearoa.github.io/data-management-resources/), containing tips, tricks and resources to support biodiversity genomics researchers, especially those new to data management, in their journey towards best practice. The Hub also provides an opportunity for those biodiversity researchers whose expertise lies beyond genomics and are keen to advance their data management journey. We aim to support the biodiversity genomics community in embedding data management throughout the research lifecycle to maximize research impact and outcomes.
PubMed: 37873890
DOI: 10.1111/1755-0998.13880 -
BMC Health Services Research Nov 2023This study aims to develop a scale that measures individuals' perceptions of privacy, security, use, sharing, benefit and satisfaction in the digital health environment.
PURPOSE
This study aims to develop a scale that measures individuals' perceptions of privacy, security, use, sharing, benefit and satisfaction in the digital health environment.
METHOD
Within the scope of the study, in the scale development process; The stages of literature review, creation of items, getting expert opinion, conducting a pilot study, ensuring construct and criterion validity, and reliability analyses were carried out. The literature was searched for the formation of the question items. To evaluate the created question items, expert opinion was taken, and the question items were arranged according to the feedback from the experts. In line with the study's purpose and objectives, the focus group consisted of individuals aged 18 and above within the community. The convenience sampling method was employed for sample selection. Data were collected using an online survey conducted through Google Forms. Before commencing the survey, participants were briefed on the research's content. A pilot study was conducted with 30 participants, and as a result of the feedback from the participants, eliminations were made in the question items and the scale was made ready for application. The research was conducted by reference to 812 participants in the community. Expert evaluations of the question items were obtained, and a pilot study was conducted. A sociodemographic information form, a scale developed by the researcher, Norman and Skinner's e-Health Literacy Scale, and the Mobile Health and Personal Health Record Management Scale were used as data collection tools.
RESULTS
The content validity of the research was carried out by taking expert opinions and conducting a pilot study. Exploratory factor analysis and confirmatory factor analysis were performed to ensure construct validity. The total variance explained by the scale was 60.43%. The results of confirmatory factor analysis indicated that the 20-Item 5-factor structure exhibited good fit values. According to the analysis of criterion validity, there are significant positive correlations among the Data Management in the Digital Health Environment Scale, Norman and Skinner's e-Health Literacy Scale and the Mobile Health and Personal Health Record Management Scale (p < 0.01; r = .669, .378). The Cronbach's alpha value of the scale is .856, and the test-retest reliability coefficient is .909.
CONCLUSION
The Data Management in the Digital Health Environment Scale is a valid and reliable measurement tool that measures individuals' perceptions of privacy, security, use, sharing, benefit and satisfaction in the digital health environment.
Topics: Humans; Data Management; Reproducibility of Results; Pilot Projects; Surveys and Questionnaires; Personal Satisfaction; Psychometrics
PubMed: 37964225
DOI: 10.1186/s12913-023-10205-3 -
Methods in Molecular Biology (Clifton,... 2024Genetic design automation (GDA) is the use of computer-aided design (CAD) in designing genetic networks. GDA tools are necessary to create more complex synthetic genetic...
Genetic design automation (GDA) is the use of computer-aided design (CAD) in designing genetic networks. GDA tools are necessary to create more complex synthetic genetic networks in a high-throughput fashion. At the core of these tools is the abstraction of a hierarchy of standardized components. The components' input, output, and interactions must be captured and parametrized from relevant experimental data. Simulations of genetic networks should use those parameters and include the experimental context to be compared with the experimental results.This chapter introduces Logical Operators for Integrated Cell Algorithms (LOICA), a Python package used for designing, modeling, and characterizing genetic networks using a simple object-oriented design abstraction. LOICA represents different biological and experimental components as classes that interact to generate models. These models can be parametrized by direct connection to the Flapjack experimental data management platform to characterize abstracted components with experimental data. The models can be simulated using stochastic simulation algorithms or ordinary differential equations with varying noise levels. The simulated data can be managed and published using Flapjack alongside experimental data for comparison. LOICA genetic network designs can be represented as graphs and plotted as networks for visual inspection and serialized as Python objects or in the Synthetic Biology Open Language (SBOL) format for sharing and use in other designs.
Topics: Software; Programming Languages; Gene Regulatory Networks; Algorithms; Synthetic Biology; Automation
PubMed: 38468100
DOI: 10.1007/978-1-0716-3658-9_22 -
Scientific Reports Apr 2024With the acceleration of China's economic integration process, enterprises have gained greater advantages in the fierce market competition, and gradually formed the...
With the acceleration of China's economic integration process, enterprises have gained greater advantages in the fierce market competition, and gradually formed the trend of grouping and large-scale. However, as the scale of the company increases, the establishment of a branch also causes many problems. For example, in order to obtain more benefits, the business performance of the company can generate false growth, resulting in financial and operational risks. This paper analyzed the current situation and needs of enterprise financial control from two aspects of theory and practice, combined with specific engineering projects, taking ZH Group as an example, according to the actual situation of the enterprise. The article first introduces the basic situation of the enterprise; Then, the financial control strategy was designed, and different modules were designed to achieve financial control; Afterwards, use a reverse neural network to evaluate the effectiveness of financial management and risk warning; Relying on particle swarm optimization algorithm to seek the optimal solution and applying it to financial management and risk warning, in order to improve the level of introspection and risk management in decision-making. Finally, the value of computer intelligence algorithms in financial big data management is evaluated by constructing a financial risk indicator system. Through the analysis of enterprise financial management, the total asset turnover rate of ZH Group decreased by 0.39 times in 5 years. After 5 years of adjustment of the company's business, the company's overall operational capabilities still needed to be improved, and the company's comprehensive business capabilities also still needed to be improved. Therefore, the application of intelligent algorithms for financial control is very necessary.
PubMed: 38658586
DOI: 10.1038/s41598-024-59244-8 -
Healthcare (Basel, Switzerland) Oct 2023Generative artificial intelligence (AI) and large language models (LLMs), exemplified by ChatGPT, are promising for revolutionizing data and information management in... (Review)
Review
Generative artificial intelligence (AI) and large language models (LLMs), exemplified by ChatGPT, are promising for revolutionizing data and information management in healthcare and medicine. However, there is scant literature guiding their integration for non-AI professionals. This study conducts a scoping literature review to address the critical need for guidance on integrating generative AI and LLMs into healthcare and medical practices. It elucidates the distinct mechanisms underpinning these technologies, such as Reinforcement Learning from Human Feedback (RLFH), including few-shot learning and chain-of-thought reasoning, which differentiates them from traditional, rule-based AI systems. It requires an inclusive, collaborative co-design process that engages all pertinent stakeholders, including clinicians and consumers, to achieve these benefits. Although global research is examining both opportunities and challenges, including ethical and legal dimensions, LLMs offer promising advancements in healthcare by enhancing data management, information retrieval, and decision-making processes. Continued innovation in data acquisition, model fine-tuning, prompt strategy development, evaluation, and system implementation is imperative for realizing the full potential of these technologies. Organizations should proactively engage with these technologies to improve healthcare quality, safety, and efficiency, adhering to ethical and legal guidelines for responsible application.
PubMed: 37893850
DOI: 10.3390/healthcare11202776 -
Environmental Monitoring and Assessment Sep 2023Data resulting from environmental monitoring programs are valuable assets for natural resource managers, decision-makers, and researchers. These data are often collected...
Data resulting from environmental monitoring programs are valuable assets for natural resource managers, decision-makers, and researchers. These data are often collected to inform specific reporting needs or decisions with a specific timeframe. While program-oriented data and related publications are effective for meeting program goals, sharing well-documented data and metadata allows users to research aspects outside initial program intentions. As part of an effort to integrate data from four long-term large-scale US aquatic monitoring programs, we evaluated the original datasets against the FAIR (Findable, Accessible, Interoperable, Reusable) data principles and offer recommendations and lessons learned. Differences in data governance across these programs resulted in considerable effort to access and reuse the original datasets. Requirements, guidance, and resources available to support data publishing and documentation are inconsistent across agencies and monitoring programs, resulting in various data formats and storage locations that are not easily found, accessed, or reused. Making monitoring data FAIR will reduce barriers to data discovery and reuse. Programs are continuously striving to improve data management, data products, and metadata; however, provision of related tools, consistent guidelines and standards, and more resources to do this work is needed. Given the value of these data and the significant effort required to access and reuse them, actions and steps intended on improving data documentation and accessibility are described.
Topics: Environmental Monitoring; Natural Resources
PubMed: 37665400
DOI: 10.1007/s10661-023-11788-4 -
Revista Espanola de Cardiologia... Jan 2024Telemedicine enables the remote provision of medical care through information and communication technologies, facilitating data transmission, patient participation,... (Review)
Review
Telemedicine enables the remote provision of medical care through information and communication technologies, facilitating data transmission, patient participation, promotion of heart-healthy habits, diagnosis, early detection of acute decompensation, and monitoring and follow-up of cardiovascular diseases. Wearable devices have multiple clinical applications, ranging from arrhythmia detection to remote monitoring of chronic diseases and risk factors. Integrating these technologies safely and effectively into routine clinical practice will require a multidisciplinary approach. Technological advances and data management will increase telemonitoring strategies, which will allow greater accessibility and equity, as well as more efficient and accurate patient care. However, there are still unresolved issues, such as identifying the most appropriate technological infrastructure, integrating these data into medical records, and addressing the digital divide, which can hamper patients' adoption of remote care. This article provides an updated overview of digital tools for a more comprehensive approach to atrial fibrillation, heart failure, risk factors, and treatment adherence.
Topics: Humans; Cardiovascular Diseases; Heart Failure; Telemedicine; Chronic Disease; Early Diagnosis
PubMed: 37838182
DOI: 10.1016/j.rec.2023.07.009