-
Nature Medicine Jan 2019Big data has become the ubiquitous watch word of medical innovation. The rapid development of machine-learning techniques and artificial intelligence in particular has... (Review)
Review
Big data has become the ubiquitous watch word of medical innovation. The rapid development of machine-learning techniques and artificial intelligence in particular has promised to revolutionize medical practice from the allocation of resources to the diagnosis of complex diseases. But with big data comes big risks and challenges, among them significant questions about patient privacy. Here, we outline the legal and ethical challenges big data brings to patient privacy. We discuss, among other topics, how best to conceive of health privacy; the importance of equity, consent, and patient governance in data collection; discrimination in data uses; and how to handle data breaches. We close by sketching possible ways forward for the regulatory system.
Topics: Big Data; Delivery of Health Care; Health Insurance Portability and Accountability Act; Humans; Privacy; United States
PubMed: 30617331
DOI: 10.1038/s41591-018-0272-7 -
Frontiers in Surgery 2022The legal and ethical issues that confront society due to Artificial Intelligence (AI) include privacy and surveillance, bias or discrimination, and potentially the... (Review)
Review
The legal and ethical issues that confront society due to Artificial Intelligence (AI) include privacy and surveillance, bias or discrimination, and potentially the philosophical challenge is the role of human judgment. Concerns about newer digital technologies becoming a new source of inaccuracy and data breaches have arisen as a result of its use. Mistakes in the procedure or protocol in the field of healthcare can have devastating consequences for the patient who is the victim of the error. Because patients come into contact with physicians at moments in their lives when they are most vulnerable, it is crucial to remember this. Currently, there are no well-defined regulations in place to address the legal and ethical issues that may arise due to the use of artificial intelligence in healthcare settings. This review attempts to address these pertinent issues highlighting the need for algorithmic transparency, privacy, and protection of all the beneficiaries involved and cybersecurity of associated vulnerabilities.
PubMed: 35360424
DOI: 10.3389/fsurg.2022.862322 -
Journal of Medical Internet Research Aug 2023ChatGPT has promising applications in health care, but potential ethical issues need to be addressed proactively to prevent harm. ChatGPT presents potential ethical...
ChatGPT has promising applications in health care, but potential ethical issues need to be addressed proactively to prevent harm. ChatGPT presents potential ethical challenges from legal, humanistic, algorithmic, and informational perspectives. Legal ethics concerns arise from the unclear allocation of responsibility when patient harm occurs and from potential breaches of patient privacy due to data collection. Clear rules and legal boundaries are needed to properly allocate liability and protect users. Humanistic ethics concerns arise from the potential disruption of the physician-patient relationship, humanistic care, and issues of integrity. Overreliance on artificial intelligence (AI) can undermine compassion and erode trust. Transparency and disclosure of AI-generated content are critical to maintaining integrity. Algorithmic ethics raise concerns about algorithmic bias, responsibility, transparency and explainability, as well as validation and evaluation. Information ethics include data bias, validity, and effectiveness. Biased training data can lead to biased output, and overreliance on ChatGPT can reduce patient adherence and encourage self-diagnosis. Ensuring the accuracy, reliability, and validity of ChatGPT-generated content requires rigorous validation and ongoing updates based on clinical practice. To navigate the evolving ethical landscape of AI, AI in health care must adhere to the strictest ethical standards. Through comprehensive ethical guidelines, health care professionals can ensure the responsible use of ChatGPT, promote accurate and reliable information exchange, protect patient privacy, and empower patients to make informed decisions about their health care.
Topics: Humans; Artificial Intelligence; Reproducibility of Results; Data Collection; Disclosure; Patient Compliance
PubMed: 37566454
DOI: 10.2196/48009 -
Journal of Advanced Nursing Nov 2010This paper is a report of a study of the type, frequency, and level of stress of ethical issues encountered by nurses in their everyday practice.
AIM
This paper is a report of a study of the type, frequency, and level of stress of ethical issues encountered by nurses in their everyday practice.
BACKGROUND
Everyday ethical issues in nursing practice attract little attention but can create stress for nurses. Nurses often feel uncomfortable in addressing the ethical issues they encounter in patient care.
METHODS
A self-administered survey was sent in 2004 to 1000 nurses in four states in four different census regions of the United States of America. The adjusted response rate was 52%. Data were analysed using descriptive statistics, cross-tabulations and Pearson correlations.
RESULTS
A total of 422 questionnaires were used in the analysis. The five most frequently occurring and most stressful ethical and patient care issues were protecting patients' rights; autonomy and informed consent to treatment; staffing patterns; advanced care planning; and surrogate decision-making. Other common occurrences were unethical practices of healthcare professionals; breaches of patient confidentiality or right to privacy; and end-of-life decision-making. Younger nurses and those with fewer years of experience encountered ethical issues more frequently and reported higher levels of stress. Nurses from different regions also experienced specific types of ethical problems more commonly.
CONCLUSION
Nurses face daily ethical challenges in the provision of quality care. To retain nurses, targeted ethics-related interventions that address caring for an increasingly complex patient population are needed.
Topics: Advance Directives; Age Factors; Bioethical Issues; Burnout, Professional; Clinical Competence; Conflict of Interest; Cross-Sectional Studies; Decision Making; Ethics, Nursing; Female; Humans; Job Satisfaction; Male; Middle Aged; Nursing Research; Nursing Staff; Patient Rights; Personnel Staffing and Scheduling; Personnel Turnover; Surveys and Questionnaires; Terminal Care; United States
PubMed: 20735502
DOI: 10.1111/j.1365-2648.2010.05425.x -
BMC Medical Ethics Feb 2020Sharing de-identified individual-level health research data is widely promoted and has many potential benefits. However there are also some potential harms, such as...
BACKGROUND
Sharing de-identified individual-level health research data is widely promoted and has many potential benefits. However there are also some potential harms, such as misuse of data and breach of participant confidentiality. One way to promote the benefits of sharing while ameliorating its potential harms is through the adoption of a managed access approach where data requests are channeled through a Data Access Committee (DAC), rather than making data openly available without restrictions. A DAC, whether a formal or informal group of individuals, has the responsibility of reviewing and assessing data access requests. Many individual groups, consortiums, institutional and independent DACs have been established but there is currently no widely accepted framework for their organization and function.
MAIN TEXT
We propose that DACs, should have the role of both promotion of data sharing and protection of data subjects, their communities, data producers, their institutions and the scientific enterprise. We suggest that data access should be granted by DACs as long as the data reuse has potential social value and provided there is low risk of foreseeable harms. To promote data sharing and to motivate data producers, DACs should encourage secondary uses that are consistent with the interests of data producers and their own institutions. Given the suggested roles of DACs, there should be transparent, simple and clear application procedures for data access. The approach to review of applications should be proportionate to the potential risks involved. DACs should be established within institutional and legal frameworks with clear lines of accountability, terms of reference and membership. We suggest that DACs should not be modelled after research ethics committees (RECs) because their functions and goals of review are different from those of RECs. DAC reviews should be guided by the principles of public health ethics instead of research ethics.
CONCLUSIONS
In this paper we have suggested a framework under which DACs should operate, how they should be organised, and how to constitute them.
Topics: Access to Information; Confidentiality; Ethics Committees, Research; Ethics, Research; Humans; Information Dissemination; Social Responsibility
PubMed: 32013947
DOI: 10.1186/s12910-020-0453-z -
Journal of Medical Internet Research Apr 2014In 2014, the vast majority of published biomedical research is still hidden behind paywalls rather than open access. For more than a decade, similar restrictions over...
In 2014, the vast majority of published biomedical research is still hidden behind paywalls rather than open access. For more than a decade, similar restrictions over other digitally available content have engendered illegal activity. Music file sharing became rampant in the late 1990s as communities formed around new ways to share. The frequency and scale of cyber-attacks against commercial and government interests has increased dramatically. Massive troves of classified government documents have become public through the actions of a few. Yet we have not seen significant growth in the illegal sharing of peer-reviewed academic articles. Should we truly expect that biomedical publishing is somehow at less risk than other content-generating industries? What of the larger threat--a "Biblioleaks" event--a database breach and public leak of the substantial archives of biomedical literature? As the expectation that all research should be available to everyone becomes the norm for a younger generation of researchers and the broader community, the motivations for such a leak are likely to grow. We explore the feasibility and consequences of a Biblioleaks event for researchers, journals, publishers, and the broader communities of doctors and the patients they serve.
Topics: Access to Information; Biomedical Research; Computer Security; Copyright; Humans; PubMed; Publishing
PubMed: 24755534
DOI: 10.2196/jmir.3331 -
Journal of Bioethical Inquiry Jun 2023The recently passed Privacy Legislation Amendment (Enforcement and Other Measures) Act 2022 (Cth) introduced important changes to the Australian Privacy Act 1988 (Cth)... (Review)
Review
The recently passed Privacy Legislation Amendment (Enforcement and Other Measures) Act 2022 (Cth) introduced important changes to the Australian Privacy Act 1988 (Cth) which increase penalties for serious and repeated interferences with privacy and strengthen the investigative and enforcement powers of the Information Commissioner. The amendments were made subsequent to a number of high profile data breaches and represent the first set of changes to the Privacy Act following the review of the Act commenced by the Attorney-General in October 2020. The submissions made to the review emphasized the need for more effective enforcement mechanisms to increase individuals' control over their personal information and as a form of deterrence. This article reviews the recent amendments to the Privacy Act and explains their effect. It comments upon the relevance of the amendments for health and medical data and other data collected in the context of healthcare, and refers to the Attorney-General's Department's review of the Privacy Act regarding other proposals relating to enforcement which have not as yet been put into effect in legislation.
Topics: Humans; Privacy; Australia; Personally Identifiable Information; Confidentiality
PubMed: 37432509
DOI: 10.1007/s11673-023-10249-4 -
Australian Journal of General Practice Jul 2022In the era of socially distanced clinical and medical research practices, the use of electronic communication has flourished. The Australian Information Commissioner...
BACKGROUND
In the era of socially distanced clinical and medical research practices, the use of electronic communication has flourished. The Australian Information Commissioner recently ordered a Victorian general practice to pay $16,400 in compensation following a breach of privacy. This is the largest award of compensation made by the Commissioner in the context of a medical or healthcare privacy matter. The practice had inadvertently sent an email containing sensitive information to an incorrect email address. The email included information concerning the human immunodeficiency virus status of the complainants.
OBJECTIVE
The aim of this article is to provide an overview of this important case in Australian information and privacy law, which relates to the operation of an Australian general practice and research activity undertaken within the practice context.
DISCUSSION
In an era marked by a great increase in the use of electronic communication in the medical setting, it is essential that practices both manage electronic communication well and respond appropriately when an error arises.
Topics: Australia; Communication; Electronics; Family Practice; Humans; Privacy
PubMed: 35773159
DOI: 10.31128/AJGP-05-21-6008 -
Yearbook of Medical Informatics Aug 2017To summarize key contributions to current research in the field of Clinical Research Informatics (CRI) and to select the best papers published in 2016. A bibliographic... (Review)
Review
To summarize key contributions to current research in the field of Clinical Research Informatics (CRI) and to select the best papers published in 2016. A bibliographic search using a combination of MeSH and free terms on CRI was performed using PubMed, followed by a double-blind review in order to select a list of candidate best papers to be then peer-reviewed by external reviewers. A consensus meeting between the two section editors and the editorial team was organized to finally conclude on the selection of best papers. Among the 452 papers published in 2016 in the various areas of CRI and returned by the query, the full review process selected four best papers. The authors of the first paper utilized a comprehensive representation of the patient medical record and semi-automatically labeled training sets to create phenotype models via a machine learning process. The second selected paper describes an open source tool chain securely connecting ResearchKit compatible applications (Apps) to the widely-used clinical research infrastructure Informatics for Integrating Biology and the Bedside (i2b2). The third selected paper describes the FAIR Guiding Principles for scientific data management and stewardship. The fourth selected paper focuses on the evaluation of the risk of privacy breaches in releasing genomics datasets. A major trend in the 2016 publications is the variety of research on "real-world data" - healthcare-generated data, person health data, and patient-reported outcomes -highlighting the opportunities provided by new machine learning techniques as well as new potential risks of privacy breaches.
Topics: Biomedical Research; Clinical Trials as Topic; Confidentiality; Humans; Medical Informatics; Observational Studies as Topic
PubMed: 29063566
DOI: 10.15265/IY-2017-024