-
Journal of Law and Medicine Dec 2023People with (a history of) hepatitis C have concerns about privacy and the confidentiality of their health information. This is often due to the association between...
People with (a history of) hepatitis C have concerns about privacy and the confidentiality of their health information. This is often due to the association between hepatitis C and injecting drug use and related stigma. In Australia, recent data breaches at a major private health insurer and legislative reforms to increase access to electronic health records have heightened these concerns. Drawing from interviews with people with lived experience of hepatitis C and stakeholders working in this area, this article explores the experiences and concerns of people with (a history of) hepatitis C in relation to the sharing of their health records. It considers the potential application of health privacy principles in the context of hepatitis C and argues for the development of guidelines concerning the privacy of health records held by health departments and public hospitals. Such principles might also inform reforms to legislation regarding access to health records.
Topics: Humans; Privacy; Electronic Health Records; Confidentiality; Hepacivirus; Hepatitis C
PubMed: 38459877
DOI: No ID Found -
Neural Networks : the Official Journal... Jun 2024With the widespread application of deep neural networks (DNNs), the risk of privacy breaches against DNN models is constantly on the rise, resulting in an increasing...
With the widespread application of deep neural networks (DNNs), the risk of privacy breaches against DNN models is constantly on the rise, resulting in an increasing need for intellectual property (IP) protection for such models. Although neural network watermarking techniques are widely used to safeguard the IP of DNNs, they can only achieve passive protection and cannot actively prevent unauthorized users from illicit use or embezzlement of the trained DNN models. Therefore, the development of proactive protection techniques to prevent IP infringement is imperative. To this end, we propose SecureNet, a key-based access license framework for DNN models. The proposed approach involves injecting license keys into the model through backdoor learning, enabling correct model functionality only when the appropriate license key is included in the input. To ensure the reusability of DNN models, we also propose a license key replacement algorithm. In addition, based on SecureNet, we designed defense mechanisms against adversarial attacks and backdoor attacks, respectively. Furthermore, we introduce a fine-grained authorization method that enables flexible granting of model permissions to different users. We have designed four license-key schemes with different privileges, tailored to various scenarios. We evaluated SecureNet on five benchmark datasets including MNIST, Cifar10, Cifar100, FaceScrub, and CelebA, and assessed its performance on six classic DNN models: LeNet-5, VGG16, ResNet18, ResNet101, NFNet-F5, and MobileNetV3. The results demonstrate that our approach outperforms the state-of-the-art model parameter encryption methods by at least 95% in terms of computational efficiency. Additionally, it provides effective defense against adversarial attacks and backdoor attacks without compromising the model's overall performance.
Topics: Learning; Neural Networks, Computer; Algorithms; Benchmarking; Intellectual Property
PubMed: 38452664
DOI: 10.1016/j.neunet.2024.106199 -
BMC Medical Ethics Feb 2024To examine the understanding of the ethical dilemmas associated with Big Data and artificial intelligence (AI) among Jordanian medical students, physicians in training,...
Evaluating the understanding of the ethical and moral challenges of Big Data and AI among Jordanian medical students, physicians in training, and senior practitioners: a cross-sectional study.
AIMS
To examine the understanding of the ethical dilemmas associated with Big Data and artificial intelligence (AI) among Jordanian medical students, physicians in training, and senior practitioners.
METHODS
We implemented a literature-validated questionnaire to examine the knowledge, attitudes, and practices of the target population during the period between April and August 2023. Themes of ethical debate included privacy breaches, consent, ownership, augmented biases, epistemology, and accountability. Participants' responses were showcased using descriptive statistics and compared between groups using t-test or ANOVA.
RESULTS
We included 466 participants. The greater majority of respondents were interns and residents (50.2%), followed by medical students (38.0%). Most participants were affiliated with university institutions (62.4%). In terms of privacy, participants acknowledged that Big Data and AI were susceptible to privacy breaches (39.3%); however, 59.0% found such breaches justifiable under certain conditions. For ethical debacles involving informed consent, 41.6% and 44.6% were aware that obtaining informed consent posed an ethical limitation in Big Data and AI applications and denounced the concept of "broad consent", respectively. In terms of ownership, 49.6% acknowledged that data cannot be owned yet accepted that institutions could hold a quasi-control of such data (59.0%). Less than 50% of participants were aware of Big Data and AI's abilities to augment or create new biases in healthcare. Furthermore, participants agreed that researchers, institutions, and legislative bodies were responsible for ensuring the ethical implementation of Big Data and AI. Finally, while demonstrating limited experience with using such technology, participants generally had positive views of the role of Big Data and AI in complementing healthcare.
CONCLUSION
Jordanian medical students, physicians in training and senior practitioners have limited awareness of the ethical risks associated with Big Data and AI. Institutions are responsible for raising awareness, especially with the upsurge of such technology.
Topics: Humans; Cross-Sectional Studies; Big Data; Artificial Intelligence; Jordan; Students, Medical; Morals; Physicians
PubMed: 38368332
DOI: 10.1186/s12910-024-01008-0 -
Health Expectations : An International... Feb 2024General practice data, particularly when combined with hospital and other health service data through data linkage, are increasingly being used for quality assurance,...
INTRODUCTION
General practice data, particularly when combined with hospital and other health service data through data linkage, are increasingly being used for quality assurance, evaluation, health service planning and research. In this study, we explored community views on sharing general practice data for secondary purposes, including research, to establish what concerns and conditions need to be addressed in the process of developing a social licence to support such use.
METHODS
We used a mixed-methods approach with focus groups (November-December 2021), followed by a cross-sectional survey (March-April 2022).
RESULTS
The participants in this study strongly supported sharing general practice data with the clinicians responsible for their care, and where there were direct benefits for individual patients. Over 90% of survey participants (Nā=ā2604) were willing to share their general practice information to directly support their health care, that is, for the primary purpose of collection. There was less support for sharing data for secondary purposes such as research and health service planning (36% and 45% respectively in broad agreement) or for linking general practice data to data in the education, social services and criminal justice systems (30%-36%). A substantial minority of participants were unsure or could not see how benefits would arise from sharing data for secondary purposes. Participants were concerned about the potential for privacy breaches, discrimination and data misuse and they wanted greater transparency and an opportunity to consent to data release.
CONCLUSION
The findings of this study suggest that the public may be more concerned about sharing general practice data for secondary purposes than they are about sharing data collected in other settings. Sharing general practice data more broadly will require careful attention to patient and public concerns, including focusing on the factors that will sustain trust and legitimacy in general practice and GPs.
PATIENT AND PUBLIC CONTRIBUTION
Members of the public were participants in the study. Data produced from their participation generated study findings.
CLINICAL TRIAL REGISTRATION
Not applicable.
Topics: Humans; Cross-Sectional Studies; Information Dissemination; Focus Groups; Delivery of Health Care; General Practice
PubMed: 38361335
DOI: 10.1111/hex.13984 -
JMIR MHealth and UHealth Feb 2024Smart home technology (SHT) can be useful for aging in place or health-related purposes. However, surveillance studies have highlighted ethical issues with SHTs,... (Review)
Review
BACKGROUND
Smart home technology (SHT) can be useful for aging in place or health-related purposes. However, surveillance studies have highlighted ethical issues with SHTs, including user privacy, security, and autonomy.
OBJECTIVE
As digital technology is most often designed for younger adults, this review summarizes perceptions of SHTs among users aged 50 years and older to explore their understanding of privacy, the purpose of data collection, risks and benefits, and safety.
METHODS
Through an integrative review, we explored community-dwelling adults' (aged 50 years and older) perceptions of SHTs based on research questions under 4 nonmutually exclusive themes: privacy, the purpose of data collection, risk and benefits, and safety. We searched 1860 titles and abstracts from Ovid MEDLINE, Ovid Embase, Cochrane Database of Systematic Reviews, and Cochrane Central Register of Controlled Trials, Scopus, Web of Science Core Collection, and IEEE Xplore or IET Electronic Library, resulting in 15 included studies.
RESULTS
The 15 studies explored user perception of smart speakers, motion sensors, or home monitoring systems. A total of 13 (87%) studies discussed user privacy concerns regarding data collection and access. A total of 4 (27%) studies explored user knowledge of data collection purposes, 7 (47%) studies featured risk-related concerns such as data breaches and third-party misuse alongside benefits such as convenience, and 9 (60%) studies reported user enthusiasm about the potential for home safety.
CONCLUSIONS
Due to the growing size of aging populations and advances in technological capabilities, regulators and designers should focus on user concerns by supporting higher levels of agency regarding data collection, use, and disclosure and by bolstering organizational accountability. This way, relevant privacy regulation and SHT design can better support user safety while diminishing potential risks to privacy, security, autonomy, or discriminatory outcomes.
Topics: Aged; Humans; Middle Aged; Independent Living; Perception; Privacy; Technology
PubMed: 38335026
DOI: 10.2196/48526 -
PloS One 2024Synthetic datasets are artificially manufactured based on real health systems data but do not contain real patient information. We sought to validate the use of...
OBJECTIVES
Synthetic datasets are artificially manufactured based on real health systems data but do not contain real patient information. We sought to validate the use of synthetic data in stroke and cancer research by conducting a comparison study of cancer patients with ischemic stroke to non-cancer patients with ischemic stroke.
DESIGN
retrospective cohort study.
SETTING
We used synthetic data generated by MDClone and compared it to its original source data (i.e. real patient data from the Ottawa Hospital Data Warehouse).
OUTCOME MEASURES
We compared key differences in demographics, treatment characteristics, length of stay, and costs between cancer patients with ischemic stroke and non-cancer patients with ischemic stroke. We used a binary, multivariable logistic regression model to identify risk factors for recurrent stroke in the cancer population.
RESULTS
Using synthetic data, we found cancer patients with ischemic stroke had a lower prevalence of hypertension (52.0% in the cancer cohort vs 57.7% in the non-cancer cohort, p<0.0001), and a higher prevalence of chronic obstructive pulmonary disease (COPD: 8.5% vs 4.7%, p<0.0001), prior ischemic stroke (1.7% vs 0.1%, p<0.001), and prior venous thromboembolism (VTE: 8.2% vs 1.5%, p<0.0001). They also had a longer length of stay (8 days [IQR 3-16] vs 6 days [IQR 3-13], p = 0.011), and higher costs associated with their stroke encounters: $11,498 (IQR $4,440 -$20,668) in the cancer cohort vs $8,084 (IQR $3,947 -$16,706) in the non-cancer cohort (p = 0.0061). A multivariable logistic regression model identified 5 predictors for recurrent ischemic stroke in the cancer cohort using synthetic data; 3 of the same predictors identified using real patient data with similar effect measures. Summary statistics between synthetic and original datasets did not significantly differ, other than slight differences in the distributions of frequencies for numeric data.
CONCLUSION
We demonstrated the utility of synthetic data in stroke and cancer research and provided key differences between cancer and non-cancer patients with ischemic stroke. Synthetic data is a powerful tool that can allow researchers to easily explore hypothesis generation, enable data sharing without privacy breaches, and ensure broad access to big data in a rapid, safe, and reliable fashion.
Topics: Humans; Retrospective Studies; Big Data; Stroke; Neoplasms; Risk Factors; Ischemic Stroke; Pulmonary Disease, Chronic Obstructive
PubMed: 38324588
DOI: 10.1371/journal.pone.0295921 -
Science Advances Feb 2024Modern machine learning models toward various tasks with omic data analysis give rise to threats of privacy leakage of patients involved in those datasets. Here, we...
Modern machine learning models toward various tasks with omic data analysis give rise to threats of privacy leakage of patients involved in those datasets. Here, we proposed a secure and privacy-preserving machine learning method (PPML-Omics) by designing a decentralized differential private federated learning algorithm. We applied PPML-Omics to analyze data from three sequencing technologies and addressed the privacy concern in three major tasks of omic data under three representative deep learning models. We examined privacy breaches in depth through privacy attack experiments and demonstrated that PPML-Omics could protect patients' privacy. In each of these applications, PPML-Omics was able to outperform methods of comparison under the same level of privacy guarantee, demonstrating the versatility of the method in simultaneously balancing the privacy-preserving capability and utility in omic data analysis. Furthermore, we gave the theoretical proof of the privacy-preserving capability of PPML-Omics, suggesting the first mathematically guaranteed method with robust and generalizable empirical performance in protecting patients' privacy in omic data.
Topics: Humans; Privacy; Algorithms; Data Analysis; Machine Learning; Technology
PubMed: 38295178
DOI: 10.1126/sciadv.adh8601 -
Australasian Psychiatry : Bulletin of... Apr 2024To update psychiatrists and trainees on the realised risks of electronic health record data breaches. (Review)
Review
OBJECTIVE
To update psychiatrists and trainees on the realised risks of electronic health record data breaches.
METHODS
This is a selective narrative review and commentary regarding electronic health record data breaches.
RESULTS
Recent events such as the Medibank and Australian Clinical Labs data breaches demonstrate the realised risks for electronic health records. If stolen identity data is publicly released, patients and doctors may be subject to blackmail, fraud, identity theft and targeted scams. Medical diagnoses of psychiatric illness and substance use disorder may be released in blackmail attempts.
CONCLUSIONS
Psychiatrists, trainees and their patients need to understand the inevitability of electronic health record data breaches. This understanding should inform a minimised collection of personal information in the health record to avoid exposure of confidential information and identity theft. Governmental regulation of electronic health record privacy and security is needed.
Topics: Humans; Electronic Health Records; Psychiatrists; Australia; Confidentiality; Delivery of Health Care
PubMed: 38285964
DOI: 10.1177/10398562241230816 -
Risk Analysis : An Official Publication... Jan 2024Recent history has shown both the benefits and risks of information sharing among firms. Information is shared to facilitate mutual business objectives. However,...
Recent history has shown both the benefits and risks of information sharing among firms. Information is shared to facilitate mutual business objectives. However, information sharing can also introduce security-related concerns that could expose the firm to a breach of privacy, with significant economic, reputational, and safety implications. It is imperative for organizations to leverage available information to evaluate security related to information sharing when evaluating current and potential information-sharing partnerships. The "fine print" or privacy policies of firms can provide a signal of security across a wide variety of firms being considered for new and continued information-sharing partnerships. In this article, we develop a methodology to gauge and benchmark information security policies in the partner-selection process that can help direct risk-based investments in information sharing security. We develop a methodology to collect and interpret firm privacy policies, evaluate characteristics of those policies by leveraging natural language processing metrics and developing benchmarking metrics, and understand how those characteristics relate to one another in information-sharing partnership situations. We demonstrate the methodology on 500 high-revenue firms. The methodology and managerial insights will be of interest to risk managers, information security professionals, and individuals forming information sharing agreements across industries.
PubMed: 38246627
DOI: 10.1111/risa.14267