Yuan et al., 2020 - Google Patents
A training scheme of deep neural networks on encrypted dataYuan et al., 2020
- Document ID
- 7030555081862595952
- Author
- Yuan L
- Shen G
- Publication year
- Publication venue
- Proceedings of the 2020 International Conference on Cyberspace Innovation of Advanced Technologies
External Links
Snippet
Machine learning based on deep neural network has shown great potential in many application fields. Compared with traditional machine learning algorithms, training a reliable neural network model requires a huge amount of data. However, the data required to train a …
- 230000001537 neural 0 title abstract description 61
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N99/00—Subject matter not provided for in other groups of this subclass
- G06N99/005—Learning machines, i.e. computer in which a programme is changed according to experience gained by the machine itself during a complete run
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computer systems utilising knowledge based models
- G06N5/02—Knowledge representation
- G06N5/022—Knowledge engineering, knowledge acquisition
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L9/00—Cryptographic mechanisms or cryptographic arrangements for secret or secure communication
- H04L9/08—Key distribution or management, e.g. generation, sharing or updating, of cryptographic keys or passwords
- H04L9/0816—Key establishment, i.e. cryptographic processes or cryptographic protocols whereby a shared secret becomes available to two or more parties, for subsequent use
- H04L9/0819—Key transport or distribution, i.e. key establishment techniques where one party creates or otherwise obtains a secret value, and securely transfers it to the other(s)
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computer systems utilising knowledge based models
- G06N5/04—Inference methods or devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computer systems based on biological models
- G06N3/02—Computer systems based on biological models using neural network models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/30—Information retrieval; Database structures therefor; File system structures therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/6218—Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
- G06F21/6245—Protecting personal data, e.g. for financial or medical purposes
- G06F21/6254—Protecting personal data, e.g. for financial or medical purposes by anonymising data, e.g. decorrelating personal data from the owner's identification
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L9/00—Cryptographic mechanisms or cryptographic arrangements for secret or secure communication
- H04L9/30—Public key, i.e. encryption algorithm being computationally infeasible to invert or user's encryption keys not requiring secrecy
- H04L9/3066—Public key, i.e. encryption algorithm being computationally infeasible to invert or user's encryption keys not requiring secrecy involving algebraic varieties, e.g. elliptic or hyper-elliptic curves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F7/00—Methods or arrangements for processing data by operating upon the order or content of the data handled
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L9/00—Cryptographic mechanisms or cryptographic arrangements for secret or secure communication
- H04L9/32—Cryptographic mechanisms or cryptographic arrangements for secret or secure communication including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
- H04L9/3226—Cryptographic mechanisms or cryptographic arrangements for secret or secure communication including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials using a predetermined code, e.g. password, passphrase or PIN
- H04L9/3231—Biological data, e.g. fingerprint, voice or retina
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L2209/00—Additional information or applications relating to cryptographic mechanisms or cryptographic arrangements for secret or secure communication H04L9/00
- H04L2209/50—Oblivious transfer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F19/00—Digital computing or data processing equipment or methods, specially adapted for specific applications
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Wagh et al. | Falcon: Honest-majority maliciously secure framework for private deep learning | |
Lou et al. | Glyph: Fast and accurately training deep neural networks on encrypted data | |
Sanyal et al. | TAPAS: Tricks to accelerate (encrypted) prediction as a service | |
Zhang et al. | Additively homomorphical encryption based deep neural network for asymmetrically collaborative machine learning | |
Zhao et al. | PVD-FL: A privacy-preserving and verifiable decentralized federated learning framework | |
Yang et al. | A comprehensive survey on secure outsourced computation and its applications | |
Kwabena et al. | MSCryptoNet: Multi-scheme privacy-preserving deep learning in cloud computing | |
Riazi et al. | Chameleon: A hybrid secure computation framework for machine learning applications | |
Zhang et al. | Achieving efficient and privacy-preserving neural network training and prediction in cloud environments | |
Liu et al. | Oblivious neural network predictions via minionn transformations | |
De Cock et al. | High performance logistic regression for privacy-preserving genome analysis | |
US20230325529A1 (en) | System and method for privacy-preserving distributed training of neural network models on distributed datasets | |
Li et al. | Optimizing privacy-preserving outsourced convolutional neural network predictions | |
Xie et al. | BAYHENN: Combining Bayesian deep learning and homomorphic encryption for secure DNN inference | |
Lin et al. | A generic federated recommendation framework via fake marks and secret sharing | |
Niu et al. | Toward verifiable and privacy preserving machine learning prediction | |
Wang et al. | Deep learning data privacy protection based on homomorphic encryption in AIoT | |
Baryalai et al. | Towards privacy-preserving classification in neural networks | |
Ibarrondo et al. | Banners: Binarized neural networks with replicated secret sharing | |
Fan et al. | Privacy-preserving deep learning on big data in cloud | |
Panzade et al. | Towards faster functional encryption for privacy-preserving machine learning | |
Deng et al. | Non-interactive and privacy-preserving neural network learning using functional encryption | |
Yuan et al. | A training scheme of deep neural networks on encrypted data | |
Grebnev et al. | Pitfalls of the sublinear QAOA-based factorization algorithm | |
Panzade et al. | FENet: Privacy-preserving neural network training with functional encryption |