Nothing Special   »   [go: up one dir, main page]

CN115589332A - System and method for implementing centralized privacy control in decentralized systems - Google Patents

System and method for implementing centralized privacy control in decentralized systems Download PDF

Info

Publication number
CN115589332A
CN115589332A CN202211401943.6A CN202211401943A CN115589332A CN 115589332 A CN115589332 A CN 115589332A CN 202211401943 A CN202211401943 A CN 202211401943A CN 115589332 A CN115589332 A CN 115589332A
Authority
CN
China
Prior art keywords
data
identifier
unique identifier
time
dynamically changing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211401943.6A
Other languages
Chinese (zh)
Inventor
M·G·拉夫埃韦尔
T·N·迈尔森
史蒂文·梅森
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Datawing Intellectual Property Co.,Ltd.
Original Assignee
Data Wing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/963,609 external-priority patent/US10572684B2/en
Application filed by Data Wing Co ltd filed Critical Data Wing Co ltd
Publication of CN115589332A publication Critical patent/CN115589332A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/04Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks
    • H04L63/0407Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks wherein the identity of one or more communicating identities is hidden
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • G06F21/6254Protecting personal data, e.g. for financial or medical purposes by anonymising data, e.g. decorrelating personal data from the owner's identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/12Payment architectures specially adapted for electronic shopping systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/385Payment protocols; Details thereof using an alias or single-use codes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/32Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
    • H04L9/3236Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials using cryptographic hash functions
    • H04L9/3239Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials using cryptographic hash functions involving non-keyed hash functions, e.g. modification detection codes [MDCs], MD5, SHA or RIPEMD
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/50Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols using hash chains, e.g. blockchains or hash trees
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q2220/00Business processing using cryptography
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L2209/00Additional information or applications relating to cryptographic mechanisms or cryptographic arrangements for secret or secure communication H04L9/00
    • H04L2209/42Anonymization, e.g. involving pseudonyms

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Bioethics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Accounting & Taxation (AREA)
  • Software Systems (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Finance (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Storage Device Security (AREA)
  • Computer And Data Communications (AREA)

Abstract

Systems and methods are disclosed for implementing centralized privacy control in a decentralized system. Systems, computer-readable media, and methods of improving data privacy/anonymity and data value, where data related to a data subject may be used and stored, for example, in a distributed classification data structure, such as blockchain, minimizing the risk of re-identification of unauthorized parties, and enabling information related to the data by granting access only to data related to the authorized party's purpose, time period, location, including quasi-identifiers and/or by other criteria that confuse particular data values, for example, according to european union general data protection regulations or other similar regulatory plans. The techniques described herein maintain this level of privacy/anonymity while still meeting the invariance, auditability, and validation of transactional data dispersed storage required by blockchain and other distributed ledger techniques. Such systems, media, and methods may be implemented on classical and quantum computing devices.

Description

System and method for implementing centralized privacy control in decentralized systems
The application is a divisional application of patent application 201880044101.5, filed on 2018, 4, month and 27, entitled "system and method for implementing centralized privacy control in decentralized systems".
Cross reference to related applications
This application claims priority from U.S. patent application 15/963,609, filed on 26/4/2018, entitled "system and method for implementing centralized privacy control in decentralized systems"; U.S. provisional patent application No. 62/491, 294, filed on 28/4/2017, month 4, "anonsizing collects and shares medical data"; U.S. provisional patent application No. 62/535,601, filed on 21/7/2017, entitled "dynamic pseudonym in Anonos retention format"; U.S. provisional patent application No. 62/554,000, entitled, "analysis of global data protection regulations in compliance with Anonos", filed on 4.9.2017; U.S. provisional patent application No. 62/580,628, filed 11, 2, 2017, "Anonos BigPrivacy compressible Dynamic De-Identifiers"; U.S. provisional patent application No. 62/644,463, filed on 17.3.2018, entitled "anoonosbigprivacygdpr compliant blockchain system and method"; and U.S. provisional patent application No. 62/649,103, "data compliance in bigbrivacy cloud," filed on 2018, 3, 28, the entire contents of which are incorporated herein by reference in their entirety.
Technical Field
The present disclosure relates generally to improving data security, privacy, and accuracy, and more particularly to presenting data elements using dynamically changing identifiers, which may be anonymous, e.g., as stored in Distributed hedger Technology (DLT), e.g., as blockchain. ( Note: "private" and "anonymous" are used interchangeably herein to refer to data protection, privacy, anonymity, pseudonymy, tarnish, and/or other activities available to legal entities, which may be natural and/or artificial persons, such as business entities or corporate entities or legal entity groups, to isolate, isolate or delete information about themselves from unauthorized parties, thereby selectively providing information about themselves. The phrase "Distributed Ledger Technology" or "DLT" as used herein refers to a common data storage element that includes replicated, shared, and/or synchronized digital data, which may be geographically Distributed across multiple sites, countries, or organizations, for example. With DLT, there is typically no central administrator or centralized data store. Examples of using DLT include: blockchains, cryptocurrencies, smart contracts, and even decentralized file storage. )
Background
This section is intended to provide a background or context to the invention that is recited in the claims. The description herein may include concepts that could be pursued, but are not necessarily ones that have been previously conceived, implemented, or described. Thus, unless otherwise indicated herein, what is described in this section is not prior art to the description and claims in this application and is not admitted to be prior art by inclusion in this section.
There are certain inherent conflicts between: (i) The goal of each party to maximize data value and their goal of respecting personal privacy rights; (ii) The goal of individuals to protect their privacy and to benefit from highly personalized products; (iii) U.S. and international government agencies have promoted research and business goals and the goal of maintaining citizen rights.
One goal of parties unrelated to healthcare is to reach the most "highly qualified" potential customers, i.e., potential buyers with the necessary financial resources, motivations, and authorization to make purchases. Business parties will pay more money than reaching non-differentiated potential customers, because the chances of reaching a transaction to an eligible potential customer are much higher, thanks to their interests, tendencies, and the way the transaction is completed. The level of product personalization/customization for a potential customer, which is directly related to the potential customer's likelihood of completing a transaction, is enhanced by the depth and scope of information available about each potential customer. One goal of healthcare-related parties is to conduct health and/or disease-related studies to facilitate discovery in applications that may improve human health.
The development, emergence and widespread adoption of computer networks, internets, intranets and support technologies has led to the widespread adoption of cost-effective technologies for collecting, transmitting, storing, analyzing and using information in electronic format. As a result, entities can now easily collect and analyze large amounts of information. This creates a tension in the following respects: (one) more and more information is available to satisfy prospective customers, develop personalized/customized products for the prospective customers, and/or conduct health-related or other research; and (ii) reduces the security, anonymity and privacy of individuals who are typically unaware of the existence of many data elements that may exist traceable to them, and which typically have little or no effective control at all.
Data elements may be collected online and offline by a variety of sources ("digital birth" and "analog birth" and later converted to digital format), including but not limited to activities on social networking sites, electronic or digital records, e-mail, programs for participating in rewards or reward card programs that may track purchases and locations on the internet, browsing or other activities, and activities and purchases at brick and mortar stores and/or e-commerce sites. Merchants, medical-related and other service providers, governments, and other entities use the vast amount of data collected, stored, and analyzed to suggest or locate patterns and correlations and draw useful conclusions. This data is sometimes referred to as "big data" because a large number of information entities may now be collected. With big data analysis, entities can now release and maximize the value of the data-one example might involve non-health related entities engaging in behavioral marketing (the material created for distribution is customized to try to improve the relevance of preferences related to someone). A particular recipient), another example may involve a health-related entity accessing large data for medical research. However, through behavioral marketing and big data analysis, the level of privacy and anonymity of the interested parties is now much lower.
Historically, attempts to reconcile conflicts between privacy/anonymity and value/personalization/research often involved the use of alternative identifiers rather than real names or identifying information. However, these alternative identifiers are typically statically assigned and persist over time. Static identifiers are easier to track, identify and cross-reference to determine true identity, and can be used to determine other data for a subject associated with a data element without the consent of the interested party. Privacy and information experts have expressed a concern that re-identification techniques may be used with data associated with static identifiers and challenge that in certain practices, data identified by a particular computer, device, or activity (i.e., by an associated static identifier) may be considered anonymous. When the identifier does not change over time, the countering entity will have unlimited time to accumulate, analyze additional data, even exogenous data, and associate it with the persistent identifier, thereby determining the true identity of the principal and associating other data with the true identity. Furthermore, the infinite time provides an opportunity for a countering entity to perform a time-consuming brute force attack that may be applied to any encrypted data.
According to a report from the global institute of mackentin 2011:
● The retailer can improve the operating profit rate by more than 60% by fully utilizing the big data;
● Utilizing public sector big data has great potential-if the U.S. healthcare department were to creatively, effectively utilize big data to improve efficiency and quality, the industry could create a value of over 3,000 billion dollars per year-two-thirds of the way to reduce U.S. healthcare expenditures by about 8%;
● In the european developed economy, government administrators may save operating efficiencies in excess of 1000 billion euros (1490 billion dollars) by using big data, which does not include the use of big data to reduce fraud and errors and increase revenue; and
● A user of a big data service enabled with personal location functions may obtain $ 6,000 million of consumer remainders.
Due to the confusion of ownership/use of underlying data, the tension associated with the privacy of underlying data, and the consequences of inaccurate analysis due to erroneous data collected from secondary (as opposed to primary) sources, many of the potential benefits of big data have not been fully realized and/or the consequences inferred from the activities of parties who have not been actively engaged or authenticated by such parties.
Many of the potential benefits of big data have not been fully realized due to the confusion of ownership/use of the underlying data, the tension associated with the privacy of the underlying data, and the consequences of inaccurate analysis due to errors. Decentralized networks or platforms (including unlicensed systems and distributed ledger technologies such as blockchain), including networks based on peer-to-peer or other non-centrally linked networks or platforms, further increase the difficulty in maintaining a desired level of privacy/anonymity for users while still allowing for proper extraction of information value and/or personalized services provided by authorized third parties. In particular, due to the requirements of distributed ledger technology in terms of indelibility, auditability and verification, it has heretofore not been possible to provide a high level of privacy/anonymity, at least due to the necessarily static nature of the information recorded therein. Such a distributed ledger.
What is needed are systems, methods, and apparatus that overcome the limitations of static and/or persistent privacy/anonymity and security systems and improve the accuracy of data exchange, collection, transaction, analysis, and other uses, particularly in those that use distributed ledger techniques to store data in a decentralized manner, such as blockchain. In other words, the privacy/anonymity enhancement techniques provided herein may help reconcile tension between auditable and immutable information stores by providing a tool that enables authorized users to unlock the "true" meaning of such information. At a particular time and context.
Abstract
Embodiments of the present invention can improve data privacy and security by maintaining "dynamic anonymity," i.e., anonymity, of the subject to which the data pertains, as long as needed for the desired degree of time. Embodiments of the invention may include creating, accessing, using (e.g., processing, copying, analyzing, combining, modifying, distributing, etc.) data that is stored and/or erased with increased privacy, anonymity, and security to facilitate obtaining more qualified and accurate information. Also, when data is authorized to be shared with a third party, embodiments of the invention may facilitate sharing information in a dynamically controlled manner that enables time, geographic, and/or purpose-limited information to be delivered to a recipient. Embodiments of the invention may even be used in decentralized networks built on blockchain or other distributed ledger technologies that require invariance and auditability to be recorded over time.
In contrast to existing systems, where electronic Data is readily accessible for use (e.g., collection, processing, replication, analysis, consolidation, modification or dissemination, etc.), where control of storing and/or erasing the Data is rare, embodiments of the present invention may enable temporary use of unique dynamic change de-identifiers ("DDIDs") -each relating to or to a Subject (e.g., a person, place or thing (e.g., an event, document, contract, or "smart contract")) to which the Data is directly or indirectly related (a "Data Subject"), and/or actions, activities, processes, and/or features related to the Data Subject, for a unique period of time, thereby enabling the Subject to operate in a "dynamic anonymous" manner. "Dynamic Anonymity" or "Dynamic Anonymity" as used herein refers to the ability of a user to remain anonymous until a decision is made that Anonymity is not maintained, at which point there is only one or more actions, activities, processes, or traits related to one or more desired parties. Thus, embodiments of the invention may enable Data subjects to maintain a flexible level of privacy and/or anonymity under the control of the Data subjects or controlling entities which may be trusted parties or agents.
Embodiments of the present invention may use DDIDs to help prevent retention of Data, sometimes referred to as metadata, that may otherwise provide information about one or more aspects of a Data Subject and/or Data attributes reflecting actions, activities to a third party. The Data object may be associated with a process and/or characteristic such as, but not limited to, information relating to the manner of creation, purpose, time and/or date of creation, the identity and/or creator Data attributes of the Data object, the location where the Data attributes are created, the criteria used in creating or using the Data attributes, etc. This is due to the fact that: in order to establish a persistent record of information relating to one or more specific data attributes, metadata must have content attached to or associated with itself. As used in this application, the terms "data," "attributes," "elements," or similar terms will include any or all of (a) structured data (i.e., data in a predetermined structured architecture), (b) unstructured data, (c) metadata (i.e., data about data), (d) other data, and/or (v) any of the above types of data, as originally recorded in an analog format, and then converted to a digital format, as appropriate.
Embodiments of the present invention may use a first DDID for a particular purpose relating to a first Data Subject, action, activity, process and/or feature at a time, and then use a second DDID with the first Data Subject, action, activity, process and/or feature for a different purpose, and/or use the first DDID with a second Data Subject, operation, activity, process and/or feature for a different purpose, etc. As a result, different DDIDs may be used in conjunction with different Data subjects, operations, activities, procedures, and/or characteristics and/or purposes-each unique in time-as different DDIDs may be associated with the same Data Subject, action, activity, procedure, and/or characteristic, and/or the same DDIDs may be invalidated by retention and aggregation attempts of Data associated with other underlying information related to the DDID.
Embodiments of the present invention may track and record the different DDIDs used by and associated with a Data object at different times for various actions, activities, processes, or characteristics, thereby enabling the storage, selection, and retrieval of information appropriate for the network. A particular action, activity, process or feature and/or a particular Data Subject. Conversely, because of the relationship between operations determined using multiple DDIDs and the lack of information available outside the system to determine DDIDs and/or Data subjects, the system may not enable third parties outside the system to effectively retain and aggregate Data, activities, processes, and/or features.
Each DDID may be associated with any one or more data attributes to facilitate identifying a particular action, activity, process, or trait, such as, but not limited to: (one) before replacing the current DDID, information reflecting the action, activity, procedure, or characteristic associated with the Data Subject (e.g., browsing information, reflecting the current Web-based activity of the Data Subject when associated with the current DDID) has a different DDID; (ii) information about past actions, activities, processes or features previously associated with a Data object, while associated with one or more previous DDIDs, but the Data object now wishes to share information with a third party while associated with a current DDID (e.g., share pricing information collected from an e-commerce website in a previous browsing session with the Data object while associated with a previous DDID; and (iii) new information when associated with the current DDID that can help facilitate the performance of a desired action, activity, process or trait on behalf of the Data object (e.g., indicating a new desired size and color of a currently desired garment to be obtained from the currently purchased garment).
From the point of view of implementation of the Dynamic authentication embodiment as a closed system, it is required that the DDID (i.e. the "primary identifier") representing the identity of the Data body is a temporally unique process for assigning DDIDs to Data subjects over this period of time-i.e. neither of the two existing Data bodies can have the same primary identifier DDID at the same time-i.e. no two existing Data subjects can have the same primary identifier DDIDs at the same time. The requirement of temporal uniqueness of DDIDs applies when it is desired to express the separability of Data Subject identities; if it is desired to represent other factors than the separation of identities of Data subjects by DDIDs, DDID assignments may be made accordingly to represent expected associations, relationships, etc. When the identity separation of Data subjects is desired to be represented by DDIDs, the DDIDs can be instantiated in two ways: either (one) invented within this implementation or (two) by externally created identifiers, but provided that they meet the requirement of being "temporarily unique" (e.g., a "cookie" or other unique identifier assigned by the website to the first visitor can be effectively used as a DDID).
A Cookie is a small piece of Data, typically sent from a website and stored in the Data Subject's web browser as it browses the website, so each time a Data Subject returns to the website, the browser sends a Cookie back to the server associated with the website to inform the website that a Data Subject has returned to the website. However, in order for cookies to be used as DDIDs, the browser (which in this potential embodiment of the invention acts as a client) can prevent any cookies submitted by the website from persisting between browsing sessions (e.g., by copying the user's cookies, caches, and browsing history files to the server of the anonymity system and then deleting them from the user's computer), so that a new Cookie can be assigned for each browsing session. In this way, while the various cookies published by the website (which, in this example embodiment, serve as DDIDs representing the Data Subject identity separation) are created "outside" the system, each cookie is unique and does not enable the website to remember the state information or aggregate the browsing activity of the Data subjects, as the website will treat each browsing session as irrelevant, so that the Data subjects can remain dynamically anonymous to the extent needed.
As described in the example potential embodiments above, according to some embodiments, the Dynamic authentication system may collect and retain information about various actions, activities, processes, or characteristics associated with different browsing sessions/different cookies (in this example, used as a DDID to represent the separability of the Data Subject identities) and store the merged information in the aggregated Data profile of the Data Subject until a decision is made by or on behalf of the Data Subject that no longer remains anonymous, at which point one or more required parties associated with one or more actions, activities, processes, or characteristics need only share the required information in the Data Subject aggregated Data profile. In this exemplary embodiment of the invention, this may involve the Data Subject deciding to provide information from its aggregated Data profile to the website as a TDR that reflects the past activity of the Data Subject on the website-all under election and control-a Data Subject (or other control entity). In the above-described exemplary embodiments of the present invention, instead of using a cookie assigned by the website accessed by the Data Subject as a DDID, the system may instead use a Globally Unique Identifier (GUID), i.e., a unique reference number used as an identifier in computer software), or other temporally unique, dynamically changing proxy de-identifier, a DDID created either internally or externally to the implementation of the present invention. In the above example, the control of Data collection resulting from browsing activity by a Data object would be attributed to the Data object or other controlling entity, rather than the website that the Data object visits. In other exemplary embodiments of the invention, the website (with appropriate permissions and authentication) may request, i.e., "pull," the relevant information, rather than having the Data Subject decide when to send (i.e., "push") the information from the Data Subject's aggregated Data profile to the website. And/or the associated DDID in the aggregate Data profile of a Data Subject to Data Subject, which is needed by the website at this time.
In other exemplary embodiments of the invention, the work of dynamically anonymizing and controlling the sending of the relevant portions of the aggregated data profile for DataSubject may be handled in the following manner: the client device itself of DataSubject; the above-mentioned central dynamic authentication system; or a combination of both. For example, a full view of information for a particular DataSubject and/or associated DDID to DataSubject's associated information may be reserved for a predetermined or flexible time, stored in the DataSubject's client device, reserved for a predetermined or flexible time, and then resynchronized back to the central dynamic anonymity system (and synchronized with any other client devices that the DataSubject may have registered with the central anonymity system).
TDRs and DDIDs may include multiple levels of abstraction for tracking and identification purposes. A system according to some embodiments of the invention may store a TDR (consisting of a DDID value and Data elements associated with the DDID, if any) and information, subject, data attribute, action, activity, procedure or feature regarding the period of time that each DDID is associated with a particular Data Subject-thereby allowing the TDR to be re-associated with a particular Data Subject, data attribute, action, activity, procedure or feature at a later time. Such a system may be used to facilitate development of aggregated Data profiles by reference to and use of keys that reveal relationships between various DDIDs, data subjects, data attributes, actions, activities, processes and/or features-in other words, data subjects may benefit from evolving technological advancements (e.g., internet of things (IoT), personalized medicine, etc.) by using "Dynamic Anonymity" provided by TDRs and/or DDIDs, as described herein, without having to forego privacy, anonymity, security or control. This can be achieved by: assigning a unique dynamically changing DDID to Data Subject, action, activity, process and/or feature; (ii) retaining information about the association of DDIDs with Data subjects, actions, activities, procedures and/or features; and (three) providing deterministic control of access/use of association information for Data objects and/or controlling entities that may be trusted parties/agents. By using dynamically variable, temporally unique and re-assignable DDIDs, current systems and processes (e.g., web browsers and data analysis engines) may not be able to identify relationships between disassociated and/or replaced data elements. They can still use existing functionality to process information, but will do so without creating inferences, associations, profiles or conclusions (unless explicit authorization is obtained for Data object and trusted party/agent). Furthermore, the DDID employed by embodiments of the present invention may be dynamically replaced at the Data element level, thereby implementing Dynamic authentication — not just at the Data Subject level or the Data record level. This means that individuals can control which data is shared or accessed, thereby enabling dynamic de-identification without "diminishing" the value of the underlying information.
Information control to the Data element level makes it possible to achieve controlled information sharing in the big Data era-beyond the control scope of controls for only the Data record level or Data Subject level. It may also enable a "once and for all relationship" between a Data Subject and a website or other entity that receives information about the Data Subject. Over time, most existing systems collect information around unique identifiers. Even if the DDID carries a certain amount of history or other information about the Data Subject, the Data Subject visits a site, a store, a doctor, etc. the next time. If desired, the Data subjects may appear to be completely different Data subjects. Only if the DDID contains a unique identifier (e.g. a name or email address) can the recipient associate the then-current DDID representing the Data Subject with the DDID previously used to represent the Data Subject, at which point the recipient can interact to use the Data Subject with the Data Subject in accordance with the Data that the recipient collects on the Data Subject. However, the next time a recipient encounters a Data Subject, it will not be able to be re-identified unless it so wishes.
Dynamic immunity also determines the name and context (e.g., time, purpose, location) of a Data Subject and/or controlling entity for Data, identity (e.g., by obfuscating the aforementioned associations between the aforementioned) thus, dynamic immunity can also revoke or revoke granted rights or access to Data (e.g., a particular party can be provided access to the DDID underlying Data and then revoke its access rights by altering the replacement key), and updates to Data (i.e., the value of the Data, not necessarily re-identified) to support other authorized secondary uses without violating commitments to a Subject (e.g., one or more DDIDs may initially provide access to X-ray results through one or more replacement keys and, by altering the replacement keys, the X-ray results and subsequent results can then be reflected-on physical therapy).
The reason that it remains attractive in the commercial market is that companies are often not really concerned about who the DataSubjects with which they interact (i.e., their actual "real world" identities); but they are concerned that DataSubject is; data Subject behavior; and when DataSubjects behave in this manner. The more accurate their location, the less waste, and the more likely the anonymous consumer will respond to the personalized product. Thus, dynamic Anonymity avoids companies following data bodies in the digital world in an attempt to persuade them to purchase products and/or services that they may really not need or want. Dynamic Anonymity allows a more favorable "match" between the seller and the interested customer. Currently, the best that many companies can do is to "segment" potential customers by using demographic data, but they may not know the actual interests of the individual segment market members. Dynamic Anonymity also improves overall demographics and statistics by providing "highly qualified" individualized expression/interest expression levels for members of a segment audience group. The ability of Dynamic Anonymity to enable a Data Subject to control its Data usage, directly or indirectly, in accordance with its personal privacy/Anonymity preferences may support different processing of Data in different jurisdictions, despite different Data usage/privacy/Anonymity requirements in such jurisdictions.
In healthcare, medical-related and other research areas, dynamic Anonymity will be more attractive than traditional "de-identification" methods that protect data privacy/Anonymity by using defensive methods-e.g. applying a series of masking steps to direct identifiers (e.g. name, address) and masking and/or statistical-based operations to pre-identifiers (e.g. age, gender, occupation) to reduce the likelihood of re-identification by unauthorized third parties. This defense method of protecting data privacy/anonymity results in a trade-off between preventing re-identification and retaining access to the available information. In contrast, using Dynamic authentication can preserve the value of information that all statistically does not risk re-identifying any benchmarks and can be used for authorization purposes. DDIDs can be used to represent actions, activities, procedures, and/or features between Data subjects, whose meaning may change over time, requiring the most recent appropriate key at the time to identify the underlying value. Thus, dynamic Anonymity refuses claims and traditional dichotomy that, in order to minimize/reduce the risk of anonymous loss, one must sacrifice information content by making it never recoverable. Instead, dynamic Anonymity minimizes the risk of loss of privacy/Anonymity and the amount of information lost, so that most, if not all, information can be recovered, but only by authorization.
The key used by embodiments of the present invention may vary depending on the use of the corresponding DDID. For example: time keys ("TKs") may be used to associate a DDID with a Data object, operation, activity, process, and/or feature for an associated time period, i.e., a time period for which a TDR exists; an association key ("AK") may be used to show an association between two or more data elements and/or TDRs that may not be distinguishable from each other due to the use of different DDIDs; if/when a DDID is used to replace one or more data attributes in the TDR, a replacement key ("RKs") may be used, in which case the DDID(s) contained in the value TDR of the replaced one or more data attributes may be determined with reference to a look-up table.
If a third party intercepts information related to one or more Data subjects, actions, activities, processes, and/or the like, no access is available to the applicable TK, AK, and/or RK. By nature, the third party will not be able to: (one) in the case of the association function of the present invention, data Subject is re-identified by associating a DDID with the corresponding Data attribute (which together comprise a TDR); and/or (ii) in the case of the alternative functionality of the present invention, the value of the data element represented by the DDID is known in order to correctly understand the information. Rather, embodiments of the present invention may enable a Data object or other controlling entity to send only those Data attributes (which the system, through its tracking/logging/recording functionality, relates to) to one or more desired third party systems, which are particularly relevant to a particular action, activity, process or trait.
According to various embodiments described herein, the following terms may also be used in connection with anonymous data:
"A-DDID" or "Association DDID": refers to a DDID that is used to replace a value that identifies a data element and dereferences (e.g., points to) the data element, thereby conveying a relationship (association or correlation between the data element and its value) such that an index that assigns an information value in a non-identifying manner (optionally, according to specified grouping rules) for resolving dereferences can include, but is not limited to, a key, a schema translation table, an anonymous identifier, a pseudonym, a token, or other representation. The dereference grouping rule for a-DDID may be (at least) two groups: digital grouping and classification grouping. The number packet refers to a range of values represented by a-DDID. The classification packet replaces the "association" (i.e., two or more related or complementary items) with a-DDIDs that are selected to represent the correlation between the values in each packet category. The a-DDID dereferencing rule may also cover multiple fields. For example, a blood test may encompass many variables from which the risk of a heart attack may be inferred, so a rule may specify various combinations needed to assign the risk of a heart attack to a particular category (e.g., high, medium, or low).
"R-DDID" or "Replacement DDID": refers to a DDID that can be used to replace a value that identifies a data element and dereferences (e.g., points to) the data element.
"mosaic effect" refers to the ability to re-identify a body of data by correlating data between and among seemingly anonymous sets of data.
Various systems, methods, and apparatus for privacy and security management and use of information related to one or more Data subjects, such as people, places, or things, and related actions, activities, processes, and/or features are disclosed herein. The systems, methods, and devices described herein can abstract Data related to a Data object, action, activity, process, and/or feature by linking elements related to the Data to independent or dependent attributes, and separating elements related to the Data into independent or dependent attributes. For purposes of this disclosure, attributes refer to any Data element that can be used, either alone or in combination with other Data elements, to directly or indirectly identify a Data object, such as a person, place, or thing, and an associated action, activity, flow, and/or feature. It should be noted that a Data object may have a property or combination of properties unique to that Data object: for example, the social security number of a single Data Subject, and the attributes or combination of attributes that the Data Subject shares with other Data subjects: for example, the gender or affiliation with a political party of a certain Data object. In some cases, the attribute may be an electronic or digital representation of the Data object or associated action, activity, process, and/or feature. Similarly, an attribute may be an electronic or digital representation of information or Data related to a Data Subject or associated action, activity, process, and/or feature. Separating, linking, combining, rearranging, defining, initializing, or augmenting attributes, combinations of attributes can be formed in relation to any particular Data object or group of Data objects or related actions, activities, processes, and/or features. With respect to any Data object, action, activity, process, and/or feature, the combination of attributes may include any combination of attributes, as well as other Data added to or combined with the attributes. It should also be noted that one attribute or combination of Data attributes may identify a Data object, but are not themselves Data objects-the individual or legal entity identified by an attribute or combination of Data attributes may be the Subject of the attribute or combination of Data attributes and be considered an associated party in relation thereto, as he/she/with the attribute or entity is interested in or associated with the combination of Data attributes.
In some embodiments, one or more features or aspects of the present disclosure may be implemented using a client-server architecture or architecture, whether internal to an enterprise or in an enterprise, private cloud, or public cloud. A hybrid cloud, or any combination of the foregoing, whereby in one example, a privacy server, which may be virtual, logical, or physical, provides functionality and/or services to one or more privacy clients, which may themselves be virtual, logical, or virtual clients. These privacy clients may reside on the Data master device, on the service provider device, may be accessed through the cloud network and reside in the cloud network, or may reside on the same computing device as the privacy server, and may initiate requests for such functions and/or services through interaction for Data attributes and/or association information of Data attributes with Data subjects, which is stored in a database on a hard drive or other storage element associated with the privacy server. For example, in response to a request for a function and/or service from one or more privacy clients, data attributes may be linked to or separated into independent or dependent attributes by a privacy server coupled to the database. It should be noted that embodiments of the present invention may use a single computer or computing device as both a privacy server and privacy client, while other embodiments may use one or more computers or computing devices located at one or more locations as both a privacy server and one or more computers or computing devices located at one or more locations as privacy clients. A plurality of system modules may be used to perform one or more of the features, functions, and processes described herein, such as but not limited to: determining and modifying attributes required by the attribute combination; distributing the DDID; tracking usage of the DDID; expired or reassigned existing DDIDs; enable or provide data associations relevant or necessary to a given operation, activity, process, or trait.
In one embodiment, the modules may include an abstraction module of the privacy server configured to, among other things: dynamically associating at least one attribute with at least one Data Subject, action, activity, process, and/or feature; determining and modifying required attributes that are relevant or necessary to a given action, activity, process or trait; generating, storing and/or assigning a DDID to at least one data attribute to form a TDR; and assigns a predetermined expiration time to the TDR through the DDID component of the TDR.
These system modules, and other modules disclosed herein if desired, may be implemented in program code executed by a processor in a private server computer or in another computer in communication with the private server computer. The program code may be stored on a computer readable medium accessible by the processor. The computer readable medium may be volatile or nonvolatile, and may be removable or non-removable. The computer-readable medium may be, but is not limited to, RAM, ROM, solid state memory technology, erasable programmable ROM ("EPROM"), electrically erasable programmable ROM ("EEPROM"), CD-ROM, DVD, magnetic tape, magnetic disk storage, or other magnetic or optical storage device. In some embodiments, the privacy client may reside on or be used in a "smart" device (e.g., in wearable, removable or non-removable electronic devices, typically connected to other devices or networks via different protocols such as bluetooth, NFC, wiFi, 3G, etc.), may operate somewhat interactively and autonomously, and the smartphones, tablets, laptops and desktop computers, as well as the privacy client, may communicate with one or more privacy servers that process and respond to information requests from the privacy client, such as requests Data Subject associations regarding Data attributes, attribute combinations and/or Data attributes to requests.
In one implementation of the invention, the DDID associated with the attributes and attribute combinations may be limited in scope and duration. Furthermore, DDIDs may be re-assigned such that a DDID may reference multiple Data subjects or multiple actions, activities, processes or traits at different points in time.
The DDIDs may be reallocated on a configurable basis to further abstract and dilute or attenuate the data traces while preserving the timeliness and significance of the TDRs and the data they contain. In one example, rather than storing, transferring or processing all Data attributes related to a Data object and/or all Data attributes related to or necessary for a given action, activity, procedure or trait, embodiments of the present invention may introduce an initial layer of abstraction through an association function, for example by including only a portion of the related Data attributes in each TDR. In this way, data attributes associated with a Data object may be separated among the seemingly unrelated TDRs, such that one or more AKs need to be accessed and used to know which two or more TDRs must be associated with each other in order to collectively contain all of the Data attributes associated with the Data body and/or all of the Data attributes associated with or necessary for a given operation, activity, process, or trait. Privacy, anonymity and security of data attributes contained or referenced in a TDR may be further improved or enhanced by alternative functions, e.g. by replacing one or more of said data attributes contained in one or more TDRs with a DDID for access. In order to use a look-up table to determine the value of one or more data elements replaced by the one or more DDIDs, one or more RKs must be used. By using other known protection techniques, such as encryption, tokenization, pseudonymization, hiding, and/or other means; and/or by introducing additional layers of protection, the privacy, anonymity and security of the data attributes contained or referenced in the TDR may be further improved or enhanced, abstracted by replacing the keys with second level or n level DDIDs.
In the following two cases: disassociate Data attributes related to Data objects, actions, activities, processes and/or features, thereby requiring AK; and Data attributes, processes and/or features relating to Data Subject, action, activity are replaced so as to require the RK, effective levels of privacy, anonymity and security may be increased according to the manner and/or frequency of alteration of the DDID associated with the Data attribute or attribute in question. In an exemplary embodiment of the present invention, the DDID may be maintained for separation and/or replacement purposes and its initially assigned value (i.e., permanently assigned). In another exemplary embodiment of the present invention, a DDID may be assigned for disassociation and/or replacement purposes and retain its initially assigned value until the value is changed on a temporary basis, i.e., "temporarily changed". In yet another exemplary embodiment of the present invention, a DDID may be assigned for disassociation and/or replacement purposes and retain its initially assigned value until the value is changed on a random, fixed, variable or other dynamic basis, i.e., "dynamic variability.
Embodiments of the invention may create additional layers of abstraction by replacing identifying references within the system to external networks, the internet, intranets and/or computing devices that may be integrated or in communication with one or more embodiments of the invention. The invention with DDIDs therefore must have one or more RKs and/or AKs to access and use the look-up table to determine the identity of one or more DDIDs by said one or more external networks, internet, intranet and/or computing devices.
Because of the variable, temporally unique and re-assignable nature of the DDID paired with the data attribute or attribute combination to create the TDR, the recipient of the TDR can exclusively use the information contained in the TDR at the expected time. This is due to the fact that: the association key (which may require stitching TDRs together to understand the information contained in the TDRs that appears to be irrelevant) and/or the replacement key (which may require knowledge of the value of the information represented by the temporary unique DDID) (transmitted to a third party as part of the TDR) may be only limited in time. In other words, the utility is limited in time because when the intended purpose and/or the intended time is no longer applicable, the Data Subject or other controller may alter the DDID component of the TDR, without the AK and/or RK being able to no longer reveal the relevant information. Conversely, the relevant information revealed by the AK and/or RK may change over time to support other secondary uses of the data.
In one example, the maintenance module may be used to store information about associations having a particular combination of attributes at any particular point in time for a particular DDID in a TDR in a secure database associated with the privacy server and accessible by the privacy server. The system is not accessible by parties other than the controlling entity, nor by parties authorized by the controlling entity (this associated time period may be represented by a Time Key (TK) or otherwise). In an example, a maintenance module of the privacy server and associated database may store and maintain all associations of DDID and attribute combinations. Thus, the system provides secure data exchange and non-repudiation of data attributes, attribute combinations and TDRs to facilitate more secure collection, use, research and/or analysis of data related thereto while meeting stringent privacy, anonymity and security criteria.
In one example, the verification module of the privacy server and associated database may provide an authenticated data structure that allows for verification and verification of the integrity of the aggregated data profile, the information contained in the data attributes and/or the DDID. Attribute setting for attribute combinations and/or TDRs at any point in time by methods such as cyclic redundancy check ("CRC"), message authentication code, digital watermark, link-based timestamp, or the like
In another example, the authentication module of embodiments of the present invention can be used to anonymously verify the right to process for a Data object, action, activity, procedure, or characteristic at a particular time. And/or by TDR allocation. A privacy client with TDR information may request an authentication module (in one example, part of a privacy server) to confirm whether the TDR (and unpublished Data object, data attributes, or combination of attributes associated therewith) is authorized to participate in the requested action, activity, process, or trait at a particular time and/or place. In one embodiment, the authentication module may compare the DDID included in the TDR to a list of authorized DDIDs to determine an authorization status for expected action, activity, process, or trait participation at a specified time and/or place. Alternatively, the authentication module may request, via DDID validation or other validation techniques (e.g., password validation), that the party in possession of the TDR is entitled to participate in or multi-factor authentication at a specified time and/or location with respect to a specified activity, process, or trait. In one example, if an optional authorization request is made, the process continues only if the party is authorized. The authentication module may send authorization status information to a party controlling the TDR via the privacy client, and the authorization status may be used to allow or deny performance for a desired action, activity, process, or trait at a specified time and/or place.
The TDR and/or DDID contained in the TDR may also be used as a high level key for known protection techniques such as encryption, tokenization, pseudonymization, omission or otherwise. The certificate module may be used to retain the keys required to unlock the TDR content's protection technology (e.g., encryption, tokenization, pseudonymization, hiding, or otherwise) unless the TDR, DDID, undisclosed associated Data Subject, attribute, combination of attributes, or related content is validated by the DDID and/or TDR and known validation techniques (e.g., password validation, multi-factor authentication, or the like) confirm that a party is authorized to participate in the required action, activity, process, or trait at a specified time and/or location.
In another example, an access log module may be provided, wherein in the event of system or privacy server errors and/or abuse, the access log module may collect and store information to enable post-event forensic analysis.
According to one aspect of one embodiment of the present invention, disclosed herein is a computer-implemented method of providing controlled distribution of electronic information. In one example, the method may include the steps or operations of: receiving data on a computing device; identifying one or more attributes of the data; selecting, by a computing device, a DDID; associating the selected DDID with one or more data attributes; and creates a temporally unique data representation (TDR) from at least the selected DDID and the one or more data attributes.
In one example, the step of selecting a DDID may comprise: generating a temporally unique, dynamically changing DDID; alternatively, in another example, a temporally unique, dynamically changing value created outside the system is accepted or modified to be used as a DDID.
For this purpose, the phrase "dynamically changing" refers to the DDID assigned relative to the data body, action, activity, process, or characteristic: (ii) changes over time due to (i) passage of a predetermined amount, (ii) passage of a flexible period of time, (iii) expiration of the purpose of creation of the DDID, or (iv) a change in a virtual or real-world location associated with a data body, action, activity, process or feature; or (ii) differ at different times (i.e., the same DDID is not used at different times) with respect to the same or similar data bodies, operations, activities, processes, or features.
For purposes herein, the phrase "temporally unique" refers to a period of time that a DDID is assigned to a data body, action, activity, process, or trait that is not infinite. The initial assignment of a DDID to a data body, action, activity, process, or feature begins at some point in time, and information about the time of the assignment is known and, in some implementations of the invention, may be used to identify a relationship between a relational DDID and the data body, action, activity, process, or feature. If the period of time in which a DDID is assigned to a data body, action, activity, process, or feature ends at a discrete point in time, information about the assignment termination time is known, and in some implementations of the invention, may be used to identify a relationship or association between the DDID and the data body, action, activity, process, or feature.
For purposes herein, the term "policy" may refer to, but is not limited to, one or more methods of programmatically implementing mathematical, logical, sampling, or other functions on a data set (e.g., a data set of any number of dimensions) equal to or greater than any privacy-enhancing technique ("PET") for the enabled enforcement mechanism, including, but not limited to, public key encryption, k-anonymity, l-diversity, introducing "noise", differential privacy, homomorphic encryption, digital rights management, identity management, suppression, and/or row-wise, column-wise, any other dimension, any combination of dimensions, by discrete units, by any combination of discrete units, and by any combination of row, column, and discrete units, or any portion thereof, to summarize certain data.
For this purpose, the term "Non-organizing Data Element Value" (nadve) may denote, but is not limited to: the value displayed when re-identifying an a-DDID, or the value that would be displayed if a given a-DDID were to be re-identified. The NADEV may be generated by creating a derivative or correlated version or subset of one or more elements of the dataset to reflect the application of one or more PETs or other privacy and/or security enhancing methods to the dataset to restrict access to the dataset or at least selected portions of the dataset to all objects. For example, assuming that the data set contains a heart rate value of a data object of 65 beats per minute, the value of the data may be summarized into two nadvs, e.g., one specifying a "range of 61-70 beats per minute" and one simply specifying a "normal" signal-each nadve may be independently and independently suppressed or displayed without revealing the true data value of 65 beats per minute and without revealing the identity of the data subject.
In another example, the method may further include causing the association between the selected DDID and the one or more data attributes to terminate. In yet another example, the method may include: information about the time period over which the selected DDID is associated with different data attributes or attribute combinations by Time Key (TK) or other means is stored in a database accessible with respect to the computing device.
In another embodiment, the method may further include reassociation of the selected DDID with one or more other data attributes or attribute combinations after expiration of the association between the DDID and the one or more initial data attributes.
In one example, expiration of a DDID occurs at a predetermined time, or the expiration may occur after completion of a predetermined event, purpose or activity. In another example, a DDID may be authorized for use only for a given period of time and/or at a predetermined location.
In another example, the method may include changing the DDID associated with one or more data attributes, attribute combinations, and/or TDRs, where changing the DDID may occur randomly or programmatically, or may occur after the changing. Predetermined activity objectives and/or events are completed.
According to another aspect of another embodiment of the invention, a method for facilitating transactions over a network is disclosed herein, wherein the method may include operations of receiving, at a privacy server, a request from a client device to conduct an activity over the network; determining which of a plurality of data attributes or attribute combinations in the database is necessary to complete the requested activity; creating or accepting a DDID; associating the DDID with the determined data attributes to create a combined time-unique data representation (TDR); making the combined time-unique data representation (TDR) accessible to at least one network device to perform or initiate a requesting activity; receiving a modified temporally unique data representation (TDR) including other information related to the performed activity; storing association information of the modified time-unique Data representation (TDR) and/or DDID-Data Subject in a memory database.
In one example, the at least one network device may include an internet service provider, a server operated by a merchant or service provider, a server operated by a mobile platform provider, or a server in a cloud computing environment.
According to another aspect of another embodiment of the present invention, a method of providing controlled distribution of electronic information is disclosed herein. In an example, the method may include receiving, at a privacy server, a request to conduct an activity over a network; selecting attributes of data located in a database accessible to the privacy server that are determined to be necessary to satisfy the request, wherein other attributes of the data that are not determined to be necessary are not selected; assigning or accepting, to an abstraction module of the privacy server, assignment or acceptance of DDIDs to selected attributes and/or combinations of attributes to which they apply, wherein DDIDs do not display unselected attributes; recording the time for distributing the DDID; receiving an indication indicating that the requested activity is complete; receiving, at the privacy server, the DDID and the determined attributes and/or their applied attribute combinations, wherein the attributes are modified to include information related to the performed activity; and the time at which the performed activity is completed is recorded and the DDID and the determined attributes and/or their applied attribute combination are received at the privacy server.
In one example, the method may further include assigning an additional DDID to one or more selected data attributes and/or attribute combinations contained within the TDR. In another example, the method may include: the DDID and Data attributes are re-associated with the true identity of the Data attribute, combination of attributes or Data object using a Time Key (TK) that reflects the recorded time. The method may also include reassigning the DDID to other data attributes and recording the time at which the DDID was reassigned.
According to another aspect of another embodiment of the present invention, disclosed herein is a computer-implemented method of improving data security, wherein the data includes at least one attribute. In one example, the method can include associating at least one attribute with a DDID to create a temporally unique data representation (TDR); wherein the time-unique data representation (TDR) limits access to data attributes to only those attributes necessary to perform a given action, such as completing a purchase of an item from an online website.
In one example, the method may include: an Association Key (AK) is assigned to the time-unique data representation (TDR), wherein for authorized access to the time-unique data representation (TDR) the Association Key (AK) is required to be accessed.
In another example, the method may further include expiring the association between the DDID and the at least one attribute, wherein the expiration occurs at a predetermined time and/or the expiration may occur after completion of a predetermined event, and/or is active. In another embodiment, the method may include reassociation the DDID with at least one different attribute after expiration of the association between the DDID and the at least one attribute. The method may further include storing information in a database relating to one or more time periods in which the DDID is associated with a different data attribute or combination of attributes reflected by an applicable Time Key (TK).
According to another aspect of another embodiment of the present invention, a system for improving electronic data security is disclosed herein. In one example, the system can include a module configured to dynamically associate at least one attribute with at least one Data object, action, activity, procedure, and/or feature; the module is configured to generate or accept a DDID, and is further configured to associate the DDID with the at least one data attribute; the module is configured to track activity related to the DDID and to associate any other electronic data generated by the activity with the DDID; and means for storing the DDID, the tracked activity, and the time period of the activity tracked using the DDID.
According to another aspect of another embodiment of the present invention, disclosed herein is an apparatus for secure private activities over a network. In one example, the apparatus may include a processor configured to execute program modules, wherein the program modules include at least a privacy client; a memory coupled to the processor; and a communication interface for receiving data over a network; wherein the privacy client is configured to receive a time-unique data representation (TDR) from a privacy server, the TDR comprising a DDID and associated data attributes necessary for activity on the network.
In one example, the privacy client may be further configured to capture activity performed using the device and correlate the performed activity with a temporally unique data representation (TDR). In another example, the privacy client may be configured to transmit the captured activity and temporally unique data representation (TDR) to the privacy server. In one example, the privacy client may reside on the mobile device as a mobile application. In another example, the privacy client may reside in and be accessible over a network as a cloud-based application. In another example, the privacy client may reside as a local application on the same computing device on which one or more privacy servers reside.
In another example, the device may further comprise a geographic location module on the mobile device, wherein the time-unique data representation (TDR) is modified using information from the geographic location module, and wherein the time-unique data representation (TDR) restricts access to the information about the identity of the device. The device may also include a user interface configured to allow a user to modify the time-unique data representation (TDR), including the option of changing a DDID or data attribute associated with a particular time-unique data representation (TDR). The user interface may include selectable options for sharing the time-unique data representation (TDR) only with other network devices within a predetermined physical, virtual, or logical proximity to the mobile device.
In another example, the device may receive targeted advertising or marketing information based on a physical, virtual, or logical location of the mobile device in response to a shared time-unique representation (TDR), which in one example, may include demographic information, time information, geographic location information, psychological information, and/or other forms of information related to a user of the mobile device. In another example, the shared temporally unique data representation (TDR) may include information related to a purchase transaction made or expected to be made using the mobile device, and further include receiving targeted advertising or marketing information based on a previous or expected purchase transaction. In this way, the vendor can almost immediately learn about the relevant characteristics of nearby users and potential customers-without knowing or knowing the identity of such users-so that the vendor can tailor products and services to the interests of nearby users and potential customers in real time, without compromising the privacy/anonymity of the user/potential customer.
In accordance with another aspect of another embodiment of the present invention, disclosed herein is a system for providing electronic data privacy and anonymity in one example, the system may include at least one user device having a first privacy client operating thereon; at least one service provider device having a second privacy client running on the service provider device; at least one privacy server connected to the network, the privacy server in communication with the first and second privacy clients; wherein the privacy server comprises an abstraction module that electronically links and separates data attributes and attribute combinations, and associates DDIDs with data attributes and/or attribute combinations.
In one example, the privacy server may include an authentication module that generates and/or accepts one or more of the DDIDs. In another example, the privacy server may include a maintenance module that stores data attributes and/or combinations of attributes of DDIDs and combinations thereof. In another example, the privacy server may include a verification module that verifies the integrity of the data attributes, attribute combinations, and DDIDs.
In another example, the privacy server may include an access log module that collects and stores information related to DDIDs and data attributes for use in post-event forensic analysis in the event of one or more errors.
In one example, the DDID expires after a predetermined time, and after the DDID expires, the abstraction module assigns the DDID to another Data attribute and/or another Data Subject.
In accordance with another aspect of another embodiment of the present invention, methods, computer-readable media and systems are disclosed herein for (a) transforming a multi-dimensional dataset by enforcing one or more policies (at the same or different times) for at least one dimension or a subset of one of the dimensions in a given dataset; (ii) transforming the data sets in the section (i) at a time before, during or after the original transformation, e.g., by creating one or more A-DDIDs; (iii) technically enforce policies using Just-In-Time-Identity (JITI) or other types of access control based keys to restrict access to all or part of a data set; (IV) applying parametric or non-parametric techniques and/or mathematical methods to enable ranking or ranking of the information in the transformed data set according to various industry-appropriate or industry-related value indicators; (V) enforcing one or more privacy policies to one or more individual "units" of data; and/or (vi) enable an electronic marketplace to make purchases, sales, licenses, and/or other transactions for policies, wherein such policies may be ranked or rated according to quantitative and/or qualitative measures of the effectiveness of providing anonymization to the data set.
In accordance with another aspect of another embodiment of the present invention, disclosed herein are methods, computer-readable media and systems for analyzing schema, metadata, structures, etc. of a data set to determine an algorithm using artificial intelligence algorithms, possibly to mask, summarize, or otherwise transform the data set to comply with operations of a predetermined privacy policy
In accordance with another aspect of another embodiment of the present invention, methods, computer-readable media and systems are disclosed herein for providing an "as a service" privacy policy, e.g., over a network or through an application, to provide assistance to one or more users to help comply with regulatory and/or contractual restrictions while enhancing data security and privacy in a manner that helps to free up the full value of the data (i.e., by making more use of the data).
According to another aspect of another embodiment of the invention, disclosed herein are methods, computer-readable media and systems, e.g., across an unlicensed system or using an immutable and verifiable distributed ledger technique, e.g., blockchain, for providing electronic data privacy and anonymity to user information stored in a decentralized manner.
Other embodiments of the disclosure are described herein. The features, utilities, and advantages of various embodiments of the disclosure will be apparent from the following more particular description of embodiments as illustrated in the accompanying drawings.
Drawings
Fig. 1 shows an example of a block diagram of a system comprising a privacy server according to an embodiment of the invention.
FIG. 1A shows an example of a block diagram of a system including a privacy server in which the present invention is provided as a service that interacts with external databases, according to one embodiment of the present invention.
Figure 1B illustrates different manners in which allocation, application, expiration, and recycling of DDIDs may occur with respect to data attributes and/or attribute combinations according to different embodiments of the invention.
Fig. 1C-1 illustrates potential input and output flows of a system including a privacy server from the perspective of a trusted party, according to one embodiment of the invention.
Fig. 1C-2 illustrate potential input and output streams of a system including a privacy server from the perspective of a Data object, according to one embodiment of the invention.
FIG. 1D illustrates an example of using a DDID in conjunction with a network blood pressure monitor, according to one embodiment of the invention.
Figure 1E illustrates an example of using DDIDs in servicing patients with Sexually Transmitted Diseases (STDs) according to one embodiment of the invention.
FIG. 1F illustrates an example of the use of DDIDs in connection with providing coupons in accordance with one embodiment of the present invention.
FIG. 1G shows an example of using a DDID in conjunction with a physician viewing blood pressure levels, according to one embodiment of the present invention.
FIG. 1H illustrates an example of using DDIDs to implement dynamic data obfuscation in connection education related information according to one embodiment of the invention.
FIG. 1I illustrates an example of a process of performing Disassociation Level Determination (DLD) and creating an Anonymity Measurement Score (AMS) according to one embodiment of the present invention.
FIG. 1J illustrates an exemplary calculated anonymity measurement score according to one embodiment of the present invention.
Fig. 1K illustrates exemplary categories of consent/participation levels required by a Data Subject for certain calculated anonymity measurement scores according to one embodiment of the present invention.
Fig. 1L illustrates an example of using DDIDs in an emergency response area according to an embodiment of the present invention.
FIG. 1M illustrates an example of using security and privacy with Just-In-Time-Identity (JITI) according to one embodiment of the invention.
FIG. 1N illustrates an example of using Just-In-Time-Identity (JITI) -enabled security and privacy In accordance with one embodiment of the present invention.
FIG. 1P-1 shows an example of using static anonymous identifiers.
FIG. 1P-2 illustrates an example of using Just-In-Time-Identity (JITI) -enabled security and privacy In accordance with an embodiment of the present invention.
FIG. 1Q illustrates an example of using Just-In-Time-Identity (JITI) -enabled security and privacy In a healthcare environment, according to one embodiment of the invention.
FIG. 1R illustrates an example of a system for implementing Just-In-Time-Identity (JITI) -enabled security and privacy In accordance with one embodiment of the present invention.
FIG. 1S illustrates an example of a system for implementing Just-In-Time-Identity (JITI) enabled security and privacy to support OpenHealthPlatform (OH) according to one embodiment of the invention.
FIG. 1T illustrates an example of a system for implementing data de-risking policy management and access control according to one embodiment of the invention.
FIG. 1U illustrates examples of various data de-risking schemes according to one embodiment of the invention.
FIG. 1V illustrates an example of a marketplace for various data de-risking policies available for purchase, according to one embodiment of the invention.
FIG. 1W-1 illustrates an example of an intelligent policy compliance engine according to one embodiment of the invention.
FIGS. 1W-2 illustrate an exemplary flow diagram for using an intelligent policy compliance engine according to one embodiment of the present invention.
FIG. 1X-1 illustrates an exemplary system for providing data privacy services through a shim.
1X-2 illustrate an exemplary system for providing data privacy services through online services from a web browser, device, or other sensor.
FIG. 1Y-1 illustrates a cloud-based platform and application for providing a system for de-identifying data.
FIGS. 1Y-2 illustrate a cloud-based platform and application for providing a system for re-identifying data that has been de-identified.
FIGS. 1Y-3 illustrate a cloud-based platform and application for providing a system integrated with an extract, transform and load (ETL) application.
FIG. 1Z-1 illustrates a decentralized network built based on blockchain technology in which anonymous privacy controls may be employed, in accordance with one or more embodiments.
FIG. 1Z-2 illustrates a decentralized network built according to a blockchain-based technique in accordance with one or more embodiments
FIGS. 1Z-3 illustrate a decentralized network built based on blockchain technology in accordance with one or more embodiments, where anonymous privacy controls may be employed.
Figures 2-4 illustrate examples of the generation and use of TDRs according to one embodiment of the present invention.
Fig. 5 shows two example property combinations with different abstraction levels by means of the association function and the replacement function of the system according to one embodiment of the invention.
FIG. 6 illustrates an example of a process (from an example control entity and system perspective) for selecting attribute combinations, generating TDRs to abstract or anonymize data, and then re-associating or de-anonymizing from the data in accordance with an embodiment of the present invention.
FIG. 6A illustrates an example of a process of receiving attributes from one or more external databases, generating TDRs to abstract or anonymize data, and then re-associating or de-anonymizing data (from the perspective of example control entities and systems), according to one embodiment of the present invention.
FIG. 6B illustrates an example of a process (from the perspective of an example control entity and system) that provides dynamic anonymity for data elements contained in one or more databases that are deemed too sensitive to be revealed in a recognizable manner outside of an organization.
FIG. 7 shows an example of a process (from the perspective of a recipient entity) of the process of FIG. 6, according to one embodiment of the invention.
FIG. 8 shows an example of a process for verifying rights according to one embodiment of the invention.
Fig. 9 shows an example of a process of withholding key protection information unless authenticated according to one embodiment of the present invention.
FIG. 10 shows an example of a process for anonymously analyzing interests of an associated party in accordance with one embodiment of the present invention.
Figures 11-18 illustrate various examples of interactions between an associator, a service provider and a privacy server, including DDID and attribute combinations generated, sent and tracked, according to one embodiment of the invention.
FIG. 19 shows an example of a combination of attributes accessible by multiple service providers, and an example of a combination of attributes that is resent back to the privacy server by each service provider, according to one embodiment of the invention.
Figure 20 shows data accessible by an associated party including all attribute combinations sent to and retransmitted from a service provider according to one embodiment of the present invention.
Fig. 21 and 22 illustrate how a service provider acting as a control entity and providing information to various sellers may provide each seller with only those combinations of attributes necessary to perform the services assigned to it, according to one embodiment of the present invention.
Fig. 23 shows an example of implementation of DDID in the field of internet advertising according to an embodiment of the present invention.
Figures 24-25 illustrate an example of an implementation of a DDID in the healthcare field according to one embodiment of the present invention.
Fig. 26 shows an example of implementation of DDID in the field of mobile communication according to an embodiment of the present invention.
FIG. 27 shows a block diagram of an example of a programmable device implementing techniques for dynamically creating, assigning, changing, reallocating, and using dynamically changeable, temporally unique identifiers (DDIDs) in accordance with an embodiment of the present invention.
Figure 28 illustrates a block diagram showing a privacy client network and a privacy server for implementing techniques for dynamically creating, assigning, changing, reallocating, and using DDIDs, according to one embodiment of the invention.
Detailed Description
Various systems, methods, and apparatus for privacy and security management and use of information related to one or more Data subjects, such as people, places or things, and/or related actions, activities, processes, and/or features are disclosed herein. The systems, methods, and devices described herein abstract and separate Data attributes and/or dependent attributes related to Data objects and/or associated actions, activities, processes, and/or features by linking Data related to the Data objects and/or associated actions, activities, processes, and/or features to independent objects. The DDID may then be associated with a selection data attribute or a selection attribute combination, thereby creating a TDR. In this manner, embodiments of the invention can be used to provide Data security, privacy, anonymity, and accuracy for Data subjects, such as, for example, people, places or things, and/or related actions, activities, processes, and/or features, even for Data stored in a decentralized storage system, for example, in the form of immutable, verifiable, and distributed ledgers provided in blockchain technology. Various embodiments of the invention are disclosed herein.
Dynamic Anonymity/circle of trust (CoT)
The Dynamic Anonymity premise is that static Anonymity is an illusion, and the use of static identifiers is fundamentally flawed. The system dynamically segments the data stream elements at various stages and applies a re-assignable dynamic counter identifier (DDID) to the data stream elements (note: although the dynamic subdivision may contain time lapse, it is more likely to be determined by activity, location and/or topic) to minimize the risk of inadvertent sharing of information while in transit, use or rest, while maintaining the ability of the recipient-and without other-re-splicing data stream elements.
The clear text primary key can be used internally within the trust circle ("CoT"), as shown in fig. 1C-1, to identify Data subjects, actions, activities, processes, and/or features; however, these keys may not be shared outside of the "circle of trust". In contrast, dynamic authentication uses a dynamically altered and reassignable composite key outside of the trust circle, which may include (a) a DDID; and (two) the time period/purpose for which the DDID is associated with a Data object, operation, activity, procedure, and/or feature). Information about this association may not be available outside of the "circle of trust" (a DDID representing a connection to one or more Data subjects, actions, activities, processes, and/or features may not be reconfigurable if it does not contain recoverable information that results in the one or more Data subjects, actions, activities, processes, or features.
Dynamic authentication enhances privacy, anonymity and personal Data protection functions in a distributed platform/scattered ecosystem, while providing superior access to and use of Data according to the Data body or policies established on behalf of Data subjects. Thus, everyone-including those who choose to use a closed or distributed system) would benefit from enhanced data privacy and anonymity.
Dynamic authentication immediately provides a certain gain without modifying existing business and technical practices. By using dynamically changing and temporally unique DDIDs, current systems and processes (e.g., web browsers and data analytics engines) may not be able to identify relationships between and among data elements. These systems and processes can use existing functionality to process information without creating inferences, dependencies, profiles, or conclusions unless the Data object and the trusted party/agent are explicitly authorized by a circle of trust (CoT). However, new businesses and technological practices that utilize the specific attributes and functions of DDID, dynamic authentication, and/or circle of trust (CoT) will bring even more significant advantages.
Dynamic authentication provides benefits at four different points in data processing:
A. Collecting data;
B. data transmission/storage;
C. analyzing data; and
D. data privacy/anonymity control.
At each point, the Data will be protected as specified by or on behalf of the Data object that the PERMS relates to.
A. Data acquisition
In applications where a static identifier is typically associated with the capture of data related to the body of the data, dynamic authentication may provide:
1. dynamic anti-identifiers (or DDIDs) change over time (triggered by the passage of time, a change in purpose, a temporary cessation of activity, or a change in virtual or physical location), thereby limiting the tracking, analysis, or association of Data with Data objects, actions, activities, processes, and/or features.
2. Each DDID is associated with an applicable Data Subject or subjects, operation, activity, procedure and/or feature that is known and stored only within the applicable trust circle (CoT).
Dynamic authentication also provides an optional function of storing data associated with a DDID in a CoT.
A key feature of Dynamic Anonymity is the ability to anonymize and segregate Data elements at the Data element level rather than at the Data record level-i.e., at the level of individual Data elements related to Data Subject, action, activity, procedure, and/or feature, rather than Data element features representing all or most of the information related to Data Subject, action, activity, procedure, and/or. Trust circles retain relationship information between Data elements and Data subjects, actions, activities, procedures, and/or features to allow re-association (sometimes referred to herein as pesms) according to Data subjects and/or privacy/anonymity policies and/or rules established on behalf of the Data principals.
Example (c): search engine
Consider a person who uses a particular search engine on a regular basis. Currently, the search engine assigns the person (via their browser) a "cookie" or other digital footprint tracker that lasts months or years, then accumulates the observation Data (e.g., search terms, clicked links, location Data) over time, and is likely to be analyzed and further summarized by multiple parties — often revealing personally identifiable information without consent from the Data Subject.
The Dynamic authentication can create a new cookie/digital footprint tracker for each DataSubject that first interacted with the search engine using the natural response of the search engine. Clearing the history, cache, cookie/digital footprint tracker, and related Data will cause the search engine to generate a new Cookie/digital footprint tracker for the Data object. A circle of trust (CoT) may store information about the Cookie/digital footprint tracker's association with the Data Subject, and may optionally store a query list and selected links.
In this way, the search engine can still access aggregated Data-hit words, hit websites, advertisement clicks, etc. -but will not be able to draw inferences about Data subjects from the observed Data. CoT can cause the search engine to perform more detailed analysis if/via Data Subject and/or privacy/anonymity policies and/or rules established on behalf of the Data Subject. This can be accomplished using an HTTP proxy or browser extension without modification (or cooperation with existing search engines).
In the past, anonymous tracking cookies were thought to have solved the problem of how to support both privacy and analytics. However, anonymous tracking cookies fail to achieve this goal because all Data is stored together and associated with a random static identifier, which makes it too easy to generate information that is linked or linkable to a Data object ("personal Data" or "PD"), thereby invalidating or weakening the value of the static "anonymous" identifier. Dynamic authentication overcomes these shortcomings by employing dynamically altered and reassignable DDIDs, storing the generated DDID associations and hidden keys in a trust circle, and providing a unique interaction model to enable Data Subject participation with trusted/third party participants.
B. Data transmission/storage
CoT consists of one or more trusted parties, each of which can provide one or more independent data storage facilities, and secure methods of segmenting and transferring sensitive data to these data stores.
Alternatively, an application developer who meets the Dynamic Anonymity requirements may choose to store only DataSubject in association with DDID in CoT, but rather use the Dynamic Anonymity-defined process to obscure, encrypt, and/or segment data (or use a Dynamic Anonymity-enabled tool kit for such a process); allowing applications to securely store generated or collected information in their own facilities without losing context or business value.
In the past, techniques similar to those employed by the present invention have been employed to:
-subdivision data;
-encrypting and obfuscating data during transmission; and
distribution, confusion and security during storage.
However, dynamic immunity improves these previous methods by:
hiding data at the data element (relative to data records) level using dynamically altered and re-assignable DDIDs;
-storing the generated DDID association/masking key within the trust circle; and
-providing a unique interaction model to enable participation between DataSubjects and trusted/third party participants.
C. Data analysis
Conventional techniques for data "scrubbing" (also referred to as data scrubbing and data scrubbing) suffer paradoxically from two distinct and opposite problems.
1. A given data cleansing technique may not be effective at all. Despite careful efforts to hide personal Data even using legally recognized techniques, it is still possible to identify Data subjects and personal Data from "cleaned" Data. Three notable examples:
a. in the mid 1990's, the massachusetts Group Insurance Committee (GIC) published data on individual hospital visits by state employees to assist in conducting important studies. Latanova swainson, who was a researcher at the massachusetts institute, purchased a registration record for the cambridge voter and by linking two completely innocuous data sets she could re-identify the GIC entry at the time of massachusetts state girl welrd despite the fact that it has been "anonymized" and deleted all obvious identifiers such as name, address and social security number.
In 2006, the university of texas, austin, alder nala, at that time, together with his consultant, showed that many Netflix users might be re-identified by linking the "anonymous" Netflix dataset to an internet movie database (IMDb) where viewers usually watch movies under their names.
In 2013, a team led by Ehrlich, neisseria, inc., of the Wyod biomedical institute, re-identified a male who participated in the 1000 genome project-an international consortium, who placed the sequenced genome of "unidentified" people (who, indeed, proved to be 2500 people) into the genome in an open online database, who also participated in the study of the Utah Mormen church family.
2. More efficient data cleanup techniques may reduce the business value of the data, i.e., many obfuscation techniques are lossy.
The Dynamic Anonymity method for data privacy/Anonymity provides a method to avoid two traps at the same time.
D. Data privacy/anonymity control
To protect personal data, dynamic Anonymity may be measured by specifying and enforcing data privacy/Anonymity by a number of means:
1. A system for determining the privacy/anonymity level for each potential exposure type of Data related to a Data object, action, activity, process, and/or feature. These privacy/anonymity levels may contain continuous discrete values (between the extremes of full privacy/anonymity and full public exposure), and/or mathematical descriptions of such discrete values ("anonymity metric values" or "AMS").
PERMS specifies the operation allowed or limited by the policy on the data. (e.g., "shared", "update")
PERMS associates levels of access, permissions, and data with one another, granting or denying certain levels of access to the data according to one or more conditions, including data type, time, organization seeking access, and the like.
The PERMS of Data object may also be merged with or limited by legal policies. (for example, medical data in the United states must be protected under the United states Health Insurance Portability and Accountability Act (HIPAA).
In addition, a proposal to modify or grant specific and limited rights can be made to the Data Subject and accepted if allowed by the trusted party and with the consent of the Data owner.
Dynamic Anonymity may also be improved by using privacy/Anonymity level determination to prevent misuse of Data, whether from inside or outside the trust circle, from being masked and analyzed in a manner consistent with the privacy/Anonymity level specified by each Data Subject.
Dynamic De-Identifiers(DDIDs)
The dynamic de-identifier DDID is a time-bounded pseudonym that both references and obfuscates the value of the primary key that references a Data object, operation, activity, procedure, and/or feature, (ii) the attribute value of the Data object, operation, activity, procedure, and/or feature (e.g., the zip code), and/or (iii) the type or type of Data associated with the Data object, operation, activity, procedure, and/or feature (e.g., the fact that some encoded value is a zip code).
DDIDs may additionally protect data if there is no discernible, inherent or computable relationship between the contents of the DDID and the values (plaintext) to which they refer. Furthermore, the association between any given DDID and its plaintext value may not be disclosed outside of the trust circle (CoT). Unlike static identifiers, when fuzzy values or keys are used for different purposes or at different times in different contexts, they do not necessarily have the same associated DDID.
DDIDs may be generated within a trust circle, or if the above conditions are met, an external ID may be used as the DDID.
DDIDs have time limits
As previously mentioned, DDID associations are bounded in time, which means that a particular DDID may reference one value at a time, but (if needed) another value at another time, even in the same context, and for a single type of data (e.g., zip code).
This necessarily means that in order to decode or disclose the meaning of a particular DDID, the application must also retain knowledge about the time that the DDID was applied.
This knowledge can be explicit-i.e., the time assigned can also be part of the record or document storing the DDID-or can be implicit-e.g., the entire data set may have been masked in batches, (regardless of how long the process actually takes) and assumed to occupy the same time-thus there is only one set of consistent DDID mappings per field type. To reconstruct such data, a reference to a set of corresponding DDID/value associations (stored in the CoT) also needs to be provided.
DDID is purposeful
Note that DDIDs are also limited by context or purpose-meaning that the same DDID may even recur in multiple contexts. For example, consider a stream of records, each containing a Social Security Number (SSN) and zip code, and all occupying a block of time. In this case, a particular DDID may be used as a substitute for either a zip code or an SSN.
As mentioned above, this means that some indication of the context (e.g., whether this is a zip code or SSN.
Replacing data with DDID
Consider the task of replacing a single data stream (the same type of data (e.g., zip code or SSN) occupying the same time block) with a DDID. The (Java) "pseudo code" description of the Application Programming Interface (API) that performs this behavior in one potential embodiment of the present invention may look like this:
interface DDIDMap
DDID protection (Value clear text);
value exposure (ddidddd);
}
in english, an "interface" means that we are defining a set of functions (named "DDIDMap") that operate on the same underlying data. The data type is here started with capital letters (e.g., "DDID"), while the variable or function parameter name is started with lower case letters (e.g., "plaintext" function parameter must be data of the "value" type-where "value" is simply the name of any data type that can be obscured: ID, quantity, name, zip code, etc.).
A function "protect ()" accepts some plaintext values and returns the corresponding DDID. The DDID. If the value has been seen previously, its previously assigned DDID will be returned. If never encountered before, a new DDID (this data set is unique so far) will be generated, associated with this value, and then returned.
Another function "expose ()" reverses the process: when a DDID is passed to it, it looks up and returns a plaintext value that was previously encoded as the DDID. If a given DDID never appears, it will fail and display an error prompt.
The data managed by these operations is then a bi-directional mapping from each plaintext value to the DDID replacing it and from the DDID back to the original value.
Note that although we have said that a given DDID can only reference one value, a variant version of the algorithm may be implemented that allows one value to be associated with multiple DDIDs, if desired.
Managing DDID maps by time and purpose
Recall that the above two-way DDID value graph operates (one) based on a single type of data (i.e., having the same type, context, and purpose), and (two) within the same time period. To support operations across time and context, we can place another potential API, providing us with the appropriate DDID to value mapping for timing and purpose:
interface DDIDMapManager
DDIDMapgetMap(Context context,Time time);
}
Here, a "context" is a (or issuing) key that refers to a particular kind of data that is being covered. (also sometimes referred to as an "association key" or "A _ K" elsewhere in this document.) for example, a context may be the name of a table and column in which the data to be hidden will reside (e.g., "employee salary"). It may also include other non-chronological indications of the purpose or scope.
Since the DDID to value mapping spans one time block and there are many instances of time in a block, this means that there are some functions to find the time block associated with each given time.
(more later.)
DDID generation and time restriction policies
Note that different kinds of data may employ different DDID replacement strategies. In addition to what is mentioned in the next two sections, the DDIDs may vary in size, whether they are generic or unique to the data set (or time block), which encoding they use (e.g., integer or text), and so on. Moreover, although DDID generation should generally be random, it may also be desirable to demonstrate, test or debug using a deterministic or pseudo-random DDID generator.
Unique or reusable DDIDs
One potential policy may allow a particular DDID to be assigned to two different Data subjects in the same context, but for two different time periods. For example, a DDID "X3Q" might refer to (e.g., "80228" at a certain time (in one time period) and "12124" at a later time (in another time period) in the same time-anchored record set. (we call this strategy "DDID reuse").
An alternative is to prohibit such "reuse" -and to specify that a given DDID can only reference a single body in the same context. (although the subject may still receive a different ddid over time.)
The choice between these two strategies involves a tradeoff between increased ambiguity and the ease with which aggregated queries can be performed on the ambiguous data.
Suppose we wish to count patients by zip code. If the zip code DDIDs are unique, we can aggregate the counts for each DDID and then ask the CoT to complete the query by parsing these DDIDs into their corresponding zip codes and aggregating again. However, if we have "reused" a DDID, the entire DDID list and corresponding time must be sent to the CoT for parsing (and aggregation) -because we cannot determine whether two instances of the same DDID reference the same value.
DDID time period
Implementation also has the freedom to choose different strategies to split the DDID graph in time. The time blocks may differ in size and/or time offset; the size may be fixed, random, or determined by the number of records per assignment. (Note that for a given context, using an infinitely long block of time produces equivalent behavior to using a "static" identifier.)
Practice of
Although there may be many policies for creating new DDIDs, the API for generating such DDIDs appears (substantially) the same no matter which policy is implemented "behind the scenes". For example:
interface DDIDFactory back-up
DDIDcreateDDID();
}
Next, consider the task of determining which time block is associated with a given DDID assignment, and since a time period may contain many instances of time, we need to use some sort of "time key" (sometimes abbreviated as "T _ K" elsewhere in this document) for each time period. This means that a function is required to obtain the appropriate key at any time:
TimeKeytimeKey=getTimeKey(Time time);
further, note that both the time blocking and DDID generation policies depend on the type of data that is masked. In short, they are all associated with a given "context" (which includes or implies the notion of data type and usage), meaning that the "context" API must provide at least one function that supports:
interface context
TimeKeygetTimeKey(Time time);
DDIDFactorycreateDDIDFactory();
}
In view of these two additional functions, we can imagine that the "getMap ()" implementation at "DDIDManager" (as shown before) might look like this:
DDIDMapgetMap(Context context,Time time){
TimeKeytimeKey=context.getTimeKey(time);
DDIDMap map=getExistingMap(context,timeKey);
if (not find the map)
DDIDFactory factory=context.createDDIDFactory();
Map = createMap (factorary);
storeNewMap (context, timeKey, map);
endif
Returning to the map;
}
here, "getExistingMap ()" is a function for finding a map assigned to a given context and time key, "createMap ()" creates a map that will use a given DDID factory, and "storeNewMap ()" associates the newly created map with the context and time key for later retrieval of the map. )
Hiding data and attribute types using context
Dynamic Anonymity may define the following different types of data to protect: (one) primary keys relating to Data subjects, operations, activities, procedures, and/or features (e.g., employee numbers), (two) attribute Data associated with, but not unique to, data subjects, operations, activities, procedures, and/or features (e.g., employee zip codes), and (three) type itself ("association key" or "a _ K") representing the disassociated (obscured) Data element.
Each of the above can be implemented by defining different contexts: first, we will discuss (one) and (two), both of which are achieved by masking the data value (replaced by a "replacement key" DDID, abbreviated "R _ K" elsewhere). We will address the indication of the (ambiguous) data element type of the (triple) association below.
Consider a simple example: an order form records which products the customer has purchased on a given day. Each record has a date number, a customer number and a product number. We want to mask these data for use or analysis by some third party outside of CoT. In particular, we want to cover customers and product IDs, but please keep the date intact.
To this end, we can create two "context" instances: one for the "customer number" and one for the "product number". Although the DDID is preferably random, for our purposes, let us assume that our "ddidfactor" will create integer DDIDs in order from zero. Further, assume that each DDID mapping spans only three days, so after three days, a new set of DDID mappings will be used. This also means that the DDID will be "reused" -the same DDID may reference different values when different blocks are used. (this is not an ideal encoding strategy and is used here for illustrative purposes only.)
Table 1 shows some plaintext sample data:
TABLE 1
Number of days Customer identification number Product numbering
1 500 ZZZ
2 600 XXX
3 600 YYY
4 700 TTT
5 500 YYY
6 600 TTT
After being masked (as described above), the data will appear as shown in table 2 below:
TABLE 2
Number of days Customer identification number Product numbering
1 0 0
2 1 1
3 1 2
4 0 0
5 1 1
6 2 1
To understand this, please read each column and think in groups of three days (the first period of DDID covers 1-3 days of each ambiguous field, the second covers 4-6 days).
For the first three days, the customer number is: 500. 600 and 600. The result is encoded as: 0. 1, 1 (note that 600 is repeated, so its DDID, 1, is also repeated.)
On the next three days, the customer number is: 700. 600 and 500. The results (starting from 0) are: 0. 1,2 (note 500 was previously 0 and now 2).
The product number uses a separate context, and therefore the DDID stream, so it also starts from zero:
for the first time block (XXX, YYY, TTT) becomes (0, 1, 2).
For the second time block (TTT, YYY, TTT) becomes (0, 1, 0).
Another "context" may be used to mask the indication of the type (three above) of the unassociated (obscured) data element, where the column name is an example of an attribute key (a _ K). This can be done by using one DDID to value mapping for the entire set (effectively replacing DDIDs with column names) or in time blocks (as with other fields in this example) if appropriate random DDID generation strategies are used, and the affected records cannot be analyzed without the help of trust circles.
Notes about location and time
The example API defined above assumes that when data is encoded, the encoding time is passed with each data or record. This is only needed when a DDID is "reused" in the same context (thus requiring time to distinguish the two potential meanings of the DDID). When each context assigns a DDID to only one value, the DDID is sufficient to find the (single) original value.
Time may also become an issue when "reuse" DDIDs are used across different systems, and the notion of time for these systems may be slightly different. If the time associated with the DDID encoding cannot be passed, a (chronological) "buffer" may be used to prevent the DDID from being reused too close to its original allocation, and when the time associated with the data to be encoded can be delivered, the time may be "justified" against the local system clock: skew within a small window (less than the DDID reuse buffer) can be tolerated, while larger differences will trigger error reporting. Finally, note that there is flexibility in where to encode data: the data can be streamed to the computer residing in the CoT and then sent to its destination together after encoding. Alternatively, however, the encoded portion of the algorithm described above may run outside of the trust circle, provided that the resulting association of DDIDs and values is not (a) stored on the local host, and (b) streamed to the CoT host securely (e.g., using encryption) and taking appropriate action to prevent data loss to achieve persistence, thereby reducing latency for critical applications.
Dynamic anonymity, i.e. identity verification can be cancelled without identity verification
"De-identification" techniques, traditionally used in certain situations (e.g., HIPAA or health related situations), to protect data privacy/anonymity may be defensive in nature, e.g., a series of masking steps are applied to direct identifiers (e.g., name, address), and masking and/or statistical-based operations are applied to quasi-identifiers (e.g., age, gender, occupation) to reduce the likelihood of re-identification by unauthorized third parties.
Dynamic authentication can be of significant value aggressively, as information can be retained and utilized/utilized for authorization purposes without all data statistically risking any re-identification of any data. Dynamic Anonymity may refuse such claims and traditional dichotomy in that one must sacrifice the value of the information content in order to minimize risk. Instead, dynamic authentication can minimize the amount of risk and information lost, allowing most, if not all, information to be recovered, but only under the authority of a Data Subject/trusted party, and not under the authority of an unauthorized adversary/"black cap" hacker.
Dynamic authentication can uniquely enable information to be used by multiple parties in a controlled environment in different ways, which is beneficial for unlocking and maximizing the value of the data. The Dynamic Anonymity can improve the value of potential business intelligence, research, analysis and other processes to the maximum extent, and can obviously improve the quality and performance of a data privacy/Anonymity process at the same time.
When collecting or storing sensitive data, the sensitive data may be "separated" from the object using one or more of the following strategies, none of which results in any loss of value:
1. subdividing: sensitive data may be divided into portions by data type, sent and/or stored separately (in separate trust circles, or using different DDID mapping sets maintained by the same trusted party) so that each block alone does not produce any personal data.
2. And (3) replacing the number: the static identifier may be replaced with a dynamically changing and re-assignable DDID to avoid a relationship between the Data and the Data object to which the Data refers.
3. Blurring: the data value and data type indicator may also be replaced with a DDID.
The DDIDs associated with these operations are stored in a circle of trust (CoT), as shown in FIG. 1C-1; thus, the original Data can be reconstructed by reversing these transformations, but only in coordination with the CoT itself, and thus only if such permission is granted by the body of the Data and/or on behalf of the Data Subject.
Fig. 1 illustrates an example of an embodiment of the invention, including a system with a privacy server 50 or privacy server module that securely manages various Data attributes and Data attribute combinations (which may include, but are not limited to, behavioral Data, transaction history, credit rating, identity information, social network Data, personal history information, medical and employment information, and educational history) related to Data subjects for use in different applications 56. These applications 56 may include, but are not limited to:
medical care applications
■ Medical record
■ Mobile application
■ Real-time intensive care applications
■ Regulatory compliance (e.g., HIPAA)
■ Study of
O educational application
■ Student record
■ Study of
O Mobile applications
■ Geographic position (Beacon, GPS, wi-Fi fingerprint)
■ Mobile payment and loyalty
O financial services applications
■ Banks, brokers, etc
■ Payment processing
■ Payment Card Industry (PCI) security
■ Authorization
■ Verifying the identity of a cardholder
■ Compliance with regulations
■ Study of
■ Credit assessment
■ Fraud detection
O network applications
■ Advertisement delivery
■ Content review
■ Electronic commerce
■ Social network
Application of the Internet of things
■ Telematics
■ Smart grid
■ Smart city
● Traffic monitoring
● Utility monitoring
O power supply
O fuel oil
O. water/wastewater
● Waste management
■ Intelligent office
■ Intelligent factory
■ Smart home
● Networked entertainment
O television
O streaming device
● Automation
O heating ventilation air conditioner
O illumination
● Safety feature
O window/door lock
Fire/smoke/carbon monoxide detector
● Electrical appliance
■ Intelligent vehicle
■ Agricultural sensor
■ Wearable device
● Healthcare monitoring
● Body-building equipment
● Glasses
● Garment
■ Unmanned plane
O private wireless/wired network
■ Crop sensor
■ Tagged animal tracking
■ Army sports
Private security applications
O E-commerce applications
O offline retail applications
O human resources/recruitment application
Application for government
■ National security applications
● Call detail record analysis
● Website browsing behavior analysis
● Analyzing online and offline purchasing behavior
● Travel behavior analysis
● Social media activity analysis
● Analyzing circles of friends, acquaintances, and other relationships
Applications of attorneys/law firms
■ Maintaining confidentiality/attorney-client privileges
■ Electronic discovery
Customer competition registration application
O appointment application
FIG. 1A illustrates an example of one embodiment of the invention, including a system with a privacy server 50 or privacy server module that receives electronic Data from one or more external databases 82 and securely converts from the database various Data attributes and Data attribute combinations one or more (possibly including but not limited to behavioral Data, transaction history, credit rating, identity information, social network Data, personal history information, employment information, medical and educational history) external databases related to Data subjects, which are stored in a TDR for different applications. Alternatively, the application stores only the association information of Data object and DDID in the privacy server 50 and uses the process defined by the dynamicimmunity to mask, encrypt and/or segment the Data stored in the external database 82. In this manner, the association information of Data subjects with DDIDs stored in the privacy server 50 may provide greater context and/or commercial value for information generated, collected and/or stored in the external database 82.
In one example, embodiments of the invention may form a secure and comprehensive aggregate Data profile 58 for Data subjects in one or more applications 56. The Data Subject, or a party associated therewith, such as user 59, can anonymously communicate or selectively disclose the identity and/or Data attributes of the Data Subject from the aggregated Data profile 58 of the Data Subject to a seller, service provider, advertiser, or other entity Subject or party associated with the Data (including Data attributes, combinations of attributes, or portions thereof, possibly from unrelated Data sources) interested in communicating 57 (comprised of Data attributes, combinations of Data attributes, or portions thereof, possibly from unrelated Data sources) over a network 72 (e.g., likely to obtain service or conduct a purchase transaction) based on the characteristics of one or more Data subjects represented in the summary Data profile 58 of the Data subjects. In this manner, embodiments of the present invention provide digital rights management for individuals ("DRMI") that refer to a Data Subject, affiliate or third party to manage Data attributes and Data attribute combinations related to the Data Subject or digital rights management. The cancellation identification ("DRMD") of Data attributes and Data attribute combinations associated with one or more Data subjects is managed by a third party. In one example, the extent to which information about Data attributes, data attribute combinations, data subjects, and/or interested parties may be provided to other parties may be controlled by embodiments of the present invention.
In the example of fig. 1 and 1A, a plurality of users 59 (e.g., data subjects or service providers) utilize devices, such as smart devices 70 (e.g., wearable, mobile, or non-mobile smart devices), smartphones, tablets, laptops, desktop computers, wired or wireless devices, or other computing devices running a privacy client application 60, to access a network 72, such as the internet. As shown in fig. 1 and 1A, a system 80 is shown coupled to and in communication with the internet or other public or private network, and may include a privacy server 50 securely coupled to one or more databases 82. In an example, the privacy server 50 may be implemented using computer program modules, code products or modules running on a server or other computing device. One or more of the databases 82 may be implemented using any conventional database technology, including techniques that securely store (e.g., by encryption) data in redundant locations such as, but not limited to, RAID storage, network attached storage, or any other conventional database.
In one example, the privacy server 50 implements one or more of the operations, processes, functions or process steps described herein, and the privacy server 50 may include or be configured to include other operations, functions or process steps as desired in certain embodiments of the present invention, including but not limited to the following processes, operations or functions performed by the indicated modules:
The authentication module 51, which may provide both internal and external authentication, comprises the following processes:
a. the internal authentication request TDR of the privacy client 60 and the TDR generated by the privacy server 50.
b. Before allowing participation in the required actions, activities or processes and authenticating the recipient using TDR, an external authentication is performed to approve the reception Time Key (TK), the Association Key (AK) and/or the Replacement Key (RK), which may be the TDR necessary for unlocking the content.
c. One example implementation of the authorization module may include allowing the ability to delegate the request to generate a DDID and associated TDR to other parties authorized by the controlling entity.
Abstraction module 52, which may provide internal and external abstractions, may include one or more of the following processes:
a. DDIDs are selected by generating a unique DDID or accepting or modifying a temporally unique, dynamically changing value to be used as a DDID.
b. The DDID is associated with a Data attribute or combination of attributes to form a TDR for a given Data object, operation, activity, process, or feature.
c. Including only a portion of the relevant Data attributes in the TDR, thereby disassociating Data attributes that are relevant to a Data object and/or relevant to a given operation, activity, process, or trait
d. Replacing one or more data attributes contained in one or more TDRs with a DDID
e. One or more references to external networks, the internet, intranets, and/or computing devices that may be integrated or in communication with one or more embodiments of the invention are replaced with DDIDs.
Maintenance module 53 that can be stored:
a. data object, operation, activity, procedure, or feature, "related Data" (defined as Data initially associated with a DDID and/or Data aggregated with a DDID within and/or after an associated time period) and/or a DDID; and
b. key information relating to (a) a Temporal Key (TK) reflecting information of time periods associated with each DDID with a particular Data object, attribute, combination of attributes, activity, process or characteristic, (b) an Association Key (AK) and/or (d) a Replacement Key (RK);
thus, the TDR is allowed to be later re-associated with a particular attribute, combination of attributes, action, activity, procedure, feature, and/or associated data body. In addition, the maintenance module may perform further analysis and processing of the attributes or combinations of attributes in the secure environment.
Access a log module 54, which may include collecting and storing information to enable postmortem forensic analysis in the event of system errors and/or abuse.
Verification module 55 may include verifying and verifying the integrity of the aggregated data profile including data attributes, attribute combinations, DDIDs, and TDRs at any point in time.
As described herein, embodiments of the present invention are directed to facilitating privacy, anonymity, security, and accuracy in connection with electronic data and network communications, analytics, and/or research. In one example, data elements related to a Data object, action, activity, procedure, or characteristic may be abstracted by linking the Data element related to the Data object, action, activity, procedure, or characteristic to an independent attribute or a dependent attribute and/or a separate Data element. Information about a Data object, action, activity, process, or feature is classified as an independent attribute or a dependent attribute. For purposes of this disclosure, a Data attribute may refer to any Data element, activity, process, or feature that may be used, either alone or in combination with other Data elements, to identify a Data object (e.g., a person, place, or thing, and/or an associated action).
As described above, in addition to abstracting Data that may be used to identify a Data object such as a person, place, or thing, abstraction module 52 of fig. 1 or 1A may also be used to abstract Data related to the Data object, such as may include but is not limited to: physical or virtual things and entities; hardware or virtual devices; a software application; a legal entity; an object; an image; audio or video information; sensory information; multimedia information; geographic location information; private/anonymous information; security information; electronic messaging information including senders and recipients, message content, hyperlinks in messages, embedded content in messages, and information about devices and servers involved in sending and receiving messages; social media and electronic forums; online websites and blogs; RFID (radio frequency identification); tracking information; tax information; educational information; identifiers related to military, defense, or other government entity programs; virtual reality information; massively multiplayer online role-playing games (i.e., MMORPGs); medical information; biometric data; behavior index information; genetic information; data that refers to the physical or virtual location of other data; and an instantiation or representation of data or information.
In one example, digital Rights Management (DRMI) for individuals and/or Digital Rights Management (DRMD) for de-identification may be provided using the systems, methods, and devices described herein. Digital rights management for individuals may include individual-directed privacy/anonymity, where an associated party manages data attributes related to one or more associated parties. In this case the associated party will act as the controlling entity. Alternatively, the third party may manage data attributes relating to one or more interested parties, including privacy/anonymity for the entity. In this case, the third party will act as the controlling entity. Digital rights management for de-identification also includes entity-directed privacy/anonymity, where a third party manages data attributes associated with an associated party and controls the extent to which information about the data attributes and/or associated party is available to others.
The systems, methods, and apparatus disclosed herein may be used to provide a DRMI such that one or more interested parties may directly or indirectly manage their online data digital fingerprints. The associator may also control the extent to which information relating to the Data attributes, data Subject, or one or more associators may be provided to a third party so that the information and Data may be provided in an anonymous, unrecognizable manner. The system, method and apparatus provide a dynamically changing environment in which interested parties may wish to share data at a certain time rather than the next. This is a mechanism that can be dynamic in nature, understanding time intervals, specific receiving entities, physical or virtual whereabouts, or other mechanisms that trigger changes in the data to be shared. Implementing DRMI can enable non-re-identifiable anonymity and can allow sharing of different information about Data attributes, data subjects, and associates according to different purposes in dynamically changing time and/or location sensitive situations. The special need for information relating to Data attributes, data subjects or associates at a particular time and place can be met without revealing other unnecessary information unless the disclosure is authorized by the controlling entity. In addition, the unnecessary information may be, for example, the true identity of the Data object or the associated party, mailing address, email address, previous online operations, or processes or features related to the Data object or the associated party for related operations, activities.
The systems, methods, and devices disclosed herein may be used to provide DRMD so that entities can centrally manage online digital fingerprints of information about the Data attributes, data subjects, and interested parties for which they are responsible; and such entities may control the extent to which information is provided to other parties in a non-re-identifiable or identifiable manner. This allows the entity to meet the goals and/or obligations to cancel identity verification to comply with Data Subject, the requirements of the associated party, and regulatory protection and inhibition.
Exemplary implementations of some embodiments of the present invention may be configured to provide DRMI and/or DRMD functionality with respect to data attributes comprised of image or video files that reveal identifying facial features, as will be discussed below. Data subjects or associates may benefit from others who are able to make identity inferences based on the unique facial features of the Data subjects in the electronic image. However, the rapidly growing commercial availability and use of facial recognition technology and the increasing availability of electronic images raise issues with respect to privacy/anonymity and security of Data subjects and associated parties. In one example, with respect to Data subjects and associated parties, in one or more aspects of the invention, in the Data attributes of the photograph, including the features of the facial image and the Data Subject, privacy/anonymity and security may be protected.
In some embodiments, the systems, methods, and devices disclosed herein may be configured to distinguish between a principal's registered or authorized visitor and a state of unregistered/unauthorized access to a website or other electronic image sharing application that contains data attributes. Depending on the state, it may also be possible to distinguish registered/authorized visitors to a website or other photo sharing application that contains Data attributes related to Data subjects or contacts/friends of the associated party from Data attribute parties that are not related to Data subjects or contacts/friends of the associated party. In one example, the system of the present invention can control whether any image data attributes containing facial features are presented. If image data attributes containing facial features are present, the system can further control and restrict unauthorized use and copying of photographs, which can lead to unexpected secondary use through other protection techniques. In addition, some embodiments of the invention may provide the Data Subject, the affiliates, and the control entity with the ability to specify which additional parties that may present image Data attributes at all, and for which particular purpose. If Data attributes are provided, the Data Subject, the associated party, or the controlling entity may specify whether the image uses known protection techniques intended to limit unauthorized use and copying of the photograph, thereby preventing or reducing the risk of accidental use of the image.
The DRMI can cause the Data object and the associated party to directly or indirectly manage the photographs containing the facial images and control the photographs relating to the associated party in a manner that is identifiable, unrecognizable, reproducible, or available to third parties to an extent that is not reproducible.
An example of a potential implementation of the present invention may involve the use of DRMI by providers of wearable, implantable, embeddable or otherwise connectable computing technologies/devices to alleviate the public's potential concerns about information obtained and/or processed using the technologies/devices. For example,
Figure BDA0003935247150000521
DRMI can be employed to facilitate by establishing a non-numeric display list (similar to the "no call list" maintained by FTC, restricting DRMI to individuals)
Figure BDA0003935247150000522
More widely adopted) to register with a Data Subject or related party to prohibit digital display use
Figure BDA0003935247150000523
An unauthorized photograph taken or displayed. (
Figure BDA0003935247150000524
And
Figure BDA0003935247150000525
is a trademark of google corporation. )
Com, an example of the present invention, may further provide a function for Data subjects or associates that are members of the professional networking site linkedin. In one example, a three-tier classification scheme may be used to control access, use, and duplication of photographs containing Data subjects or facial images of associated parties:
Category a treatment or identity may apply to linkedin.com website visitors that are not registered/authorized members of linkedin.com. These visitors may not be provided with a view or copy containing registered/authorized
Figure BDA0003935247150000526
(
Figure BDA0003935247150000527
Trademark of the yielden company) of the member. Instead, they may be provided with graphics, images, indicators, or avatars through their web browser, mobile application, or other application to indicate that the photos are only available to registered/authorized users of the linkedin.
The class B treatment or identity may be applicable to linkedin.com registered/authorized members of authenticated contacts that are not linkedin.com registered/authorized members. By using other protection techniques intended to limit unauthorized use and copying of photographs, which may lead to accidental secondary use, these registered/authorized members may be provided with a limited way to view or copy the content
Figure BDA0003935247150000528
The photograph of the member's facial image is not a verified contact. These other protection techniques may include, but are not limited to:
1. tiling divides an image into smaller image tiles that will appear as a continuous image, but can only be limited to one tile at a time for any entity attempting to copy the image;
2. Adopting an image watermarking technology;
3. concealing the layer to place an image containing facial features behind the transparent foreground image;
4. providing an image without a color profile or palette;
5. preventing download by prohibiting copying or using tabular descriptions of images using "right click";
6. preventing downloading by prohibiting "right click" copying or using JavaScript technology for functional pictures;
7. forbidding downloading through a Flash technology to forbid 'right click' copying or using the functional picture;
8. hiding the image through a URL coding technology;
9. using META tags to prevent images containing facial features from being indexed by search engine spiders, robots, or robot images; and
10. txt files are used to prevent images containing facial features from being indexed by search engines spiders, robots, or robot images.
The class B treatment or identity may be applicable to a registered/authorized member of linkedin.com while also being an authenticated contact of another registered/authorized member of linkedin.com. These registered/authorized members can view or copy the content of the other members in a complete manner
Figure BDA0003935247150000531
A photograph of a facial image of a member.
Some examples of the invention may provide a DRMD such that an entity may centrally manage the photo data attributes that contain the facial images for which it is responsible, and may control the manner in which the photo data attributes are provided to other parties in a recognizable, unrecognizable, reproducible, or irreproducible manner.
One example of a potential implementation of the invention involves or involves the use of an entity control system that provides a DRMD that utilizes known facial image recognition capabilities to restrict disclosure of elements by parties not authorized for Data subjects, or that the photo Data attributes contain recognizable facial elements of the registered/authorized Data subjects, or related parties to view the facial elements. Conversely, a party attempting to upload, use or view a photograph that includes a registration/authorization Data Subject or a related party's facial elements, whose facial features have been registered by the DRMD system, but which related party has not yet been authorized by the registration/authorization Data Subject or related party, may only see and only be able to use the modified version of the photograph as modified by the DRMD system to prevent or "unmark" the registration/authorization Data Subject or the related party's identifiable facial elements. By way of example, a photograph taken at a public bar that includes the faces of Data subjects or interested parties registered with the system providing the DRMD may be modified to obscure or "unmark" the faces of the interested parties on all versions except those explicitly authorized by the Data subjects or interested parties.
In one example of the invention, the authentication module may be configured such that the decision as to who can see what information is determined by the controlling entity on a configurable basis. In one example, configurable control may include automatic and/or manual decision-making, and updating on a case-by-case basis in a timely manner by providing each controlling entity with the ability to dynamically change the information made up of the data-containing attributes at any time. Enhanced customization through dynamically changing the composition of data attributes will result in greater relevance and accuracy of the information provided about the data attributes and/or interested parties. As disclosed herein, the use of DDIDs as an integral part of privacy, anonymity, and security enables each receiving entity receiving information to receive different information as appropriate for each particular purpose, thereby facilitating the distribution of fresh, timely, and highly relevant and accurate information, rather than stale, time-burdened, less accurate value-added data, such as that provided by conventional persistent or static identifiers or other mechanisms.
Fig. 1 and 1A also show various examples of privacy clients 60 operating on user devices 70, such as computers, smart phones, or other wired or wireless devices, that may communicate with the privacy server 50 over a network 72, such as the internet or other public or private network.
In one example, the privacy client component of the present disclosure may reside on a mobile device. The privacy client may be provided as part of a mobile application or operating system running on the mobile device, or may be configured as a hardware device, integrated circuit or chip of the mobile device. Those mobile devices employing one or more aspects of the present disclosure can alternatively possess real-time knowledge of the location, activity and/or behavior of Data subjects and/or parties associated with the device. The mobile device may also transmit, receive, and process information with other devices and information sources. Mobile applications interacting with privacy clients can also provide the controlling entity with control over the level of participation and timing in location and time sensitive applications, and the degree of information sharing with third parties in an anonymous manner (rather than an identifiable personal manner). Those mobile devices employing one or more aspects of the present disclosure may also leverage the unique capabilities of mobile devices to aggregate user personal preference information collected from a variety of unrelated and disparate sources (whether mobile devices, more traditional computer systems, or a combination of both) and may share the user's information with vendors (on an anonymous or personalized basis) only under the user's approval to facilitate time and/or location sensitive personalized business opportunities. As can now be more clearly understood, users can determine whether the benefits of such time and/or location sensitive personalized business opportunities merit their own identity in the relevant transactions.
For example, without embodiments of the present invention, a static identifier traditionally associated with a mobile device may enable mobile application providers and other third parties to aggregate information related to using the mobile device; by aggregating data about the use of mobile devices, application providers and other third parties may obtain a variety of information, which may include, but is not limited to, frequent physical location information, calling habits, content preferences, and online transaction-related information that they cannot obtain from any one-time interaction with the device user. By using some embodiments of the present invention, application providers and other third parties may be prevented from aggregating information about Data subjects and the parties involved in using the mobile device; and some embodiments of the invention may be configured for use by mobile applications that provide geographic location information to mobile devices that need to be accessed (e.g., directions or mapping applications) without revealing various relevant identity information including the mobile device, data Subject or interested party through dynamic creation, changeable and reassignable DDIDs as described herein; rather than the conventional static identifier.
In one example, embodiments of the invention may be configured to provide enhanced privacy, anonymity, security, and accuracy with respect to persistent and/or static identifiers by utilizing DDIDs rather than aggregation over static identifiers; thus, embodiments of the present invention may provide a solution for online digital fingerprints left across networks and the internet. Thus, embodiments of the invention may provide the control entity with the ability to decide who can see what Data, prevent the Data aggregator from understanding the Data connections belonging to a Data Subject or interested party without permission of the control entity, and provide the control entity with control over the upstream and/or downstream propagation of information.
In one example of the invention, continuous access may be provided by using a DDID to provide multiple levels of protective abstraction and for the benefit of big data analytics. Systems, methods, and devices embodying aspects of the present invention also Do Not suffer from fundamental deficiencies such as Not tracking Do-Not-Track and other plans that eliminate data access required for efficient big data analysis, as well as data access inconsistent with economic models that provide free or discounted products or services in exchange for information. Non-tracking is a technical and policy proposition that allows Data subjects or interested parties to be online and may opt out of certain tracking for some Data collection entities at websites and third parties, including analytics services, advertising networks, and social platforms. While not tracking provides enhanced privacy, anonymity, and security for Data subjects and interested parties, it also cuts off the benefit that they can obtain customized related personal products through big Data analytics while online. This impacts the economic benefits provided by big Data analytics for merchants, service providers, and Data subjects or interested parties themselves.
In contrast, some embodiments of the present invention may have a net neutral to positive revenue impact (versus a net negative revenue impact of an untracked plan) because, in some embodiments of the present invention, the controlling entity may include data attributes in the TDR that enable the receiving entity to track over the time that the TDR persists using existing tracking techniques. The control entity may also contain more accurate information than is available by just tracking, for easy personalization and customization. For example, the controlling entity may choose to include some Data in the past browsing records about the website in a combination of attributes about the Data object or the interested party that is sent to the website by the privacy client and that adds other specific updated information that is beneficial to both the website and the Data object or the interested party.
Referring to fig. 1 and 1A, one embodiment of the invention may include a computer network 72, wherein one or more remote privacy clients 60 are comprised of computer hardware, firmware or software residing on one or more computer devices 70 or on a network device and accessible via the network device, sending requests/queries to and receiving services/responses from one or more computer devices acting as privacy server 50. The privacy client computing device 70 may include a smart device (i.e., a wearable, removable or non-removable smart device), a smartphone, tablet, laptop, desktop, or other computing device with programs to (i) allow service requests from and/or submit queries to the privacy server, (ii) provide user interface capabilities, (iii) provide application processing capabilities, and/or (iv) provide localized storage and memory. The privacy server 50 computing device may comprise a mainframe personal computer, mini-computer, mainframe computer, or other computer device that programs (i) to respond to requests for services/queries from privacy clients, (ii) to provide centralized or decentralized management of the system, (iii) to provide high-volume application processing capabilities, and/or (iv) to provide mass storage and memory capabilities integrated with one or more databases. The privacy server 50 may also be configured to perform one or more of the operations or features described herein. The communication capability between the privacy server and the privacy client may be comprised of computer networks, the internet, intranets, public and private networks or communication channels, and supporting technologies.
Referring to fig. 1 and 1A, another potential embodiment of the invention may include a computer network, wherein one or more remote privacy clients 60 comprise computer hardware, firmware or software residing on one or more computing devices 70 or residing on network devices and accessible via the network devices-sending requests/queries to and receiving services/responses from one or more computing devices acting as privacy server 50, wherein the privacy server 50 may send and receive services/responses to cards, mobile devices, wearable devices and/or other portable devices comprising electronically received and stored information via the internet, intranet or other network, while the wearable and/or other portable devices contain information about data attributes and/or DDIDs until such time as the information relating to the data attributes and/or DDIDs is modified by the privacy server (if any).
The privacy server and privacy client may implement modules including program code to perform one or more steps or operations of the processes and/or features described herein. The program code may be stored on a computer readable medium accessible by a processor of the privacy server, or the privacy client. The computer readable medium may be volatile or nonvolatile, and may also be removable or non-removable. The computer-readable medium can be, but is not limited to, RAM, ROM, solid state memory technology, erasable programmable ROM ("EPROM"), erasable programmable electronic ROM ("EEPROM"), CD-ROM, DVD, magnetic cassettes, magnetic tape, magnetic disk storage, other magnetic or optical storage devices, or any other conventional storage technology or storage device.
The privacy server and associated database may store information relating to TDRs, periods/stamps, DDIDs, attribute combinations, data subjects, interested parties, associated profiles and other relevant information. The privacy server and associated databases may be managed and accessed by the controlling entity, but in one example cannot be managed and accessed by others unless authorized by the controlling entity. In one example, through the TDR, an authentication module of one or more privacy servers controls data access. The privacy client may request information from the privacy server that is needed to perform the required action, activity, process, or feature and/or query the privacy server as to whether the TDR is authorized to participate in the requested action, activity, process, or feature within a particular time and/or location. The privacy client may also aggregate data, such as tracking data, about actions, activities, processes, or features in which TDRs associated with the privacy client participate, thereby avoiding the need to return to the database for data extrapolation. In one example, the insights collected by other parties may become part of the TDR for their duration.
In one example implementation of the invention, abstraction module 52 is configured to cause the control entity (which may be a Data object or a stakeholder) to link Data belonging to the Data object to attributes, and/or to separate Data belonging to the Data object into attributes that may be divided, combined, rearranged, or added to various attribute combinations. These combinations may include any combination of attributes associated with a Data Subject or previously created combinations of associated attributes.
In this example, with respect to each desired action, activity, process, or feature involving the privacy server, in one example, the abstraction module enables the control entity to limit the extent to which identifying information is transmitted or stored by selecting from the attributes only those attributes that are necessary for the desired action, activity, process, or feature, and linking those data attributes to one, or a combination of, the attributes, and/or separating the data attributes into one, or a combination of, the attributes. The control entity may then use the abstraction module to dynamically create and/or assign DDIDs to form TDRs in each combination of attributes. The DDID may be configured to expire after a preset delay or prompt and may be reused on Data associated with another action, activity, procedure or feature and/or other Data Subject or related party so that no precise trace of association is left outside the privacy server. In one example, before assigning or accepting a DDID to form a TDR, the abstraction module may verify that the DDID has not been actively used in another TDR. To perform such validation, additional buffer timeout periods may be included to account for potential downtime and system downtime. The greater the number of data attributes and associated TDRs generated with respect to the required action, activity, process or feature, the higher privacy, anonymity and security is achieved. In this case, an unauthorized party obtaining access to one of the TDRs will only obtain access to the information contained in that TDR. In one example, the information in a single TDR may be only a portion of the required attributes in the desired action, activity, process, or feature, and further provide no information needed to determine other TDRs containing the required attributes, or to determine any Data objects that may be associated with the TDR, and/or the interested party.
In one example, a TDR created by means of an abstraction module may be based on one or more procedures that match prescribed steps required to describe or perform different actions, activities, or procedures to specified categories of attributes associated with those steps, and select or combine those attributes necessary with respect to a particular action, activity, procedure, or feature. The process of creating a TDR by means of the abstraction module may be performed directly by the control entity or indirectly by one or more parties authorized by the control entity.
For example, the first database containing credit card purchase information may include information that is vital to the credit card issuer in performing a big data analysis of the purchase information. However, the database need not include identification information for the credit card user. The credit card user identification information in this first database may be represented by the DDID and the combined DDID and the Replacement Key (RK) that the users must use are stored in a separate secure database accessible to the privacy server, and/or the system module. In this manner, the system may help protect the identity of the credit card user and limit potential financial loss in the event that the first database is faced with unauthorized inclusion of credit card purchase information, since the DDID and related information are not decipherable by an unauthorized party.
Further, in one example of the invention, real-time or batch analysis from mobile/wearable/portable device data can be performed in a manner that is beneficial to the receiving entity (e.g., merchant or service provider) without sacrificing privacy/anonymity of the mobile/wearable/portable device user. Each user may be considered to be the mobile/wearable/portable device in question, as well as the relevant party to the Data Subject associated with the device itself or the use of the device. In return for special or other offers provided by the receiving entity, the user of the mobile/wearable/portable device may choose to share the non-identified TDR in an anonymous manner based on the user's real-time location, real-time activity, or during a particular time period, e.g., with receiving entities located within a specified distance (e.g., 1 mile, 1000 feet, 20 feet, or other distance depending on the implementation) of a particular geographic location, or within a specified category (e.g., jewelry, clothing, restaurants, bookstores, or other locations) related to the location of the mobile/wearable/portable device. In this way, the receiving entity can have an accurate aggregated view of the demographics of its potential customer base, in terms of age, gender, income, and other characteristics. These demographics may be revealed by TDRs shared by mobile/wearable/portable device users at different locations, times of day, and days of the week, which may help the recipient to more efficiently determine what services, desired inventory, and other sales, supply chain, or inventory-related activities should be provided for the interested party. In one example, a Data Subject and a stakeholder (which may be a user of a mobile/wearable/portable device will benefit from a special arrangement or offer without necessarily disclosing their personal information to the receiving entity (which will only know that the Data Subject or stakeholder is registered, but will not know the specific information associated with any particular Data Subject or stakeholder), unless and only to the extent desired by the Data Subject or stakeholder.
In an example implementation of the invention, the authorization module may provide the controlling entity with control rights as to which other entities can be given access or use with respect to the TDR information. The control entity may also use an abstraction module to control the extent to which other entities have access to particular information elements contained in the system. For example, a mobile/wearable/portable platform provider acting as a control entity may provide performance Data to a mobile/wearable/portable device manufacturer without revealing the identity of the device, data object, or interested party user, or the location of the Data object, or interested party user or device. The mobile/wearable/portable platform provider may also provide the mobile/wearable/portable application provider with a map of mobile/wearable/portable device usage or other geographic location Data needed for the application, and without revealing the device, data Subject, or the identity of the interested party user. Conversely, the mobile/wearable/portable platform provider may use the system to provide location and identity Data to the 911 emergency system related to the device, as well as the Data object or related party user of the device. One exemplary implementation of the authorization module may include allowing delegation of the ability to request generation of a DDID and associated TDR to other parties authorized by the controlling entity.
According to an example embodiment of the present invention, a receiving entity may use information about mobile/wearable/portable device interested parties in order to customize the user experience or opportunity at the interested party gathering location without requiring disclosure of personal identifying information. For example, a band playing rural western music and gospel music may determine in real time or near real time that the majority of interested parties attending a concert prefer gospel music and make their respective adjustments to the concert song selections by receiving Data subjects or TDRs associated with interested parties as attendees of the concert. By analogy, in stores that use video screens to display goods or specials, store management can know in real time when they have a large number of specific demographic customers in the store by receiving and analyzing TDRs associated with the customers as clients from mobile/wearable/portable devices, or interested party customers. The store may then play a video for that particular population and schedule changes to the video throughout the day in response to changes in Data subjects, or related party demographics, communicated to the store system by the client in the mobile/wearable/portable device. Demographic Data obtained from the information in the TDR may include, but is not limited to, the age, gender, or income level of the Data Subject or the relevant party. Likewise, in a retail store that uses real-time geographic location to identify a particular location of a given customer in the store, customers who are Data subjects or related parties may be offered special discounts or offers via their mobile phones, tablets or wearable devices by receiving and analyzing TDRs associated with their personal preferences, brand preferences and product purchase preferences, where such TDRs will also include extraneous information added in real-time based on the products available at the location of the store for the Data Subject or related parties.
In an exemplary implementation of the invention, the abstraction module of the privacy server assigns the DDID to the combination of attributes necessary to satisfy the privacy client's request, and/or the query from the privacy client, which may reside in a number of locations including, but not limited to, a Data Subject device, a service provider device, accessible through and residing in a cloud network, or on the same computer device as the privacy server, to create a TDR during the association between the DDID and the desired combination of attributes. The TDR in the privacy client may freely interact with the recipient entity within a configured time, action, activity, process, or feature. Once the interaction cycle with the designated recipient entity is complete, in one example, the privacy client can return the TDR, enhanced by the activity-related attribute combination with the privacy client, to the privacy server and associated database. The privacy server may then associate various combinations of attributes with a particular Data Subject, as well as update and store the combinations of attributes in the aggregated Data profile for the Data Subject in the secure database. At this point, in one example, the DDID that has been assigned to the combination of attributes may be reassigned relative to other actions, activities, procedures or features or Data subjects to continue obfuscating Data relationships.
Other implementations of the invention are also contemplated herein, including various systems and devices. In one embodiment, a system for improving electronic data security is disclosed herein. In one example, the system may include: an abstraction module configured to dynamically associate the attribute(s) with the Data Subject(s); an abstraction module configured to generate a DDID or accept or modify a temporally unique, dynamically changing value as the DDID, and further configured to associate the DDID with at least one Data Subject; a maintenance module configured to track activity related to the DDID and also configured to associate: any other DDID, tracked activity, and time period for tracking activity through a Time Key (TK) or otherwise using the DDID. In one example, the abstraction module is configured to add or delete properties associated with at least one Data Subject, and the abstraction module can also be configured to modify properties already associated with at least one Data Subject.
In another implementation, disclosed herein is a device for secure, private, anonymous activities on a network. In one example, the apparatus may include a processor configured to execute program modules, wherein the program modules include at least a privacy client module; a memory connected to the processor; and a communication interface for receiving data over a network; wherein the privacy client may reside on a Data Subject device, on a service provider device, accessible via a cloud network and residing in the cloud network, or on a computer device configured the same as the privacy server, to receive Data attributes TDR and DDID necessary to perform activities on the network from the privacy server. In one example, the privacy client may be further configured to capture activity performed using the device and associate the activity performed with the TDR. In another example, the privacy client may be configured to send the captured activity and TDR to the privacy server. In one example, the privacy client may reside on the mobile device as a mobile application. In another example, the privacy client may reside in a network as a cloud-based application and may be accessed via the network. In another example, the privacy client may reside on the same computer device that the privacy server resides as the local application.
In another example, the device may further include a geolocation module, wherein the TDR is modified with information from the geolocation module, and wherein the TDR restricts access to the device identity information. The device may also include a user interface configured to allow a user to modify the TDR, including options for changing the DDID or data attributes associated with a particular TDR. The user interface may include a selectable option for sharing TDR only with other network devices having a predetermined physical, virtual, or logical proximity to the mobile device.
In another example, in response to TDRs, the device may receive targeted advertising or marketing information based on the physical, virtual, or logical location of the device; wherein the TDRs include demographic information related to the device user and further include receiving targeted advertising or marketing information based on the demographic information. In another example, the TDRS may include related purchase transaction information made or expected to be made using the device, and further include receiving targeted advertising or marketing information based on previous or expected purchase transactions.
In another implementation of the invention, a system for providing privacy and anonymity of electronic data is disclosed herein. In one example, the system may include at least one user device having a first privacy client running on the user device; at least one service provider device having a second privacy client running on the service provider device; and at least one privacy server coupled to the network, the privacy server in communication with the first and second privacy clients; wherein the privacy server includes an abstraction module that electronically links Data subjects to Data attributes and attribute combinations and separates Data into Data attributes and attribute combinations, and associates DDIDs with the Data attributes and attribute combinations. In one example, the privacy server may include an authentication module that generates one or more of the DDIDs. In another example, the privacy server may include a maintenance module to store combinations of data attributes and attribute combinations with which the DDIDs are associated. In another example, the privacy server may include a verification module that verifies the integrity of the data attributes, attribute combinations, and DDIDs. In another example, the privacy server may include an access log module that collects and stores information related to DDIDs and data attributes for use in one or more post-incident forensic analyses in the event of an error. In one example, the DDID expires after a predetermined time, and after the DDID expires, the abstraction module assigns the DDID to another Data attribute or Data Subject.
Figure 1B highlights some examples of how allocation, application, expiration, and reclamation of DDIDs may occur. It should be noted that in the context of a potential implementation of an embodiment of the present invention, a DDID may exist forever, but is reused for multiple Data subjects, data attributes, attribute combinations, actions, activities, procedures and/or features. Although the DDID can be reused, two identical DDIDs cannot be used simultaneously unless required and authorized by the controlling entity. The reassignment of DDIDs can be accomplished by taking advantage of the existing capabilities of Data collection and analysis to reassign DDIDs to similar combinations of attributes or Data subjects, or to distinctly different combinations of attributes or Data subjects. This reallocation enhances the privacy/anonymity and security feasibility of dynamically created and alterable digital DDIDs.
As shown in FIG. 1B, the system may be configured such that the assignment, expiration, and/or reclamation of any given DDID may occur based on any one or more of the following factors: (1) Create a destination change for the DDID (and associated TDR), e.g., associated with a particular browsing session, data Subject, transaction, or other destination; (2) A change in physical location associated with a DDID (and associated TDR), e.g., upon exiting a physical location, upon reaching a general physical location, upon reaching a particular physical location, upon entering a physical location, or upon some other marking of a physical location; (3) A change in a virtual location associated with a DDID (and associated TDR), e.g., upon entering a virtual location, upon changing a virtual location, upon exiting a virtual location, upon reaching a particular page on a website, upon reaching a particular website, or other virtual location indicia; and/or (4) based on time variation, such as at a randomized time, at a predetermined time, at a specified interval, or some other time-based criterion. As can be appreciated, DDIDs separate Data from contextual relationships, as there is no discernible relationship outside the system between related Data, data subjects, or the identity of a related party, or contextual relationship Data associated with different DDIDs and/or TDRs. Within the system, relationship information is maintained for authorized use by Data subjects and trusted parties/agents.
FIG. 1C-1 represents the concept of a circle of trust (CoT) (denoted as "trusted agent" in FIG. 1C-1, referred to herein as "trusted agent" and/or "trusted party") from the perspective of a trusted party or trusted agent. First, note that the Data object is contained in the chart in the lower left corner. The schematics of most current Data usage systems do not include Data subjects because participation in Data subjects typically takes the form of a binary decision as to whether to agree to the online terms and conditions of "accept or leave" using the traditional "notification and consent" model. After this initial point, the Data Subject will typically lose the ability to affect everything that happens on its Data, since "they are products and not customers". It is well known that this is a fragmentation model in the digital age, with few effective limitations on current or future data usage.
It should be noted that there may be more than one trusted party working with a single trust circle, and that a Data Subject may be a participant in any number of trust circles. Trust circles may be implemented through a centralized or federated model to improve security. The arrows in fig. 2 represent data movement; the data input and output will contain different information.
FIG. 1C-1 illustrates the data processing flow of two potential embodiments of the present invention. In a first example embodiment of the invention, a user (1) may participate in the required operations of browsing a website in this example implementation by forming one or more TDRs (each of which may initially consist of a Data attribute DDID for collecting and retaining Data relevant to an activity involving the TDR, or consist of the DDID and a Data attribute or combination of attributes retrieved from a Data Subject summary Data profile) to indicate that they are interested in using the system to create Data input (in this example, a Data Subject) relating to a particular Data Subject. The system may track and collect data relating to browsing web pages with one or more TDRs and transmit it to a controlling entity (3) acting as a trusted party or agent. The TDR, which reflects the tracking Data collected in connection with web browsing, will display the output from the browsed web page with which the controlling entity, as a trusted party, can choose to augment the aggregated Data profile of the user/Data object. In a second example embodiment of the invention, user (2) may show that they are interested in using the system to create a privatized/anonymized version of the Data set that the user has, which contains personal information about the Data object (1). In this example, a user Data set containing personal information about Data subjects may be used as input to the system. The system can identify and track Data values contained in Data sets reflecting personal information, and the process performed by the controlling entity acting as a trusted party or agent (3) can choose to replace said personal information with one or more Replacement Keys (RK) that need to be accessed in order to re-identify the personal information on the Data Subject. In this example, the resulting modified Data set would represent output from a DDID system that contains dynamic changes, rather than personal information about the Data object. In this way RKs can be changed in the future so that personal information access on any one or more Data subjects can no longer be re-identified, so that applicable Data subjects have "forgotten rights", i.e. they can delete their digital traces from the internet.
As shown in the blocks labeled "privacy policy" and "authorization request" in fig. 1C-1, data usage may be managed by a "user" according to the rights ("permissions") managed by trusted parties and/or agents. A "user" can be the Data object itself, which is the Subject of the Data in question (e.g., data about themselves by users, consumers, patients, etc. -for purposes herein referred to as a "Subject user"); and/or a third party that is not the subject of the data in question (e.g., a vendor, a merchant, a healthcare provider, a legally permitted government entity, etc. -referred to herein as a "non-subject user").
PERM relates to allowed operations, such as which data may be used by whom, for what purpose, for what period of time, and the like. The PERM may also specify the level of anonymization desired, such as when/where/how to use the DDID in providing anonymity in context with the identity and/or activity of the Data Subject, when to use other privacy enhancement techniques related to or in place of the DDID, when to provide identifying information to facilitate the transaction, and the like.
In the Data Subject implementation (e.g., DRMI) of the present invention, subject users can build custom PERMs for using their Data through pre-set policies (e.g., gold/silver/bronze-minds, which is just an example, and mathematically, this can be a discrete set of k choices, which can be or can be represented by values on a continuum between a lower and upper bound) that translate into fine-grained dynamic permissions, or can select a "custom" option to specify more detailed dynamic parameters.
In a "managed" implementation of Dynamic authentication (DRMD), non-subject users may establish data usage/access rights that allow compliance with applicable corporate, legislative, and/or regulatory data usage/privacy/Anonymity requirements.
Within the PERMS-based CoT reflected in FIGS. 1C-1, business intelligence, data analysis and other processes may be performed by any combination of I, D, T and/or X, or interpolation, with respect to one or more Data subjects, as shown in Table 3 below:
TABLE 3
Figure BDA0003935247150000661
FIG. 1C-2 shows the circle of trust (CoT) from the Data object perspective.
FIG. 1D shows a smartphone application that can track geographic location and blood pressure levels. Using Dynamic authentication, such a device can split the data into two streams, each of which is hidden so that neither stream, if intercepted and/or compromised (even checked after storage), will reveal Personal Data (PD) without adding the critical information protected in the CoT.
More specifically, FIG. 1D illustrates:
1. the blood pressure monitoring application (a) contacts trusted parties within the trust circle (B) requesting the DDID of the Data Subject patient.
2. The trusted party's CoT provides the DDD for the Data object.
3. The application operated by the trusted party sends back two sets of periodically varying information (one for GPS data and one for blood pressure level), each set consisting of a DDID, an offset (to mask blood pressure level data and geographic location), and an encryption key; refreshed every new time period. (these are also stored in a database for later use.)
4. The monitoring application transmits two encrypted and covert data streams to a "proxy" application or network device (C) controlled by Dynamic authentication within its corporate network. (here, regularly changing offsets are applied to both the location and level.)
The "agent" (C) uses the data stream (D, E) from the trusted party (containing only the decryption key) in order to convert the transmitted data into "plaintext". The agent also hides the incoming IP address and provides a stream of DDIDs and covered blood pressure level Data (F) (containing information for multiple Data subjects), or databases (H) and (I) corresponding to GPS locations (G).
At each point outside the trust circle in fig. 1D (and outside the smartphone itself), the patient's data is protected; no Personal Data (PD) is available or never generated.
-the transmission to and from the trusted party (1, 2) does not compromise personal data privacy/anonymity, nor is any stored in the database of the trusted party.
The location and blood pressure level (4) are transmitted separately (intercepting neither stream shows anything), keyed in by DDID, and made hidden so that even the data itself is not displayed directly or indirectly, or contains anything about the patient's true location or blood pressure level.
The Dynamic authentication agent (C) must connect to a trusted party to decrypt the data (prevent man-in-the-middle attacks). Each will merge with multiple data streams after decryption so that the originating IP address cannot be associated with its decrypted data.
Once in a resting state, when residing in two separate databases (H and I), the blood pressure level and location Data have different DDID sets, so that even the hosting company cannot establish any association between the two, let alone link each Data set with the Data Subject that produced it.
FIG. 1E illustrates a selected location for a new visit to serve a 20 to 30 year old Sexually Transmitted Disease (STD) patient using one embodiment of the present invention. A "cleansed" data set may show the incidence of sexually transmitted diseases, clustered by neighborhood to protect privacy/anonymity. Another data set may then show how many patients are in each neighborhood. However, even if these are grouped together, one cannot know exactly how many established cases of the disease belong to a particular age group.
This dilemma is alleviated by supporting two different modes of analysis.
In the case where the data must be exposed externally (i.e., outside of the CoT), the personal data element can be masked or encoded as a DDID, with the resulting association stored within the CoT. In addition, the data (or field) type identifier may also be masked in a similar manner when needed.
Later, after performing the analysis, the results of the analysis can be associated (if allowed) with the original Data object, field type, and value.
Another way in which Dynamic authentication enables lossless analysis is through the use of federated, anonymous queries, either between different trusted parties within a CoT, between different data stores within the same trusted party, or between a trusted party and an application developer whose data store resides outside of a CoT.
Consider again the question of choosing where to set up a clinic for a 20 to 30 year old patient to provide service. The Dynamic authentication system improves on the prior art by allowing the target query job to span multiple data stores and partition them so that each participant is unaware of its serving purpose, and therefore there is no risk of leaking PDs.
In this scenario, a query of the number of 20-30 year old patients within a set (large enough) of geographic areas will be presented to many trusted parties within a circle of trust. This aggregated query is then broken down into several steps, such as:
1. patients between 20-30 years of age are sought in a wide geographic area.
2. Only selective patients.
3. Only users whose privacy/anonymity policy allows this level of analysis are selected.
4. These results are "linked" to the home address of these patients.
5. The results are summarized by neighborhood, with only the number of patients displayed.
The operations required to satisfy this query may span disparate data stores and be in different organizations-but protected and benefited by trust circles.
FIG. 1E shows the following scheme:
1. the potential office owner sends a query to a trusted party asking for the search for venereal patients between the ages of 20-30.
2. The trusted party contacts the healthcare-related data store to locate venereal patients of 20-30 years of age.
3. The associated healthcare data store (which is stored by the DDID rather than by the determination of identifiable keywords) looks up matching records.
4. The matching DDID is then transmitted back to the trusted party.
5. The trusted party then parses these DDIDs to uncover the identified individuals.
6. The trusted party allows people of this particular type of query to filter the list by its privacy/anonymity policy.
7. The CoT then uses its database of addresses to cluster counts (or frequency of occurrence if the query is incomplete) by neighborhood in order to produce the desired result.
In this case, the companies operating the healthcare-related databases need not know (or divulge) the identity, location, or other potentially identifiable information of the patients for whom they own the data. Records they own are keyed in by the DDID and may also be obscured, so no personal data is generated either when a specified query is executed or when the results are transmitted.
Please note that the party that presented the query does not have access to this information. Their only interaction with CoT includes raising a question and receiving a high-level, aggregated, non-PD result. Please note that the inability to access this information never affects the quality, accuracy or precision of the final result. Dynamic Anonymity thus eliminates personal data that does not contribute to the end result, and data that is used only to diminish privacy/Anonymity, without any consequential benefit to any other party. By filtering out irrelevant data, which would otherwise consume time and resources to analyze, dynamic authentication actually increases the utility and value of the received information.
Personal data is only temporarily generated within a trust circle (the appropriate location for such information) managed by the trusted party-e.g., when the DDID is resolved. Such operations are temporary, leaving no persistent traces beyond the expected query results, and may also be restricted to certain dedicated servers to improve security. The use of DDIDs in the context of circles of trust may avoid the potential drawbacks of conventional data analysis that may produce discriminative or even recognizable results.
FIG. 1F illustrates an embodiment of the use of the present invention that enables a shoe manufacturer to send coupons for new shoe families to people who have recently made network searches related to running activities within a city. In exchange for providing a discount on shoes, a manufacturer may wish to receive an email and/or home address of a qualified consumer and send a survey to a person redeeming a coupon to assess their satisfaction with new shoes.
Description of the drawings:
1. manufacturers outside the CoT purchase a list of matching DDIDs from a search engine.
2. The DDID is submitted to one or more trusted parties, accompanied by a letter of record and policy modifications that allow access (after acceptance) to the email and/or home address of the Data Subject.
3. Each trusted party then forwards the invitation to the Data subjects that match these DDIDs (if they have chosen to accept such an offer).
4. If the Data Subject recipient accepts the offer, the recipient's policy is updated with (possibly temporarily limited) permissions to expose their home and/or email address to the shoe company.
5. Shoe manufacturers, today part of CoT, but only in terms of this particular offer, and only in the most limited sense, then receive a list of e-mails and home addresses of those who wish to receive coupons. It should be noted that this list must be highly targeted and accurate, which is of the greatest value to the shoe manufacturer. This is exactly how CoT can also add value by increasing privacy/anonymity. It can be assured by the shoe manufacturer that all mailings made in this manner will be sent to those having a significant interest in the manufacturer's proposal.
FIG. 1G builds the previous example in FIG. 1D, where a GPS enabled blood pressure monitor securely stores the patient's location and blood pressure level via Dynamic Anonymity. Dynamic immunity is available for:
1. HIPAA data processing obligations are avoided from being imposed on business partners participating in data processing flows if the data they own does not constitute Personal Data (PD).
2. Ensuring that the doctor's access to and use of the data meets HIPAA obligations.
Please note that the following scenario assumes that both the Data Subject patient and his/her doctor have accounts within the trust circle.
Description of the invention:
1. the monitoring application cooperates with the patient's trusted party to allow the patient to update his/her privacy/anonymity policy rules so that his/her doctor now has access to his/her blood pressure level (but not his/her GPS location data). Note that this authorization may be temporary (similar to the time-limited nature of photos that may be shared with Snapchat-authorization expires after a period of time) or persistent.
2. The doctor browses (via his/her web browser) to the blood pressure monitor's website which launches a JavaScript based blood pressure level viewer application which is thus run in the doctor's browser, rather than on the monitoring company's server (i.e. making the splice of the required data recognizable to him/her, done by a trusted party server itself-see steps 4 and 5 below).
3. The blood pressure level viewing application requires the doctor to log in through his trusted party (similar to many applications that allow you to use
Figure BDA0003935247150000711
Or
Figure BDA0003935247150000712
The way the account authenticates), and receives a session cookie that continues to identify it to the party, etc. (
Figure BDA0003935247150000713
Is a trademark of Facebook, inc
4. After the physician selects the time frame for viewing, the viewer application requests the recipient the patient's associated DDID and offset.
5. The trusted party verifies the doctor's access to this information (checks the patient's privacy/anonymity policy rules) and then returns the DDID and offset.
6. The viewer application then contacts its own corporate website, requests blood pressure data corresponding to those DDIDs, receives the results, applies the offset, and presents the blood pressure level as a graph.
At this time, the image on the doctor screen is the PHI data protected by HIPAA. If the doctor prints the data, the paper will be constrained by HIPAA. When the doctor finishes viewing the chart, he/she logs off or closes the browser, the application ends and the data is erased.
Please note that the re-identified HIPAA control data resides only in the physician's browser. The raw blood pressure level data stored in the application provider's database remains unchanged and hidden. The data of the trusted party is not affected.
Please note that the right to view blood pressure data is enforced within the trust circle. It is not only enforced by the viewer application (as is common practice today) -or only by the application's backend server. This means that an adversary cannot gain unauthorized data access merely by intruding the blood pressure level viewer application, because the data does not exist in any usable or recognizable form. The Dynamic data hiding capability of the Dynamic authentication DDID is combined with the Dynamic data privacy/Anonymity control capability of the trust circle, so that the data privacy/Anonymity and value are improved to the maximum extent, and personalized medicine/medical research is supported.
The different nodes described in relation to FIGS. 1H,1H-A represent Data elements associated with two different Data subjects that can be tracked, profiled, and/or analyzed by third parties, as these may be associated with and/or re-identified for each Data Subject. 1H-B represents a simplified visual depiction of the same data element, which can be retained dynamically without loss of context. The "family education right and privacy act" (FERPA) is a federal privacy regulation that regulates student educational record access and disclosure that reveals Personal Identity Information (PII). The FERPA specification cannot disclose PII, but if PII is deleted from the record, the student will become an anonymous student, privacy is protected, and the de-identification data generated thereby can be disclosed. In addition to legally defined categories (e.g., name, address, social security number, mother's maiden name, etc.), the FERPA definition of PII also includes "\8230" \ other information associated or linkable with a particular student, alone or in combination, allowing reasonable persons in the school community who have no personal knowledge of the relevant situation to identify the student with reasonable certainty ". As shown in fig. 1H-B, dynamic anonymity can obscure the connection between each Data Subject and the Data element in a controlled manner by enabling the Anonos's trust circle (CoT), thereby enabling the educational-related Data to be used without revealing PII.
FIG. 1I illustrates an example of a process for performing Disassociation Level Determination (DLD) and creating an Anonymity Measurement Score (AMS) according to an embodiment of the present invention. Determining DLD may require mathematical and/or empirical analysis of the uniqueness of the data elements prior to disassociation to assess the degree of disassociation to reduce the likelihood of an improperly authorized adversary identifying or re-associating. The DLD value may be used as an input to determine a level of correlation disassociation/replacement appropriate for different types of data elements.
The AMS may be used to associate a mathematically derived level of certainty with a hierarchy level and/or an anonymity category relating to the third party's likelihood of identifiable personal sensitivity, and/or identification information. In other words, the AMS values may be used to evaluate the output from the disassociation/replacement activity to determine the level/type of consent needed before the data may be used.
In step (1) of fig. 1I, the data attributes may be evaluated to qualify DLD, i.e., analyze the data elements to determine the potential for directly or indirectly revealing personal, sensitive, identifying, or other information for which anonymity protection is desired. In step (2), based at least in part on the determined DLD, the data element may be dynamically anonymized by means of disassociation. In addition, data elements may also be replaced. In step (3), a calculation may be performed, for example, by means of a mathematical function/algorithm (e.g., one whose output is reflected in fig. 1J) to calculate an AMS, which is associated with the identity likelihood of a third party identifiable after disassociation/replacement with a DIDD and the Data subject associated with the Data attribute. Finally, in step (4), the score/rating calculated in step (3) above may be used to specify the level of consent/participation required by the Data Subject to which the anonymous Data attribute belongs, as opposed to what level of caution/use the third party may exercise on the anonymous Data attribute without requiring consent/participation by the Data Subject, such as shown in the example AMS usage reflected in fig. 1K below.
Different categories of information have different re-identifiable statistical likelihoods. Each data element has an inherent level of uniqueness associated with it and when combined with other data determined by location, order, and/or frequency of occurrence. For example, looking at a single data point, the social security number is very unique and therefore more easily re-identified than a single data point such as gender, since everyone has an approximate 1:1 is the probability of a male or female. Since gender is less unique as an identifier than a social security number, the possibility of re-identifying someone by gender on an independent basis is much less than a social security number.
An Anonymity Measurement Score (AMS) measurement scheme binds re-identified statistical probabilities according to a degree and/or level of disassociation and/or replacement applied to data elements to create a plurality of ratings. As a single data point example, an AMS rating of 100 may be worthwhile for a social security number that is not disassociated or replaced at all, meaning that uniqueness classifies it as a very high risk of re-recognition. While gender may be worth an AMS score of 10 as a single data point identifier without disassociation or replacement because it is classified as a low risk of re-recognition even if there is no adequate means of de-recognition.
In an exemplary implementation that treats the social security number as a single data point, a level 1 implementation may assign DDIDs for purposes of disassociation and/or replacement while retaining the initial assigned value-i.e., a permanent assignment (e.g., where the data is used as an output of a hard copy representation of the data). In the case of social security numbers, a level 1 application of the DDID may reduce the AMS score by 10% and result in a modified AMS score of 90. This is still a high level of risk associated with re-identification, but is more secure than non-disassociated and/or replaced elements.
In an exemplary level 2 implementation, the social security number may have a DDID assigned for disassociation and/or replacement purposes, while retaining the initially assigned value until the value changes on a one-way basis-i.e., immutability (e.g., where the data value may be changed unilaterally by sending new information to a remote card, mobile, wearable, and/or other portable device that includes means for electronically receiving and storing information). Thus, the AMS score of the social security number may be reduced by another 10% to achieve the AMS score of the AMS.
In this example, and continuing to level 3 implementations, it may have DDIDs assigned for disassociation and/or replacement purposes while retaining the initially assigned values, but the DDIDs may change on a bi-directional basis, i.e., dynamic variability (e.g., where data values may be altered bi-directionally by dynamically sending, and/or receiving data between client/servers and/or cloud/enterprise devices, and with the ability to dynamically receive and alter specified data). The social security number will then have an AMS score, which will decrease by a further 50%, resulting in an AMS score of 40.5.
Since the de-identification measures are applied to the data points via de-association and/or replacement by using the DDID, the risk of re-identification is reduced. The AMS score determination is derived from a likelihood function that one or more identifiers are re-identified together. This, in combination with the process for obfuscating the data elements, may then be broken down into categories or other types of classification patterns to determine various functions, such as the usage allowed and the level of permissions that the entity needs to have before using the data. This process may also be applied to single or aggregated AMS scores. The aggregate AMS score is a multiple data point re-identification probability expressed by grouping together AMS scores to represent a level of uniqueness for the combined data point.
As an example of possible category classification patterns, the AMS score may be divided into categories A, B, and C. Where category a is Data with a single or aggregate score of 75 or more, can be used only under the current, unambiguous, and unambiguous consensus of a Data Subject. Category B may represent a single or aggregate AMS score of 40 to 74.9, meaning that the dataset may be agreed with (i) current or (ii) prior assertions by the parties to the profile. Category C may represent a single or aggregate AMS score of 39.9 or less, which may allow the dataset to be used without the Data Subject consent.
In the example disclosed in FIG. 1J, each identifier other than the social security numbers discussed above (i.e., credit card number, first name, last name, date of birth, age, and gender) is assigned in the first column by analogy as a non-disassociated/replaced AMS rating. In the next two subsequent columns (i.e., level 1 and level 2), the AMS score of each column is continuously reduced by 10%, in the last column (i.e., level 3), the AMS score is reduced by 50%, which results in a reduction in AMS score because of increased confusion of DDID enablement by persistent assignment (level 1), temporary change (level 2) and dynamic change (level 3)
As described above, an exemplary calculated anonymity measure score is illustrated in FIG. 1J, according to one embodiment of the present invention. These AMS are for illustrative purposes only and demonstrate the following facts: certain types of potential personally identifying information are more likely than others to reveal the true identity of a Data Subject, and additional levels of disassociation/replacement, e.g., temporary (i.e., level 2) and/or mutable (i.e., level 3), may increase the amount of anonymity provided to a Data Subject by anonymization systems and schemes.
As described above, FIG. 1K shows exemplary categories of consent/participation levels required by a Data object for certain calculated anonymity measurement scores, according to one embodiment of the present invention. These classifications are for illustrative purposes only and demonstrate the fact that some aggregate scores may be handled using different classifications. For example, class A Data can only be used with the current, explicit, and unambiguous consent of the Data Subject; and class B Data can be used with current or previous explicit consent for Data object; class C Data can then be used without consent from the Data object. Other approaches may also be adopted to meet the needs of a particular implementation.
Fig. 1L illustrates an exemplary embodiment of the use of DDIDs of the present invention for emergency response purposes. In step (1) shown in fig. 1L, the data attributes are evaluated to determine applicable emergency response differentiation-e.g., whether the premises are located in a flooded area, whether the individual is in an inactive state or needs specific life saving equipment or medical care. In step (2), the applicable data elements are dynamically anonymized by the recipient by using DDID disassociation and/or replacement to protect citizens' privacy/anonymity, and the covert information is sent to a DDID covert emergency response database. In step (3), the trusted party evaluates the information to determine data elements relevant to responding to the particular emergency. Finally, in step (4), the trusted party-covert emergency response database Associates Keys (AK), and/or Replacement Keys (RK) reveal the keys (AK) and/or Replacement Keys (RK) required for the required information represented by the DDID during the emergency event and associated response duration.
In the exemplary embodiment reflected in fig. 1L, the data resides in the emergency response database in a dynamic DDID hidden state such that the identification information is not discernible or re-identified until an appropriate triggering event occurs and provides the necessary Association Key (AK) and/or Replacement Key (RK). The trigger operation performed by the trusted party will issue a time-sensitive AK/RK for the appropriate data portion of a specified level of concealment or transparency, depending on the event type. The identified information may be stored in the emergency response database, but in a hidden state of the dynamic DDID; the data mapping engine controlled by the trusted party will maintain the relevant information related to the dynamically changing DDID and AK/RK, which are information necessary to identify and/or re-identify data that is only provided when an appropriate emergency event occurs.
Policies external to the system will determine which information may be relevant to different events and event phases, and at what degree of concealment/transparency is appropriate at different times, so that not all information is released at once, and thus irrelevant and sensitive information is not released without reason. These rights will then be encoded in order to trigger access in case of an emergency. This method allows two-way communication with the affected individuals and verifies the location of the affected individuals, as compared to a static list or the ability to communicate one-way.
The AKs/RKs will be changed and reintroduced into the emergency response database after each event so that the information will be retained on a continuous electronic basis in the DDID concealed state, i.e., a new trigger is required to read a portion of the data through the new AKs/RKs in response to the previous event (i.e., the previously provided AKs/RKs will no longer display potential identification information associated with the dynamically changing DDID after the emergency response event is resolved) before the AKs/RKs is issued. This will protect the privacy/anonymity of citizens individuals and allow proper access to data for a limited time in order to protect their security in the event of a major incident at the same time. In terms of emergency management, this may reduce the need for resource-intensive information acquisition and processing procedures during large events.
In addition, new data related to individuals may be added during the accident, such as specifying a "responsible" or "missing" status during evacuation. With embodiments of the present invention, this new input may become part of the personal profile of the person in the stasis state and be retained for future authorized use if helpful in a similar or subsequent emergency situation.
In the case of local opt-in, citizens may register to obtain information related to emergency situations stored in the DDID covert emergency database. The emergency database may be stored locally or elsewhere, but may interoperate in the event of a cross-jurisdictional event. Once the citizen data is entered into the DDID covert system, no one can see or access the data in an identifiable or re-identifiable manner until a trigger mechanism controlled by the trusted party results in the release of a dynamic, scenario-based AK/RK, which is necessary to identify/re-identify the appropriate components of the stored data.
Emergency management views of two examples in potential embodiments of the present invention may include:
1. the interactive screen may display overlays that allow Geographic Information Systems (GIS) and other data to be enforced or linked with specific location data-i.e., clicking on a house may display information submitted by citizens, as well as information owned by the jurisdiction about the subject industry and related disaster risks. For example, flood alarms are a good example of notifications, and may provide different amounts of information depending on the specific location of different individuals. General flood warnings may be sent throughout the region, but specific warnings may be sent to those people directly within the flooding region, at greater risk of flooding.
2. More traditional formats, such as spreadsheets, etc., may be extended to provide non-geographic data.
The two format changes described above may also be interoperable, where each data is represented by another class in an interactive or linked manner.
In the case of monitoring and warning, the location of the weather phenomenon (determined by weather radar, GIS maps, etc.) will determine the subset of information published, which can be further revealed within the database.
In another example, there may be a criminal targeting a particular demographic. In this case, DDIDs such as contact and demographic information would also be topical in addition to the partially masked location data to create a universal perimeter on the outgoing message. The associated data fields and their DDIDs will be activated to point to individuals matching the demographics and will likely be notified of criminal activity at a later time.
In an emergency situation requiring evacuation, this information may be triggered to help emergency personnel deploy resources more efficiently, in addition to assisting evacuation or determining those who may need additional assistance in the emergency situation. In another example, such as a storm, the system may be triggered to let emergency personnel know exactly where renal dialysis patients are in their cities by GPS location information associated with the patient's associated mobile device for emergency transport over snow-this information will be displayed by the non-discernable/non-re-discernable DDID until the triggering event releases the applicable AK/RK to reflect the appropriate relevant information.
Contextual security and privacy for just-in-time identity (JITI) support
The terms "instant identity" and/or "JITI" are used herein to refer to the dynamic anonymous approach and system described by the policy. The term "JITI key" or the term "key" as used herein refers to the terms "associated key", "alternate key", "temporal key", "AKs", "RKs", "TKs" and/or "key" as used herein.
The methods and systems for universal granularity, context, programmatic protection of data disclosed in this section shift focus to who can access the data (because without an Anonos immediate identity (JITI) key, the data is unintelligible) and re-focus on who can access the JITI key and the range of use supported by each JITI key.
By implementing data privacy and security policies in a technologically and programmatically flexible, selective manner in a context up to lower data element levels, even single data element levels, JITI maximizes authorized use of data while also minimizing unauthorized use of data. JITI facilitates compliance and auditability of established privacy policies by enabling mathematical, statistical, and/or actuarial measurement and monitoring of data usage. JITI enables the same data store to programmatically support privacy policies applicable to multiple companies, states, regions, countries, industries, etc. simultaneously and to adjust in real-time to accommodate the changing needs of the policy by dynamically modifying the intelligible data form into which the DDID is converted.
With JITI, data down to the minimum required data element level (e.g., down to a single data level) is dynamically hidden by replacing the data with a dynamic de-identifier (DDID) as described more fully herein. For example, instead of storing the actual name of a person, the person's name is replaced with a DIDD. Importantly, the JITI replaces data elements at the data layer, rather than masking data at the presentation layer. By replacing data elements with DDIDs, and further by separating the relationships between data elements, data is dynamically hidden up to the element level of the data layer, and tracking, parsing, inferring, analyzing, or otherwise directly or indirectly understanding-or associating-the data becomes extremely difficult without access to the JITI keys needed to "convert" the DDIDs into an understandable form. For purposes of this application, "transform" means, but is not limited to, correcting, shortening, compressing, encoding, replacing, rendering, calculating, translating, encrypting, decrypting, replacing, exchanging, or otherwise performing a mathematical function or identifiable operation on a DDID by mechanical, physical, electronic, quantum, or other means.
Returning to FIG. 1H, the domain on the left side of FIG. 1H represents a data element, the metadata (i.e., data that provides information about other data) represents, reveals, and represents interrelationships between and among the top three domains of the data element, and between and among the bottom four domains of the data element, thereby enabling tracking, profiling, inference, analysis, understanding, and relevance represented by the dashed lines between and among the spheres on the left side of FIG. 1H. On the right side of the 1H diagram, the different designs on each domain represent unique dynamic de-identifiers (DDIDs) for replacing the data elements represented by the domain. As a result of using different DDIDs, no metadata may exist or be associated with any of the realms on the right side of the 1H graph to indicate any interrelationship between or among any of the realms representing data elements. Without access to the JITI key needed to convert the DDID into an understandable form, replacing the data element with the DDID can significantly increase the difficulty of successfully attempting to track, analyze, reason about, infer, analyze, understand, or establish a correlation between any of the fields representing the data element.
By performing accurate, context-dependent, programmatic execution of the front-end, compliance with policies governing the protection of backend data (e.g., security, privacy, and/or anonymity) can be more easily audited, thereby increasing the level of accountability and trust necessary for widespread acceptance of data analysis and use, both domestically and internationally, and maximizing data value while also improving protection of the same data. The same data may be subject to different jurisdictional requirements based on the source and/or use of the data. For example, data representing heart rate readings (e.g., 55 per minute) may be affected by different privacy policies depending on how the data is captured.
For example, if the data is captured by a personal health device in the United states, the use of the data may be constrained only by the terms and conditions of the device and/or application that captured the information. If data related to the provision of medical services in the United states is captured, the use of the data may be constrained by the Federal Health Insurance Portability and Accountability Act (HIPAA) and applicable state laws. If the data is captured under relevant research circumstances federally sponsored in the United states, the use of the data may be constrained by "common rules," such as: part 1c of "the Code of Federal Regulations (CFR) in the United states" part 7 written by the Ministry of agriculture; 10CFR part 745 of the energy sector; part 1230 of 14CFR of National Aeronautics And astronautics And Space Administration; department of Commerce-part 27 of the 15CFR of the National Institute of Standards and Technology (Department of Commerce-National Institute of Standards and Technology); part 1028 of the 16CFR of the Consumer Product Safety Commission (Consumer Product Safety Commission); department of justice-part 2224CFR 60 of national department of justice; department of defense 28CFR part 46; 32CFR part 219; section 97 of the education department 34 CFR; department of refund military affairs-research supervision office-part 16 of the development office at 38 CFR; environmental protection agency-part 26 of 40CFR development; 45CFR Part 46 made by the Ministry of health and public services (also applicable to Central information agency, national safety Ministry and social Security agency); department of transportation 49CFR, part 11. Thus, scalable procedural, generic data protection, and compliant technical solutions (e.g., JITI) may also be needed, as may privacy policies that are tailored to different jurisdictions of different enterprises, industries, governments, regulatory agencies, and/or other stakeholder groups, among other reasons.
In a preferred embodiment, the granularity, context, programmatic enforcement methods and systematic possible implementations of privacy policies disclosed herein include real-time to identify and anonymize solutions and/or services that help address concerns about unintended access, as well as data used in violations of privacy policies, thereby overcoming limitations of other methods of protecting data. Conversely, other methods for protecting data (e.g., improving the security, privacy, and/or anonymity of the data) are generally binary: data protection may be facilitated at the expense of data value, or data value may be facilitated at the expense of data protection. For example, efforts to increase data security by encrypting data may result in the data being protected but not used in its protected form, or conversely, the data becomes vulnerable to attack when decrypted for purposes of use.
Fig. 1M compares the impact of other methods of data protection (security and privacy) on the preservation of data value with the preservation (or expansion) of data value in the present invention (i.e., JITI), as well as the impact of other inventions also contained herein. Column 1 of fig. 1M represents the effect of binary replacement (e.g., encryption), where the top black field shows the value of the original data (in unprotected form), while the dashed field indicates that the data value when the data is in protected form is lost, making it unusable. Column 2 of the 1M graph represents a reduction in data value due to data being removed from the ecosystem in response to concerns about data being used for purposes other than the primary intended use ("data minimization"), and the use of traditional static methods to conceal data to achieve a de-identification manner that would reduce the value of the data. Column 3 of FIG. 1M shows that 100% of the data values remain in JITI. Finally, column 4 of fig. 1M shows the possibility of aggressive data fusion due to the use of JITI.
It is worth mentioning that JITI-based techniques need not necessarily be used in place of other known data protection techniques (i.e., security and privacy). Indeed, JITI may be used in conjunction with other techniques. The main benefit of using JITI to present data as DDIDs is that if the JITI key cannot be accessed when other methods fail, which is necessary to present DDIDs in an understandable form, the disclosed data has neither value nor meaning.
Fig. 1N shows two important steps in one potential embodiment of the JITI invention. Step 1, above the horizontal dividing line in FIG. 1N, highlights eliminating visible links between data elements so that one cannot infer or infer relationships between data elements. Rendering the data elements as DDIDs dynamically conceals the plaintext source data. Data presented using DDIDs still exists, but from an information theory perspective, the knowledge or context required to understand the data is separated from the data by the JITI key: thus, the DDID does not contain information about the underlying data elements. Step 2, below the horizontal split line in FIG. 1N, involves JITI key assignment based on strategic control (e.g., destination, location, time, and/or other specified triggers) of JITI key enablement to allow selective disclosure of data; the level of detail/clarity (e.g., raw plaintext, perturbation values, digest information, etc.) provided to each key holder may also be dynamically controlled as the data is selectively revealed. It is worth noting that there is no limit to the number of different alternative disclosures that can be made in series or in parallel. There is no limit to the number of different authorized users that may make any one or more disclosures; there is no limitation on the constraints or policies (e.g., time, destination, location, others (associations, relationships, quantities), etc.) of such disclosure.
JITI's granularity, context, programmatic enforcement of data protection (e.g., data security, privacy, and/or anonymity) policies support statistical evaluation of the probability of data leakage and/or data re-identification occurring, or rank ordering (i.e., non-parametric methods) of such events. From an information theory perspective, JITI is more efficient than other methods of protecting data because the value of the data is still accessible, but the identification information is not. In other words, the absence of leakage of identification information means that zero information is leaked, while at the same time the value of the data is deliberately "leaked" safely in a positive way (which may itself be optimized by standard information theory), which means that this value is available for the authorized user.
The granularity, context, programming structure of JITI supports mathematical evidence that significantly reduces the probability of data leakage or re-identification. One example of a mathematical proof of JITI validity is the conclusion that data that has been replaced by DDID to the data element level (a process referred to herein as "anonsizing" data) does not result in a greater probability of re-identification than guessing the identity of highly encrypted data, as analyzed by data scientists. However, unlike encrypted and other non-Anonosize data, anonosize data can be used in its protected form to generate value from the data. Further: (a) Different DDIDs may be assigned to the same data element at different times and/or at different locations and/or for different purposes and/or according to other criteria, thereby making it extremely difficult for parties not in possession of a JITI key to track, profile, infer, deduce, analyze, or otherwise understand the protected data; and (b) if the same DDID expires for any reason, it may (but need not) be assigned to a different data element, but also at a different time and/or at a different location and/or for a different purpose and/or according to other criteria, thereby making it extremely difficult for a queueing party or other "bad actor" to establish any meaningful continuity or audit trail, as these reassigned DDIDs will reference data elements that do not have any meaningful relationship (whether or not relevant) with any of the data elements to which they have been assigned. Reference is made back to the fig. 1B for possible triggering criteria of allocation, application, expiration and reclamation of DDID and/or jit keys.
JITI seriously detracts from the granularity, context, and procedural enforcement of privacy policies by "social Effect" -this definition is that data, when combined with other data, poses privacy or security risks even if the data itself is unrecognizable. For example, latanya Sweeney, a resident professor of the Harvard university government and the academy of science, was remembered to work because of this information, knowing three separate identifiers, (1) the zip code, (2) the gender, and (3) the date of birth-could result in 87% of the U.S. population (i.e., 2.16 billion of 2.48 million U.S. citizens at the time) being individually re-identified. However, to do this, the zip code, gender and date of birth of the same person must be known. Using JITI, the owner of data elements can be concealed by associating each data element with a different (or dynamically changing) DDID, rather than associating all three data elements with the same static identifier. With JITI, it is difficult to know whether the zip code, gender, or date of birth is appropriate for one person or more-thereby severely detracting from the "social Effect".
One potential implementation of the method and system for granular, contextual, procedural data protection disclosed herein would involve developing a mathematical/statistical/actuarial model to reduce insurance risk. Procedural protection regarding data granularity, context driven, disclosed herein, can mathematically measure regularity as needed to develop algorithms that better assess price and ensure risk containment. By ensuring protection of data security, privacy, and/or anonymity at the individual consumer level, it becomes more acceptable to aggregate more data on a broader, more demographic representative basis, which may improve data accuracy and value associated with risk.
Another potential embodiment of the method and system for granular, contextual, programmatic data protection disclosed herein is the need to use multiple JITI keys to ensure the consent of multiple interested parties before presenting a DDID. Multiple JITI keys (i.e., "n of m" models, where all or a certain percentage of the available key fragments are needed) are needed to unlock data values from DDIDs to ensure that the interests of each stakeholder are respected in multiple stakeholders or highly sensitive data access/disclosure scenarios.
Another potential embodiment of the method and system for granular, contextual, programmatic data protection disclosed herein is to encapsulate highly fine-grained (as low as 1 for the ratio of JITI key triggers to data elements, although this should not be interpreted as limiting the many-to-one, one-to-many, or many-to-many mapping elements between JITI key triggers and data, as such embodiments are also envisioned) access rules that, among a number of potential parameters, without limitation, specify any, partial, or full degree, context, specificity, abstraction, language, and accuracy to which a DDID is authorized to be translated. In this embodiment, the access rules may be encoded as one or more JITI keys that are enforced by programming to ensure that the DDID is unlocked and its original content is displayed, but only if all explicit access rules are adhered to and enforced. JITI supports multiple and/or cascaded policies contained in an assigned JITI key by enabling "override," such that when multiple policies are applied, only the most restrictive applicable policy will be implemented; alternatively, the most restrictive policies can be used in combination to create new "maximum" restrictive policies, both statically, dynamically, and in any batch, near real-time, and real-time scenarios.
1P-1 figure highlights how a hypothetical consumer "Scott" (of the 4 different purchase transactions represented by the static anonymous identifier 7abc1a 23) can be used to re-identify him using metadata captured in the financial transaction entered by him. With JITI, after the first 7abc1a23 allocation, every time it appears in the 1P-1 diagram, the static anonymous identifier 7abc1a23 representing "Scott" is replaced by a DDID.
On the other hand, 7abc1a23 DDID shown in figure 1P-2 only appears once, and in the three other transaction records where 7abc1a23 appeared previously, what appeared was instead DDID:54#3216, deTym321 and HHyarglm. Using JITI to change the DDID that references Scott, it is possible to effectively cancel Scott's per transaction identification-thus providing him with JITI per transaction. Therefore, scott cannot be re-identified by correlating these dynamic anonymous identifiers.
Different JITI keys may "unlock" different views of the same DDID or its underlying value, thereby providing granular control of the degree of detail or concealment visible to each user based on the context of the user's authorized use of the data (e.g., the purpose, location, time or other usage attributes of the authorization). For purposes of this application, "unlock" means decode, translate, reveal, become permanently or transiently visible, or provide a unique "slice" consisting of a subset of a larger data set, where such a slice may not contain data elements, a single data element, or any combination of any number of data elements. The JITI key renders the DDID into an understandable form is triggered by a specified JITI key trigger (e.g., a destination, a location, a time, and/or other specified trigger), which may be used alone or in combination with other triggers. Thus, all DDIDs (including masked DDIDs) based on satisfying the JITI key trigger will be presented differently to different users and/or at different times, and/or at different locations and/or other attributes used. As described above, fig. 1B depicts various exemplary events that may trigger allocation, application, expiration, and reclamation of DDIDs with respect to data elements (e.g., data attributes and/or attribute combinations) and/or JITI keys.
Another example embodiment of the invention relates to medical services. In this example embodiment, the plaintext value of 55 Beats Per Minute (BPM) is replaced with a DDID having an "ABCD" value. It is noted that the example DDIDs provided in this application are typically presented as a few characters in length for purposes of simplifying the description only, but in practical embodiments these DDIDs may be any finite length. The DDID, ABCD, used in this potential example is programmed to its unmodified original value "55BPM", presented by the key holder of only those JITI keys meeting all of the following applicable requirements (by "applicable", it is meant that JITI key access may be based on one, some or all of the attributes listed below):
1. ) Based on the purpose: in this example, the following is relevant:
a. verifying the identity of the key holder (e.g., by password, multi-factor verification, or any other verification process); and/or
b. The individual key holder is authorized to view the JITI key authorization data (e.g., by comparing the authenticated identity of the key holder with the identity of medical personnel assigned to care for the patient), or to indirectly authorize the key holder by inheriting attributes (e.g., from any size set, group, class, or other structure to which the individual belongs) to enable JITI access to the source data.
2. ) Based on the physical location: in this example, the following are concerned:
a. ) A physical location associated with providing care to or for the patient (e.g., within a specified distance from the patient room and/or a medical station on the same floor as the patient room); and/or
b. ) Belonging to the physical location of the authenticated and authorized personnel (e.g., within a specified distance from the mobile phone, device, and/or sensor that each authenticated and authorized nurse's individual is prepared to store).
3. ) Temporary (time-based): the allowed time period is verified (e.g., by comparing the time at that time with the time at which the key holder plans to provide care to the patient).
Fig. 1Q shows the medical service embodiment described above. For example, the first JITI key used by an authorized medical provider during a movement within a specified distance of a patient's room or associated medical kiosk may be configured to unlock all original values of DDID "ABCD" so the provider would be displayed as "55BPM". The second JITI key used by the authorized medical service provider during its shift, but beyond the specified distance of the patient's room or associated medical station, will be configured to unlock a disturbed (e.g., altered) version of the original value of DDID "ABCD," and the provider will see a range of "50-60 BPM. The third JITI key, which the healthcare provider is authorized to use outside of its shift hours, and beyond a specified distance from the patient's room or associated medical station, will be configured to unlock the original value descriptive statement regarding DDID "ABCD" and thus will display to the provider an explanation of "normal heart rate," but lacks any timely information regarding the patient's heart rate. A fourth situation where an authorized medical provider (after a successful authentication action) possesses a fourth JITI key that is not authorized to reveal patient-specific heart rate data information, thereby preventing the provider from seeing anything other than the DDID. By analogy, if the JITI key is not provided, or if an unauthenticated and authorized person attempts to use the JITI key, that person will not see any information other than the DDID.
Fig. 1R illustrates one potential architectural embodiment of an exemplary medical services embodiment for supporting JITI as described above. In this potential embodiment, an "authentication module" is used to verify user authorization to retrieve DDIDs by using a way hereinafter referred to as the "Anonos JITI policy engine," but then the order and application of the various JITI key scenarios will determine how much the source value is revealed and returned to the medical provider. The user interacts with the policy engine using a "query interface," which in turn accesses data in the "Anonos platform" (e.g., DDID, JITI keys, roles and policies (determine when DDID transitions) and data in the DVAL (provides another level of abstraction for DDID) and data in the "information platform" (e.g., primary data that has been replaced with DDID at the data element level). This potential embodiment illustrates that, even if trust and proper authentication is performed on an active user, mere possession of DDID may not be sufficient to unlock any original data elements.
The following description is intended to neither include all possible considerations nor define the minimum or maximum scope. For example, although the following description uses a conventional table database structure, it is merely a single example and a single embodiment of an implementation. JITI may be implemented using NoSQL and/or other methods, including but not limited to emerging technologies such as quantum databases, quantum relational databases, graphical databases, triple store (RDF), or S3DB (as a way to represent data on a semantic web without the rigors of a relational/XML schema).
Moreover, any such methods and/or databases may be used to support, implement, and/or integrate into the creation, implementation, and/or deployment of privacy clients and/or privacy servers that are themselves used to support JITI implementations or other aspects of the invention or patents of the same family, or of the invention described in any patent application. One or both of the privacy client and the privacy server may be integrated with, controlled by, and/or populated with data by a client application, wherein, in some embodiments, the application may (i) run on a stand-alone computer device that is not connected to the internet; (ii) Operating on a mobile device connected directly or indirectly to the internet, including an internet of things device; (iii) Run directly as an application or run through an application itself running on any standard Internet browser (e.g., chrome, internet Explorer, microsoft Edge, firefox, opera, safari, native Android browser, etc.); and/or (iv) utilize component and service semantic webs that are typically related or partially related thereto. Likewise, the various query and record creation/modification events described below in no way limit embodiments to relational database management system (RDBMS) type designs.
Embodiments of the invention as described herein relating to DDID and JITI keys may include at least an implementation in which a privacy client (and, at most, privacy client and privacy server, and instances of such client and server equal in number to one or more, respectively) will reside on the client side (e.g., as part of an application running in a browser, in a virtual, any type of physical or logical computing device described herein, a location on which a "privacy client" may run, and where such device or application running thereon interacts directly or indirectly with such browser). One of the potential implementations using DDID and JITI keys may utilize a semantic Web (extending the Web through standards established by the world wide Web consortium (W3C), such as resource description framework or RDF) as the capability to unify computer environments.
1S illustrates one potential JITI-enabled embodiment of a JITI-enabled system that uses a native supporting Openhealth platform (OH), W3C standardization, data management resources, such as NoSQL IndexDB, where one or both of the "privacy client" and/or the "privacy server" may reside on or logically follow the "OH platform". Note that, in contrast to example A of FIG. 1S, in example B of FIG. 1S, all data and computer operations, including but not limited to privacy client and/or privacy server functions, may be performed by the data provider or domain consumer such that a dedicated computer infrastructure would no longer be required to support JITI enablement or other operations. Implementing the OH as a JITI enabled deployment through the semantic web, the OH can manage and coordinate the health-related digital assets to simultaneously maximize data value and data protection (security and privacy) without being constrained by server-side resources, since from a resource perspective, any privacy client and any privacy server do not consume any such resources at best, thereby enabling and providing greater scalability.
Unlike the conventional DB, the original data cannot be stored in the main DB of the JITI-enabled system (i.e., only DDID data can be stored). Instead, there may be two databases: the "master DB" (having DDID data) and the "JITI DB" contain keys for decrypting the master DB on a unit-by-unit basis. In this example, each new value in the "master DB" is assigned a unique DDID value, 8 characters in length, where each character is a member of the character classes a-Z, A-Z, 0-9. (such syntax and structural constraints are arbitrary and can be reconfigured to suit any particular deployment or policy objective, including defining a DDID syntax that meets the original syntax requirements of the source data field type, while still inserting random values that do not have a greater chance of re-recognition than by guessing.) in general, there are 62 possible values per character (26 lower case +26 upper case +10 digits). Thus, there are 62^8 (about 2.1834 x 10^ 14) possible values (higher disorder state measurements are obtained by adding extra characters, and this range will increase significantly). In the future, it will be easily changed to BASE64 (or some other coding means) -in this example embodiment, this choice is for aesthetic value only.
In one embodiment, a new, unique 8-word symbol DDID may also be assigned to the base value of each DDID in the master DB. For convenience, we will refer to the base value of DDID as "DVAL" in order to distinguish it from the DDID itself. For simplicity, a random 8-character DVAL will suffice, as long as the uniqueness check is performed. For future use, random generation may not be sufficient to handle very large data sets (trillion records). Sequence values (e.g., aaaaaaa, aaaaaaaaa b) are not used because if the order of the original table is known (e.g., during database import), then a sequence unique ID can be used to launch an inference attack. Sequential values (e.g., aaaaaaaa, aaaaaaab) are not used because if the ordering of the original table is known (e.g., during database import), then a reasoning attack may be launched using sequentially unique IDs.
In one embodiment, each original value will be encrypted using AES, which will produce unique ciphertext even for the same plaintext due to different initialization vectors. For example, in table 4 below, an exemplary set of "raw" values is given.
TABLE 4
Name (I) Birthday Position of
John's work 10 and 9 days 1940 Prison
Paul
6/month 18/1942 Saint John forest
George
2 month and 25 days 1943 Heaven roof
Lin Ge
7/1940 Los Angeles
The · values of DVAL are shown in table 4, possibly (with random generation) are the values shown in table 5 below.
TABLE 5
Figure BDA0003935247150000891
Figure BDA0003935247150000901
To re-associate each DVAL with its original value, each DVAL may be written to a DVAL table, as shown in table 6 below, along with its encrypted ciphertext and Initialization Vector (IV) (AES encrypted using a "for demonstration purposes only" key).
TABLE 6
Figure BDA0003935247150000902
In another embodiment, a one-way hash function may be used to generate a DDID that hides each original value. In yet another embodiment, the DDID may be generated using various random processes that are not related and do not relate in any way to the DDID, its base value, or any other relevant data. (e.g., a postal code list divided into 8 strings worldwide and used randomly every 15 minutes).
Returning to the example of AES, the Initialization Vector (IV) may be passed with the ciphertext because the key is one that keeps the data secret. One benefit of IV is that the same plaintext value may have different ciphertexts. For example, if there are 10 records with the same last name or zip code, the DVAL, ciphertext, and IV will all be unique, although the plaintext values for these 10 names or zip codes are the same.
To query the anonossized database, the user needs to have the right to pass the JITI key. These are mostly intended to apply to policy controllers specific to the intended destination, location, time of use, and other relevant attributes. In addition, the JITI key may enforce an expiration-based constraint, resulting in the generation of a triple metric with respect to a preferred embodiment: query constraints, display constraints, and time constraints. The JITI key may be stored in a JITI key DB and provide granular access control; it may also be determined how the raw data is displayed (e.g., in DDID form, converted by one of the conversion rules, or in raw form).
Method for "anonossizing" data
As described above, the terms "anonize" and/or "anonizing" refer to the replacement of data with DDID up to the data element level. More specifically, anonsizing, as used herein, may refer to encoding and decoding data under controlled conditions to support a particular use of such data, for example, in a specified context authorized by a data subject or authorized third party.
Implementation of the anonsizing data may allow the data management system to retain the ability to render the data intact with its original values (e.g., economic, intelligent, or other) and utility, but enable the level of identification information disclosed to be authorized by, for example, data subjects and/or authorized third parties.
In some embodiments, data exposure may be solely on the exercise of the degree required to support each given datum. Through control of the anonsizing data, for example, by "identifying" and "associating" data elements within a community and/or personal queue, data usage may be limited to those permitted by a particular data subject or authorized third party. If new authorized data usage occurs, all original data values and utilities may be retained to support new data usage within the scope of the data principal or authorized third party authorization, but improper, i.e., unauthorized, identification information usage may be prevented.
The data is anonymizing by dynamically changing the DDIDs, thereby minimizing the ability to re-identify individuals from data that appears unrecognizable based on "medical Effect". The study of Latanya Sweeney Latania Winni, professor Harvard university, is cited as evidence that knowledge of birth date, gender, and zip code is sufficient to identify up to 87% of Americans. However, in order to combine date of birth, gender and zip code to achieve a re-identification rate of 87%, it is necessary to know that these three pieces of information are related to the same person. As a dynamic example of an implementation using DDIDs, by associating a different DDID with each birthday, gender, and zip code, it is not possible to determine whether a given birthday, gender, or zip code is the same person or some combination of different persons. Thus, this lack of knowledge overcomes the so-called re-recognition by "social Effect".
Thus, embodiments of anonsizing herein may include: 1. ) A method is provided for specifying data fields containing primary and/or secondary "quasi-identity" data elements, i.e. those that reveal some information about a person, but do not explicitly show the true identity of the person, which data elements are to be replaced by R-DDIDs and/or a-DDIDs; and 2.) provide a method of establishing de-citation policy rules to replace data elements of primary and secondary "quasi-ids" and/or specific formatting requirements of the R-DDIDs and/or a-DDIDs, such as field length and character type (e.g., letters, numbers, alphanumeric combinations, etc.), alter the dynamic requirements of the R-DDIDs and/or a-DDIDs (e.g., triggers that cause changes, frequency of changes, etc.).
Data anonossifying policy management and access control
Although some privacy policies (such as those implementing fuzzy logic, non-deterministic or other similar methods) appear intentionally ambiguous from the standpoint of allowing or disallowing the recipient to view the true underlying data, certain policies are described herein for such data that enable explicit "bright line" differentiation to be performed between views that allow and disallow given data (e.g., 65 raw heart rate values per minute can be converted to NADEV concealed using a-DDID). Specifically, the NADEV, whether or not obscured by an a-DDID, may include, but is not limited to: (i) Synthetic data, i.e. data applicable to a given situation, which is not obtained by direct measurement, but is stored permanently and used for carrying out a business process (as further defined below); (ii) Derived values, i.e., data that is logically expanded or modified based on the original data; (iii) Generalized data, i.e., a generalized version of data derived from an inference or selective extraction of raw data (e.g., a category or homogeneous group); (iv) Aggregation, i.e., the result of applying one or more algorithms to multiple data elements in the same record or records). In one example, the first NADEV may include a range of 61-70 times per minute, while the second NADEV may simply include the textual description "normal" (each may be individually suppressed or revealed). In addition, the persons or entities (and for what purposes) authorized to create or use these views can also be specified separately. Such policies may also provide for the setting of time parameters governing when creation or use is authorized or unauthorized, and location parameters, such as where creation or use of such data is authorized may be controlled by place name, GPS coordinates, or other means of identification.
One particular form of generalized data arises relative to unstructured data. According to wikipedia, "unstructured data (or unstructured information) refers to information that has no predefined data model or is not organized in a predefined manner. Unstructured information is typically text intensive, but may also contain date, number, and fact data. This incurs irregular and ambiguous situations that make it more difficult to understand using traditional programs than other data stored in the form of fields in a database or other data annotated (semantically tagged) in a document. Unstructured data may also include multimedia data such as pictures, audio, video, and the like. Importantly, the data can be anonossized regardless of whether the data is structured, unstructured, or any other combination. https:// en.wikipedia.org/wiki/Data _ modehttps:// en.wikipedia.org/wiki/Plain _ texthttps:// en.wikipedia.org/wiki/Ambiguitieshtps:// en.wikipedia.org/wiki/Antotationhttps:// en.wikipedia.org/wiki/Tag (metadata)
In 2016, IBM represented: "today, 80% of the data comes from previously undeveloped, unstructured information from the network, such as images, social media channels, news sources, emails, periodicals, blogs, images, sounds, and videos. Unstructured data, sometimes referred to as "dark data," contains important insight needed for faster, more informed decisions. Then, what is the remaining 20%? That is structured data that is traditionally present in a data warehouse, and this is also important. Without a structure, you cannot survive. "IBM president, president and president executive Ginni Rometty said: "first, phenomenon of data. Previously invisible data now becomes visible, especially over 80% of "unstructured" -the natural language in books, literature and social media \8230; \ video, audio, images. More and more information from the internet of things. Computers can process unstructured data, store, protect, and move it, but traditional programmable computers cannot understand the data. Dark data is data obtained through various computer network operations,
But are not used in any way to derive insight or to make decisions. An organization may have the ability to collect data that exceeds the data throughput it can analyze. In some cases, the organization may not even be aware that data is being collected. IBM estimates that approximately 90% of the data produced by the sensors and analog-to-digital conversion has never been used. In an industrial environment, dark data may include information collected by sensors and telematics. The first use and definition of this term appears to be from the consulting company Gartner. There are many reasons why organizations retain dark data, and it is estimated that most companies analyze only 1% of the data they own. And this data is typically stored for regulatory compliance and to keep records. Some organizations believe that dark data may be useful to them in the future once they obtain better analysis and business intelligence techniques to process the information. Since the storage cost is low, it is easy to store data. However, storing and protecting data typically requires a greater fee (or even risk) than the potentially rewarded profit. "Anonosolition may also be applied to such" dark data ". https:// www.ibm.com/blogs/business-analytics/data-is-everywhere/https:// www.ibm.com/ibm/ginni/01 \/u 06 \/2016. Html
The leader of the research company IDC and storage area EMC (now owned by Dell Computer), predicts that by 2020, data will grow to 40ZB (zeyte), resulting in a 50-fold growth rate since the beginning of 2010. Computerworld indicates that unstructured information may account for over 70% -80% of all data in an organization. Thus, in any given organization, any method of preserving data privacy while increasing the value of the data would most likely, if not nearly certainly, have to deal with unstructured information to be practical, among other needs. https:// en
For example, consider, but not limited to, electronic Medical Records (EMRs). The EMR contains not only specific data such as red blood cell count, blood pressure, ICD disease code, etc., but also a "comment" field that consists primarily (if not exclusively) of text. By default (i.e., as an auto opt-in, which may be modified to opt-out), the comment field for such an anonosiation results in a de-recognition translation of the field to the R-DID. However, the content contained in the comment field may also be an important medical feature relating to the Data Subject, wherein revealing only a few or possibly just one such feature may also result in the Data Subject being re-identified. For example, "streptococcal pharyngolaryngitis" is a common disease and therefore unlikely to lead to re-identification, the disclosure of "islet cell carcinoma" or few cases per year worldwide (even with orphans) is a very rare case, such that the Data Subject can be easily re-identified when used alone or in combination with other Data.
As noted above, the first attempt at this solution may simply be to anonize the comment field, i.e., replace it with an R-DDID that does not itself display any information in the comment field, but may provide a way to retrieve the entire comment field under controlled conditions (e.g., where an authorized JITI key is used). The use of a-DDID provides an additional approach. The a-DDIDs enable queues (e.g., those with islet cell carcinoma, those with streptococcal throat, those with schizophrenia and irritable bowel disorder-and finally, perhaps for those who study the gut microbiome, which is now considered to be associated with mental health) to be identified (particularly manually; through the use of machine learning; through the use of artificial intelligence; through the use of quantum computers), and once identified, are represented by such a-DDIDs. In this manner, while the A-DDID may be associated with a range (e.g., systolic pressures 140 and 160), the A-DDID may also be associated with a particular condition present within an injection field in the EMR. However, the generation of the A-DDID may default to opt-out, and thus requires overriding before actually being generated. Furthermore, any value that may be derived from analysis of any annotation field, including but not limited to Bayesian, markov, or heuristic analysis, may also be used to define the existence of a queue; and membership to the group may be enabled by an a-DDID assigned to all records of the group. In addition to these applications, multimedia forms of unstructured data are considered, such as the output of MRI, CT, positron emission tomography, sonography, etc., whether represented in snapshots (as may be the case with X-rays) or video (as may be the case with positron emission tomography and sonography). The information extractable from such multimedia data is virtually unlimited and can be organized into an unlimited or nearly unlimited number of queues. Thus, the A-DDID may be used to de-identify any queue derived from the extractable information, presenting the information to the Data Subject in a non-re-identifiable manner, as the queue and Data values associated therewith may be used independently of the identity of the Data Subject. In all of the above cases, those who need to use the extracted information may be authorized (e.g., by the JITI key), re-identify the relevant A-DDIDs, which themselves may be associated with other A-DDIDs, but not with R-DDIDs, or if so, R-DDID access would be unnecessary-and thus unauthorized. Since the R-DDID relates only to the Data Subject, the researcher need only be informed of the medical information available by re-identifying the A-DDID, where such A-DDID identifies not only structured Data, but also unstructured Data (or a structured representation of Data inferred or inferred from unstructured Data), thereby improving or maximizing privacy of the Data Subject, and as such may improve or maximize the value of the Data to the researcher.
According to some embodiments, after one or more transformations have been performed on the associated data set to produce a NADEV or a set of NADEV, each member of the produced data set species (or any combination of members thereof) may be masked or otherwise concealed to the extent desired by the policy maker by using the a-DDID in order to meet or exceed the requirements of Privacy Enhancement Technologies (PETS), such as public key encryption, k-anonymity, l-diversity, introduction of "noise", differential privacy, homomorphic encryption, digital rights management, identity management, suppression, and/or generalization. At the same time, the value of the data (e.g., as measured by one or more of a number of factors such as mean, joint mean, marginal mean, variance, correlation, accuracy, precision, etc.) may be maintained at a maximum or optimal level (i.e., compared to the value of the original unconverted data or with further conversion of the input data). These techniques are advantageous over existing data concealment methods, at least because existing methods are typically: (ii) (i) policy-based only (no technology implementation method); or (ii) if technically enforced, would typically significantly reduce the data value, thereby preventing the desired analysis, association, discovery, or breakthrough from occurring.
As previously mentioned, the application of the data anonsizing policies described in the various embodiments disclosed herein provides a way to enforce these policies against any simple or complex data in a programmatic manner. Such mandatory properties include, but are not limited to, creating further restrictions or exclusions to the data by using any combination of time, destination and location JITI keys or values (or other types of access control based keys or values).
Part of the utility of using such data anonossifying policies comes from the ability of the policy to transform data in an "atomic" or "cellular" manner, i.e., down to the level of a single unit of data, whatever may be possible for a given implementation. An atomic unit of data may be a single datum or a set of data that is treated as a single entity for purposes of analysis, association, computation, anonymization, and the like. As discussed below with reference to fig. 1U, while previous data protection methods, for example, were capable of protecting or encrypting data "row-by-row" or "column-by-column" in a two-dimensional data set, the techniques described herein may protect or encrypt data row-by-row, column-by-column, by vectors in 3 rd, 4 th, or even nth dimensions, or by any combination thereof. Furthermore, the techniques described herein may be applied in the opposite direction, i.e., down to the level of a single cell in a multivariate data set, or to any collection or arrangement of contiguous, non-contiguous, or discrete cells. These "cellular operation" capabilities, i.e., suppression, generalization, public key encryption, k-anonymity, l-diversity, introduction of "noise", differential privacy, homomorphic encryption, digital rights management, identity management, and other PETS, can be achieved through the anonizing system's ability to label each data or any group of data.
The tagging (i.e., anonizing) of data at the cellular level may also be built into the hierarchy of NADEV or other values, as well as a reference to another data or group of data. The generated token and the information about access control and authorization itself may be stored in a relational and look-up database, and these may also be hidden by using the a-DDID. Implementation of a given policy may include: (ii) (i) protecting data at the elemental or cellular level; (ii) Control what data is disclosed, when and/or for how long, to whom and for what purpose; and/or (iii) control disclosure of "clarity" of data, e.g., an authorized party may access the clear values of data at a given authorized time and place, whereas only the actual values of data represented by NADEV may be disclosed to another party that does not need access to that level of data specificity. Controlled disclosure of data may involve the use of certain stochastic, parametric, or nonparametric aspects, and also include the ability to control the occurrence of the following disclosure: when (i.e., at what time or time), place (i.e., at what physical or virtual place), and why (i.e., for what purpose or purpose).
FIG. 1T illustrates an example of a system for implementing data de-risk policy management and access control according to one embodiment of the invention. First, the 101 table represents the original plaintext representation of the source data table. As shown, table 101 stores the unmasked values for each field in the table, i.e., record date, name, bpm, address, city, state, country, and date of birth. Table 102 represents a data table in which data is transformed by replacing the data with tokens (i.e., pseudonyms) at the data element level. For example, the beats per minute (bpm) value 55 in the second row of the table 102 has been replaced with the token value "RD-4a7e8d33", and the birth date of 1944-10-28 in the second row of the table 102 has been replaced with the token value "RD-4f0b03c 0". Table 103 represents an exemplary data disclosure of the second row of the original source data table 101, where the selective data from the table (e.g., bpm field and date of birth field) has been exposed to the data element level, while the remaining data remains anonized/pseudonym. Table 104 represents an example of a NADEV, i.e., a digitally masked, partially masked, granular, filtered and/or transformed version of the underlying data that has been inserted into the data table based on one or more policies. For example, as shown in table 104, two NADEVs corresponding to bpm value 55 are inserted from the second row of the original data table 101, and three NADEVs corresponding to birth date values 1944-10-28 are inserted from the original data table 101. Finally, table 105 represents examples of the base values of NADEV that are masked, partially masked, granular, filtered and/or transformed into the table shown as 104. As described above, only the necessary level of identification data may be exposed to a given recipient based on one or more appropriate policies. For example, one authorized recipient may receive a "55" value for bpm, while another recipient may receive a "51-60" NADEV, and another recipient may receive a "low" NADEV. Similarly, one authorized recipient may receive the value for date of birth "1944-10-28", while another recipient may receive "1944-10" nadev, another recipient may receive "1944" nadev, and yet another recipient may receive "1901-1950" nadev. As can now be more fully understood, each NADEV is accurate with respect to the underlying data, although the data, individually or together, may only reveal the true underlying value of the data to a greater or lesser degree of granularity, all depending on the implementation and design of the relevant policy.
FIG. 1U illustrates an example of various data de-risking schemes according to embodiments of the invention. For example, a conventional method for protecting data (e.g., encryption) is shown in scheme 106. Scheme 106 represents a "binary" protection scheme, in other words such a scheme reveals either every single data element (i.e., white square blocks) or no data elements at all (i.e., darkened square blocks). As shown in scheme 107, newer methods of protecting data may enable data to be revealed or obscured on a '2-dimensional' basis. In other words, the disclosure of data may be based on rows or columns. Finally, scheme 108 reflects the multi-dimensional or 'n-dimensional' protection scheme described herein, wherein data can be revealed (or concealed) on a 2-dimensional, 3-dimensional, or n-dimensional basis at a single data level (including any combination of cells).
Virtual marketplace for data anonossizing policies
FIG. 1V illustrates an example of a market for the various data de-risking policies available for purchase, shown in scenario 110, according to one embodiment of the invention. The electronic marketplace described herein can sell, or otherwise have, internal or third-party privacy policy providers offer many different policies for consumption. Policies may be ranked using non-parametric metrics (i.e., rank ordering) and/or parametric metrics for a given policy, analysis, and performance attributes versus quantitative or qualitative metrics (e.g., "accuracy rating" or "privacy rating" as shown in table 110) and "user rating" for a particular policy. Further, ranking and analysis may be applied to certain types of privacy or data value challenges (i.e., "subject area" in the 110 table) based on the policy, e.g., HIPAA, GLBA, or FERPA (in the united states) or the European Union (EU) "general data protection regulations" (GDPR). None of the known markets provide an objective measure of the quality and relevance of a particular privacy policy, i.e. based on its contextual use and applicable laws and regulations, wherein the policy is to be technically enforced on the underlying data.
Application of artificial intelligence in data anonossation
As described above, certain embodiments of the present invention may use Digital Rights Management (DRM) -like techniques similar to those used by companies to limit individuals who may make copies of music, movies, and other digital content, and transfer rights from the Data business owner to the Data body Data Subject by performing an anesizing process on the Data, in addition to authorizing the use of the Data body Data Subject's personal Data by the Data body Data Subject or an entity trusted by the Data body Data Subject. This data protection scheme is also referred to herein as "privacy rights management" (PRM) or "large privacy". Even without direct involvement of the Data Subject, PRM techniques manage risks to ensure that the Data is used accountably and respecting the rights of the Data Subject.
PRM or BigPrivacy may be used to replace the static, apparently anonymous identifier with a DDID. As described above, these dynamic identifiers encapsulate the data and re-identify and provide control over re-identification throughout the data lifecycle (or even at the data element level). Therefore, the same data has different meanings for different persons based on the policy control imposed by the technology. The BigPrivacy technique may distinguish sensitive or identifiable data from segments and dereference the segments, e.g., using a DDID indicator to obfuscate the identity of segmented data elements and the relationships between and among data elements.
The PRM or BigPrivacy techniques may also impose a generic data architecture on data collected from different applications and/or platforms, enabling functional interoperability between heterogeneous datasets to support data fusion, big data analysis, machine learning, and Artificial Intelligence (AI). The anonymous data may then be decoded under controlled conditions to support certain uses in a given context, i.e., authorized by the data principal or an authorized third party (i.e., a "trusted party").
Various so-called "intelligent policy compliance" systems and methods described herein may be comprised of artificial intelligence algorithms that can analyze the data architecture, metadata, structure, and optional sample records of a data set to determine algorithmic operations that can be used to obfuscate, summarize, or otherwise transform the data set in order to comply with a predetermined policy using R-DDIDs and/or a-DDIDs, as described above.
According to some embodiments, intelligent policy compliance systems and methods may classify data by analyzing metadata of the data. For example, a field name such as "patient _ id" or "descriptor _ id" may represent a healthcare-related data set. Advanced classification techniques, including techniques involving remote data lookup, statistical methods, and other algorithms, may be used to improve the accuracy of classification. Sample records of the data set (if available) may further improve the accuracy of the classification. According to some embodiments, the categories generated by intelligent policy compliance systems and methods may be consistent with industry verticals (such as healthcare) or specific products and services (such as mobile phone call records). The neural network algorithm can also be used for generating conceptual models in different fields and industry vertical fields, and cross-industry and cross-vertical classification is realized. For example, although jet engines in aircraft are distinct from water turbines, both have the ability to direct the flow of liquids or gases. Thus, a conceptual model may be generated that may be used to guide the flow measurement strategy.
According to some embodiments, the intelligent policy compliance system and method may analyze data provided to it in operations previously configured for certain classes of data, for example, using R-DDIDs and/or A-DDIDs, as described above. This analysis may be used to generate a set of operations that may be applied to the data set, modifying it in a particular manner, for example, by using R-DDIDs and/or A-DDIDs, as described above. For example, a set of operations intended to comply with a particular privacy-related policy might completely mask a person's name with an R-DDID, while summarizing the person's phone number as an area code only by an A-DDID. The intelligent policy compliance system and method may analyze a number of combinations of operations to generate one or more combinations of operations that fit into a data set. The combination may comprise a single "best" combination, a plurality of combinations selectable by the user, or any other set of combinations.
Through the user interface, a user may modify or apply the operations generated by the intelligent policy compliance system and method to data as appropriate. When the user makes such a decision, it may be stored as part of a feedback loop, effectively using machine learning, to make intelligent policy compliance systems and methods learn from success and error.
FIG. 1W-1 illustrates an example of an intelligent policy compliance engine, according to one embodiment of the invention. As shown, a user may interact with the intelligent policy compliance engine using a user interface. The policy compliance engine may include software running one or more classification services and one or more analysis services. As described above, the classification service may use AI-related techniques (including machine learning) to determine which categories of data to store in a dataset of interest. Also, the analysis service may analyze the determined categories and recommend one or more privacy policies that may be appropriate for the type of data being managed. Over time, the data store can be used to store and update potential data categories and related policies, as the intelligent policy compliance system uses machine learning or other methods to "learn" which data privacy and interaction policies are most effective (or preferred, e.g., for use by the user) for a given type of data set.
1W-2 illustrate an example flow diagram 130 for using an intelligent policy compliance engine in accordance with an embodiment of the present invention. Beginning at the left side of the flowchart 130, a user may provide a data set (including any relevant metadata) to a classification service of the data privacy system through a user interface. The classification service may request information from the stored data store regarding the names and data types of the fields of data that are commonly used and their associations, which information is relevant to the particular class of data that the user stores. With the benefit of this stored historical information, the classification service may apply AI techniques to classify the incoming data sets provided by the user. The determined data categories may then be provided to an analytics service of the data privacy system. Likewise, the analytics service may also request from the store a data store of information about data anonymization operations that have been applied to previous similar data sets. Based on the analysis of the returned information, the analytics service may make various policy decisions and assign various operations to the data set to enforce data anonymization. The user may then review and modify (if necessary) the specified operations and policies through the user interface before the anonymization policies are validated on the data set. Any required modifications are then stored in the data store so that the policies can be updated and the final set of policy operations can be returned to the user for approval and use of the data set at the required time.
Application of synthetic data in data anonymization and application of data anonymization in synthetic data
According to wikipedia, as mentioned above, the synthetic data is "any production data according to the maoglao-hill science and technology terminology dictionary, applicable to a specific situation, not obtained by direct measurement; mullins defines production data as "information that professionals continuously store and use for performing business processes. In other words, the synthetic data is created using various modeling, statistical, bayesian, markov and other methods, but it does not represent any real data actually measured. In contrast, synthetic data is a model of real data. Note that the actual data is ultimately referred to as the actual data bodies, and that the de-identified actual data, if re-identified, will reveal the identity of these data bodies and any quasi-identifiers associated with these data bodies. In contrast, synthetic data, whether plain text data or re-identified data, does not refer to real world data, but rather to a model thereof. Thus, while synthetic data may retain some abstract statistical properties of real data, synthetic data can never be dereferenced to generate real data unless the application that generated the synthetic data is still connected to or able to continue to access the real data, in which case any authorized (or possibly unauthorized) user of the application may access the real data.
The suggested "privacy policy" may include, but is not limited to, the use of synthetic data. This is because the synthesized data does not relate to the actual data body in the real data, and data not linked to the actual data body should protect the data privacy of the data body in principle. However, as explained elsewhere herein, this is not necessarily true in practice.
Thus, the privacy policy may: (i) specifying the use of synthetic data separately; (ii) The anonymity of the synthetic data is specified because, in principle, the synthetic data can be reverse engineered to generate a model of the real-world data, which can then be used to identify that a high degree of correlation between associated actual real-world data sets has a data body and model, i.e. a mosaic effect applied to the synthetic data and its model; the anonymous processing of the synthesized data can prevent all users except the authorized party from using the synthesized data, thereby reducing the capability of interlopers and bad behavior persons to utilize the potential vulnerability; (iii) Recognizing that the synthetic data generator must have access to the underlying real data for a limited time in order to model the synthetic data, but that after the synthetic data is generated, the need for such access to the underlying real data no longer exists, and thus can be terminated by using the JITI key, thereby limiting access according to time, location, and/or purpose; (iv) (iv) combining both (ii) and (iii) above, so that the synthesized data is not only named, but the synthesized data generation application cannot access the actual data and its associated data subject after the synthesized data is generated, and/or depending on the location or reason (i.e., purpose) of the generated data; (v) Any of the above data is supported, some of which are real data and some are synthetic data.
In one embodiment, bigPrivacy may support a privacy policy that specifies the use of some, most, or only synthetic data.
In another embodiment, bigPrivacy may support anonymization of part, most, or only of the synthesized data, and thus, even access to the synthesized data may only be available to authorized parties for limited time, limited location, and/or for limited purposes.
In another embodiment, bigPrivacy may support restricting access to real data and related data bodies only when necessary or at a specified place or for specific purposes when necessary or relevant to generating synthetic data, whether the synthetic data ultimately contains part, most, or all of the data set to be used.
In another embodiment, bigPrivacy may support the case of a synthetic dataset that is partially, mostly, or entirely made up of any combination of the above.
As described herein, bigPrivacy techniques can be used to facilitate compliance with regulatory and contractual restrictions to help release the full value of data, for example, by allowing more data usage while enhancing data security and privacy.
One exemplary implementation of bigpivacy may be used to help organizations adhere to new data protection regimes, e.g., by way of illustration and not limitation of GDPR, which involve new protection of eu data bodies, starting in the spring of 2018, with significant penalties and penalties for unsatisfactory data controllers and processors. GDPR is applicable to all companies that process one or more european union citizen personal data, providing fines up to 4% of the total global revenue since that date, collective litigation, direct liability data controllers and processors, data leakage notification obligations, etc., regardless of where the company is located or operating.
According to GDPR, a company cannot rely on previous methods of data analysis, artificial intelligence, or machine learning, and/or legal bases. Although consent remains the legal basis according to GDPR, the definition of consent is very limited according to GDPR. Now, consent must be "free, concrete, informed and explicitly stated that the subject of the data consents to the processing of personal data related thereto". These requirements, which comply with GDPR consent, are met if there is ambiguity and uncertainty in the data processing, as is typically the case with data analysis, artificial intelligence, or machine learning (e.g., big data analysis). These requirements for consent by the GDPR are heightened, transferring the risk from a single data subject to the data controller and processor. Prior to GDPR, the failure to fully understand the risks associated with widespread consent was borne by a single data subject. According to GDPR, widespread consent no longer provides sufficient legal evidence for large data. Therefore, data controllers and processors that manage eu data body information must now meet the alternative legal grounds for big data processing. Companies can establish alternative legal bases for the right to perform big data processing by satisfying GDPR's requirements for "legal interest", which requires that two new technical requirements be satisfied: "pseudonymization" and "data" default data protection, "which will be discussed in more detail below.
The fourth, fifth item of the GDPR defines "pseudonymization" as requiring separation of the informational value of data from the way it is associated with an individual. GDPR requires a technical and organizational separation between the data and the way the data is connected (or attributed) to the individual. Traditional methods, such as persistent identifiers and data masking, do not meet this requirement because association between data elements is possible without the need to access a separate protected method of linking data to individuals. The ability to re-link data to individuals is also referred to as "correlation effects", "re-identification by link attacks", or "mosaic effects" because the same party who has access to the data can link the data to a particular individual.
The second fifteen GDPR rules also specify a new specification of "default data protection," requiring that data must be protected by default and that steps of using data are required (unlike the default before GDPR, where data is available by default, steps (protection)), and that steps force the use of the required data by any given user only at any given time, and support authorized use only as needed, and then re-protect the data.
BigPrivacy may support pseudonymization by separating the information value of data from the ability to repurpose the data to an individual, and may also satisfy GDPR default data protection requirements, i.e., disclose only at a specified time for a given user, and then re-protect the data. BigPrivacy can be used to meet these requirements by replacing "restricted data elements" (e.g., "personal data" under GDPR, "protected health information" under HIPAA, contractual restriction elements, etc.) with dynamically changing pseudonym flags, which are associated with the original data values in the look-up table (these dynamically changing pseudonym are referred to herein as R-DDIDs because the pseudonym flag identifiers are used to de-identify, in which case the de-identifier is used to replace the data element). Using R-DDIDs, the data set may be qualified using a symbol that does not enable association or a "link attack" to return to a personal identity state without access to the key. Furthermore, bigPrivacy may provide access to more accurate data because alternative techniques tend to apply PETs on a broad basis, i.e., without knowing which data will be used for what purpose, which may reduce the value of the data.
As described above, the first step in BigPrivacy may involve replacing common matches for the same data element with different pseudonym labels using R-DDIDs. The second step may involve inserting nadves that may reflect or contain the "group", "scope", or "class" to which the data element belongs, without providing a method of linking the data back to the individual (i.e., without providing an identification element). An example of a NADEV would be to replace a numerical representation of a person's age and age range. In such an example, data bodies of any age within a particular age range will be assigned the same numerical representation (i.e., NADEV) to reflect that they belong to the "category" of that age. For uncommon NADEV, a-DDIDs may also be used to insert a spare data pattern (related or derived data values) into the protected data field. Generic a-DDIDs that protect or obscure NADE values may be assigned to all of the same data values (i.e., nadves) in the same queue or class because these nadves need not be converted. In this way, queue marking is done, where (i) the value of the queue, i.e. the NADEV itself, becomes the primary identifier of the data, i.e. the NADEV essentially acts as an a-DDID, since no additional protection level or obfuscating of the NADEV is necessary, relevant or optional; or (ii) if additional data protection is required, the a-DDID of the obfuscated NADEV will become the primary identifier of the data. According to current solutions, such anonymization is not possible, since the identity of the individual is the primary identifier of the data.
FIG. 1X-1 shows a general method of providing an application of BigPrivacy (140). Each time the privacy system is accessed, incoming data may be sent to the system through a "shim" application (e.g., a small library that transparently intercepts API calls and changes passed parameters, handles the operation itself, or redirects the operation elsewhere). The shim may also be used to run programs on software platforms other than the one originally developed. Since BigPrivacy implementations may utilize a random lookup table from which the correlation between R-DDIDs and/or a-DDIDs and underlying data values is not mathematically derived, but rather randomly associated, third parties cannot re-identify the underlying data without having to access the correct key.
As shown in fig. 1X-2, diagonalization can also be accomplished "online" through the use of a system (150), which system (150) implements a de-identification and/or re-identification policy at a data entry or data entry point, using data communication over a network with browsers, devices, and sensors.
Figure 1Y-1 illustrates a cloud-based platform and application (160) for providing BigPrivacy services to de-identify data. A user, automated process, internet-connected device or other entity ("user") may send "raw" data (i.e., data that existed prior to de-identification) and metadata that may specify data attributes to the bighivacy cloud platform processor (step 1). Data may be specified as a single data element, a record, an entire data set, or any combination thereof. The system can determine how to process the data by analyzing the provided metadata and looking up to identify the policy through a separate interface (step 2). The policies under the de-identity policy interface may be stored in an "intelligent policy compliance" engine in a relational database, as a file in a server file system, or by other means (step 3). After determining the policy to be applied to the user-provided data, the system may revoke the identification of the data according to the policy. If the user configures the system to store the data to be de-identified in a data store, message bus, map reduction system, or other destination, the system may send the data to be de-identified to the destination (step 4). If the user configures the system to preserve mappings between "original" data elements and their de-identification values ("R-DDIDs") and nadves, the user should take a non-exclusive approach, these values may themselves be used as a-DDIDs or identified by de-identifying the associated a-DDIDs, and then the system may establish a persistent mapping in the data store for future use (step 5). The identifier may be returned to the user so that the user may refer to the de-identified data set or mapping between the R-DDIDs and any NADEVs, A-DDIDs in the future (step 6).
The persistent mapping described above in step 5 of fig. 1Y-1 may be used at some point in the future, by an automatic key generation service or other means, to create re-identification keys (e.g., JITI keys) that may restore some or all of the persistent R-DDIDs and the nadvs or a-DDIDs (or both) in the system-generated de-identification dataset.
Figures 1Y-2 illustrate a cloud-based platform and application that provides BigPrivacy to re-identify data that has been deleted (170), for example, the BigPrivacy deletion phase described above with reference to figure 1Y-1. A user, automated process, networked device, or other entity (e.g., a "user") may require re-identification of one or more data elements. The user provides a reference to re-identify the data by referencing a unique identifier that was returned to the user during the un-identification phase, specifying the data to be explicitly re-identified, or other means. The user also provides a reference to the JITI key that contains a mapping between the specified de-identification data and its re-identified corresponding data, e.g., by specifying a unique identifier that is returned to the user during the de-identification phase, and so on (step 1). To ensure that only the appropriate entity has access to the re-identified data, the system may use the JITI key management service (step 2) to authenticate and authorize the user (step 3) before processing the request. As described above with reference to FIG. 1Y-1, the system may also establish a persistent mapping in the data store for future use (step 4). The system then accesses the user-specified de-identification data and the JITI key, reverses the de-identification mapping based on the data contained in the JITI key, and finally may return the requested re-identification data to the user or another authorization destination configured by the user (step 5).
Multiple users may be allowed to re-identify different R-DDIDs and a-DDIDs based on their access rights to the underlying data elements. Access rights verification may be performed by identification (i.e., if the user has a JITI key, the user may access an authentication and authorization service (e.g., LDAP) via an access request, display all data in the key by geographic, temporal, or other parameters, or by any combination of these and/or other methods.
BigPrivacy may generate nadves (possibly also obscured by a-DDIDs) during the de-identification phase, thereby pre-calculating derivative, correlation and/or composite data needed to re-identify the data set before the re-identified data is needed for analysis or other applications. For systems that must perform these operations during the re-identification phase, this represents an improvement in re-identification speed, server power consumption, multi-tenant capability, and other factors.
Figures 1Y-3 illustrate a BigPrivacy application integrated with an extract, transform and load (ETL) application (180). The user may use the ETL application to coordinate, transform, or otherwise manipulate data, and use a BigPrivacy plug-in (also called an "add-on") to perform a de-identification task, a function that can be added or deleted in software in a modular fashion) (step 1). Using the ETL application, the user can store the de-identified data and/or re-identification data on their local computer, corporate data center, bigPrivacy platform, and/or other authorized location (step 2). The system receives user-provided de-identification data and/or re-identification key data and stores the data (step 3). In the future, another user having access to a re-identified version of the de-identified data may interact with BigPrivacy and be required to re-identify one or more data elements that were originally de-identified from the ETL application (step 4).
As described above, the process of anonymizing data can reduce the obligation and responsibility for data leakage notification in each jurisdiction, for example, (i) in the European Union under GDPR items 33 and 34; (ii) In the united states, legislation requiring private or governmental entities to notify individuals of security leaks relating to personal identity information under (a) federal regulations, such as HIPAA violation notification rules, 45 us federal regulations § 164.400-414, and (b) rules of forty-seven states, the columbian district, the guam, puerto rico and the wiljing island; and/or (iii) other similar notification obligations as specified by other regulatory plans. In other words, if an anonymous data table is destroyed, the data custodian does not have to inform the data principal of the leak, since the data will be protected from a re-identification perspective. Furthermore, access to the master table using stolen keys may be made more difficult by the established functionality of the key management system, e.g., one or more "heartbeat" authorization certificates, multi-key requirements, GPS requirements, etc. May be used to manage the keys of a given system. Furthermore, the information value level of the access rights provided by these combinations of controls may be individually limited.
BigPrivacy of all types can support NADEVs by anonymization, thereby supporting comprehensive performance analysis and processing using these NADEVs without the need to translate or reference data to identify/re-identify policy engines, API calls or "shims". In particular, the A-DDIDs can be handled directly, and only when the desired result of de-identifying the abstraction level is achieved will a "call" be issued to retrieve NADEV. In this case, the NADEV may be retrieved for only a few users (e.g., 50 users in a data table) whose a-DDID shows a queue or class associated with the query, while the a-DDID of most users (e.g., the remaining 500,000 users in the data) does not match the queue or class associated with the query and thus does not have to retrieve their data. Despite the above provisions, bigPrivacy does not require that NADEV be masked by A-DDID; it simply provides a method and apparatus to do so if the NADEV is not covered by the a-DDID, then the NADEV effectively acts as an a-DDID.
BigPrivacy may also support different levels of abstraction where, in addition to supporting only primary and secondary levels or tables, other levels or tables (including but not limited to NADEV) may represent R-DDIDs and/or a-DDIDs, associating data with fictional persons, companies and/or attributes that are not "truthful" to the persons, companies and/or attributes disclosed with reference to the main table. This may prevent the identity of the "true" person, company and/or attribute to which the NADEVs, R-DDIDs and/or a-DDIDs relate from being revealed, but may indicate that the NADEVs, R-DDIDs and/or a-DDIDs are associated with a common (but unidentified) person, company and/or attribute. Since different types of data controllers may require different levels of identification data, they may be provided with access to different tables, levels, and/or JITI keys only when their particular authorization requirements are met, without revealing identification information that is higher than the authorization level.
BigPrivacy also enables the data handler to implement the individual 'forgotten rights' of the data body (e.g. according to the requirements of GDPR clause 17), e.g. delete the link to the individual by 'deleting' the key required to create the link. The policy engine is de-identified without deleting the data itself. Instead, only the link between the data and the true identity of the data body needs to be deleted from the look-up table or database.
Applications of anonezation in quantum computers, quantum computing, quantum cryptography and quantum privacy
We distinguish between Classical Computers (CCs) and Quantum Computers (QCs) as follows-CCs, as used herein, refers to a binary machine in which the smallest representable information is a binary digit (or bit, i.e. binary number). 0 or 1); QCs, as used herein, refers to quantum machines in which the smallest representable information is a quantum bit (or qubit). The qubit can be either 0 or 1 or both. Quantum bits are typically atoms or photons, although they may in principle be any sufficiently small particles, for example. Any particle for which the quantum mechanical principle applies. This quantum mechanical property is called superposition. In addition, the quantum bits of QC are entangled. This means that when one qubit is changed, it will also affect the other qubits. (conversely, in CCs, bits are independent, i.e., a change in one bit does not necessarily mean that other bits will also change.
Due to these two properties (superposition and entanglement), the QC can perform a large number of computations simultaneously in parallel (CCs perform a large number of computations serially; or they require additional processors to achieve parallelism, rather than simply adding bits). For example, there are some computational methods, usually solutions to other troublesome problems, which can in principle be implemented by QC in a few seconds, even if not shorter than a few seconds, while these same solutions may cost CC nearly equal to or exceeding the length of the universe (and thus difficult to solve).
Current encryption methods typically include public key encryption and elliptic curve encryption, where the former can only be decrypted by determining the prime factors of a very large composite number (i.e., p and p). p1 p 2). Even the fastest and most powerful computers on earth cannot break public key encryption involving a large number of bits (such as 512-bit, 1024-bit and 2048-bit passwords). Instead, QC can destroy the encryption in a few seconds by evaluating all possible solutions simultaneously and "solving" one solution that destroys the encryption.
Besides BigPrivacy, de-identification usually involves a so-called one-way hash function, since in principle the initial value cannot be determined in the "opposite way", i.e. from hash to re-identified original string. Also, while QCs can quickly determine the original string from their one-way hash, CCs must perform a forcing operation (if the utilization of the underlying hash algorithm is not known) to decode the hash, which may take days, months, years, or significantly longer to complete. The same deficiencies typically exist for other forms of de-identification, including other Privacy Enhancing Techniques (PETs) discussed elsewhere herein.
One fundamental problem with all encryption methods is that they encode the original information in some way. At least theoretically, for QCs, the data body and all its quasi-identifiers are recoverable at some depth, i.e. re-identified from the coded form, even using methods that are not easily corrupted by QCs. While BigPrivacy does not require implementation of blocking encoding, bigPrivacy does not rely on encoding, but rather on dynamic replacement of the original data with an unrelated string, whether to generate R-DDIDs or a-DDIDs. If a string is fundamentally random (QCs are well suited to this property, but other methods are available), then there is no way to re-identify any type of DDID, since a DDID is an arbitrary string, not an encoding of the original data. Furthermore, since the same data is represented by different DDIDs, there is not even any relationship between the DDIDs. In other words, from an information theory point of view, DDIDs are the largest range, containing no a priori information about the data body or any original data. Thus, QCs will not be able to determine the original content from DDIDs containing zero information about the content. For this reason, DDIDs provide a technique to protect individual privacy and prevent re-identification of recognized data-even in the quantum computing world.
Accordingly, bigPrivacy further addresses the goal of privacy maximization, as well as the problem of data value maximization. In contrast, other PETs increase privacy at the expense of reducing or eliminating data value; instead, privacy is either reduced or eliminated by increasing or maximizing the data value. Thus, even though QCs can maintain or enhance privacy, they can still reduce or eliminate data value. This is because massive parallelism and speed, even if done simultaneously, do not increase or enable data values. In contrast, only BigPrivacy (along with QCs) can maximize data privacy and data value, even if QCs become the standard for all computations.
BigPrivacy can also be applied to encrypted forms, even QC encrypted forms. In other words, bigPrivacy is computationally independent of the basic nature of the computer, in that it can exchange any link between the original data (which may or may not be encrypted) and the data de-identified by BigPrivacy.
BigPrivacy can also exploit fundamental quantum mechanical properties. For example, QCs are themselves ideal choices for generating truly random numbers. However, using a true random number as an input to a computable function destroys the de-identification purpose, since there is still a correlation between the randomization of the original data and the data. However, in BigPrivacy, as described above, true random numbers are used only as DDID-random numbers are independent, and have no correlation (or relationship) with base data. In this way, bigPrivacy may actually take advantage of the properties of QCs, ensuring in one embodiment that there is zero correlation (or near zero correlation) between the DDID (whether R-DDID or A-DDID) and the underlying data.
Implementing centralized BigPrivacy control in decentralized systems
The BigPrivacy techniques described above also allow a controlling entity to establish, enforce, validate and modify centralized privacy and security controls on decentralized networks or platforms (including unlicensed systems or distributed ledger techniques), including networks or platforms linked in a peer-to-peer or other decentralized manner (including unlicensed systems or distributed ledger techniques).
One embodiment of the invention is applicable to decentralized networks based on block-chain technology. Blockchains are the underlying technology behind many popular cryptocurrency platforms today. The best known use of blockchains is to support cryptocurrency and cryptocurrency transactions, but they also have a wide range of other applications, such as storing medical data, supply chain management, financial transaction management and verification, supporting and implementing so-called "smart contracts", and social networks.
The term "Blockchain" has no single definition, but generally has two uses (i) to refer to a particular method or process for digitizing, distributing ledgers, verifiable, unique, and theoretically noncorrosive transaction records in a decentralized point-to-point computer network; and (ii) describe the underlying data structure (i.e., blocks) used to represent the transaction itself, i.e., a chain of data blocks, where each data block is linked (or "linked") to a previous data block according to a particular algorithm/programming methodology. If the term "blockchain" is used in a different sense, it will be explained in detail in the context of its use. Transactions for any client or node participating in the blockchain network are recorded on the network in the form of "chunks" of data, which are time stamps and linked to a previous chunk in the blockchain, whichever client or node initiated the transaction. Linking each block to the previous block may confirm the integrity of the transaction chain, all the way back to the first block in the blockchain. Failure to link each block to a previous block confirms the failure of this integrity, which may indicate tampering (i.e., any type of change in data stored in one or more blocks in the blockchain), fraud, and the like. The information in the block is encrypted and protected by an encryption method.
Blockchain is stored in a decentralized network; in other words, there is no centralized or "official copy" of the data stored in the blocks. Instead, there may be, and do exist, multiple identical blockchain replicas. Each instantiation of Blockchain is the same node in a particular network (or, if a node does not have the latest version of Blockchain, the node will be considered to have left the network for post-verification transactions until the node has "caught up" or rejoined the cryptocurrentnetwork.
As described above, the GDPR of the european union defines certain obligations for data "controllers" (i.e., natural or legal persons, public authorities, institutions, or other agencies, alone or in conjunction with others, to determine the purpose and means of personal data processing) and data "processors" (i.e., natural or legal persons, public authorities, institutions, or other agencies that process personal data on behalf of a controller). In addition to penalizing the data handler, the GDPR imposes more stringent obligations on the personal data controller and greatly exacerbates the potential penalty for non-compliance.
The seventh item of GDPR defines "Erasure/forget right" to allow the individual data parties to request deletion or delete personal data that has not been processed for any reason.
One key feature of a blockchain is its integrity (i.e., completeness). The ability of network users to trust the accuracy of the data stored in the chain blocks), which is guaranteed by their invariance. Once a block is verified and added to the chain, it typically cannot be deleted, edited, or updated. In fact, the design of the blockchain is such that data stored in any one block "breaks" (i.e., fails) all downstream blocks in the chain. However, in most cases, blockchain data encryption protection or static marking, it is anticipated that one might want to exercise the right to "delete/forget" in light of GDPR (or other similar regulations providing such a right) to require their data to be deleted from the block chain. With a common blockchain platform, such requests are impossible to fulfill without destroying the integrity of the entire chain.
The uk financial market behavior administration (FCA) warns companies developing blockchain technology to be careful about incompatibilities between invariances and GDPRs. Some solutions to this problem have been proposed, such as allowing an administrator to edit blockchain when necessary. However, as described above, editing blockchain breaks the concept of blockchain because it makes blockchain variable, thereby not guaranteeing blockchain integrity.
The GDPR is designed with the assumption that the data custodian continues to be the central entity. GDPR does not consider a decentralized system like blockchain. The BigPrivacy technique described herein greatly increases the functionality of the underlying blockchain technique for several reasons. For example, bigPrivacy can be used to keep the blockchain invariant to data while conforming the data to the "delete/forget" criteria of the GDPR. The BigPrivacy techniques described herein (e.g., the use of DDIDs) can also be applied to the new context of decentralized storage systems (the novelty of which, e.g., GDPR itself does not take into account the problem that the use of immutable decentralized ledgers to store user data can affect the implementation of its requirements). BigPrivacy allows further use of blockchains to handle other obligations of the data controller and processor under GDPR in a variety of ways, as will be discussed in detail below.
Delete/forget right
Turning now to fig. 1Z-1, an exemplary blockchain technology-based decentralized network is illustrated in which anonymous privacy controls can be used in accordance with one or more embodiments. The top of fig. 1Z-1 (185) shows the current situation where the name of the data body may be encrypted (e.g., using a desired encryption algorithm) or replaced with a static identification before being stored in the blockchain. In this example, the abbreviation "ABCD" is used as an illustrative representation of the results of such an encryption or tokenization process. By accessing the appropriate key, it can be determined that the encryption/marker value of "ABCD" is "John Smith". As described above, this knowledge is immutable and conflicts with the requirement that GDPR provide the user with "delete/forget right". This is so because "John Smith" although stored in encrypted form is contained in blockchain185 itself.
The bottom of FIG. 1Z-1 (187) shows a Dynamic De-Identifier (DDID), in this case "DDID652", that can be used in blockchain, i.e., to replace the encryption/marker value of "ABCD". As described elsewhere herein, the DDID ("DDID 652") may be used as an "indicator" of the underlying name of the data body "John Smith" unless the data body exercises its "delete/forget right," at which time the DDID may point to an "empty" entry. In this way, the immutable nature and referential integrity of blockchain can be guaranteed, and the Data Subject can flexibly exercise the 'delete right/forget right' thereof. It must also be noted that the DDID may point to any other content, i.e. not only to "john smith" or "null" or the value 0, but also to any other location containing any other desired value. In the Big Privacy enabled example (187), the value of "John Smith" is not actually contained in the blockchain itself, as compared to the traditional blockchain example (185); instead, a DDID ("DDID 652") that temporarily points to a location containing the value "john smith" is contained in blockchain. The DDID value remains unchanged in blockchain187, but the value pointed to by the DDID can be changed without changing the blockchain itself.
In another embodiment, bigPrivacy may enforce the same "delete/forget" (or at least one independent term of several terms fulfilled by both parties) in a "smart contract" fulfilled by both parties. BigPrivacy is able to provide this level of privacy/anonymity because once each counterparty has fulfilled the contract, the records of the counterparty are no longer needed (i.e., each counterparty has fulfilled the obligation of the other party). For example, the idea of deleting or forgetting the identity of one or more parties involved in a smart contract may be present in a transaction or exchange of financial instruments.
The RedHat initiatives Mike burstell states that confidentiality, integrity, and availability are major issues for intelligent contract fulfillment, as follows:
"once a transaction (or 'smart contract') is completed and enters a blockchain or distributed ledger, it is immutable by definition. But before completion? Then, the simple transactions described at the beginning of this document are atomic-they occur or do not occur, they are "indivisible and unreducible" in jargon. In most cases, they are transient.
This is not the case for "smart contracts". They require processing and therefore must be present over time. This means that they are subject to various attacks that any system may be subject to when they are handled. The [ two relevant components ] [ include ] of the standard list:
● Confidentiality: the status of the "Smart contract" may be snooped, which may result in asymmetric knowledge or divulgence to unauthorized parties.
● Integrity: this is a nightmare-like example for many "smart contracts". If an entity (whether or not acting as a party to the base contract) is able to (intentionally or unintentionally) alter the internal state of the code executing the "smart contract," the results of the "smart contract" will not meet expectations, and any party may have a legitimate reason to disagree with the results. Furthermore, it is possible that such disputes are not based on evidence relating to integrity loss, but only on suspicion. In the execution context, it is extremely difficult to prove integrity at runtime, let alone the mitigation process when the display has been lost.
The bigbrivacy techniques disclosed herein may also make such snooping issues inconsequential, for example, by protecting the identity of transaction counter-parties, as well as information about the terms and conditions of the transaction for intelligent contract elements. In other words, bigPrivacy assumes that snooping can occur in any way, but it ensures that any data obtained by such snooping is of no value to the snooper, since that data is only a DDID and not the underlying "real" value of the data intended by the snooper. With respect to completeness, bigPrivacy ensures that parties do not intentionally or unintentionally change code by making the terms themselves (including the identity of both parties to a smart contract) unusable by snoopers, since any change to the code would produce a completely random result without knowledge of the code implementation content.
Design and defaults for data protection
The GDPR clause 25 requires the data controller to implement appropriate safeguards "when determining the means of processing and when processing itself". Article 25 further illustrates that one way to do this is to "rename personal data".
Data protection by design and default must be applied as early as possible, so by default, data usage is limited to the minimum range and time required to support the specific usage authorized by the data principal. The default is now that data is available and measures and efforts must be taken to protect the data. The GDPR requires that this default value must be changed. Whether pseudonymized, specifically mentioned as an item in item 25 of the GDPR, or by some other means, the GDPR needs early display protection and is limited in scope and time to specifically authorized Data Subjects.
GDPR clause 78 specifies that "to protect the right and freedom of a natural person in handling personal data, appropriate technical and organizational measures must be taken to ensure that the requirements of the statute are met. To be able to prove compliance with this regulation, the controller should take internal policies and implement measures that are specifically in compliance with design data protection rules and default data protection measures. These measures include pseudonymization of personal data as soon as possible.
Item 5 of GDPR item 4 defines "pseudonymization" as requiring separation of the informational value of the data from the risk of re-identification. This separation is necessary to benefit from the legal/regulatory incentives and pseudonymous rewards of the GDPR. Replacing multiple matches for the same person Data element (e.g., the name of a Data Subject) with a "static" (or persistent) identifier does not separate the informational value of the Data from the risk of re-recognition, since re-recognition correlation and link attacks (also known as "Mosaic Effect") may be due to the use of a "static" (or persistent) identifier rather than a dynamic de-identifier.
As described above, a "static" identification method uses a persistent identifier to protect data. By searching for a particular tokenized string that repeats itself within or across the database, a malicious executive component or interactor can obtain sufficient information to uncover the identity of the data body. This is an increasingly large problem for analysis and other procedures that merge and mix internal and external data sources. Conversely, if a data element is replaced each time it is stored using a different pseudonymous DDID, where each different DDID has no algorithmic relationship to other data elements, the same malicious participant or interlude can no longer determine that the DDIDs belong to or are related to the same data body, let alone that the name or other identifying information of the data body is found.
1Z-2, another exemplary decentralized network built based on blockchain technology is illustrated, in accordance with one or more embodiments. 1Z-2 show the case where the name of the data body is encrypted (e.g., using an encryption algorithm) or replaced with a static identification. For these purposes, the acronym "ABCD" is again used as an example representation of the result of such an encryption or identification process. The same encryption/tokenization value "ABCD" is used in multiple blockchains to represent "john smith". This is represented in fig. 1Z-2 by blocks #1 (190) and #2 (192), each storing data "ABCD" in one of its respective blocks as described above, and John Smith can eventually be re-identified using the same encryption/tokenization value (in this example, "ABCD") continuously (or statically) -without having to access any keys or mappings to display "ABCD" = "John Smith". As described above, the inability to protect the identity of John Smith may violate obligations that the data controller fulfills according to GDPR items 25 and 78.
1Z-3, another exemplary decentralized network based on blockchain technology is illustrated in accordance with one or more embodiments, wherein anonymous privacy controls may be used. Fig. 1Z-3 shows how different DDIDs may be used in different block chains (in this example, block #1 (195) uses "DDID652" and block #2 (196) uses "DDID 971"), where each DDID may serve as a "pointer" to a "John Smith" name value. In this way, the immutable nature and referential integrity of the blockchain is maintained while still providing the necessary impetus to meet the pseudonym requirements specified in GDPR No. 4, no. 5 and the data protection requirements specified in GDPR No. 25.
Thus, bigPrivacy does not require changing the underlying blockchain algorithm for verification. In contrast, anonosBigPrivacy first indicates that the blockchain of the current implementation cannot: (i) Key elements of GDPR compliance (which specifies the technical requirements to protect privacy of individual data subjects); while (ii) remains unchanged. Unless the invention herein is applicable to blockchain implementations, these technical requirements imposed by GDPR (e.g., the forgotten rights and data protection under design default) and the invariance requirements of blockchain cannot be met. Furthermore, bigPrivacy may be used to mask the identity of the original trading opponent into a "smart contract" before, during and after the execution of such a "smart contract".
Other embodiments of the disclosed technology, as applied to distributed ledger technologies such as blockchain, may include, but are not limited to, authenticating copyright registrations; tracking digital usage and payments to copyright content creators (e.g., wireless users, musicians, artists, photographers, and authors); tracking the flow of high value parts in the supply chain; ensuring spectrum sharing of a wireless network; online voting is supported; starting a 'treatment right'; implementing a medical record information system; confirming and verifying ownership of digital art; taking ownership of a game asset (digital asset); new distribution modes are provided for insurance industry, such as peer insurance, parameter insurance and microscopic insurance; and the method promotes the cooperative partners in the fields of sharing economy, internet of things and the like.
Fig. 2 shows an example of process operations or steps that may be taken by an abstraction module of a privacy server, such as abstraction module 52 shown in fig. 1 and 1A, according to one embodiment of the invention. In one example, in step 1, the associator ZZ (shown as "RP ZZ") sends a request through a privacy client, which may reside on a Data Subject device, on a service provider device, accessible through a cloud network and residing in the cloud network, or on a privacy server on the same computing device as the privacy, related to the desired operation, activity, flow, or feature. The request initiation may be configurable so that it may be initiated predictively, randomly, automatically, or manually. For example, the associator RP ZZ initiates a request to perform the required Web browsing online operations.
In step 2, in one example, the abstraction module of the privacy server determines and retrieves from the database a combination of attributes required to perform the desired operation, activity, process, or feature as attribute combination a ("AC a"). In this example implementation of the system, the abstraction module of the privacy server is configured to add or delete attributes, retrieve combinations of attributes, and modify attributes in any given combination.
In the example of an e-commerce website involving the sale of athletic equipment, the abstraction module of the privacy server may determine that attributes associated with the height, weight, and budget of the DataSubjects are necessary, activities, processes, or features to perform the desired operations, and thus may retrieve the height, weight, and budget attributes of the specified DataSubjects from the database to form attribute combinations consisting thereof. In another example involving a doctor requesting blood pressure information, the abstraction module of the privacy server may determine that the attributes consisting of the most recently recorded systolic and diastolic blood pressure values are for the desired action, activity, procedure or feature, and may therefore retrieve the most recently recorded systolic and diastolic blood pressure values specifying the Data Subject to form a combination of their constituent attributes. Another example may involve an internet user who goes to an online retailer of running shoes. The online retailer may not know who the user is, or even know that the user visited the site one or more times in the past. The site that the user may wish to visit knows that he is buying running shoes at all times, and the site that the user wishes to visit may also know which shoes the user has seen on other sites over the past few weeks. The user may inform the privacy server to publish only recent shopping information and other user-defined information to the visiting site. Another example may involve an internet user who visits an online retailer of running shoes. The online retailer may not know who the user is, or even know that the user visited the site one or more times in the past. The site that the user may wish to visit knows that he is buying running shoes at all times, and the site that the user wishes to visit may also know which shoes the user has seen on other sites over the past few weeks. The user may inform the privacy server to publish only recent shopping information and other user-defined information to the visiting site. Thus, in this example, the privacy server may select attributes of shoe size =9, last seen shoe on other web sites = Naike X, arthur Y, new Bailun Z, average priced shoe = $ 109, shopper's zip code =80302, shopper's gender = male, shopper's weight =185 pounds. The privacy server may collect these attributes, generate a unique DDID or accept or modify a temporarily unique, dynamically changing value as the DDID, assign the DDID to the attributes, and send the same value as the TDR to the visited website. If the user views the Socony 123 model, the website may append this attribute to the information about the attributes associated with the viewed shoe and send this information back to the privacy server as part of the enhanced TDR.
Yet another example might be an individual banker of a bank who is working with a customer who wishes to add a savings account to an account she originally opened at the bank. The individual banker may not need to know all of the information about the customer, only the information needed to open the account. By using the invention, the banker can inquire the privacy server of the bank through the privacy client to request to open a new deposit account for the client. The bank's privacy server may determine the requestor's data authorization limits and required operations. The bank's privacy server may collect the following attributes on the customer: name = Jane Doe, current account number =12345678, current account type = check, customer address =123 home street, bolder, CO 80302, other signers check account = Bill Doe, signature relationship of customer and husband. After the privacy server of the bank collects the attributes, a DDID is allocated to the attributes, and the information is sent to the personal bank as an enhanced TDR through the privacy client.
For example, the controlling entity may choose to include the data attribute in attribute combination a, enabling the recipient of the TDR to anonymously track the associator ZZ during the presence of the generated TDR using existing tracking techniques. The controlling entity can also choose to contain more accurate data than that provided by the existing tracking technology, in order to provide a personalized and customized service for the associator ZZ.
In step 3, in one example, a request is made to a privacy server ("PS") of the DDID. This may include requesting a specified level of abstraction and either generating a unique DDID or accepting or modifying a temporally unique, dynamically altered value for use as a function of the particular activity, action, flow or feature required. Prior to assigning the DDID, the PS may verify whether the DDID value is not actively used by another TDR, possibly including a buffer period, to account for potential interruptions and system downtime.
In step 4, in one example, the abstraction module of the PS allocates and stores a DDID in response to a request related to an operation, activity, process, or feature. Step 4 may also comprise, in one example, the operation of assigning DDID X to the Web browsing requested by the associator ZZ.
At step 5, in one example, the abstraction module of the PS combines the retrieved applicable attribute combinations and assigns DDID X to create a TDR. The TDR itself may not comprise information about the real identity of the associator ZZ, but the maintenance module of the privacy server may retain the information necessary to re-associate the TDR with the associator ZZ. Operation 5 may also include requesting a security database associated with the DataSubject associated with the property combination, thereby providing an internal record in the aggregated data profile for DataSubject associating party ZZ with a particular property combination A, the combination deemed necessary for the desired operation, activity, process, or feature execution.
FIG. 3 shows an example of additional steps that may be taken by the abstraction module of the privacy server, according to one embodiment of the invention. In step 6, in one example, the TDR created for the Web browsing request of the associator ZZ is transmitted as a privacy server applicable to the service provider, supplier or merchant, through a privacy client possibly residing on the Data Subject device, on the service provider device, accessed through a cloud network or residing in the same computing transmission device. The privacy client may also obtain data related to the desired browsing activity through a service provider, or merchant.
Once the purpose of the TDR is met or the predetermined time limit is reached, in one example, the TDR may be sent back to the privacy server by the privacy client, and in step 7, the new combination of attributes may be used to enhance the required actions, activities, procedures or traits for which the returned TDR created the TDR. In the example shown in FIG. 3, the associator ZZ performs the desired Web browsing with the service provider, merchant or vendor and generates an attribute combination ("AC Q") that reflects the combination of attributes associated with the desired Web to perform the browsing. When Web browsing is complete or the time limit of the TDR expires, the privacy client with TDR (now having added the attribute combination Q to reflect the data related to Web browsing) transmits the data from the service provider, vendor or merchant to the privacy server. When the data is received back on the privacy server, a time period/stamp is associated with the TDR in one example by a Time Key (TKs) or other means, and the relevant attribute combination returned from the service provider, vendor or merchant may be updated and stored in a secure database in the aggregated data profile of the DataSubject.
FIG. 4 shows an example of additional steps that may be taken after the operations of FIG. 3, according to an example of embodiment of the present invention. As the privacy server receives each extended TDR, the privacy server's maintenance module may update the source data by associating a time period/stamp with the Time Key (TKs) or other means, the DDID, and the attribute combination with the applicable data body. As shown in fig. 4, the privacy server may record and associate the time period/stamp with the Time Key (TKs) or other means (DDID, attribute combination a and attribute combination Q) and associate with the request related party ZZ in a secure database. In the maintenance module of the privacy server, the relationship information between the time periods/stamps, DDIDs, attribute combinations, data bodies and related profiles may be stored, updated or deleted as required. This may include, in one example, storing or updating all relationship information between time periods/stamps, DDIDs, attribute combinations, data bodies, and configuration files in the secure database in the aggregate data summary of the data bodies. After the association of new data related to the operations, activities, flows or features required in the combination of attributes is completed, the DDID may be reassigned for use with new TDRs, in one example, in the same manner as described above.
FIG. 5 highlights the differences between a single-level abstract implementation example of a system, as compared to a multi-level abstract implementation example of a system, in accordance with an embodiment of the present invention. Example 1 shown in fig. 5 shows an example of a system with a single abstraction layer, as described in the discussion of fig. 2-4 regarding Web browsing activities. Example 1 in figure 5 shows an example of final disposition resulting from the Web browsing activity of figures 2-4, wherein the time period/stamp associated with the request associator ZZ, when the secure database is updated, is associated with the request associator ZZ by means of a Time Key (TKs) or other attribute combination a, attribute combination Q and DDID X. It should be noted that with example 1, parties outside the system would not have access to the identification information associated with the property combinations or Data Subjects. However, in the system, although a user replacing the key (RK) is described herein, in one example the identity of the associating party ZZ may be identified, as well as the relationship between the associating party ZZ, the property combination a, the property combination Q, the time period/stamp and the DDID X.
Example 2 in fig. 5 reflects one potential implementation of a multi-layer abstraction implementation of a system, consistent with an embodiment of the present invention. The abstraction provided is a function of multiple applications of the system, rather than a completely different part. The dynamic nature of TDRs allows the same baseline principles to be used between levels of abstraction while still providing available interaction with respect to the requested data. In this example, an entity with authorized access to the privacy server a and the related secure database would have access to the association between DDID X, DDID P, DDID TS, and DDID YY, as well as each attribute combination and time period/stamp associated with the DDID. However, in one example, the entity will not have access to any information of the disclosed associations between the different DDIDs. Only after having accessed the privacy server B and the related secure database can the second level of abstraction be revealed about the relation between DDID X and DDID TS and DDID YY. As shown in fig. 5, the second layer abstraction may be the relationship between the body DD and the DDIDs X and P, or the relationship between the body CV and the ddidsss and YY.
If principal CV and principal DD reflect the identity of the Data Subjects in question, then example 2 will reflect one potential implementation of the two-level abstraction of the system. However, if the values of both the body CV and the body DD are assigned dynamically alterable DDIDs, then example 2 will reflect one potential implementation of a system three-level abstraction implementation. It should be appreciated that any and all elements of the system may be abstracted at multiple levels to achieve desired security and privacy/anonymity.
In one example implementation of the system, examples 1 and 2 in fig. 5 may represent authenticated data structures that allow the verification module of the privacy server to verify the DDID and combinations of attributes embodied in the TDR and/or data profile over time by cyclic redundancy check ("CRC"), message authentication code, digital watermark, and link-based timestamp methods. These methods verify the status and composition of Data at different points in time by validating the composition of each Data Subject, attributes, attribute combinations, aggregate Data summaries, and other elements contained at different points in time in the privacy server.
Further, in one example implementation of an embodiment of the present invention, examples 1 and 2 of FIG. 5 may each include data required to access a log module to enable post-event forensic analysis in the event of a system-related error or misuse.
Fig. 6 shows one example of a process for providing data security and data privacy/anonymity according to one embodiment of the present invention. Fig. 6 shows, in one example, the flow steps that a controller or system may implement. The operations outlined in FIGS. 6-10 may be facilitated by known programming techniques, including but not limited to Simple Object Access Protocol (SOAP), representational State transfer (REST) Application Programming Interface (API) or Service Oriented Architecture (SOA) techniques, and canonical industry standard data models such as HL7 for healthcare, SID for telecommunications, ARTS for retail, ACCORD for insurance, M3 for multi-commodity models, OAGIS for manufacturing and provisioning. PPDM for oil and gas/utility use and the like.
In step 1 in FIG. 6, the data attributes are received as input or created as system input. As previously mentioned, for the purposes of this disclosure, a data attribute refers to any data element that can be used alone or in combination with other data elements to identify a DataSubject, such as a person, place, or thing, or a related operation, activity, process, or feature. An example of a data attribute might be a street address consisting of a street address 1777 street 6, 80302, border, colorado.
In step 2 of FIG. 6, the data attributes are associated with applicable principals. In the example above, the data attribute address is associated with the subject colorado court building.
In step 3 of FIG. 6, the element associated with each data property is linked to or bound with the data property and is determined to include the applicable category. Values and classifications associated with attributes to facilitate use of the attributes with respect to desired actions, activities, processes or traits. For example, the elements associated with the data attribute address may be: (a) classification as a street address; (b) the value is: 1;7;7;7; the 6 th time; s; t; r; e; e; t; b; o; u; l; d; e; r; c; o; l; o; r; a; d; o;8;0;3;0;2;1777; street 6; border; colorado; 80302 or any combination thereof; (c) Since buildings are stationary, they are classified as constants in nature. Another example of a data attribute relating to a subject building may be that the condition of the building (a) is classified as a condition of the building; (b) value in good terms; and (c) are classified as variable properties because the conditions of the building may degrade over time. Another example of data attributes associated with a subject building may be: (ii) (a) an organization classified as having offices within a building; (a) organization associated with a subject building. (b) Has the value of a Border Colorado state alternative crime-judging scheme; (c) Are classified as of variable nature because CASP may alter the location of its office in the future. It should be noted that the exogenous information may contain attributes associated with the Data Subject. For example, with respect to the above-described buildings, if one knows that the Border Colorado Alternative Sentencing Program (CASP) has an office in the Colorado court building, and finds that John Smith worked at the CASP and that Pingtri John Smith appeared at Border No. 6 1777, one might previously use this exogenous information to identify the address of the Border Colorado court building. Thus, the fact that John Smith is working at CASP may be an attribute of the Data Subject, potentially revealing the Data Subject, i.e., at that address.
In step 4 of fig. 6, each data attribute entered into the system is added to an aggregated data summary (see fig. 1 and 1A). In the above example, the recorded data attributes would be added to the aggregated data summary for the city court building, colorado.
In step 5, attribute combinations are identified and formed to provide support for the desired activity, operation, flow or characteristic. This step may include creating or loading templates that specify one or more necessary attributes associated with a particular operation, activity, procedure, or feature. For example, for an e-commerce operation, the template may request information relating to the age, gender, size, and preferred color of the Data Subject as attributes. In another example involving a travel reservation function, the template may request information relating to the preferred mode of air travel for the Data Subject, such as coach, business class, or first class as an attribute. A privacy server may load or access a plurality of such templates to support various different operations, activities, flows, and/or characteristics. In addition, the privacy server can be configured to facilitate manual override of established templates and to create new templates relating to new operations, activities, processes and/or features as required, if desired by the controlling entity. Such manual override may be implemented, for example, by a graphical user interface of a privacy client running on the mobile device of the Data Subject. For example, a Data Subject may use a graphical user interface to cover a preferred way of requesting information about the Data Subject, air travel, business class or first class, and thus may wish to specify whether he/she wants a suite, balcony room, external status room or state room as an attribute because in one example the Data Subject may travel on cruise ships. In this example, the graphical user interface might allow a Data object to select the smallest attribute to transfer from the aggregate Data profile for the Data object.
In step 6, the privacy server receives requests from privacy clients, which may reside on Data Subject devices, on service provider devices, accessed through and residing in a cloud network, or on the same computing device as the privacy server. A particular action, activity, process, or trait. The nature and substance of requests that the privacy server may receive from the privacy client may vary depending on a number of factors, including whether the system is implemented in a DRMI, DRMD, or otherwise, whether the request is related to healthcare, educational, mobile, financial, network, internet of things, or other applications, etc.
In step 7, the level of abstraction appropriate to the level of security, anonymity, privacy and relevance required for a particular operation, activity, procedure or property is determined. For example, the system may introduce an initial abstraction layer by concatenating the associated data attributes, separating the associated data attributes into one or more TDRs that are determined based on a given action, activity, procedure, or characteristic. In addition to splitting data attributes into one or more TDRs, other layers of abstraction can be introduced by abstracting individual attributes, combinations of attributes, or replacing them with DDIDs that cannot be understood without accessing the Replacement Keys (RKs). Privacy, anonymity and security of the attributes contained in the TDR may be further improved or enhanced by using known protection techniques such as encryption, tagging, pseudonymization and streaming and further abstraction layers. By using additional DDIDs, network, internet, intranet, and third party computers are meant that may be integrated or in communication with one or more embodiments of the invention.
In step 8 the controlling entity selects a desired combination of attributes from the privacy server, possibly related to a desired operation, activity, procedure or property, based on the attributes associated with the applicable template. The abstraction module may determine the desired attributes that may be controlled by the controlling entity or delegated to another entity as an authorized party, who may choose to use the abstraction module to select methods based on building templates, dynamically selecting attributes, or intelligently detecting appropriate inputs.
In an example of step 8, an e-commerce website sells sporting equipment, and an internet browser provider as a controlling entity may use a privacy server of abstraction module information to determine the height, weight and budget of data objects requiring receiving a website to select appropriate sporting equipment, such as kayAKs and paddles.
In step 9, the abstraction module of the privacy server generates unique DDIDs, or accepts or modifies temporarily unique, dynamically changing values, as DDIDs, and assigns the DDIDs to each combination of attributes of operation 8, forming TDRs. These DDIDs may be used for a variety of functions including, but not limited to, substitution or simple association. For example, if the internet browser provider as the controlling entity instructs the abstraction module to create a TDR with a single-layer abstraction, it may assign a DDID to the same data topic that is not clearly associated with other TDRs and cannot access the Association Keys (AKs). As another example, if an internet browser provider acting as a controlling entity instructs the abstraction module to create a TDR with two layers of abstraction, it may (i) assign a DDID to be associated with the data attributes for the duration of the TDR, and (ii) further abstract the data attributes by assigning a DDID of Ab5 to the weight of the data body, a DDID of 67h to the height of the data body, and a DDID of Gw2 to the budget of the data body, which would not be understood if the substitute key (RKs) were not used. Step 9 also includes retrieving one or more attributes, i.e., attributes associated with the data body, from one or more databases. The DDIDs used in step 9 may be identified as not currently used or may be selected from outdated, previously used DDIDs.
In step 10, TDRs, consisting of attribute combinations and DDIDs, are transmitted by the privacy server to the recipient entity through the privacy client for use by the recipient entity in the desired operations, activities, procedures or features associated with the recipient entity. For example, in the above example, the internet browser provider as the controlling entity may deliver the TDR consisting of DDID and the secondary abstract data attributes consisting of Ab5, 67h and Gw2 as the receiving entity to the e-commerce website.
In step 11, the receiving entity receives TDRs (possibly consisting of attribute combinations and DDIDs, related to the desired operation, activity, flow or feature) through the privacy client. If the intended use of the system is to be able to create an output of a big data analysis, then receiving TDRs may be the last step (e.g. please refer to the example of a potential embodiment of the invention discussed in figure Z, providing privatized/anonymous data for big data analysis so that the applicable data body has "forgotten rights"), but more interactive use of TDR may involve optional steps 12 to 17.
In optional step 12, the recipient entity interprets the TDRs (possibly consisting of attribute combinations and DDIDs for desired online operations, activities, processes or features) via the privacy client and provides access to the AKs and/or RKs as needed to learn the content of the TDRs. For example, in the example above, the e-commerce site as the receiving entity would access the RK information to see that the DataSubject's weight is due to the Ab5 value, the DataSubject's height is due to a 67h value, and the value ascribed to Gw2 to the DataSubject's budget.
In optional step 13, the privacy client may obtain new data attributes associated with the desired online operations, activities, processes or features that enhance the original TDR data attributes as information in the new TDR format.
At optional step 14, the privacy client may obtain new data attributes (if any) associated with the offline activity, which data attributes are associated with the desired online operations, activities, processes, or features that supplement the original TDR data attributes in TDR format as new information.
In optional step 15, the privacy client transmits TDRs comprising combinations of DDIDs and attributes associated with the online/offline sessions back to the privacy server.
Since the TDRs are transmitted by the privacy client to the privacy server in steps 14 and 15 without AKs or RKs, they are transmitted in a sorted and anonymous format, so if someone intercepts the TDRs they will not receive the appropriate data object, required action, activity, flow or feature.
In optional step 16, in one example, re-aggregation of the attribute combinations is performed by the maintenance module through the application program using the relationship information between DDIDs and attribute combinations of the Association Keys (AKs) and (DKs) located on the privacy server. In this example, this means that the original or modified TDRs will be returned to the privacy server, which may then modify or add new information about the suggested kayAKs and add paddles to the DataSubject's aggregated data profile.
Upon completion of the above-described re-aggregation of new data regarding the desired operations, activities, processes or properties in the property combinations, the DDID may be considered to be expired in one example and reintroduced into the system in optional step 17, redistributed and used with other properties, property combinations, dataSubjects, operations, activities, flows, features or data to form new TDRs in the manner described above.
For example, DDIDs Ab5, 67h, and Gw2 assigned to attributes in step 9 above may be assigned to data attributes associated with other DataSubjects in a manner similar to case hops or remote case hops. For example, a jump point for a similar situation might involve re-associating Ab5 to a second Data Subject with the same or similar weight as the original Data Subject, or re-associating one weight Data or something related to the same number but not with the same Data Subject, while a remote case jump might involve re-assigning Ab5 to an unrelated Data attribute of the DDID.
In the second example of fig. 6, the doctor can request blood pressure information related to a specified Data Subject, collected offline by the nurse, and entered online into the aggregate Data profile of the Data Subject. The request may cause the abstraction module of the privacy server to extract the combination of attributes consisting of the systolic and diastolic blood pressure values most recently recorded by the Data object as part of step 8 above. As part of step 9, instead of specifying the identity of the Data Subject, the privacy server may combine these attribute combinations with the DDID assigned by the privacy server to form a TDR. As part of step 10, the blood pressure attributes may be communicated to the doctor along with the specified DDID by privacy clients that may reside on the DataSubject device, on a service provider device, accessible over a cloud network and residing in the cloud, or residing on the same computing device as the privacy server. At this point, the combination of the DDID and the blood pressure related attribute will constitute the TDR. As part of step 12, the doctor as the receiving entity may read the blood pressure values by means of RKs, and as part of steps 13 and 14, online and offline observations, recommendations or annotations relating to the blood pressure readings may be recorded as new data attributes. As part of step 15, the TDR enhanced online/offline information may be returned to the privacy server by the privacy client. As part of step 16, the privacy server may use this information to update the aggregate data profile for the DataSubject. In this way, an unintended recipient of a TDR will not be able to associate the identity of a Data object and will only see the DDID, which after doctor use may be redistributed to another Data object in a manner similar to a case jump or remote case jump.
FIG. 6A shows an example of a process for providing data security, data privacy, and anonymity according to one embodiment of the present invention involving interaction with an external database. Fig. 6A shows, in one example, flow steps that a controller or system may implement.
In step 1 in FIG. 6A, a third party Data source takes as input to the system Data containing one or more Data attributes associated with one or more Data subjects. It should be noted that in the embodiment of the invention represented in fig. 6A, prior to submitting Data containing one or more Data attributes associated with one or more Data object input systems, a third party Data source will have created an aggregate Data profile for each Data object (see fig. 1A), which the third party Data source will maintain directly or indirectly in one or more databases.
In step 2, the privacy server receives requests from privacy clients, which may reside on Data Subject devices, on service provider devices, accessed through and residing in a cloud network, or on the same computing device as the privacy server. A particular action, activity, process or trait. The nature and substance of requests that the privacy server may receive from the privacy client may vary depending on a number of factors, including whether the system is implemented in DRMI, DRMD, or otherwise, whether the request is related to healthcare, educational, mobile, financial, network, internet of things, or other applications, etc.
Privacy, anonymity and security of the attributes contained in the TDR may be further improved or enhanced by using known protection techniques such as encryption, tagging, pseudonymization and streaming and further abstraction layers. By using additional DDIDs, we mean networks, internets, intranets and third party computers that may be integrated or in communication with one or more embodiments of the present invention.
In step 3, the level of abstraction related to the level of security, anonymity, privacy required and the relevance of a particular operation, activity, procedure or property is determined. For example, the system may introduce abstractions by abstracting individual attributes, combinations of attributes, or both, that are all represented using DDIDs, and which cannot be understood without accessing the substitute keys (RKs). By using known protection techniques (such as encryption, tokenization, pseudonymization, and recycling) and further abstraction layers, other DDIDs may be used to refer to network, internet, intranet, and third party computers that may be integrated or otherwise in communication with one or more embodiments of the present invention, thereby further improving or enhancing the privacy/anonymity and security of the attributes contained in the TDR.
In step 4 the controlling entity selects a desired combination of attributes from the privacy server, possibly related to a desired operation, activity, procedure or property, based on the attributes associated with the applicable template. The abstract module may determine the required attributes, may be controlled by the controlling entity or delegated to another entity as an authority, and the authority may choose to use the abstract module to select the attributes, create a template based on the attributes, select the attributes, or intelligently detect the inputs as appropriate.
In one example of step 4, in the context of a healthcare study, a hospital acting as the controlling entity may use the abstraction module of the privacy server to obfuscate information about the height, weight and name of the Data Subject's before sending it to the healthcare facility.
In step 5, the abstraction module of the privacy server assigns a DDID to each attribute combination of operation 4 to form a TDR. These DDIDs may provide various functions including, but not limited to, substitution or simple association. For example, if a hospital directive abstraction module as the control entity creates a TDR with two abstraction layers, it can abstract the data attributes by assigning the DDID of Ab5 to the DataSubject's weight, the DDID of the DataSubject height of 67h, and the DDID of the DataSubject name Gw2, but cannot understand without access to the Replacement Key (RK). Step 5 also includes retrieving one or more attributes, i.e., attributes associated with the Data object, from one or more databases. The DDIDs used in step 5 may be identified as not currently in use or may be selected from outdated, previously used DDIDs.
In step 6, the TDRs, which consist of attribute combinations and DDIDs, are transmitted by the privacy server to the recipient entity through the privacy client for use by the recipient entity in the desired operations, activities, procedures or features associated with the recipient entity. In the above example, a hospital as the controlling entity may deliver a TDR consisting of Ab5, 67h and Gw2 abstract data attributes to the research institution as the receiving entity.
In step 7, the receiving entity receives TDRs (possibly consisting of DDIDs and combinations of attributes related to the desired operations, activities, processes, or features) through the privacy client. For example, in the above example, the research institution as the receiving entity would receive information for analysis, but would not reveal personal identification information about weight, height. Instead, the facility will receive Ab5, 67h and Gw2, which cannot be deciphered if access to the relevant RK information is not allowed. If the intended purpose is to perform large data analysis, then receiving the TDRs may be the last step, however, using the TDRs more interactively may involve optional steps 8 through 13.
In optional step 8, the recipient entity interprets the TDRs (possibly consisting of attribute combinations and DDIDs, relating to the desired operation, activity, flow or feature) via the privacy client and provides access to the AKs and/or RKs as required to understand the content of the TDRs.
In an optional step 9, the privacy client may obtain new data attributes associated with the desired online operations, activities, processes or features that enhance the original TDR data attributes as information in the new TDR format.
In optional step 10, the privacy client may obtain new data attributes (if any) associated with offline activities that are associated with desired online operations, activities, processes or features that supplement the original TDR data attributes in TDR format as new information.
In optional step 11, the privacy client transmits TDRs consisting of attribute combinations and DDIDs associated with the online/offline sessions back to the privacy server. Since TDRs are transmitted to the privacy server by the privacy client without AKs and/or RKS, they are transmitted in a separate and anonymous format, so if someone intercepts the TDRs, they will not receive all the data applicable to the data body or the desired operation, activity, flow or feature.
In optional step 12, in one example, re-aggregation of attribute combinations is performed by the relationship information maintenance module between DDIDs and between DDIDs through an Association Key (AKs) and/or Replacement Keys (RKs) residing at the privacy server. In this example, this means that the original or modified TDRs will be returned to the privacy server, which modifies or adds new information about the recommended kayAKs and paddles to the Data Subject's aggregated Data profile.
Upon completion of the above re-aggregation of new data regarding the desired operation, activity, process or property in the combination of attributes, the DDID may be considered to be expired in one example and reintroduced into the system in optional step 13. Reallocated and used with other attributes, combinations of attributes, datasubjects, operations, activities, processes, features or data to form new TDRs in the same manner as described above.
Fig. 6B shows how a potential embodiment of the present invention provides dynamic anonymity for data elements contained in one or more databases (whether one or more databases, as shown in fig. 1A and/or external to the system, belong to the system, as shown in fig. 1B), which are deemed too sensitive to be displayed in a recognizable manner external to the organization, e.g., data directly identifying a DataSubject or sensitive operation, activity, flow and/or feature (direct identifier), or data indirectly identifying a DataSubject or sensitive operation, activity, flow and/or feature, when used in conjunction with other data (quasi identifier), when used in conjunction with other data (quasi identifiers). .
In one potential embodiment of the present invention, the suppression of the sensitive data may be directed only to a computer application that requests data from one or more databases at the presentation level of the computer application from the subject by intercepting requests for sensitive databases and replacing the sensitive data with the one or more DDIDs. In another potential embodiment of the invention, sensitive data may be obfuscated for one or more computer applications that request a database connection level from the subject one or more databases by intercepting sensitive data requests for the one or more databases and replacing the sensitive data with one or more DDIDs, as described above.
FIG. 6B shows, in one example, the flow steps implemented by a controller or system to hide sensitive data.
In step 1 in fig. 6B, the privacy server receives requests from privacy clients, which may reside on Data Subject devices, on service provider devices, accessed through a cloud network and residing in the cloud network, or on the same computing device as the privacy server, relating to Data elements contained in one or more databases (whether or not the one or more databases are located internally within the system, as shown in fig. 1A and/or externally to the system, as shown in fig. 1B), which are deemed to be too sensitive to be displayable in recognition. External means of organization-for example, data that directly identifies a data body or sensitive operation, activity, flow, and/or characteristic (direct identifier), or data that indirectly identifies a data body or sensitive operation, activity, flow, and/or characteristic, when used in conjunction with other data (quasi-identifiers). The nature and substance of requests that the privacy server may receive from the privacy client may vary depending on a number of factors, including whether the system is implemented in DRMI, DRMD, or otherwise, and whether the request is related to healthcare, educational, mobile, financial, web, internet of things, or other applications, among others.
In step 2, the abstraction module determines the level of abstraction suitable for the level of security, privacy, anonymity and relevance required for sensitive data elements consistent with a DataSubject or a perss established by a trusted party for which DDID association policies are developed, consistent with the data usage/analysis scope allowed by the aforementioned perss.
In step 3, one or more DDIDs determined by the abstraction module are sent to the privacy client to dynamically hide sensitive data elements.
In step 4, one or more sensitive data elements are dynamically masked by replacing the data elements with one or more DDIDs determined by the abstraction module, and the resulting DDIDs are used to replace sensitive data element organization in externally communicated data. In one example of step 3, the hiding of sensitive data elements occurs in a particular computer application requesting data from the subject one or more databases by intercepting requests for sensitive data from the one or more databases at a presentation level of the computer application and replacing the sensitive data with one or more DDIDs determined by the abstraction module.
In step 5, in order to understand the association between one or more DDIDs and the hidden sensitive data element, the key needs to be securely stored in a trust circle (CoT).
In step 6, the keys required to learn the association between one or more DDIDs and the obfuscated sensitive data elements securely stored in the trust circle (CoT) are provided only to authorized parties. Sensitive data represented by one or more DDIDs is not revealed until a databject or one or more parties authorized by a trusted party to receive and/or use the underlying sensitive data request a key.
In one example of the present disclosure, fig. 7 illustrates an example of processing steps that may be implemented by a recipient entity.
In step 1, the combination of attributes selected by the control entity is used in conjunction with the DDID, the TDR associated with the Data attributes during the TDR being accessed by the recipient client through a privacy client receiving device residing on a Data Subject, through a cloud network and residing in the cloud network, or residing on the same computing device as the privacy server, indicating a request regarding a desired operation, activity, flow, or feature. For example, in the kayak example above, the ecommerce site recipient entity may receive a TDR request for a Data Subject that is related to a desired operation, activity, process, or feature.
In step 2, the recipient entity interprets the TDRs (possibly consisting of a combination of attributes and DDIDs of the desired online operations, activities, procedures or traits) by providing a private client with access to the AKs and/or their usage rights to learn about the content of the TDRs as needed. For example, in the above example, the e-commerce site will access RK information residing on a DataSubject's device, on a service provider device, through a cloud network and in a cloud network, or on the same computing device as the privacy server. Knowing the weight of DataSubject's is due to the value of Ab5, the DataSubject's height is summarized to a value of 67h, and the value of budget Gw2 to DataSubject's.
In step 3, the receiving entity may use the TDR information it receives to customize the response to the attributes conveyed by the Data Subject's, in one example. In the kayak example, this would allow the e-commerce web site to use this information to provide suggestions for Data Subject's to purchase kayak and paddle.
In step 4, in one example, the privacy client obtains Data of online activities performed at the recipient entity, by accessing a Data device associated with the combination of attributes by the privacy client (on the service provider) that may reside on the Data Subject device, may be accessed through the cloud network and reside in the cloud network, or reside on the same computing device as the privacy server.
In step 5, in one example, the recipient entity obtains data for the offline activity (if any) associated with the attribute combination and converts it to online data. In the kayak et al example, if the data body is also a member rewarding member for a physical store location operated by an e-commerce web site, and chooses to have other preferences known, the receiving entity can go through this online component.
In step 5, in one example, the privacy client transmits offline activity-related data to the privacy server in an categorized and anonymous format for online sessions related to attribute combinations and DDIDs.
In step 6, since the DDID components of the TDRs are reintroduced into the system for reallocation and use with other attributes, combinations of attributes, data bodies, operations, activities, procedures, features or data, in the same manner as described above to form new TDRs, a receiving entity may later see the same DDID, but the DDID may not be connected to any other TDRs associated with the DataSubject or other previously associated TDRs. For example, the same DDID may be seen again by the e-commerce website the day or a week later, but appended to different information relating to a completely different DataSubject.
In the second example of fig. 7, a doctor requesting blood pressure information may receive a TDR from the most recently recorded systolic and diastolic blood pressure values and the DDID assigned by the privacy server to the DataSubject as part of step 1, via the privacy client. As part of steps 2 and 3, the physician can read the blood pressure information. As part of steps 4 and 5, the doctor may add observations, suggestions, or comments related to blood pressure that will be sent to the privacy server through a privacy client that may reside on the Data Subject device, on the service provider device, accessed through the cloud network and resident in the cloud network, or on the same computing device as the privacy server, as part of step 6.
FIG. 8 illustrates a process for verifying that a process is being performed with respect to an action, activity, procedure, or characteristic at a particular time and/or place, according to one embodiment of the invention.
In step 1, in one example, the recipient entity transmits the request to the privacy server by a privacy client that may reside on a Data object device, on a service provider device, accessed over a cloud network, or resident on the same computing device as the privacy server to confirm whether the unpublished Data object or an associated party associated with the TDR has the right to engage in a privacy client for an action, activity, flow, or feature at a particular time and place. For example, after browsing kayAKs and paddles recommended on the e-commerce site, when the associated party is ready to make a purchase, the e-commerce site may query the authentication module of the privacy server to determine whether the associated party is authorized to complete the requested transaction.
In step 2, in one example, the authentication module of the privacy server compares the DDID contained in the TDR to a list of authorized DDIDs contained in a database to determine authorization of the DataSubjec or associate to participate in the desired operation, activity, process, or feature at the specified time and/or location. In the kayak example, the authentication module of the privacy server can ensure that the DDIDs being used are still active and authorized, indicating that the DataSubject or associated party is authorized to complete the desired transaction.
Alternatively, in step 3, in one example, the privacy server may request that a party controlling a privacy client, which may reside on a Data Subject device, be on a service provider device, be accessible via a cloud network and reside therein, or be on the same computing device as the privacy server (in this case an e-commerce site) to confirm that they are entitled to participate in the desired transaction.
If optional step 3 is invoked, step 4 checks to determine if the party controlling the privacy client has been verified as authorized, in one example. For example, to avoid fraudulent attempts by masquerading as a trustworthy entity (also referred to as "phishing") to obtain information such as a username, password or credit card details, step 4 may require the e-commerce website to verify that the kayak device authorized the dealer's known techniques have been authenticated.
In step 5, in one example, if authentication is obtained, the authentication module of the privacy server transmits authorization status information to the party controlled by the privacy client.
In step 6, in one example, the authorization status information is used to allow or deny operations, activities, processes, or features required for the process.
Once the authentication function is performed and the optional additional authentication step is completed, the privacy server will send AK and/or RK information needed to interpret the TDR content through the privacy client in step 7 so that the interested party can purchase the desired product, possibly with the transaction being processed by the receiving entity, e.g. an e-commerce website.
In the second example of fig. 8, a doctor may send a TDR to a privacy server through a privacy client to verify whether a Data object as a patient is authorized to participate in an exploratory study. This will cause the authentication module of the privacy server (as part of step 2) to compare the DDID of the Data Subject in the TDR with the list of authorized DDIDs contained in the database to determine whether the Data Subject is authorized to participate in the study. Alternatively, in step 3, the authentication module of the privacy server may request the requesting doctor to confirm that they are entitled to request DataSubject as a participant in the exploratory study. If optional step 3 is invoked, step 4 checks whether the doctor is authorized by known confirmation techniques (e.g. password confirmation or multiple authentication). If authentication is obtained, the authentication module of the privacy server may transmit authorization status information via the privacy client in step 5, which authorization status may be used to allow or deny requests for DataSubjects to participate in exploratory studies in step 6, and step 7 will provide access to AK and/or RK key information needed to interpret TDR content in order to continue operation.
Fig. 9 illustrates an example of a process for pre-keying Replacement Key (RK) or Association Key (AK) information or other protective information unless verified according to an embodiment of the present invention. As shown in step 1, in one example, a party controlling a privacy client (including TDR) transmits through the privacy client to an authentication module of a privacy server, which may reside on a data subject device, on a service provider device, accessed through a cloud network, or on the same computing device as the privacy server, requests AK and/or RK, and/or keys to unlock TDR data attributes protected using other techniques (such as encryption, tagging, pseudonym), and the like.
In the kayak example, the data might be sent using various additional steps to protect its transmission, however, the receiving entity e-commerce site might need a key to unlock and/or associate the three pieces of information about height, weight and budget that the privacy client originally sent to it. In step 2, in one example, the authentication module of the privacy server compares the TDR recipient attribute combination with the authorized recipient attribute combination to determine whether the TDR recipient is an authorized recipient. If the authentication module of the privacy server verifies that the TDR recipient attribute combination matches the authorized recipient attribute combination, the authentication module of the privacy server will transmit to the TDR recipient through the privacy client (e.g., the key needed to unlock the TDR) as part of step 3.
In a second example of fig. 8, in step 1, the doctor may need to receive an encrypted, tagged or omitted TDR containing the requested blood pressure information before sending the TDR to the authentication module of the privacy server via the privacy client to verify that the doctor is authorized to view the requested information. In step 2, the authentication module of the privacy server may compare the doctor's TDR information to the authorized recipient attribute combination to determine if the doctor is an authorized recipient. If the authentication module of the privacy server verifies that the doctor's TDR information matches the authorized recipient attribute combination, the authentication module of the privacy server may transmit a key to the doctor through the privacy client necessary to unlock the applicable protection technique of the encrypted, tagged, or omitted TDR containing the requested blood pressure information.
FIG. 10 illustrates an example of anonymously analyzing interests of a stakeholder in accordance with an embodiment of the present invention. At step 1, in one example, interested parties (RPs) select combinations of Attributes (ACs) that are shared with merchants/service providers through privacy clients on mobile and/or wearable devices. For example, rather than utilizing an e-commerce website, the interested party may go to the physical location of an outdoor sport store, sharing the same information about height, weight, and budget through a mobile or wearable device.
In step 2, in one example, the privacy server may assign DDIDs(s) to a combination of attributes of privacy clients located on the mobile/wearable/portable device to form TDRs(s).
At step 3, in one example, TDRs(s) are transmitted to the merchant/service provider receiving entity by a privacy client residing on the mobile/wearable/portable device. For example, for kayAKs, the storage may receive three independent TDR-enabled Data attributes from the mobile/wearable/portable device of a Data Subject via a storage device, beacon, etc.
In step 4, in one example, the merchant/service provider recipient entity may view the combination of attributes authorized by the relevant party and transmit to the merchant/service provider recipient entity by the privacy client on the mobile/wearable/portable device. For example, the store may view the height, weight, and budget of the relevant party.
In step 5, in one example, the merchant/service provider recipient entity may anonymously provide offers to the DataSubjects and/or the associated parties without knowing the identity of the DataSubjects and/or the associated parties.
In step 6, in one example, dataSubjects and/or the associated parties may select merchants/service providers that receive the entity's offers in response to transactions they consider ideal and perfect.
The systems and methods described herein may provide an associated party with a way to achieve greater anonymity and improve data privacy/anonymity and security when utilizing one or more communication networks. Without these systems and methods, a third party may be able to obtain the true identity of Data objects or identifying information related to the activity of an associated party on or between networks through a network service and/or technology provider based on the Data objects or the activity of the associated party on the communication network.
Various other methods of providing data security and data privacy/anonymity are disclosed. For example, a method may include the steps or operations of receiving an electronic data element on a computing device; identifying one or more data attributes using the electronic data element; selecting, by a computing device, a DDID; associating the selected DDID with one or more data attributes; and creating a TDR from at least the selected unique DDID and the one or more data attributes.
In one example, the step of selecting a data element includes generating a unique DDID, or in another example accepting or modifying a temporarily unique, dynamically changing value to serve as the DDID. In one example, the method may further include causing an association between the selected DDID and one or more data attributes to expire. In another example, the method may include storing information about the time period for which the selected unique DDID is associated with different data attributes or attribute combinations in a database accessible to the computing device. In another embodiment, the method may further include re-associating the selected unique DDID with the one or more data attributes after the association between the DDID and the one or more data attributes has expired. In one example, the expiration of a DDID occurs at a predetermined time or may occur after completion of a predetermined event or activity. In another example, the TDR may only be authorized for use for a given period of time or at a predetermined location. In another example, the method may include altering a unique DDID assigned to one or more data attributes, where the altering of the unique DDID may occur randomly or programmatically, or may occur upon completion of a predetermined activity or event.
Another method of facilitating network transactions is disclosed herein. In one example, the method may include operations of receiving, at a privacy server, a request from a client device to conduct an activity over a network; determining which data attributes are needed in the database to complete the requested activity; creating a DDID; associating the DDID with the determined data attributes to create a combined TDR; making the combined TDR accessible to at least one network device to perform or initiate a requesting activity; receiving a modified TDR containing other information related to the performed activity; and stores the modified TDR in an in-memory database. In another method implementation, disclosed herein is a method of providing controlled electronic information distribution. In one example, the method may include receiving, at a privacy control module, a request to perform an activity over a network; selecting attributes of the data body that are located in a location accessible to a privacy control module that is determined to be necessary to satisfy the request, wherein other attributes of the data body that are not determined to be necessary are not selected; assigning a DDID to the selected attributes and the data body or data body of its application using an abstraction module of the privacy control module, wherein the DDID does not show unselected attributes; recording the time when the unique DDID is allocated; receiving an indication that the requested activity is complete; receiving the unique DDID and the determined attributes and their data body or data body to be applied in the privacy control module, wherein the attributes are modified to contain information about the performed activity; the completion time of the executed activity is recorded and the unique DDID and its applied Data object or Data object are received at the privacy control module.
In one example, the method may further include assigning an additional DDID to one or more selected attributes or Data subjects. In another example, the method may include using the recorded time, unique DDID and Data attributes to re-associate with the true identity of the Data Subject. The method may further include reassigning the unique DDID to other data attributes and recording a time of reassignment of the unique DDID.
Another method of improving data security is also disclosed. In one example, the method can include associating a Data Subject with at least one attribute; and associating the DDID with at least one attribute to create a TDR; where TDR limits access to Data Subject attributes to only those required to perform a given operation. In one example, the method may include assigning an Association Key (AK) and/or an alternate key (RK) to the TDR, wherein access to the AK and/or the RK is necessary for authorized access to the TDR. In another example, the method may further include causing an association between the DDID and the at least one attribute to expire, wherein the expiration occurs at a predetermined time and/or upon completion of a predetermined event or activity. In another embodiment, the method may include re-associating the DDID with at least one different attribute after the association between the DDID and at least one attribute has expired. The method may further include storing information about one or more time periods in which the DDID is associated with different data attributes or attribute combinations in a database.
Various methods may be used to associate DDIDs with different combinations of attributes to form TDRs. DDIDs may be of a particular or variable length and may be composed of various code constituent elements (e.g., numbers, characters, upper and lower cases, and/or special characters). Furthermore, the DDIDs may also be comprised of random or identical intervals. In one example, only authorized parties having access to the Association Keys (AKs) and/or override keys (RKs) maintained by the maintenance module can determine which attribute combinations are properly associated with other attribute combinations, including Data Subjects, associates, or aggregated Data profiles. However, the site can still track and utilize the combinations of attributes contained in the TDRs in real time, but it must be recognized that their lifetime is limited and that the associated DDIDs can be used later for different operations, activities, procedures, properties, attribute combinations, dataSubjects and/or interested parties.
The transmitted attribute combinations may include explicit data, personal Identification Information (PII), behavioral data, derived data, enriched data, or other data, singly or in various combinations.
Example a.
In a first example, the system may be configured to authorize the associated party to specify which other party attribute combinations of controlling entities are to be released. Example a illustrates how the system handles information generated by an associator (associator X or "RP X") that is engaged in four different online sessions over three different communication networks ("CN" s) with two different service providers ("SP" s) from different industries. Fig. 11-20 show this example and in a first example how information is managed at different stages and under different circumstances. It is understood that fig. 11-20 are provided by way of example only and that embodiments of the present invention may be practiced otherwise than as illustrated in fig. 11-20.
In the example illustrated in fig. 11, an associator X sends an attribute combination a (explicit data) to a website service provider, such as Pandora Radio ("SP 1"), through an online internet access ("communication network 1" or "CN 1"). Attribute combination a an abstraction module of a privacy server ("PS") assigns an identifier code of DDID 1 (for a limited period of time). In fig. 11, the combination of DDID 1 and attribute combination a represents the TDR of the associator X within a limited period of time.
In the example illustrated in fig. 12, when interacting with SP1, the associator X generates activity information (behavioral Data) tracked by SP1, which is sent as attribute combination A1 by a privacy client, possibly residing on a Data Subject device, on a service provider device, accessible through a cloud network and residing in the cloud network, or residing on the same computing device as the privacy server, and then returned to the privacy server. The maintenance module of the privacy server may maintain information about the various DDID codes assigned to each combination of attributes at different points in time, as well as the CNs and SPs associated with each combination of attributes. In fig. 12, the combination of DDID 1, attribute combination a, and attribute combination A1 indicates the TDR of the associator X within a limited time of DDID 1, attribute combination a, and attribute combination A1. DDID 1 may be reassigned for a new TDR after the new data association of the planned action, activity or process in the attribute combination is complete. The combinations of DDIDs and attribute combinations shown in fig. 13 to 20 also represent TDRs for the time periods associated between the DDIDs and attribute combinations.
In the example illustrated in fig. 13, an associator X sends another attribute combination E (explicit data) to Pandora Radio ("SP 1") through online internet access ("CN 1"). Attribute combination E is assigned by the privacy server ("PS") to the identifier code of DDID 4 for a limited period of time, and the identifier code is transmitted along with attribute combination E to SP1 via the secure client via CN 1.
In the example illustrated in fig. 14, when interacting with SP1, a stakeholder X generates activity information (behavioral Data) that SP1 tracks, which is sent as a combination of attributes E1 back to the abstraction module of the privacy server via the privacy client (possibly located on the Data Subject device), on the service provider device, accessible through the cloud network and residing therein, or residing on the same computing device as the privacy server.
In the example illustrated in fig. 15, an associator X transmits a combination of attributes Q (explicit data) in the form of a mobile application to another version of SP1 Pandora Radio, a process that is accessible through mobile device access communication ("communication network 2" or "CN 2"). The privacy server assigns an identifier code of DDID 9 to attribute combination Q for a limited period of time, and the identifier code is passed as TDR by the secure client to SP1 via CN2, along with attribute combination Q.
In the example illustrated in fig. 16, when interacting with SP1, the stakeholder X generates activity information (behavioral Data) tracked by SP1, which is sent as a combination of attributes E1 back to the abstraction module of the privacy server via the privacy client (possibly located on the Data Subject device), which is on the service provider device, accessible through the cloud network and resident therein, or resident on the same computing device as the privacy server.
In the example illustrated in fig. 17, an associator X may reside on a Data Subject device and a service provider device and may send a combination of attributes P (behavioral Data) via cloud network access and through a privacy client in the cloud network, or on the same computing device as the privacy server, the service provider ("SP 2") providing a monitoring service related to athletic activity, such as FitBit, through wearable device access communication ("communication network 3" or "CN 3"). For a limited period of time, the attribute combination P is assigned by the PS to the identifier code of DDID 7 and the identifier code is transmitted by the secure client to SP2 as a TDR via CN3 together with the attribute combination P.
In the example illustrated in fig. 18, i.e., when interacting with SP2, SP2 calculates the percentage of required daily calorie consumption (derivative Data) done by the associator X, and this information is transmitted in the cloud network by a privacy client, which may be located on the Data Subject device, the service provider device, through and resident, or on the same computing device as the privacy server, as a combination of attributes P1 returned to the privacy server.
In the example illustrated in fig. 19, the combination of attributes that each SP can access can reside on the Data Subject device, on the service provider device, and can be accessed over the cloud network and retransmitted via the privacy client therein, or on the same computing device as the privacy server. FIG. 19 emphasizes that the sessions among SPs are subset relationships with each other, so SPs do not have the information needed to determine associations between property combinations in one example if the security association key of the maintenance module is not accessed. In one case, however, they do have access to the combinations of attributes created during each limited time period, which are determined by modifying the DDIDs. For example, SP1 does not know that DDID 1 and DDID 9 both belong to an associator X, which has access to two different versions of the website maintained by SP 1-one via the online Internet and the other via the mobile device.
In the example illustrated in FIG. 20, the data accessible to the associator X includes all information sent to and retransmitted from the SPs. Fig. 19 emphasizes that by accessing the security association key maintained by the maintenance module, the associator X, as the controlling entity, may in one example have the information needed to determine the association between the attribute combinations for aggregation and normalization purposes. In addition, the associator X may have information to use or may have a maintenance module for a data helper to perform further analysis and processing work of the data in a secure environment. The new attribute combination Z indicates that the maintenance module, at the request of the stakeholder X, generates new data ("rich data") by comparing all data related to DDID 1, DDID 9, DDID 4 and DDID 7 predicts other music choices that may be liked by stakeholder X, which will help achieve the desired daily calorie consumption. Attribute combination Z may encompass a list of other music selections that the prediction produces, as well as data associated with other various DDIDs. The attribute combination Z will not communicate with any party (SP 1, SP2 or other) unless required by the party X being the controlling entity. When an associator X wishes to share an attribute combination Z, it will, in one example, be transmitted to the recipient designated by the associator X after being assigned a DDID code. This new combination of attributes will be more comprehensive and novel when the associator X decides whether or not to distribute to the receiving entity.
Examples of the invention
In a second example illustrated in fig. 21-22, the system is configured such that the service provider ("SP 3") is a controlling entity authorized to specify participants for select combinations of attributes associated with the SP3 client. SP3 may use the system to provide better protection for the identity and privacy/anonymity of its clients. This includes reducing the likelihood of consumer or government objections due to potential loss of privacy or anonymity, as well as improving market penetration, usage and acceptance of SP3 products. It is understood that fig. 21-22 are provided by way of example only and that embodiments of the present invention may be practiced otherwise than as illustrated in fig. 21-22.
In the example illustrated in fig. 21 and 22, SP3 provides that each input technology provider, such as a website company ("ITV") that helps capture order information, a process technology provider, such as an online electronic payment processor ("PTV"), and an output technology provider, such as a party that electronically delivers selected products ("OTVs") to customers, only have the combination of attributes needed to perform the services assigned to each provider. No one supplier can obtain personal identification information ("PII") to show the identity of the SP3 customer.
Fig. 23 illustrates an implementation example of dynamically created, modifiable, and re-assignable DDIDs in the field of internet behavioral advertisement services. Without certain embodiments of the present invention, internet behavioral advertisement services are based primarily on an advertising network that places cookies in a user's Web browser and builds a user's profile by tracking websites visited by the user to carry advertisements from the same advertising network. In this way, the network can build up the profile of the website the user visits and can be augmented with data from other sources to obtain the user's details with cookie information.
In general, when a user first accesses a Website ("Website 1") in fig. 23, the Website: (i) delivering content from the website to the user's browser; (ii) sending the Cookie to the browser of the user; (iii) The user's browser is directed to a web site to retrieve advertising content from the advertising network ("ad network 1") to be placed on the web site. The Cookie passed in (ii) above is referred to as a "first-party Cookie" because it relates to a website selected by the user. First-party Cookies help users save "state" information, such as login progress, items in a shopping basket, and other relevant information that may improve the user experience. When the user's browser requests advertisement information from the advertising network 1, which is part of (iii) above, the advertising network 1 transmits an advertisement to the user's browser and displays it as part of the website 1. If this is the first time the user's browser requests advertising content from the advertising network 1, the advertising network 1 will also send a cookie to the user's browser. This Cookie is referred to as a "third party Cookie" because it is not from a web page that the user intends to access. If the advertising network 1 has not previously tracked users, the advertising network 1 will deliver based on conventional advertising techniques (e.g., the nature of the content on the website1 may be delivered). As users access more and more websites using advertisements placed by the advertising network 1, the advertising network 1 (third party cookies sent to the users 'browsers by the advertising network 1) constructs profiles of user behavioral data from the pages accessed, the time spent on each page, and other variables such as information from the users' social networks, online or offline purchasing behavior, psychological and demographic information, and more user information gathered by operation of the advertising network 1 or integration of information provided by third party data providers. Based on the profile of the user created and managed by the advertising network 1, the advertising network 1 may display advertisements for the user according to the content it determines that the user is most interested in.
This traditional way of tracking users from one website to another, and from one page to another, by third-party advertising networks raises concerns about privacy/anonymity. In response, a "track-prohibited" (DNT) activity is initiated through the world Wide Web Consortium (W3C). W3C is an international organization whose member organizations, full-time workers and the public work together to establish Web standards for adoption by various departments of regulatory agencies, private societies and business entities. Now, the primary browser (i.e., IE, chrome, firefox, safari) provides the DNT option; however, there is no consensus on how the recipient's website should respond to DNT preferences.
Nevertheless, some providers have recognized that DNTs are suitable for third party website tracking, rather than first party website tracking. According to the W3C draft standard, if the first party receives a DNT:1 signal, the first party can collect and use the data normally. This includes the ability to customize content, services, and advertisements based on the first party's experience. According to the proposal, the first party must not share data about the network interaction with third parties that cannot collect data on their own; however, data about the transaction may be shared with a service provider on behalf of the first party. In the "no tracking" case, when the user visits website ("website 1"), the user's browser will send a notification to website 1 informing that the user will not be tracked; web site 1 sends the first party Cookie and content to the user's browser, as well as the address at which the browser should request placement of an advertisement on Web site 1 from the advertising network ("Ad Network 1"). The advertising network 1 receives requests that are not tracked and sends the advertising content to the user's browser, but no third-party Cookie is placed on the user's browser. The advertisements are provided to the user according to conventional targeting methods, which may include, but are not limited to, targeting advertisements to the content of the page (i.e., contextual targeting). According to an untracked implementation, as described above, consensus has little or no restriction on the first party (except that the first party must not share data about DNT user network interactions with third parties that cannot collect the data themselves).
In contrast, in contrast to embodiments of the present invention, no tracking to protect privacy/anonymity of the interested party's user may be achieved while still delivering content and targeted advertising to support the primary revenue model of the internet. FIG. 23 illustrates one of many potential implementations of the present invention for advertising services.
In step 1 of fig. 23, the Data object or related party visits website 1 for the first time, and the browser sends a "do not trace" title to website 1. The browser may also send a TDR to website 1 if required by a Data object or interested party, enabling it to include "state" information for improving the experience of the Data body or interested party there. Then, website 1 sends the content to a Data Subject or a browser of the interested party.
In step 2, in one example, a Data Subject or affiliate's browser requests an advertisement from the advertising network 1 (with or without TDR) to website 1. If the TDR is not sent, the Data object or affiliate will receive the traditionally located advertisement from the advertising network 1 according to the page content. After sending the TDR, the advertising network 1 can deliver highly targeted advertisements to the Data object or the browser of the associated party according to the Data object or the related attributes of the associated party. In this regard, TDR-based ads served by the advertising network 1 may be more relevant to Data subjects or affiliates, particularly with greater accuracy than traditionally served ads or ads served with behavioral profile information summarized (according to general inferences) by the advertising network.
In step 3, a process similar to steps 1 and 2 will occur when a Data Subject or affiliate visits another site ("website N"). When TDRs are included, the website content and the advertising content will be highly targeted; however, the advertising network 1 has at least no ability to collect or track information about Data subjects or affiliates. Furthermore, the TDR may be contained in the information sent to the website or advertising network 1 by a private client residing on the browser or by other mechanisms.
In summary, according to existing ad targeting techniques, users can be tracked anywhere they come online, but still be advertised based on aggregated data from which the advertising network infers the preferences of particular users. This would result in no user privacy/anonymity and low to medium advertisement relevance. By combining aspects of the present invention with "do not track," a user is authorized to decide which information is sent to which websites and advertising networks. This enhances not only privacy/anonymity, but also advertisement relevance (to the user), increasing sales and return on investment for the merchant.
Fig. 24 and 25 illustrate the potential benefits of some embodiments of the present invention in the healthcare field. Fig. 24 highlights how time-unique and purpose-limited data representations (TDRs) can be used in one potential embodiment of the present invention to protect the confidentiality and privacy/anonymity of user and patient Personally Identifiable Information (PII) and/or Personal Health Information (PHI). In a healthcare information system. With the benefit of one embodiment of the present invention, the medical system may generate real-time TDRs that do not reveal sensitive PII/PHI without losing context or access to such information. In step 1.0, information including PII/PHI related to the registration process can be received as input to the system. To protect the privacy/anonymity of sensitive PII/PHI information, the output of the registration process may replace PII/PHI user information [ A ] with TDRs (containing dynamically altered and re-assignable DDIDs and PII/PHI information). Without exposing sensitive PII/PHI data. This user data (including TDRs instead of PII/PHI information) will then be entered to create, augment or alter the user data file at D1 without revealing the PII/PHI information B. Similarly, the PII/PHI information output from the step 2.0 retention process can be replaced with TDRs (containing dynamically altered and re-allocatable DDIDs and PII/PHI information) without revealing the PII/PHI information, thus not revealing sensitive PII/PHI data. This clinical data (including TDRs instead of TII/PHI information) would then be entered to create, augment or alter the clinical data file at D2 without revealing the PII/PHI information [ C ]. Then, the clinical data from D2 (after the clinical information search process in step 3.0) can be combined with the user data from D1 as input to the step 4.0 user profile search process, rather than just revealing PII/PHI information by accessing and using time-unique and purpose-limited TDRs. The output PII/PHI user information component resulting from the step 4.0 user profile search process can be replaced with TDRs (consisting of dynamically changing and re-assignable DDIDs and PII/PHI information) without exposing the PII/PHI information, and thus without revealing sensitive PII/PHI data. Finally, the user data at D1 (including TDRs instead of PII/PHI information) can be used as input to the reservation recording browsing process at step 5.0, rather than just revealing the PII/PHI information by accessing and using a TDR that is time-unique and purpose-limited. When an authorized healthcare or ancillary service requires access to detailed information from a user data file and/or clinical data file, the Association Key (AKs) and/or the Replacement Key (RKs) may be used to identify relevant sensitive PII/PHI data associated with applicable TDRs and DDIDs.
In the example illustrated in fig. 25, dynamically created, modifiable, and re-assignable TDRs (comprised of dynamically modified and re-assignable DDIDs and PII/PHI information) may be used to protect the confidentiality and privacy/anonymity of the PII/PHI contained in a patient medical record. Fig. 25 illustrates the use of multiple levels of abstraction to implement the present invention to establish a "privacy ring" to provide only the level of identifying information needed to perform the desired service or allow the function. In this example, each of the provider, state, multi-state, and country levels will receive a combination of attributes that are appropriate for their respective allowed uses. Time-unique and limited-use data formats (TDRs) to protect the confidentiality and privacy/anonymity of user and patient Personal Identification Information (PII) and/or Personal Health Information (PHI). With the benefit of one embodiment of the present invention, healthcare-related information may use TDRs that do not reveal sensitive PII/PHI without losing context or access to such information. Information may be provided for each successive level (from the lowest level provider level up to the highest level country level) where the PII/PHI information has been represented by TDRs (by dynamically altered and re-assignable DDIDs and PII/PHI information) only by time-unique and limited-use DDIDs (without the PII/PHI information being displayed) and thus does not expose sensitive PII/PHI data. When access to PII/PHI information is required for proper authorized use at a particular level, correlation keys (AKs) and/or Replacement Keys (RKs) can be used to identify relevant sensitive PII/PHI data and DDIDs associated with an applicable TDR. Furthermore, since the DDIDs change over time, and information related to new DDIDs can reflect new and additional information without revealing the Data Subject/patient identity, the DDIDs can help facilitate self-regulation to improve longitudinal studies. This can be achieved by using DDIDs to separate "context" or "elements" from the data needed to perform the analysis. The analysis results may be shared with trusted parties/agents that apply a "context" or "element" to the analyzed data. There are numerous participants in the healthcare industry, many of which use different data structures. Dynamic authentication can support the collection of different data from different sources in different formats, normalize the information into a common structure, and achieve the separation of "context" or "meta" from "content" by dynamically assigning, reassigning and tracking DDIDs, thereby enabling efficient research and analysis without revealing identifying information. This approach may allow linking together Data from individual Data subjects/patients from different sources, without worrying about authorization issues, because no individuals can be identified in the procedure. Only within the trust circle ("CoT") identified in fig. 1C-1 can the identification information be accessed by accessing a mapping engine associated with the individual. For example, currently in healthcare/life sciences research, the burden of regulatory agencies on law enforcement and the burden of companies associated with privacy/anonymous review and engineering may be greatly reduced due to the potential risk of re-identifying individuals, while a more complete data set may be provided for healthcare-related research and development, HIPAA provides a method of de-identifying Personal Health Information (PHI), once the PHI is de-identified, it is no longer subject to HIPAA regulations and may be used for any purpose, however, one's sufficiency for existing HIPAA identification methods, the responsibility for unauthorized re-identification of unidentified data, and the responsibility for use of unidentified data, and the lack of focus on HIPAA use, further, according to HIPAA final rules, the invention may be applicable to most HIPAA applications beyond 22 months, and the invention may be applicable to a reduction of privacy information by HIPAA business compliance issues.
The example of fig. 26 demonstrates some of the potential benefits of the invention in the field of mobile/wearable/portable device communications. Mobile/wearable/portable applications implementing the systems disclosed herein or aspects thereof may provide entity control over the timing and level of participation in location and time sensitive applications. The controlling entity may use the functionality of the abstraction module of the privacy server to control the extent of the combination of sharing attributes with third parties, in an anonymous and personally identifiable manner. For example, a static identifier associated with a mobile/wearable/portable device in an existing system may cause mobile/wearable/portable application providers and other third parties to aggregate attribute combination data related to usage of the mobile/wearable/portable device. The use of the present invention may prevent application providers and other third parties from aggregating combinations of attributes related to the use of mobile/wearable/portable devices, and may further enable the mobile/wearable/portable devices to use mobile applications (e.g., directions or mapping applications) that require access to geographic location information without revealing the identity of the mobile/wearable/portable device or user by enabling the use of TDRs and/or DDIDs instead of static identifiers.
Fig. 27 is a simplified functional block diagram example illustrating an example programmable device 2700 that may implement one or more processes, methods, steps, features or aspects described herein. Programmable device 2700 may include one or more communication circuits 2710, memory 2720, storage device 2730, processor 2740, control entity interface 2750, display 2760, and communication bus 2770. Processor 2740 may be any suitable programmable control device or other processing unit and may control the operation of many of the functions performed by programmable device 2700. Processor 2740 may drive display 2760 and may receive control entity input from control entity interface 2750. The embedded processor provides a versatile and robust programmable control device that can be used to perform the disclosed techniques.
Storage 2730 may store combinations of attributes, software (e.g., for implementing various functions on device 2700), preference information, device profile information, and any other suitable data. Storage 2730 may include one or more storage media for tangibly recording data and program instructions, including, for example, a hard disk drive or solid state memory, permanent storage such as ROM, semi-permanent storage such as RAM, or cache. The program instructions may comprise a software implementation encoded in any desired computer programming language.
The memory 2720 may include one or more different types of memory modules that may be used to perform device functions. The memory 2720 may include cache, ROM, and/or RAM, for example. The communication bus 2770 may provide a path for transferring data to, from, or between at least the memory 2720, the storage device 2730, and the processor 2740.
Although referred to as a bus, the communication bus 2770 is not limited to any particular data transfer technique. Control entity interface 2750 may allow a control entity to interact with programmable device 2700. For example, control entity interface 2750 may take a variety of forms, such as a button, keyboard, dial, click wheel, mouse, touch or voice command screen, or any other form of input or user interface.
In one embodiment, programmable device 2700 may be a programmable device capable of processing data. For example, programmable device 2600 may be any identifiable device (excluding smartphones, tablets, laptops, and desktops), smartphone, tablet, laptop, desktop, or other suitable personal device, such as a device with communication capabilities and embedded with a sensor, identification device, or machine-readable identifier ("smart device").
Fig. 28 illustrates a block diagram of a system 2800 of network devices to implement one or more processes, methods, steps, features, or aspects described herein. For example, the privacy client described above may be implemented on any smart device (i.e., wearable, removable, or non-removable smart device) 2810, smartphone 2820, tablet 2830, notebook 2840, or desktop 2850. Each of these devices is connected to a privacy server 2870 through one or more networks 2860, the privacy server 2870 being coupled to a database 2880 for storing relevant combinations of attributes, TDRs, data Subjects, aggregated Data Subjects profiles, time periods/stamps by Time Key (TKs) or other means, associated Keys (AKs), alternate keys (RKs), and their related information. Database 2880 may be any form of data storage desired, including structured databases and unstructured flat files. The privacy server 2870 may also provide remote storage for attribute combinations, TDRs, data Subjects, aggregated Data Subject profiles, time periods/stamps, association Keys (AKs), replacement Keys (RKs) and their related information that have been or will be passed to the privacy client on devices 2810, 2820, 2830, 2840, 2850 or database 2880 or other suitable devices in a different database (not shown), by Time Keys (TKs) or otherwise.
Although a single network 2860 is shown in fig. 28, network 2860 may be a plurality of interconnected networks, and privacy server 2870 may be connected to each privacy client on 2810, 2820, 2830, 2840, 2850, or other suitable device in the following manner. A different network 2860. Network 2860 may be any type of network, including a local area network, a wide area network, or the global Internet.
Embodiments of the invention may provide privacy and security applications for various industries, environments, and technologies, including but not limited to online transactions, healthcare, education, card payment or processing, information security, transportation, supply chain management, manufacturing resources. Planning, geographical location, mobile or cellular systems, energy and smart grid technologies, internet and defense and intelligence technologies and programs.
When used in an online transaction environment, embodiments of the present invention may provide consumers with the ability to control the collection or use of their data, giving data custodians the ability to ensure that third parties involved in data communication or dissemination receive only the information necessary to fulfill their particular roles. Thereby enhancing consumer confidence and potentially enabling people to continue to enjoy the benefits of the "internet of things" described above without relinquishing the rights of the subject or interested party and without subjecting the industry to inappropriate regulation.
In the medical field, embodiments of the present invention can help maintain the effectiveness of existing medical laws by improving de-identification. In addition, embodiments of the present invention may benefit individual consumers and the entire society from medical big data analysis by increasing the likelihood that patients will agree to conduct research due to increased protection of data confidentiality.
As another example, when used in an educational environment, embodiments of the present invention may provide educators and administrators with a secure tool to access and use zone data associated with students to analyze student data in a school system, either individually or collectively, with schools benefiting from enhanced data analysis without compromising student privacy/anonymity.
In the field of national security settings, the invention may be used, for example, by a governmental national security organization to analyze limited telephone records aggregated by a single telecommunications user without requiring any personal identification information to be provided to the security organization. For example, the time of the call, the "called to" and "called from" numbers, the duration of the call, and the zip codes of the "called to" and "called from" numbers may be disclosed without disclosing the telephone number to make or receive the call or personal information about the party making or receiving the call. In this example, the security organization may analyze the limited phone records to determine if any suspicious activity has occurred, at which point a search order or other judicial approval may be issued to receive other more detailed attributes of the phone records. In this manner, embodiments of the present invention can be used for further national security benefits while maintaining privacy/anonymity of the phone user until judicial review requires disclosure of additional, more detailed attributes.
Examples of the invention
The following examples relate to further embodiments. Example 1 is a system, comprising: a communication interface for transmitting data over a network; and a memory having computer program code stored therein; the one or more processing units are operatively coupled to the memory and configured to execute instructions in the computer program code that cause the one or more processing units to: generating one or more dynamically changing, temporally unique identifiers; receiving, over a network from a first client, a first request for a generated identifier related to a first data body; in response to the first request, associating the first generated identifier with the first data body; generating first time period data, wherein the first time period data includes information defining a first time period in which a first data body can be identified using a first generated identifier; storing the first generated identification and the first time period data in a memory; and transmitting the generated first identifier to the first client through the network.
Example 2 includes the subject matter of example 1, wherein the instructions in the computer program code further cause the one or more processing units to: one or more data attributes are associated with the first generated identifier.
Example 3 includes the subject matter of example 2, wherein at least one of the one or more data attributes associated with the first generated identifier is related to an operation, activity, procedure, purpose, identification, or feature of the first data subject.
Example 4 includes the subject matter of example 3, wherein the instructions in the computer program code further cause the one or more processing units to: one or more second requests are received from a second client over the network. A data attribute associated with the first generated identifier over a first time period; determining that the second request is authorized; and granting the second client over the network to determine the identifier associated with the first generation during the first time period.
Example 5 includes the subject matter of example 1, wherein the instructions in the computer program code further cause the one or more processing units to: the first generated identifier is associated with the second data topic during the first time period or the second time period.
Example 6 includes the subject matter of example, wherein the instructions in the computer program code further cause the one or more processing units to: associating the second generated identifier with the first data topic in response to the first request; generating second time period data, wherein the second time period data includes information defining a second time period during which the second generated identifier is usable to identify the first data body; storing the second generated identifier and the second time period data in the memory; and transmitting the second generated identifier to the first client over the network.
Example 7 includes the subject matter of example 6, wherein the instructions in the computer program code further cause the one or more processing units to: associating the one or more data attributes with a second generated identifier, wherein at least one of the one or more data attributes associated with the second generated identifier is related to an action, activity, process, purpose, identification, or feature of the first data body.
Example 8 includes the subject matter of example 7, wherein at least one of the one or more data attributes associated with the first generated identifier is different from at least one of the one or more data attributes associated with the second generated identifier.
Example 9 includes the subject matter of example 3, wherein the instructions in the computer program code further cause the one or more processing units to: the first generated identifier is associated with a second data topic over a second time period, wherein at least one of the one or more data attributes associated with the first generated identifier during the first time period is the same as one of the one or more data attributes associated with the first generated identifier during the second time period.
Example 10 includes the subject matter of example 1, wherein the instructions in the computer program code further cause the one or more processing units to: receiving, from the second client over the network, a second identifier associated with a second data topic; and associating the second identifier with the second data body; generating second time period data, wherein the second time period data includes information defining a second time period in which a second data body can be identified using a second identifier; and storing the second identifier and the second time period data in the memory.
Example 11 includes the subject matter of example 4, wherein the instructions in the computer program code further cause the one or more processing units to: the ability of the second client to determine that the requested one or more data attributes are associated is withdrawn through the network. The first generated identifier is used during a second time period.
Example 12 is a non-transitory computer-readable medium comprising computer-executable instructions stored thereon to cause one or more processing units to: generating one or more dynamically changing temporary unique identifiers; receiving, from a first client over a network, a first request for a generated identifier relating to a first data body; associating, in response to the first request, the first generated identifier with the first data body; generating first time period data, wherein the first time period data includes information defining a first time period in which a first data body can be identified using a first generated identifier; storing the first generated identification and the first time period data in a memory; and transmitting the generated first identifier to the first client through the network.
Example 13 includes the subject matter of example 12, wherein the instructions further cause the one or more processing units to: one or more data attributes are associated with the first generated identifier.
Example 14 includes the subject matter of example 13, wherein at least one of the one or more data attributes associated with the first generated identifier relates to an action, activity, process, purpose, identity, or characteristic of the first data subject.
Example 15 includes the subject matter of example 14, wherein the instructions further cause the one or more processing units to: receiving, over the network from the second client, a second request for at least one of the one or more data attributes associated with the first generated identifier over the first time period; determining that the request is authorized; and granting, over the network, the second client the ability to determine the requested one or more data attributes associated with the first generated identifier within the first time period.
Example 16 includes the subject matter of example 12, wherein the instructions further cause the one or more processing units to: the first generated identifier is associated with the second data topic during a second time period.
Example 17 includes the subject matter of example 12, wherein the instructions further cause the one or more processing units to: the first generated identifier is associated with the second data topic within a first time period.
Example 18 includes the subject matter of example 12, wherein the instructions further cause the one or more processing units to: in response to the first request, associating the second generated identifier with the first data topic; and generating second time period data, wherein the second time period data includes information defining a second time period in which the first data body can be identified using the second generated identifier; storing the second generated identification and the second time period data in the memory; and sending the second generated identifier to the first client over the network.
Example 19 includes the subject matter of example 18, wherein the first time period and the second time period do not overlap.
Example 20 includes the subject matter of example 18, wherein the first time period and the second time period at least partially overlap.
Example 21 includes the subject matter of example 18, wherein the instructions further cause the one or more processing units to: associating one or more data attributes with a second generated identifier, the generated identifier relating to an action, activity, process, purpose, identity or characteristic of the first data subject.
Example 22 includes the subject matter of example 21, wherein at least one of the one or more data attributes associated with the first generated identifier is different from at least one of the one or more data attributes associated with the second generated identifier.
Example 23 includes the subject matter of example 14, wherein the instructions further cause the one or more processing units to: the first generated identifier is associated with a second data topic in a second time period, wherein at least one of the one or more data attributes associated with the first generated identifier in the first time period is the same as one of the one or more data attributes associated with the first generated identifier in the second time period.
Example 24 includes the subject matter of example 12, wherein the instructions further cause the one or more processing units to: receiving, over the network, a second identifier associated with a second data body from a second client; associating the second identifier with the second data body; generating second time segment data, wherein the second time segment data includes information defining a second time segment during which a second identifier is usable to identify a second data subject; and storing the second identifier and the second time period data in the memory.
Example 25 includes the subject matter of example 24, wherein the second identifier comprises an HTTP cookie.
Example 26 includes the subject matter of example 12, wherein the instructions further cause the one or more processing units to: receiving, over the network, a second request from the second client for identification of a first data topic associated with the first generated identifier over a first time period; determining that the second request is authorized; and granting the second client the ability to determine the identity of the first data subject over the network for the first time period.
Example 27 includes the subject matter of example 26, wherein the instructions further cause the one or more processing units to: the ability of the second client to determine the identity of the first data topic over the first time period is revoked via the network.
Example 28 includes the subject matter of example 15, wherein the instructions further cause the one or more processing units to: revoking, over the network, the ability of the second client to determine the one or more data attributes of the request associated with the first generated identifier over the second time period.
Example 29 includes the subject matter of example 13, wherein the first generated identifier is not mathematically derived from any of the one or more data attributes associated with the first generated identifier.
Example 30 includes the subject matter of example 12, wherein the first generated identifier comprises a primary identifier of the first data subject.
Example 31 is a system, comprising: a communication interface for transmitting data over a network; a memory having computer program code stored therein; the one or more processing units are operatively coupled to the memory and configured to execute instructions in the computer program code that cause the one or more processing units to: generating a first temporally unique identifier; associating a first temporally unique identifier with a first data body; associating one or more data attributes with a first temporally unique identifier; generating first time period data, wherein the first time period data comprises information defining a first time period in which a first data body may be identified and associated one or more data attributes retrieved using a first time unique identifier; storing in a memory a first time unique identifier, one or more data attributes, and first time period data; and sending the first time unique identifier and the one or more data attributes to the first client over the network.
Example 32 includes the subject matter of example 31, wherein the instructions to generate the first time-unique identifier are performed based on at least one of: time, purpose and location.
Example 33 includes the subject matter of example 31, wherein the instructions in the computer program code further cause the one or more processing units to: the ability to terminate the first temporally unique identifier to identify the first data topic and retrieve the associated one or more data attributes.
Example 34 includes the subject matter of example 33, wherein the instructions to terminate an ability of the first temporally unique identifier to identify the first data subject and retrieve the associated one or more data attributes are performed based on at least one of: time, purpose, and location.
Example 35 includes the subject matter of example 31, wherein at least one of the one or more data attributes associated with the first temporally unique identifier is related to an action, activity, procedure, purpose, identification, or characteristic of the first data subject.
Example 36 includes the subject matter of example 31, wherein the instructions in the computer program code further cause the one or more processing units to: the first time-unique identifier is associated with the second data topic during a second time period.
Example 37 includes the subject matter of example 31, wherein the instructions in the computer program code further cause the one or more processing units to: the first temporally unique identifier is associated with the second data topic within a first time period.
Example 38 includes the subject matter of example 31, wherein the instructions in the computer program code further cause the one or more processing units to: receiving a first request from a second client over the network for a first time period requesting identification of a first data topic associated with a first temporary unique identifier; determining that the first request is authorized; and granting the second client the ability to determine the identity of the first data subject over the network for the first time period.
Example 39 includes the subject matter of example 38, wherein the instructions in the computer program code further cause the one or more processing units to: revoking, over the network, an ability of the second client to determine an identity of the first data topic within the first time period.
Example 40 includes the subject matter of example 31, wherein the instructions in the computer program code further cause the one or more processing units to: one or more requests from a second client are received over a network. A data attribute associated with a first time-unique identifier over a first time period; determining that the first request is authorized; and granting the second client the ability to determine the requested one or more data attributes associated with the first time unique identifier within the first time period over the network.
Example 41 includes the subject matter of example 40, wherein the instructions in the computer program code further cause the one or more processing units to: the ability of the second client to determine the requested one or more data is withdrawn through the network. An attribute associated with the first time-unique identifier over the first time period.
Example 42 includes the subject matter of example 31, wherein the first time-unique identifier is not mathematically derived from any of the one or more data attributes associated with the first time-unique identifier.
Example 43 includes the subject matter of example 31, wherein the first temporally unique identifier comprises a primary identifier of the first data subject.
Example 44 is a non-transitory computer-readable medium comprising computer-executable instructions stored thereon that cause one or more processing units to perform operations comprising: generating a first time-unique identifier; associating the first temporally unique identifier with the first data body; associating one or more data attributes with the first temporary unique identifier; generating first time period data, wherein the first time period data comprises information defining a first time period during which a first time unique identifier is usable to identify a first data body and retrieve associated one or more data attributes; storing in a memory a first time unique identifier, one or more data attributes, and first time period data; and sending the first time-unique identifier and the one or more data attributes to the first client over the network.
Example 45 includes the subject matter of example 44, wherein the instructions to generate the first temporally unique identifier are performed based on at least one of: time, purpose and location.
Example 46 includes the subject matter of example 44, wherein the instructions further cause the one or more processing units to: the ability to terminate the first temporally unique identifier to identify the first data topic and retrieve the associated one or more data attributes.
Example 47 includes the subject matter of example 46, wherein the instructions to terminate the ability of the first temporary unique identifier to identify the first data subject and retrieve the associated one or more data attributes are performed in accordance with at least one of a time, a destination, and a location.
Example 48 includes the subject matter of example 44, wherein at least one of the one or more data attributes associated with the first temporally unique identifier is related to an action, activity, procedure, purpose, identification, or characteristic of the first data subject.
Example 49 includes the subject matter of example 44, wherein the instructions further cause the one or more processing units to: the first time-unique identifier is associated with the second data topic during a second time period.
Example 50 includes the subject matter of example 44, wherein the instructions further cause the one or more processing units to: the first time-unique identifier is associated with the second data topic within a first time period.
Example 51 includes the subject matter of example 44, wherein the instructions further cause the one or more processing units to: receiving, over a network, a first request from a second client for an identification of a first data topic associated with a first time-unique identifier over a first time period; determining that the first request is authorized; and granting the second client the ability to determine the identity of the first data subject over the network for the first time period.
Example 52 includes the subject matter of example 51, wherein the instructions further cause the one or more processing units to: revoking, over the network, an ability of the second client to determine an identity of the first data topic within the first time period.
Example 53 includes the subject matter of example 44, wherein the instructions further cause the one or more processing units to: receiving, over a network from a second client, a first request for one or more data attributes associated with a first time-unique identifier over a first time period; determining that the first request is authorized; and granting, over the network, the second client the ability to determine the requested one or more data attributes associated with the first temporary unique identifier within the first time period.
Example 54 includes the subject matter of example 53, wherein the instructions further cause the one or more processing units to: revoking, over the network, an ability of the second client to determine one or more data attributes of the request associated with the first time-unique identifier over the first time period.
Example 55 includes the subject matter of example 44, wherein the first time-unique identifier is not mathematically derived from any of the one or more data attributes associated with the first time-unique identifier.
Example 56 includes the subject matter of example 44, wherein the first temporally unique identifier comprises a primary identifier of the first data subject.
Example 57 is an apparatus, comprising: a user interface; a communication interface for transmitting data over a network; a memory having computer program code stored therein; one or more processing units are operatively coupled to the memory and configured to execute instructions in the computer program code that cause the one or more processing units to request, from the first privacy server over the network, a first temporally unique identifier; associating a first temporally unique identifier with a first data body that is a user of the device; associating one or more data attributes with a first temporally unique identifier; generating first time period data, wherein the first time period data comprises information defining a first time period within which a first time unique identifier is available to identify a first data subject and retrieve associated one or more data attributes; storing in a memory a first time unique identifier, one or more data attributes, and first time period data; and in response to determining that the first condition is satisfied, transmitting, over the network, the first temporally unique identifier, the first time period data, and the one or more data attributes to the first privacy server.
Example 58 includes the subject matter of example 57, wherein determining that the first condition has been met comprises at least one of: a predetermined amount of time has elapsed; a flexible amount of time has elapsed; the purpose of the first time unique identifier has expired; or the location of the first data body has changed.
Example 59 includes the subject matter of example 57, wherein the instructions in the computer program code further cause the one or more processing units to: one or more data attributes associated with the first time-unique identifier are modified.
Example 60 includes the subject matter of example 57, wherein the instructions in the computer program code further cause the one or more processing units to: use of the first time unique identifier is tracked.
Example 61 includes the subject matter of example 57, wherein the instructions in the computer program code further cause the one or more processing units to: revoking the ability of the first temporally unique identifier to identify the first data topic and retrieve the associated one or more data attributes.
Example 62 includes the subject matter of example 57, wherein the device is located on a same computing device as the privacy server.
Example 63 includes the subject matter of example 57, wherein the instructions in the computer program code further cause the one or more processing units to: in response to a change in the first time-unique identifier, transmitting the first time-period data or the one or more data attributes, at least one of the first time-unique identifier, the first time-period data, and the one or more data attributes on the network to one or more client devices that have registered with a first privacy server to be synchronized with the device.
Example 64 includes the subject matter of example 57, wherein the first time unique identifier, the first time period data, and the one or more data attributes are transmitted over the network to the first privacy server in the form of an HTTP cookie.
Example 65 includes the subject matter of example 57, wherein the first time-unique identifier is not mathematically derived from any of the one or more data attributes associated with the first time-unique identifier.
Example 66 includes the subject matter of example 57, wherein the first temporally unique identifier comprises a primary identifier of the first data subject.
Example 67 is a non-transitory computer-readable medium comprising computer-executable instructions stored thereon to cause one or more processing units to: requesting a first temporally unique identifier from a first privacy server over a network; associating a first temporally unique identifier with a first data body that is a user of a first client device; associating one or more data attributes with a first temporally unique identifier; generating first time period data, wherein the first time period data comprises information defining a first time period in which the first data body can be identified and the associated one or more data attributes retrieved using the first time unique identifier; storing, in a memory of a first client device, a first time-unique identifier, one or more data attributes, and first time period data; and in response to determining that the first condition is satisfied, transmitting, over the network, the first temporally unique identifier, the first time period data, and the one or more data attributes to the first privacy server.
Example 68 includes the subject matter of example 67, wherein determining that the first condition has been satisfied comprises at least one of: a predetermined amount of time has elapsed; a flexible amount of time has elapsed; the purpose of the first time unique identifier has expired; or the location of the first data body has changed.
Example 69 includes the subject matter of example 67, wherein the instructions further cause the one or more processing units to: one or more data attributes associated with the first temporally unique identifier are modified.
Example 70 includes the subject matter of example 67, wherein the instructions further cause the one or more processing units to: use of the first time unique identifier is tracked.
Example 71 includes the subject matter of example 67, wherein the instructions further cause the one or more processing units to: revoking the ability of the first temporally unique identifier to identify the first data topic and retrieve the associated one or more data attributes.
Example 72 includes the subject matter of example 67, wherein the first client device is located on the same computing device as the privacy server.
Example 73 includes the subject matter of example 67, wherein the instructions further cause the one or more processing units to: in response to a change in the first time-unique identifier, the first time period data, or the one or more data attributes, transmitting at least one of: the first time unique identifier, the first time period data, and the one or more data attributes on the network to one or more client devices that have registered with a first privacy server to be synchronized with the first client device.
Example 74 includes the subject matter of example 67, wherein the first time unique identifier, the first time period data, and the one or more data attributes are transmitted over the network to the first privacy server in the form of an HTTP cookie.
Example 75 includes the subject matter of example 67, wherein the first time-unique identifier is not mathematically derived from any of the one or more data attributes associated with the first time-unique identifier.
Example 76 includes the subject matter of example 67, wherein the first temporally unique identifier comprises a primary identifier of the first data subject matter.
Example 77 is an apparatus, comprising: a user interface; a communication interface for transmitting data over a network; a memory having computer program code stored therein; and one or more processing units operatively coupled to the memory and configured to execute instructions in the computer program code that cause the one or more processing units to: obtaining, over a network, a first temporally unique identifier from a first privacy server, wherein the first temporally unique identifier is associated with a first data body that is a user of a device at the first privacy server during a first time period; associating one or more data attributes with a first temporally unique identifier; generating first time period data, wherein the first time period data comprises information defining a first time period in which a first data body may be identified and associated one or more data attributes retrieved using a first time unique identifier; storing in a memory a first time unique identifier, one or more data attributes, and first time period data; transmitting, over a network, a first temporally unique identifier, first time period data, and one or more data attributes to a first privacy server; and receiving, over the network, a second temporally unique identifier from the first privacy server, wherein the second temporally unique identifier is associated with the first data body and the one or more data attributes at the first privacy server during a second time period.
Example 78 includes the subject matter of example 77, wherein, in response to a determination that the first condition has been met, executing instructions in the computer program code that cause the one or more processing units to receive, over the network, the second temporary unique identifier from the first privacy server.
Example 79 includes the subject matter of example 78, wherein determining that the first condition has been satisfied comprises at least one of: a predetermined amount of time has elapsed; a flexible amount of time has elapsed; the purpose of the first time unique identifier has expired; or the location of the first data body has changed.
Example 80 includes the subject matter of example 77, wherein the instructions in the computer program code further cause the one or more processing units to: one or more data attributes associated with the first temporally unique identifier are modified.
Example 81 includes the subject matter of example 77, wherein the instructions in the computer program code further cause the one or more processing units to: use of the first time unique identifier is tracked.
Example 82 includes the subject matter of example 77, wherein the instructions in the computer program code further cause the one or more processing units to: revoking the ability of the first temporally unique identifier to identify the first data topic and retrieve the associated one or more data attributes.
Example 83 includes the subject matter of example 77, wherein the instructions in the computer program code further cause the one or more processing units to: requesting confirmation from the first privacy server whether the identity or one or more data attributes of the first data topic may be revealed to the first requestor; and in response to receiving a confirmation from the first privacy server that the identity or the one or more data attributes of the first data subject may be revealed to the first requestor, sending the identity or the one or more data attributes of the first data subject to the first requestor.
Example 84 includes the subject matter of example 83, wherein the requested confirmation further includes information as to whether an identity or one or more data attributes and a time period or location of the first data subject may be revealed to the first requestor for the particular request.
Example 85 includes the subject matter of example 83, wherein the requested confirmation further includes information as to whether an identity or one or more data attributes and a time period or location of the first data subject may be revealed to the first requestor for the particular request.
Example 86 includes the subject matter of example 84, wherein the requested confirmation further includes information as to whether an identity or one or more data attributes and a time period or location of the first data subject may be revealed to the first requestor for the particular request.
Example 87 is a system, comprising: a communication interface for transmitting data over a network; and a memory having computer program code stored therein; the one or more processing units are operatively coupled to the memory and configured to execute instructions in the computer program code that cause the one or more processing units to: generating one or more dynamically changing, temporally unique identifiers; receiving, over a network, a first request from a first data body for a generated dynamically changing, temporally unique identifier, the identifier relating to an attribute of the first data body; in response to the first request, associating a first generated dynamically changing, temporally unique identifier with an attribute of the first data body; converting a value of a first generated dynamically changing, temporally unique identifier into a first non-decryptable form, wherein a first key is usable to convert the first form back into a first view of the first generated dynamically changing, up-time unique identifier, wherein a second key is usable to convert the first form back into a second view of the first generated dynamically changing, temporally unique identifier, wherein the first key is different from the second key, and wherein the first view is different from the second key by a second perspective to store the first generated dynamically changing, temporally unique identifier, the first key, the second key, and the first form in a memory; and sends the first table to the first data body over the network.
Example 88 includes the subject matter of example 87, wherein the first view provides more detail than the second view.
Example 89 includes the subject matter of example 87, wherein the encrypted text is presented in a non-decryptable form.
Example 90 includes the subject matter of example 87, wherein the instructions in the computer program code further include instructions to cause the one or more processing units to also associate the first generated dynamically changing, temporarily unique identifier with an attribute of the second data subject.
Example 91 includes the subject matter of example 90, wherein the instructions in the computer program code cause the one or more processing units to also associate the first generated dynamic change with an identifier that is unique in time associated with an attribute of the second data subject. At least one of the following: a time different from the first generated dynamically changing, temporally unique identifier is associated with an attribute of the first data body; at a different physical or virtual location than the first generated dynamically changing, temporally unique identifier, the identifier being associated with an attribute of the first data body; and a temporally unique identifier is associated with the attribute of the first data body for a purpose different from that of the dynamic change of the first generation.
Example 92 includes the subject matter of example 87, wherein the instructions in the computer program code further include instructions to cause the one or more processing units to: a second generated dynamically changing temporally unique identifier is associated with the attribute of the first data topic.
Example 93 includes the subject matter of example 92, wherein the instructions that cause the one or more processing units to associate the second generated dynamically changing, temporally unique identifier with the attribute of the first data topic are stored in the computer program code. At least one of the following: associating with an attribute of the first data body at a time different from the first generated dynamically changing, temporally unique identifier; at a different physical or virtual location than the first generated dynamically changing, temporally unique identifier, the identifier being associated with an attribute of the first data body; and a temporally unique identifier is associated with the attribute of the first data body for a purpose different from the purpose of the dynamic change of the first generation.
Example 94 is a non-transitory computer-readable medium comprising computer-executable instructions stored thereon to cause one or more processing units to: generating one or more dynamically changing temporary unique identifiers; receiving, over the network, a first request for the generated dynamically changing, temporally unique identifier from the first data body, the identifier relating to an attribute of the first data body; in response to the first request, associating a first generated dynamically changing, temporally unique identifier with an attribute of the first data body; converting a value of a first generated dynamically changing, temporally unique identifier into a first non-decryptable form, wherein a first key is usable to convert the first non-decryptable form back into a first view of the first generated dynamically changing, superordinate, temporally unique identifier, wherein a second key is usable to convert a first difficult-to-decrypt form back into a second view of the first generated dynamically changing, temporally unique identifier, wherein the first key is different from the second key, and wherein the first view is different from the second key by a second perspective to store the first generated dynamically changing, temporally unique identifier, the first key, the second key, and the first difficult-to-decrypt form in memory; and sends the first hard-to-decrypt table to the first data body over the network.
Example 95 includes the subject matter of example 94, wherein the first view provides more detail than the second view.
Example 96 includes the subject matter of example 94, wherein the non-decryptable form comprises unencrypted text.
Example 97 includes the subject matter of example 94, wherein the instructions further comprise instructions to cause the one or more processing units to further associate the first generated dynamically changing, temporally unique identifier with an attribute of the second data subject.
Example 98 includes the subject matter of example 97, wherein the instructions that cause the one or more processing units to further associate the first generated dynamically changing, temporally unique identifier with the attribute of the second data subject perform at least one of: a temporally unique identifier associated with an attribute of the first data body, distinct from the time of the first generated dynamic change; at a different physical or virtual location than the first generated dynamically changing, temporally unique identifier, the identifier being associated with an attribute of the first data body; and a temporally unique identifier is associated with the attribute of the first data body for a purpose different from the purpose of the dynamic change of the first generation.
Example 99 includes the subject matter of example 94, wherein the instructions further comprise instructions to cause the one or more processing units to: a second generated dynamically changing temporally unique identifier is associated with the attribute of the first data body.
Example 100 includes the subject matter of example 99, wherein the instructions that cause the one or more processing units to associate the second generated dynamically changing, temporally unique identifier with the attribute of the first data topic perform at least one of: in contrast to the first generated dynamically changing time, a temporally unique identifier is associated with an attribute of the first data body; at a different physical or virtual location than the first generated dynamically changing, temporally unique identifier, the identifier being associated with an attribute of the first data body; and a temporally unique identifier is associated with the attribute of the first data body for a purpose different from that of the dynamic change of the first generation.
Example 101 is a computer-implemented method, comprising: generating one or more dynamically changing, temporally unique identifiers; and receiving over the network from the first data body a first request for the generated dynamically changing, temporally unique identifier, the identifier relating to an attribute of the first data body; in response to the first request, associating a first generated dynamically changing, temporally unique identifier with an attribute of the first data body; converting a value of a first generated dynamically changing, temporally unique identifier to a first non-decryptable form, wherein a first key is usable to convert the first non-decryptable form back to a first view of the first generated dynamically changing, temporally unique identifier, wherein a second key is usable to convert a first difficult-to-decrypt form back to a second view of the first generated dynamically changing, temporally unique identifier, wherein the first key is different from the second key, and wherein the second view, different from the second key, is different from the second view to store the first generated dynamically changing, temporally unique identifier, the first key, the second key, and the first difficult-to-decrypt form in memory; and sending the first difficult-to-decrypt table to the first data body over the network.
Example 102 includes the subject matter of example 101, wherein the first view provides more detail than the second view.
Example 103 includes the subject matter of example 101, further comprising associating the first generated dynamically changing temporally unique identifier with an attribute of the second data subject.
Example 104 includes the subject matter of example 103, wherein the act of associating the first generated dynamically changing, temporally unique identifier with an attribute of the second data subject is performed in at least one of: a first generated dynamically changing, temporally unique identifier associated with an attribute of the first data body; at a different physical or virtual location than the first generated dynamically changing, temporally unique identifier, the identifier being associated with an attribute of the first data body; and a temporally unique identifier is associated with the attribute of the first data body for a purpose different from the purpose of the dynamic change of the first generation.
Example 105 includes the subject matter of example 101, the subject matter further comprising: associating the second generated dynamically changing temporally unique identifier with the attribute of the first data topic.
Example 106 includes the subject matter of example 105, wherein the act of associating the second generated dynamically changing, temporarily unique identifier with the attribute of the first data subject is performed in at least one of: at a time other than the dynamic change of the first generation, the temporary unique identifier is associated with an attribute of the first data body; in a physical or virtual location distinct from the first generated dynamic change, the temporary unique identifier is associated with an attribute of the first data body; and the temporary unique identifier is associated with an attribute of the first data body for a purpose different from the dynamic change of the first generation.
Example 107 is a system, characterized by: a communication interface for transmitting data over a network; a memory having computer program code stored therein; one or more data sources; and one or more processing units operatively coupled to the memory and configured to execute instructions in the computer program code that cause the one or more processing units to: obtaining data from each of one or more data sources belonging to a first plurality of data bodies; generating a first dynamic change, the first data body being a time-unique identifier of a first data body of a first plurality of data bodies, wherein the first data body is in each of a first data source and a second data source of the one or more data sources; generating one or more dynamically changing second data bodies corresponding to temporary unique identifiers of the one or more quasi identifiers in each of the first data source and the second data source, wherein each quasi identifier has a value; receiving, over a network, a first request for values of one or more quasi-identifiers in a first data source; receiving, over a network, values for one or more quasi-identifiers in a second data source; converting the value obtained from the first request to a third one of the one or more dynamically changing time-unique identifiers; converting the value obtained from the second request to a fourth one of the one or more dynamically changing time-unique identifiers; storing, in a memory: a first dynamically changing, temporally unique identifier; a second dynamically changing, temporally unique identifier; one or more third dynamically changing, temporally unique identifiers; and one or more fourth dynamically changing, temporally unique identifiers; and transmitting a first dynamically changing identifier, a time-unique identifier; a second dynamically changing time-unique identifier; one or more third dynamically changing time-unique identifiers; and one or more fourth dynamically changing time-unique identifiers across the network.
Example 108 includes the subject matter of example 107, wherein the first dynamically changing temporally unique identifier comprises a Replacement replace DDID (R-DDID).
Example 109 includes the subject matter of example 108, wherein the one or more third dynamically changing, temporally unique identifiers comprise Association IDs (A-DDIDs).
Example 110 includes the subject matter of example 107, wherein the R-DDID includes a particular value.
Example 111 includes the subject matter of example 107, wherein each a-DDIDs includes a particular value.
Example 112 includes the subject matter of example 109, wherein the instructions further cause the one or more processing units to: converting the R-DDID to a first view of the R-DDID using a first bond; and converting the R-DDID to a second view of the R-DDID using a second key, wherein the first key is different from the second key.
Example 113 includes the subject matter of example 109, wherein the instructions further cause the one or more processing units to: converting the first one of the A-DDIDs to a third view of the first one of the A-DDIDs using a third key; and converting the first a-DDIDs to a fourth view of the first a-DDIDs using a fourth key, wherein the third key is different from the fourth key, and the third view is different from the fourth view.
Example 114 includes the subject matter of example 107, wherein a first one of the second dynamically changing time-unique identifiers has a same value in the first data source and the second data source.
Example 115 includes the subject matter of example 107, wherein at least one of the one or more third dynamically changing time-unique identifiers comprises a first non-decryptable form.
Example 116 includes the subject matter of example 107, wherein at least one of the one or more fourth dynamically changing time-unique identifiers comprises a second difficult-to-decrypt form.
Example 117 includes the subject matter of example 115, wherein the first non-decryptable form comprises encrypted data.
Example 118 includes the subject matter of example 116, wherein the first non-decryptable form comprises encrypted data.
Example 119 includes the subject matter of example 107, wherein the at least one or more data sources comprise a particular subset, population, or homogeneous of the data subject matter.
Example 120 includes the subject matter of example 107, wherein each of the one or more data sources belongs to a particular plurality of data subjects over a particular time period.
Example 121 includes the subject matter of example 109, wherein at least one of the one or more a-DDIDs comprises one of: a digital packet or a classified packet.
Example 122 includes the subject matter of example 109, wherein the one or more a-DDIDs include at least one of: a discrete value or a set of discrete values.
Example 123 is a non-transitory computer-readable medium comprising computer-executable instructions stored thereon to cause one or more processing units to: obtaining data relating to a first plurality of data subjects from each of one or more data sources; generating a first dynamically changing, temporally unique identifier for a first data body of the first plurality of data bodies, wherein the first data body is in each of a first data source and a second data source of the one or more data sources; generating one or more second dynamically changing, temporally unique identifiers corresponding to the one or more quasi-identifiers in each of the first data source and the second data source, wherein each quasi-identifier has a value; receiving, over a network, a first request for values of one or more quasi-identifiers in a first data source; receiving, over the network, a second request for values of one or more quasi-identifiers in a second data source; converting the value obtained from the first request into one or more third dynamically changing time-unique identifiers; converting the value obtained from the second request into one or more fourth dynamically changing time-unique identifiers; storing in a memory: a first dynamically changing, temporally unique identifier; a second dynamically changing, temporally unique identifier; one or more third dynamically changing, temporally unique identifiers; one or more fourth dynamically changing, temporally unique identifiers; and sending a first dynamically changing time unique identifier; a second dynamically changing, temporally unique identifier; one or more third dynamically changing, temporally unique identifiers; one or more dynamically changing, temporally unique identifiers on the network.
Example 124 includes the subject matter of example 123, wherein the first dynamically changing, temporarily unique identifier comprises a Replacement replace DDID (R-DDID).
Example 125 includes the subject matter of example 124, wherein the one or more dynamically changing, time-unique third identifiers comprise Association IDs (A-DDIDs).
Example 126 includes the subject matter of example 123, wherein the R-DDID includes a particular value.
Example 127 includes the subject matter of example 123, wherein each a-DDIDs includes a particular value.
Example 128 includes the subject matter of example 125, wherein the instructions further cause the one or more processing units to: converting the R-DDID to a first view of the R-DDID using a first key; and converting the R-DDID to a second view of the R-DDID using a second key, wherein the first key is different from the second key.
Example 129 includes the subject matter of example 125, wherein the instructions further cause the one or more processing units to: converting the first one of the A-DDIDs to a third view of the first one of the A-DDIDs using a third key; and converting the first a-DDIDs to a fourth view of the first a-DDIDs using a fourth key, wherein the third key is different from the fourth key, and the third view is different from the fourth key.
Example 130 includes the subject matter of example 123, wherein a first one of the second dynamically changing time-unique identifiers has a same value in the first data source and the second data source.
Example 131 includes the subject matter of example 123, wherein at least one of the one or more third dynamically changing temporally unique identifiers comprises a first non-decryptable form.
Example 132 includes the subject matter of example 123, wherein at least one of the one or more fourth dynamically changing time unique identifiers comprises the second difficult to decrypt form.
Example 133 includes the subject matter of example 131, wherein the first non-decryptable form includes encrypted data.
Example 134 includes the subject matter of example 132, wherein the first non-decryptable form comprises encrypted data.
Example 135 includes the subject matter of example 123, wherein at least one of the one or more data sources comprises a particular subset, aggregate, or queue of data subjects.
Example 136 includes the subject matter of example 123, wherein each of the one or more data sources belongs to a particular plurality of data subjects within a particular time period.
Example 137 includes the subject matter of example 125, wherein the one or more a-DDIDs comprise at least one of: a digital packet or a classified packet.
Example 138 includes the subject matter of example 125, wherein the one or more a-DDIDs comprise at least one of: a discrete value or a set of discrete values.
Example 139 is a computer-implemented method, comprising: obtaining data from each of one or more data sources related to a first plurality of data subjects; generating a first dynamically changing, temporally unique identifier for a first data body of the first plurality of data bodies, wherein the first data body is in each of a first data source and a second data source of the one or more data sources; generating one or more second dynamically changing temporally unique identifiers corresponding to the one or more quasi identifiers in each of the first data source and the second data source, wherein each quasi identifier has a value; receiving, over a network, a first request for values of one or more quasi-identifiers in a first data source; receiving, over a network, a second request for values of one or more quasi-identifiers in a second data source; converting the value obtained from the first request into one or more third dynamically changing time-unique identifiers; converting the value obtained from the second request into one or more fourth dynamically changing time-unique identifiers; storing in a memory: a first dynamically changing, temporally unique identifier; a second dynamically changing, temporally unique identifier; one or more third dynamically changing, temporally unique identifiers; one or more fourth dynamically changing, temporally unique identifiers; transmitting a first dynamically changing time unique identifier; a second dynamically changing, temporally unique identifier; one or more third dynamically changing, temporally unique identifiers; one or more dynamically changing, temporally unique identifiers on the network.
Example 140 includes the subject matter of example 139, wherein the first dynamically changing, temporally unique identifier comprises a replacementddid (R-DDID).
Example 141 includes the subject matter of example 140, wherein the one or more third dynamically changing, temporally unique identifiers comprise Association DDIDs (a-DDIDs).
Example 142 includes the subject matter of example 139, wherein the R-DDID includes a particular value.
Example 143 includes the subject matter of example 139, wherein each a-DDIDs includes a particular value.
Example 144 includes the subject matter of example 141, further comprising the acts of: converting the R-DDID to a first view of the R-DDID using a first key; the R-DDID is converted to a second view of the R-DDID using a second key, wherein the first key is different from the second key.
Example 145 includes the subject matter of example 141, further comprising the acts of: converting the first A-DDIDs to a third view of the first A-DDIDs using a third key; converting the first A-DDIDs to a fourth view of the first A-DDIDs using a fourth key, wherein the third key is different from the fourth key, and wherein the third view is different from the fourth view.
Example 146 includes the subject matter of example 139, wherein a first one of the second dynamically changing, time-unique identifiers has a same value in the first data source and the second data source.
Example 147 includes the subject matter of example 139, wherein at least one of the third one of the one or more dynamically changing temporally unique identifiers comprises a first non-decryptable form.
Example 148 includes the subject matter of example 139, wherein at least one of the one or more dynamically changing temporally unique identifiers comprises a second non-decryptable form.
Example 149 includes the subject matter of example 147, wherein the first non-decryptable form comprises encrypted data.
Example 150 includes the subject matter of example 148, wherein the first non-decryptable form comprises encrypted data.
Example 151 includes the subject matter of example 139, wherein at least one of the one or more data sources comprises a particular subset, population, or queue of data subjects.
Example 152 includes the subject matter of example 139, wherein the one or more data sources all belong to a particular plurality of data subjects for a particular time period.
Example 153 includes the subject matter of example 141, wherein at least one of the one or more a-DDIDs comprises one of: a digital packet or a classified packet.
Example 154 includes the subject matter of example 141, wherein the one or more a-DDIDs comprise at least one of: a discrete value or a set of discrete values.
Example 155 is a system, comprising: a communication interface for transmitting data over a network; a memory having computer program code stored therein; one or more data stores; one or more processing units operatively coupled to the memory and configured to execute instructions in the computer program code that cause the one or more processing units to: obtaining a request from a first user to provide a privacy policy; determining (at least in part) a first privacy policy based on the request; obtaining data from a first user relating to a first plurality of data bodies; generating a first dynamically changing time-unique identifier (DDID) for a first data body of the first plurality of data bodies, wherein the first dynamically changing time-unique identifier is configured to: replacing a first value associated with the first data body; and obey the determined first privacy policy; storing a first dynamically changing, temporally unique identifier in one or more data stores; receiving, over a network, a first request for a first value related to a first data body; sending the first dynamically changing, temporally unique identifier over a network in response to the first request when the first request is not authorized to receive the first value in accordance with the first privacy policy. The first value is transmitted over the network in response to the first request when the first request is authorized to receive the first value in accordance with the first privacy policy.
Example 156 includes the subject matter of example 155, wherein the first dynamically changing, temporally unique identifier comprises a Replacement replace DDID (R-DDID).
Example 157 includes the subject matter of example 155, wherein the first dynamically changing, temporarily unique identifier comprises an Association DDID (a-DDID).
Example 158 includes the subject matter of example 156, wherein the R-DDID comprises a particular value to replace the first value.
Example 159 includes the subject matter of example 157, wherein the a-DDID comprises a particular value.
Example 160 includes the subject matter of example 159, wherein the particular value further comprises a category, queue, or range for replacing the first value.
Example 161 includes the subject matter of example 155, wherein at least one of the following conditions is satisfied: a first user providing a request for a privacy policy; data related to a first plurality of data topics; and receiving, by the shim, a first request for a first value.
Example 162 includes the subject matter of example 155, wherein the first value comprises a quasi-identifier.
Example 163 includes the subject matter of example 162, wherein the quasi-identifier comprises unstructured data.
Example 164 includes the subject matter of example 162, wherein the quasi-identifier comprises a category, a queue, or a range of values.
Example 165 includes the subject matter of example 155, wherein the privacy policy specifies generation of synthetic data.
Example 166 includes the subject matter of example 165, wherein the privacy policy further specifies DDIDs that generate the synthetic data.
Example 167 includes the subject matter of example 155, wherein at least some of the data obtained from the first user comprises synthetic data.
Example 168 includes the subject matter of example 155, wherein the data obtained from the first user includes only synthetic data.
Example 169 is a computer-implemented method, comprising: obtaining a request from a first user to provide a privacy policy; determining a first privacy policy based at least in part on the request; obtaining data from a first user relating to a first plurality of data bodies; generating a first dynamically changing time-unique identifier (DDID) for a first data body of the first plurality of data bodies, wherein the first dynamically changing time-unique identifier is configured to: replacing a first value relating to the first data body; and obey the determined first privacy policy; storing a first dynamically changing, temporally unique identifier in one or more data stores; receiving, over a network, a first request for a first value related to a first data body; when the first request is not authorized to receive the first value in accordance with the first privacy policy, a first dynamically changing, temporally unique identifier is transmitted over the network in response to the first request. The first value is sent over the network in response to the first request when the first request is authorized to receive the first value in accordance with the first privacy policy.
Example 170 includes the subject matter of example 169, wherein the first dynamically changing, temporally unique identifier comprises a Replacement DDID (R-DDID).
Example 171 includes the subject matter of example 169, wherein the first dynamically changing, temporally unique identifier comprises an Association DDID (a-DDID).
Example 172 includes the subject matter of example 169, wherein the R-DDID comprises a particular value to replace the first value.
Example 173 includes the subject matter of example 171, wherein the a-DDID includes a particular value.
Example 174 includes the subject matter of example 173, wherein the particular value further comprises a category, queue, or range of values to replace the first value.
Example 175 includes the subject matter of example 169, wherein at least one of: a request from a first user to provide a privacy policy; data relating to a first plurality of data bodies; a first request for a first value is received through a shim.
Example 176 includes the subject matter of example 169, wherein the first value comprises a quasi-identifier.
Example 177 includes the subject matter of example 176, wherein the quasi-identifier comprises unstructured data.
Example 178 includes the subject matter of example 176, wherein the quasi-identifier comprises a category, a queue, or a range of values.
Example 179 includes the subject matter of example 169, wherein the privacy policy specifies generation of the synthetic data.
Example 180 includes the subject matter of example 179, wherein the privacy policy further specifies DDIDs that generate the synthetic data.
Example 181 includes the subject matter of example 169, wherein at least some of the data obtained from the first user comprises synthetic data.
Example 182 includes the subject matter of example 169, wherein the data obtained from the first user includes only synthetic data.
Example 183 is a non-transitory program storage device readable by a programmable control device, comprising instructions stored thereon that, when executed, cause the programmable control device to: obtaining a request from a first user to provide a privacy policy; determining (at least in part) a first privacy policy based on the request; obtaining data from a first user relating to a first plurality of data bodies; generating a first dynamically changing time-unique identifier (DDID) for a first data body of a first plurality of data bodies, wherein the first dynamically changing, temporary unique identifier is configured to: replacing a first value associated with the first data body; and obey the determined first privacy policy; storing the first dynamically changing temporary unique identifier in one or more data stores; receiving, over a network, a first request for a first value associated with a first data body; transmitting a first dynamically changing, temporarily unique identifier over the network in response to the first request when the first request is not authorized to receive the first value in accordance with the first privacy policy; and when the first request is authorized to receive the first value according to the first privacy policy.
Example 184 includes the subject matter of example 183, wherein the first dynamically changing, temporally unique identifier comprises a Replacement DDID (R-DDID).
Example 185 includes the subject matter of example 183, wherein the first dynamically changing, temporally unique identifier comprises an Association DDID (a-DDID).
Example 186 includes the subject matter of example 185, wherein the R-DDID comprises a particular value to replace the first value.
Example 187 includes the subject matter of example 185, wherein the a-DDID includes a particular value.
Example 188 includes the subject matter of example 187, wherein the particular value further comprises a class, queue, or range of values to replace the first value.
Example 189 includes the subject matter of example 183, wherein at least one of the following is satisfied: a request from a first user to provide a privacy policy; data about a first plurality of data topics; and receiving, by the shim, a first request for a first value.
Example 190 includes the subject matter of example 183, wherein the first value comprises a quasi-identifier.
Example 191 includes the subject matter of example 190, wherein the quasi-identifier comprises unstructured data.
Example 192 includes the subject matter of example 190, wherein the quasi-identifier comprises a class, queue, or range of values.
Example 193 includes the subject matter of example 183, wherein the privacy policy specifies generation of synthetic data.
Example 194 includes the subject matter of example 193, wherein the privacy policy further specifies DDIDs for generating the synthetic data.
Example 195 includes the subject matter of example 183, wherein at least some of the data obtained from the first user comprises synthetic data.
Example 196 includes the subject matter of example 183, wherein the data obtained from the first user includes only synthetic data.
Example 197 is a system, comprising: a communication interface for transmitting data over a network; and a memory having stored therein computer program code and one or more distributed ledgers capable of recording data records; one or more processing units operatively coupled to the memory and configured to execute instructions in the computer program code that cause the one or more processing units to: obtaining data from a first user relating to a first data body; generating a first dynamically changing time-unique identifier (DDIDs) for the first data body, wherein the first DDIDs are configured to replace a first value associated with the first data body; storing the first DDIDs in a first element of a first of the one or more distributed ledgers; receiving, over a network from a first requestor, a first request for a first value related to a first data body; when the first requester is not authorized to receive the first value, transmitting the first DDIDs to the first requester through the network in response to the first request; when the first requestor is authorized to receive the first value, the first value associated with the first data is sent over the network to the first requestor over the network in response to the first request.
Example 198 includes the subject matter of example 197, the network being decentralized and comprising a plurality of nodes, each node storing a copy of a first of the one or more distributed ledgers.
Example 199 includes the subject matter of example 198, wherein a first one of the one or more distributed ledgers comprises a blockchain, and wherein the first element comprises a first block.
Example 200 includes the subject matter of example 197, wherein the one or more processing units are further configured to execute instructions in the computer program code, the instructions further causing the one or more processing units to: obtaining a request from the first user to provide a privacy policy; and determining, based on the first privacy policy, that the first DDID is further configured to comply with the determined scheme.
Example 201 includes the subject matter of example 197, wherein the first DDID points to a storage location containing a first value associated with the first data topic.
Example 202 includes the subject matter of example 201, wherein the one or more processing units are further configured to execute instructions in the computer program code, the instructions further causing the one or more processing units to: obtaining, from a first user, a request to modify a first value associated with data, the first value being a first modified value; and storing the first modified value in a storage location containing the first value associated with the data topic.
Example 203 includes the subject matter of example 197, wherein the first data subject matter includes a first executable term of the smart contract.
Example 204 is a computer-implemented method, comprising: obtaining data from a first user relating to a first data body; generating a first dynamically changing time-unique identifier (DDID) for the first data body, wherein the first DDID is configured to replace a first value associated with the first data body; storing the first DDID in a first element of a first of the one or more distributed ledgers; receiving, over a network, a first request by a first requestor for a first value associated with a first data body; sending a first DDID to a first requestor over a network in response to a first request at a first request when the first requestor is authorized to receive a first value, one party not being authorized to receive the first value; and sending the first value associated with the first data to the first requestor over the network at the request of the first requestor when the first requestor is authorized to receive the first value.
Example 205 includes the subject matter of example 204, the network being in a decentralized arrangement and comprising a plurality of nodes, each node storing a copy of a first of the one or more distributed ledgers, the first of the one or more distributed ledgers containing a blockchain, the first element comprising a first block.
Example 206 includes the subject matter of example 204, further comprising: obtaining a request from a first user to provide a privacy policy; and determine a first privacy policy based at least in part on the request, wherein the first DDID is further configured to comply with the determined first privacy policy.
Example 207 includes the subject matter of example 204, wherein the first DDID points to a storage location containing a first value associated with the first data subject.
Example 208 includes the subject matter of example 207, further comprising: obtaining a request from a first user to modify a first value associated with a data topic as a first modified value; and storing the first modified value in a storage location containing a first value associated with the data topic.
Example 209 includes the subject matter of example 204, wherein the first data subject matter includes a first executable term of the smart contract.
Example 210 is a non-transitory program storage device readable by a programmable control device, comprising instructions stored thereon to cause the programmable control device to: obtaining data from a first user associated with a first data body; is a first data body, wherein the first DDID is configured to replace a first value associated with the first data body; storing the first DDID in a first element of one or more distributed ledgers; receiving a first request from a first requestor over a network, requesting a first value associated with a first data body; sending a first DDID to a first requestor over a network in response to a first request by the first requestor when the first requestor is not authorized to receive a first value; when the first requestor is authorized, a first value associated with the first data is sent to the first requestor over the network to receive the first value in response to a first request by the first requestor.
Example 211 includes the subject matter of example 210, the network being in a decentralized arrangement and comprising a plurality of nodes, each node storing a copy of a first of the one or more distributed ledgers.
Example 212 includes the subject matter of example 210, wherein a first of the one or more distributed ledgers comprises a blockchain, and wherein the first element comprises a first block.
Example 213 includes the subject matter of example 210, wherein the instructions further comprise instructions that, when executed, further cause the programmable control device to: obtaining a request from a first user to provide a privacy policy; and instructions that determine a first privacy policy based at least in part on the request, wherein the first DDID is further configured to comply with the determined first privacy policy.
Example 214 includes the subject matter of example 210, wherein the first DDID points to a storage location containing a first value associated with the first data subject.
Example 215 includes the subject matter of example 214, wherein the instructions further comprise instructions that, when executed, further cause the programmable control device to: instructions to obtain a request from a first user to modify a first value associated with a data topic to be modified; and storing the first modified value in a storage location containing the first value associated with the data topic.
Example 216 includes the subject matter of example 210, wherein the first data subject matter includes a first executable term of the smart contract.
While the methods disclosed herein have been described and illustrated with reference to particular operations performed in a particular order, it will be appreciated that these operations may be combined, sub-divided, or reordered to form an equivalent method without departing from the spirit of the invention. Accordingly, unless specifically indicated herein, the order and grouping of the operations is not a limitation of the present invention. For example, as a non-limiting example, in alternative embodiments, some of the operations described herein may be rearranged and performed in a different order than described herein.
It should be appreciated that reference throughout this specification to "one embodiment" or "an instance" or "one example" or "an instance" means that a particular feature, structure or characteristic described in connection with the example may be included in at least one embodiment of the present invention if desired. Thus, it should be appreciated that two or more references to "one embodiment" or "an instance" or "one example" or "an alternate embodiment" in various portions of this specification are not necessarily all references to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined as desired in one or more embodiments of the invention.
It is noted that, as used herein, the term "browser" may refer not only to a browser for a network, but also to a programmable display engine such as that used in X-Windows, for example. Remote display tools, such as tools for desktop virtualization; or user interfaces of applications on the device, where such interfaces may be text and/or multimedia messaging with other parties (e.g., facebook Messenger, whatsApp, snapchat, wickr, cyberdust or any other user or enterprise application that provides such functionality). As used herein, the term "network" refers not only to the World Wide Web (WWW), but may also refer to documents linked, e.g., by plain text, or interconnected devices, which may be spread over multiple entities or within a single entity. An entity (e.g., an Intranet). As used herein, "device" may refer to a physical device or a "virtual" device, e.g., a Virtual Machine (VM) or a microservice hosted by nodeJS. It is also understood that a server may be comprised of multiple components on different computers or devices and/or multiple components within the same computer or device. Similarly, a client may be comprised of multiple components on different computers or devices and/or multiple components within the same computer or device. Although the server and client may communicate over a channel such as the Internet, they may also communicate using, for example, remote Procedure Calls (RPCs) and/or operating system Application Programming Interfaces (APIs).
It should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the presentation of various aspects of the disclosure. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed invention requires more features than are expressly recited in each claim. Rather, inventive aspects lie in less than all features of a single foregoing disclosed embodiment, and each embodiment described herein may contain more than one inventive feature.
While the invention has been particularly shown and described with respect to examples thereof, it will be understood by those skilled in the art that various other changes in form and details may be made therein without departing from the spirit and scope of the invention.

Claims (196)

1. A system, comprising:
a communication interface for transmitting data over a network; and
a memory having computer program code stored therein;
the one or more processing units are operatively coupled to the memory and configured to execute instructions in the computer program code that cause the one or more processing units to:
Generating one or more dynamically changing time-unique identifiers;
receiving, over a network from a first client, a first request for a generated identifier related to a first data body;
associating, in response to the first request, the first generated identifier with the first data body;
generating first time period data, wherein the first time period data includes information defining a first time period in which a first data body can be identified using a first generated identifier;
storing the first generated identifier and the first time period data in a memory; and
the generated first identifier is sent to the first client over the network.
2. The system of claim 1, wherein the instructions in the computer program code further cause the one or more processing units to:
one or more data attributes are associated with the first generated identifier.
3. The system of claim 2, wherein at least one of the one or more data attributes associated with the first generated identifier is related to an operation, activity, procedure, purpose, identity, or characteristic of the first data subject.
4. The system of claim 3, wherein the instructions in the computer program code further cause the one or more processing units to:
Receiving one or more second requests from the second client over the network, the data attributes associated with the first generated identifier for a first period of time;
determining that the second request is authorized; and
the second client is granted the network to determine the identifier associated with the first generation within the first time period.
5. The system of claim 1, wherein the instructions in the computer program code further cause the one or more processing units to:
the first generated identifier is associated with the second data body over the first time period or the second time period.
6. The system of claim, wherein the instructions in the computer program code further cause the one or more processing units to:
associating the second generated identifier with the first data body in response to the first request;
generating second time period data, wherein the second time period data includes information defining a second time period during which the second generated identifier is usable to identify the first data body;
storing the second generated identifier and the second time period data in the memory; and
the second generated identifier is transmitted to the first client over the network.
7. The system of claim 6, wherein the instructions in the computer program code further cause the one or more processing units to:
associating one or more data attributes with the second generated identifier,
wherein at least one of the one or more data attributes associated with the second generated identifier is related to an action, activity, procedure, purpose, identification, or characteristic of the first data body.
8. The system of claim 7, wherein at least one of the one or more data attributes associated with the first generated identifier is different from at least one of the one or more data attributes associated with the second generated identifier.
9. The system of claim 3, wherein the instructions in the computer program code further cause the one or more processing units to:
associating the first generated identifier with the second data body for a second time period,
wherein the at least one of the one or more data attributes associated with the first generated identifier during the first time period is the same as one of the one or more data attributes associated with the first generated identifier during the second time period.
10. The system of claim 1, wherein the instructions in the computer program code further cause the one or more processing units to:
receiving, over the network, a second identifier associated with a second data body from a second client;
associating the second identifier with the second data body;
generating second time period data, wherein the second time period data includes information defining a second time period in which a second data body can be identified using a second identifier; and
the second identifier and the second time period data are stored in a memory.
11. The system of claim 4, wherein the instructions in the computer program code further cause the one or more processing units to:
the ability of the second client to determine that the requested one or more data attributes are associated is withdrawn over the network, using the first generated identifier for a second period of time.
12. A non-transitory computer-readable medium comprising computer-executable instructions stored thereon to cause one or more processing units to:
generating one or more dynamically changing temporally unique identifiers;
receiving, from a first client over a network, a first request for a generated identifier relating to a first data body;
In response to the first request, associating the first generated identifier with the first data body;
generating first time period data, wherein the first time period data includes information defining a first time period in which a first data body can be identified using a first generated identifier;
storing the first generated identifier and the first time period data in a memory; and
the generated first identifier is sent to the first client over the network.
13. The non-transitory computer-readable medium of claim 12, wherein the instructions further cause the one or more processing units to:
one or more data attributes are associated with the first generated identifier.
14. The non-transitory computer-readable medium of claim 13, wherein at least one of the one or more data attributes associated with the first generated identifier relates to an action, activity, process, purpose, identity, or characteristic of the first data subject.
15. The non-transitory computer-readable medium of claim 14, wherein the instructions further cause the one or more processing units to:
receiving, over the network from the second client, a second request for at least one of the one or more data attributes associated with the first generated identifier over the first time period;
Determining that the request is authorized; and is
The second client is granted, over the network, the ability to determine the requested one or more data attributes associated with the first generated identifier within a first time period.
16. The non-transitory computer-readable medium of claim 12, wherein the instructions further cause the one or more processing units to:
the first generated identifier is associated with the second data body over a second time period.
17. The non-transitory computer-readable medium of claim 12, wherein the instructions further cause the one or more processing units to:
the first generated identifier is associated with the second data body over a first time period.
18. The non-transitory computer-readable medium of claim 12, wherein the instructions further cause the one or more processing units to:
associating the second generated identifier with the first data body in response to the first request;
generating second time period data, wherein the second time period data includes information defining a second time period in which the first data body can be identified using the second generated identifier;
Storing the second generated identifier and the second time period data in the memory; and
the second generated identifier is sent to the first client over the network.
19. The non-transitory computer readable medium of claim 18, wherein the first time period and the second time period do not overlap.
20. The non-transitory computer readable medium of claim 18, wherein the first time period and the second time period at least partially overlap.
21. The non-transitory computer-readable medium of claim 18, wherein the instructions further cause the one or more processing units to:
associating one or more data attributes with the second generated identifier,
wherein at least one of the one or more data attributes associated with the second generated identifier is related to an action, activity, procedure, purpose, identification, or characteristic of the first data body.
22. The non-transitory computer-readable medium of claim 21, wherein at least one of the one or more data attributes associated with the first generated identifier is different from at least one of the one or more data attributes associated with the second generated identifier.
23. The non-transitory computer-readable medium of claim 14, wherein the instructions further cause the one or more processing units to:
Associating the first generated identifier with the second data body for a second time period,
wherein the at least one of the one or more data attributes associated with the first generated identifier during the first time period is the same as one of the one or more data attributes associated with the first generated identifier during the second time period.
24. The non-transitory computer-readable medium of claim 12, wherein the instructions further cause the one or more processing units to:
receiving, over the network, a second identifier associated with a second data body from a second client;
associating the second identifier with the second data body;
generating second time segment data, wherein the second time segment data includes information defining a second time segment during which a second identifier is usable to identify a second data subject; and is
The second identifier and the second time period data are stored in the memory.
25. The non-transitory computer readable medium of claim 24, wherein the second identifier comprises an HTTP cookie.
26. The non-transitory computer-readable medium of claim 12, wherein the instructions further cause the one or more processing units to:
Receiving, over the network, a second request from the second client for identification of a first data body associated with the first generated identifier over a first time period;
determining that the second request is authorized; and
the second client is granted, over the network, the ability to determine an identity of the first data subject for a first time period.
27. The non-transitory computer-readable medium of claim 26, wherein the instructions further cause the one or more processing units to:
the ability of the second client to determine the identity of the first data subject over the first time period is revoked over the network.
28. The non-transitory computer-readable medium of claim 15, wherein the instructions further cause the one or more processing units to:
revoking, over the network, the ability of the second client to determine the one or more data attributes of the request associated with the first generated identifier within a second time period.
29. The non-transitory computer-readable medium of claim 13, wherein the first generated identifier is not mathematically derived from any of the one or more data attributes associated with the first generated identifier.
30. The non-transitory computer-readable medium of claim 12, wherein the first generated identifier comprises a primary identifier of the first data body.
31. A system, comprising:
a communication interface for transmitting data over a network;
a memory having computer program code stored therein;
the one or more processing units are operatively coupled to the memory and configured to execute instructions in the computer program code that cause the one or more processing units to:
generating a first temporally unique identifier;
associating a first temporally unique identifier with a first data body;
associating one or more data attributes with an identifier that is unique at a first time;
generating first time period data, wherein the first time period data comprises information defining a first time period in which a first data body can be identified and associated one or more data attributes retrieved using a first temporally unique identifier;
storing in a memory a first temporally unique identifier, one or more data attributes, and first time period data; and
the first temporally unique identifier and the one or more data attributes are transmitted to the first client over a network.
32. The system of claim 31, wherein the instructions for generating the first temporally unique identifier are performed based on at least one of: time, purpose and location.
33. The system of claim 31, wherein the instructions in the computer program code further cause the one or more processing units to:
the ability to terminate the first temporally unique identifier to identify the first data body and retrieve the associated one or more data attributes.
34. The system of claim 33, wherein the instructions to terminate the ability to identify the first data body and retrieve the associated one or more data attributes with the first temporally unique identifier are executed based on at least one of: time, purpose, and location.
35. The system of claim 31, wherein at least one of the one or more data attributes associated with the first temporally unique identifier relates to an action, activity, procedure, purpose, identification, or characteristic of the first data body.
36. The system of claim 31, wherein the instructions in the computer program code further cause the one or more processing units to:
The first temporally unique identifier is associated with the second data body over a second time period.
37. The system of claim 31, wherein the instructions in the computer program code further cause the one or more processing units to:
a first temporally unique identifier is associated with the second data body over a first time period.
38. The system of claim 31, wherein the instructions in the computer program code further cause the one or more processing units to:
receiving a first request from a second client over the network for a first time period requesting identification of a first data body associated with a first temporally unique identifier;
determining that the first request is authorized; and
the second client is granted, over the network, the ability to determine an identity of the first data subject for a first time period.
39. The system of claim 38, wherein the instructions in the computer program code further cause the one or more processing units to:
revoking, over the network, an ability of the second client to determine an identity of the first data subject for the first time period.
40. The system of claim 31, wherein the instructions in the computer program code further cause the one or more processing units to:
Receiving, over a network, one or more requests from a second client, a data attribute associated with an identifier unique at a first time over a first time period;
determining that the first request is authorized; and is
The second client is granted the capability over the network to determine the requested one or more data attributes associated with the identifier unique at the first time within the first time period.
41. The system of claim 40, wherein the instructions in the computer program code further cause the one or more processing units to:
the ability of the second client to determine the requested one or more data is withdrawn through the network. An attribute associated with an identifier that is unique at a first time during a first time period.
42. The system of claim 31, wherein the first temporally unique identifier is not mathematically derived from any of the one or more data attributes associated with the first temporally unique identifier.
43. The system of claim 31, wherein the first temporally unique identifier comprises a primary identifier of the first data body.
44. A non-transitory computer-readable medium comprising computer-executable instructions stored thereon that cause one or more processing units to:
Generating a first temporally unique identifier;
associating a first temporally unique identifier with a first data body;
associating one or more data attributes with an identifier that is unique at a first time;
generating first time period data, wherein the first time period data comprises information defining a first time period during which a first temporally unique identifier is available to identify a first data body and retrieve associated one or more data attributes;
storing in a memory a first temporally unique identifier, one or more data attributes, and first time period data; and
the first temporally unique identifier and the one or more data attributes are transmitted to the first client over a network.
45. The non-transitory computer readable medium of claim 44, wherein the instructions to generate the first temporally unique identifier are performed based on at least one of: time, purpose and location.
46. The non-transitory computer-readable medium of claim 44, wherein the instructions further cause the one or more processing units to:
the ability to terminate the first temporally unique identifier to identify the first data body and retrieve the associated one or more data attributes.
47. The non-transitory computer readable medium of claim 46, wherein the instructions to terminate the ability of the first temporally unique identifier to identify the first data body and retrieve the associated one or more data attributes are performed in accordance with at least one of time, purpose, and location.
48. The non-transitory computer-readable medium of claim 44, wherein at least one of the one or more data attributes associated with the first temporally unique identifier relates to an action, activity, procedure, purpose, identification, or characteristic of the first data body.
49. The non-transitory computer-readable medium of claim 44, wherein the instructions further cause the one or more processing units to:
the first temporally unique identifier is associated with the second data body over a second time period.
50. The non-transitory computer-readable medium of claim 44, wherein the instructions further cause the one or more processing units to:
a first temporally unique identifier is associated with the second data body over a first time period.
51. The non-transitory computer-readable medium of claim 44, wherein the instructions further cause the one or more processing units to:
Receiving, over a network, a first request from a second client for an identification of a first data body associated with a first temporally unique identifier over a first time period;
determining that the first request is authorized; and
the second client is granted, over the network, the ability to determine an identity of the first data subject for a first time period.
52. The non-transitory computer-readable medium of claim 51, wherein the instructions further cause the one or more processing units to:
revoking, over the network, an ability of the second client to determine an identity of the first data subject for the first time period.
53. The non-transitory computer-readable medium of claim 44, wherein the instructions further cause the one or more processing units to:
receiving, over a network from a second client, a first request for one or more data attributes associated with a first temporally unique identifier over a first time period;
determining that the first request is authorized; and is
The second client is granted, over the network, the ability to determine the requested one or more data attributes associated with the identifier unique at the first time within the first time period.
54. The non-transitory computer-readable medium of claim 53, wherein the instructions further cause the one or more processing units to:
revoking, over a network, an ability of a second client to determine one or more data attributes of a request associated with an identifier unique at a first time within a first time period.
55. The non-transitory computer-readable medium of claim 44, wherein the first temporally unique identifier is not mathematically derived from any of one or more data attributes associated with the first temporally unique identifier.
56. The non-transitory computer-readable medium of claim 44, wherein the first temporally unique identifier comprises a primary identifier of the first data body.
57. An apparatus, comprising:
a user interface;
a communication interface for transmitting data over a network;
a memory having computer program code stored therein;
the one or more processing units are operatively coupled to the memory and configured to execute instructions in the computer program code that cause the one or more processing units to:
requesting, from a first privacy server over a network, an identifier that is unique at a first time;
Associating a first temporally unique identifier with a first data body that is a user of the device;
associating one or more data attributes with an identifier that is unique at a first time;
generating first time period data, wherein the first time period data comprises information defining a first time period within which a first temporally unique identifier is available to identify a first data body and retrieve associated one or more data attributes;
storing in a memory a first temporally unique identifier, one or more data attributes, and first time period data; and
in response to determining that the first condition is satisfied, a first temporally unique identifier, first time period data, and one or more data attributes are transmitted over the network to the first privacy server.
58. The apparatus of claim 57, wherein determining that the first condition has been met comprises at least one of:
a predetermined amount of time has elapsed;
a flexible amount of time has elapsed;
the purpose of the unique identifier at the first time has expired; or
The location of the first data body has changed.
59. The apparatus of claim 57, wherein the instructions in the computer program code further cause the one or more processing units to:
One or more data attributes associated with the first temporally unique identifier are modified.
60. The apparatus of claim 57, wherein the instructions in the computer program code further cause the one or more processing units to:
use of a unique identifier at a first time is tracked.
61. The apparatus of claim 57, wherein the instructions in the computer program code further cause the one or more processing units to:
revoking the ability of the first temporally unique identifier to identify the first data body and retrieve the associated one or more data attributes.
62. The device of claim 57, wherein the device is located on the same computing device as the privacy server.
63. The apparatus of claim 57, wherein the instructions in the computer program code further cause the one or more processing units to:
in response to a change in the first temporally unique identifier, transmitting the first time period data or the one or more data attributes, at least one of the first temporally unique identifier, the first time period data, and the one or more data attributes on the network to one or more client devices that have registered with a first privacy server to be synchronized with the device.
64. The apparatus of claim 57, wherein the first temporally unique identifier, the first time period data, and the one or more data attributes are transmitted over a network to the first privacy server in the form of an HTTP cookie.
65. The apparatus of claim 57, wherein the first temporally unique identifier is not mathematically derived from any of one or more data attributes associated with the first temporally unique identifier.
66. The apparatus of claim 57, wherein the first temporally unique identifier comprises a primary identifier of the first data body.
67. A non-transitory computer-readable medium comprising computer-executable instructions stored thereon to cause one or more processing units to perform operations comprising:
requesting, from a first privacy server over a network, an identifier that is unique at a first time;
associating a first temporally unique identifier with a first data body that is a user of a first client device;
associating one or more data attributes with an identifier that is unique at a first time;
generating first time period data, wherein the first time period data comprises information defining a first time period in which a first data body can be identified and associated one or more data attributes retrieved using a first temporally unique identifier;
Storing, in a memory of a first client device, a first temporally unique identifier, one or more data attributes, and first time period data; and is
In response to determining that the first condition is satisfied, a first temporally unique identifier, first time period data, and one or more data attributes are transmitted over the network to the first privacy server.
68. The non-transitory computer-readable medium of claim 67, wherein determining that the first condition has been met comprises at least one of:
a predetermined amount of time has elapsed;
a flexible amount of time has elapsed;
the purpose of the unique identifier at the first time has expired; or
The location of the first data body has changed.
69. The non-transitory computer-readable medium of claim 67, wherein the instructions further cause the one or more processing units to:
one or more data attributes associated with the first temporally unique identifier are modified.
70. The non-transitory computer-readable medium of claim 67, wherein the instructions further cause the one or more processing units to:
use of a unique identifier at a first time is tracked.
71. The non-transitory computer-readable medium of claim 67, wherein the instructions further cause the one or more processing units to:
revoking the ability of the first temporally unique identifier to identify the first data body and retrieve the associated one or more data attributes.
72. The non-transitory computer-readable medium of claim 67, wherein the first client device is located on a same computing device as the privacy server.
73. The non-transitory computer-readable medium of claim 67, wherein the instructions further cause the one or more processing units to:
in response to a change in the unique identifier at the first time, the first time period data, or the one or more data attributes, transmitting at least one of: the first temporally unique identifier, the first time period data, and the one or more data attributes on the network to one or more client devices that have registered with a first privacy server to be synchronized with the first client device.
74. The non-transitory computer readable medium of claim 67, wherein the first temporally unique identifier, the first time period data, and the one or more data attributes are transmitted over the network to the first privacy server in the form of an HTTP cookie.
75. The non-transitory computer-readable medium of claim 67, wherein the first temporally unique identifier is not mathematically derived from any of one or more data attributes associated with the first temporally unique identifier.
76. The non-transitory computer-readable medium of claim 67, wherein the first temporally unique identifier comprises a primary identifier of the first data body.
77. An apparatus, comprising: a user interface;
a communication interface for transmitting data over a network;
a memory having computer program code stored therein; and
one or more processing units operatively coupled to the memory and configured to execute instructions in the computer program code that cause the one or more processing units to:
obtaining, over a network, a first temporally unique identifier from a first privacy server, wherein the first temporally unique identifier is associated with a first data body that is a user of a device at the first privacy server during a first time period;
associating one or more data attributes with an identifier that is unique at a first time;
Generating first time period data, wherein the first time period data comprises information defining a first time period in which a first data body may be identified and associated one or more data attributes retrieved using a first temporally unique identifier;
storing in a memory a first temporally unique identifier, one or more data attributes, and first time period data;
transmitting, over a network, a first temporally unique identifier, first time period data, and one or more data attributes to a first privacy server; and is
A second temporally unique identifier is received from the first privacy server over the network, wherein the second temporally unique identifier is associated with the first data body and the one or more data attributes at the first privacy server during a second time period.
78. An apparatus as recited in claim 77, wherein, in response to a determination that the first condition has been met, instructions in the computer program code that cause the one or more processing units to receive, over the network, a second temporally unique identifier from the first privacy server are executed.
79. The apparatus of claim 78, wherein determining that the first condition has been met comprises at least one of:
A predetermined amount of time has elapsed;
a flexible amount of time has elapsed;
the purpose of the unique identifier at the first time has expired; or
The location of the first data body has changed.
80. The apparatus of claim 77, wherein the instructions in the computer program code further cause the one or more processing units to:
one or more data attributes associated with the first temporally unique identifier are modified.
81. The apparatus of claim 77, wherein the instructions in the computer program code further cause the one or more processing units to:
use of a unique identifier at a first time is tracked.
82. The apparatus of claim 77, wherein the instructions in the computer program code further cause the one or more processing units to:
revoking the ability of the first temporally unique identifier to identify the first data body and retrieve the associated one or more data attributes.
83. The apparatus of claim 77, wherein the instructions in the computer program code further cause the one or more processing units to:
requesting confirmation from the first privacy server whether the identity or one or more data attributes of the first data subject may be revealed to the first requestor; and is
The identity or one or more data attributes of the first data subject are sent to the first requestor in response to receiving confirmation from the first privacy server that the identity or one or more data attributes of the first data subject may be revealed to the first requestor.
84. The apparatus of claim 83, wherein the requested confirmation further includes information as to whether the identity or one or more data attributes and a time period or location of the first data subject may be disclosed to the first requestor for the particular request.
85. The apparatus of claim 83, wherein the requested confirmation further includes information as to whether the identity or one or more data attributes and a time period or location of the first data subject may be disclosed to the first requestor for the particular request.
86. The apparatus of claim 84, wherein the requested confirmation further comprises information as to whether the identity or one or more data attributes and a time period or location of the first data subject may be revealed to the first requestor for the particular request.
87. A system, comprising:
a communication interface for transmitting data over a network;
a memory having computer program code stored therein; and
The one or more processing units are operatively coupled to the memory and configured to execute instructions in the computer program code that cause the one or more processing units to:
generating one or more dynamically changing time-unique identifiers;
receiving, over a network, a first request from a first data body for a generated dynamically changing temporally unique identifier, the identifier relating to an attribute of the first data body;
in response to the first request, associating the first generated dynamically changing time-unique identifier with an attribute of the first data body;
converting the first generated value of the dynamically changing time-unique identifier into a first non-decryptable form,
wherein the first key is usable to convert the first non-decryptable form back to the first generated dynamically changing first view of the temporally unique identifier,
wherein the second key is usable to convert the first non-decryptable form back to the first generated second view of the dynamically changing time-unique identifier,
wherein the first key is different from the second key, and
wherein the first view is different from the second view;
storing in memory a first generated dynamically changing time-unique identifier, a first key, a second key, and a first non-decryptable form; and is
The first non-decryptable form is sent to the first data body over a network.
88. The system of claim 87, wherein the first view provides more detail than the second view.
89. The system of claim 87, wherein the non-decryptable form comprises encrypted text.
90. The system of claim 87, wherein the instructions in the computer program code further comprise instructions to cause the one or more processing units to also associate the first generated dynamically changing time-unique identifier with an attribute of the second data body.
91. The system of claim 90, wherein the instructions in the computer program code cause the one or more processing units to also associate the first generated dynamically changing time-unique identifier with the attribute of the second data body. At least one of the following:
a time other than the first generated dynamically changing temporally unique identifier is associated with an attribute of the first data body;
at a physical or virtual location different from the first generated dynamically changing temporally unique identifier, the identifier being associated with an attribute of the first data body; and
a temporally unique identifier is associated with an attribute of the first data body for a purpose different from that of the first generated dynamic change.
92. The system of claim 87, wherein the instructions in the computer program code further comprise instructions to cause the one or more processing units to:
the second generated dynamically changing temporally unique identifier is associated with an attribute of the first data body.
93. The system of claim 92, wherein the instructions that cause the one or more processing units to associate the second generated dynamically changing temporally unique identifier with the attribute of the first data body are executed in at least one of:
associating with an attribute of the first data body at a time different from the first generated dynamically changing time-unique identifier;
at a physical or virtual location different from the first generated dynamically changing temporally unique identifier, the identifier being associated with an attribute of the first data body; and
a temporally unique identifier is associated with an attribute of the first data body for a purpose different from that of the first generated dynamic change.
94. A non-transitory computer-readable medium comprising computer-executable instructions stored thereon to cause one or more processing units to:
Generating one or more dynamically changing temporally unique identifiers;
receiving, over a network, a first request for the generated dynamically changing time-unique identifier from a first data body, the identifier relating to an attribute of the first data body;
in response to the first request, associating the first generated dynamically changing time-unique identifier with an attribute of the first data body;
converting the first generated value of the dynamically changing time-unique identifier into a first non-decryptable form,
wherein the first key is operable to convert the first non-decryptable form back to the first view of the first generated dynamically changing up-time unique identifier,
wherein the second key is usable to convert the first non-decryptable form back to the first generated second view of the dynamically changing time-unique identifier,
wherein the first key is different from the second key, and
wherein the first view is different from the second view;
storing in memory a first generated, dynamically changing, time unique identifier, a first key, a second key, and a first non-decryptable form; and is
The first non-decryptable form is sent to the first data body over a network.
95. The non-transitory computer-readable medium of claim 94, wherein the first view provides more detail than the second view.
96. The non-transitory computer readable medium of claim 94, wherein the non-decryptable form comprises unencrypted text.
97. The non-transitory computer-readable medium of claim 94, wherein the instructions further comprise instructions that cause the one or more processing units to further associate the first generated dynamically changing time-unique identifier with an attribute of the second data body.
98. The non-transitory computer-readable medium of claim 97, wherein the instructions that cause the one or more processing units to further associate the first generated dynamically changing time-unique identifier with the attribute of the second data body are performed at least one of:
a temporally unique identifier associated with an attribute of the first data body, distinct from the time of the first generated dynamic change;
at a physical or virtual location different from the first generated dynamically changing temporally unique identifier, the identifier being associated with an attribute of the first data body; and
a temporally unique identifier is associated with an attribute of the first data body for a purpose different from that of the first generated dynamic change.
99. The non-transitory computer-readable medium of claim 94, wherein the instructions further comprise instructions to cause the one or more processing units to:
the second generated dynamically changing temporally unique identifier is associated with an attribute of the first data body.
100. The non-transitory computer-readable medium of claim 99, wherein the instructions that cause the one or more processing units to associate the second generated dynamically changing temporally unique identifier with the attribute of the first data body are performed at least one of:
different from the time of the dynamic change of the first generation, a temporally unique identifier is associated with the attribute of the first data body;
at a physical or virtual location different from the first generated dynamically changing temporally unique identifier, the identifier being associated with an attribute of the first data body; and
a temporally unique identifier is associated with an attribute of the first data body for a purpose different from that of the first generated dynamic change.
101. A computer-implemented method, comprising:
generating one or more dynamically changing time-unique identifiers;
receiving, over a network, a first request from a first data body for a generated dynamically changing temporally unique identifier, the identifier relating to an attribute of the first data body;
In response to the first request, associating the first generated dynamically changing time-unique identifier with an attribute of the first data body;
converting the first generated value of the dynamically changing time-unique identifier into a first non-decryptable form,
wherein the first key is usable to convert the first non-decryptable form back to the first view of the first generated dynamically changing temporally unique identifier,
wherein the second key is usable to convert the first non-decryptable form back to the first generated second view of the dynamically changing time-unique identifier,
wherein the first key is different from the second key, and
wherein the first view is different from the second view;
storing in memory a first generated dynamically changing time-unique identifier, a first key, a second key, and a first non-decryptable form; and is provided with
The first non-decryptable form is sent to the first data body over a network.
102. The computer-implemented method of claim 101, wherein the first view provides more detail than the second view.
103. The computer-implemented method of claim 101, further comprising associating the first generated dynamically changing temporally unique identifier with an attribute of the second data body.
104. The computer-implemented method of claim 103 wherein the act of associating the first generated dynamically changing temporally unique identifier with the attribute of the second data body is performed in at least one of:
the first generated dynamically changing time-unique identifier is associated with an attribute of the first data body;
at a physical or virtual location different from the first generated dynamically changing temporally unique identifier, the identifier being associated with an attribute of the first data body; and
a temporally unique identifier is associated with an attribute of the first data body for a purpose different from that of the first generated dynamic change.
105. The computer-implemented method of claim 101, further comprising: associating the second generated dynamically changing temporally unique identifier with an attribute of the first data computer-implemented method.
106. The computer-implemented method of claim 105 wherein the act of associating the second generated dynamically changing time-unique identifier with an attribute of the first data body is performed in at least one of:
at a time different from the dynamic change of the first generation, a temporally unique identifier is associated with an attribute of the first data body;
At a physical or virtual location different from the first generated dynamic change, a temporally unique identifier is associated with an attribute of the first data body; and is
A temporally unique identifier is associated with the attributes of the first data body for a purpose different from the dynamic changes of the first generation.
107. A system, comprising:
a communication interface for transmitting data over a network;
a memory having computer program code stored therein;
one or more data sources;
one or more processing units operatively coupled to the memory and configured to execute instructions in the computer program code that cause the one or more processing units to:
obtaining data relating to a first plurality of data subjects from each of one or more data sources;
generating a first dynamically changing time-unique identifier for a first data body of the first plurality of data bodies, wherein the first data body is in each of a first data source and a second data source of the one or more data sources;
generating one or more second dynamically changing temporally unique identifiers corresponding to the one or more quasi identifiers in each of the first data source and the second data source, wherein each quasi identifier has a value;
Receiving, over a network, a first request for values of one or more quasi-identifiers in a first data source;
receiving, over the network, a second request for values of one or more quasi-identifiers in a second data source;
converting the value obtained from the first request into one or more third dynamically changing temporally unique identifiers;
converting the value obtained from the second request into one or more fourth dynamically changing time-unique identifiers;
storing in a memory: a first dynamically changing time-unique identifier;
a second dynamically changing time-unique identifier;
one or more third dynamically changing time-unique identifiers;
one or more fourth dynamically changing time-unique identifiers; and is
Transmitting a first dynamically changing time-unique identifier;
a second dynamically changing time-unique identifier;
one or more third dynamically changing time-unique identifiers;
one or more dynamically changing time-unique identifiers on the network.
108. The system of claim 107, wherein the first dynamically changing temporally unique identifier comprises a Replacement DDID (R-DDID).
109. The system of claim 108, wherein the one or more third dynamically changing time-unique identifiers comprise an Association DDID (a-DDID).
110. The system of claim 107 wherein the R-DDID comprises a particular value.
111. The system of claim 107, wherein each a-DDID comprises a particular value.
112. The system of claim 109, wherein the instructions further cause the one or more processing units to:
converting the R-DDID to a first view of the R-DDID using a first bond; and
the R-DDID is converted to a second view of the R-DDID using a second key, wherein the first key is different from the second key.
113. The system of claim 109, wherein the instructions further cause the one or more processing units to:
converting the first one of the A-DDIDs to a third view of the first one of the A-DDIDs using a third key; and
the first A-DDID is converted to a fourth view of the first A-DDID using a fourth key, where the third key is different from the fourth key, and the third view is different from the fourth view.
114. The system of claim 107, wherein a first one of the second dynamically changing time-unique identifiers has the same value in the first data source and the second data source.
115. The system of claim 107, wherein at least one of the one or more third dynamically changing time-unique identifiers comprises a first non-decryptable form.
116. The system of claim 107, wherein at least one of the one or more fourth dynamically changing time-unique identifiers comprises a second non-decryptable form.
117. The system of claim 115, wherein the first non-decryptable form comprises encrypted data.
118. The system of claim 116, wherein the first non-decryptable form comprises encrypted data.
119. The system of claim 107, wherein at least one or more data sources comprises a particular subset, population, or homogeneous body of data.
120. The system of claim 107, wherein each of the one or more data sources belongs to a particular plurality of data bodies for a particular period of time.
121. The system of claim 109, wherein at least one of the one or more a-DDIDs comprises one of: a digital packet or a classified packet.
122. The system of claim 109, wherein the one or more a-DDIDs comprise at least one of: a discrete value or a set of discrete values.
123. A non-transitory computer-readable medium comprising computer-executable instructions stored thereon to cause one or more processing units to:
Obtaining data relating to a first plurality of data subjects from each of one or more data sources;
generating a first dynamically changing time-unique identifier for a first data body of the first plurality of data bodies, wherein the first data body is in each of a first data source and a second data source of the one or more data sources;
generating one or more second dynamically changing, temporally unique identifiers corresponding to the one or more quasi-identifiers in each of the first data source and the second data source, wherein each quasi-identifier has a value;
receiving, over a network, a first request for values of one or more quasi-identifiers in a first data source;
receiving, over the network, a second request for values of one or more quasi-identifiers in a second data source;
converting the value obtained from the first request into one or more third dynamically changing temporally unique identifiers;
converting the value obtained from the second request into one or more fourth dynamically changing time-unique identifiers;
storing in a memory: a first dynamically changing time-unique identifier;
a second dynamically changing time unique identifier;
One or more third dynamically changing time-unique identifiers;
one or more fourth dynamically changing time-unique identifiers; and is
Transmitting a first dynamically changing time unique identifier;
a second dynamically changing time-unique identifier;
one or more third dynamically changing time-unique identifiers;
one or more dynamically changing time-unique identifiers on the network.
124. The non-transitory computer-readable medium of claim 123, wherein the first dynamically changing time-unique identifier comprises a Replacement DDID (R-DDID).
125. The non-transitory computer-readable medium of claim 124, wherein the one or more dynamically changing time-unique third identifiers comprise an Association DDID (a-DDID).
126. The non-transitory computer readable medium of claim 123, wherein the R-DDID includes a particular value.
127. The non-transitory computer-readable medium of claim 123, wherein each a-DDID includes a particular value.
128. The non-transitory computer-readable medium of claim 125, wherein the instructions further cause the one or more processing units to:
Converting the R-DDID to a first view of the R-DDID using a first key; and
the R-DDID is converted to a second view of the R-DDID using a second key, wherein the first key is different from the second key.
129. The non-transitory computer readable medium of claim 125, wherein the instructions further cause the one or more processing units to:
converting the first one of the A-DDIDs to a third view of the first one of the A-DDIDs using a third key; and
the first A-DDID is converted to a fourth view of the first A-DDID using a fourth key, wherein the third key is different from the fourth key, and the third view is different from the fourth key.
130. The non-transitory computer readable medium of claim 123, wherein a first one of the second dynamically changing time-unique identifiers has a same value in the first data source and the second data source.
131. The non-transitory computer readable medium of claim 123, wherein at least one of the one or more third dynamically changing temporally unique identifiers comprises a first non-decryptable form.
132. The non-transitory computer readable medium of claim 123, wherein at least one of the one or more fourth dynamically changing temporally unique identifiers comprises a second non-decryptable form.
133. The non-transitory computer readable medium of claim 131, wherein the first non-decryptable form includes encrypted data.
134. The non-transitory computer readable medium of claim 132, wherein the first non-decryptable form comprises encrypted data.
135. The non-transitory computer-readable medium of claim 123, wherein at least one of the one or more data sources comprises a subset, an aggregate, or a queue of a body of data.
136. The non-transitory computer-readable medium of claim 123, wherein each of the one or more data sources belongs to a particular plurality of data subjects over a particular time period.
137. The non-transitory computer-readable medium of claim 125, wherein the one or more a-DDIDs include at least one of: a digital packet or a classified packet.
138. The non-transitory computer readable medium of claim 125, wherein the one or more a-DDIDs comprise at least one of: a discrete value or a set of discrete values.
139. A computer-implemented method, comprising:
obtaining data from each of one or more data sources associated with a first plurality of data bodies;
Generating a first dynamically changing time-unique identifier for a first data body of the first plurality of data bodies, wherein the first data body is in each of a first data source and a second data source of the one or more data sources;
generating one or more second dynamically changing temporally unique identifiers corresponding to the one or more quasi identifiers in each of the first data source and the second data source, wherein each quasi identifier has a value;
receiving, over a network, a first request for values of one or more quasi-identifiers in a first data source;
receiving, over the network, a second request for values of one or more quasi-identifiers in a second data source;
converting the value obtained from the first request into one or more third dynamically changing time-unique identifiers;
converting the value obtained from the second request into one or more fourth dynamically changing time-unique identifiers;
storing in a memory: a first dynamically changing time unique identifier;
a second dynamically changing time-unique identifier;
one or more third dynamically changing time-unique identifiers;
one or more fourth dynamically changing time-unique identifiers;
Transmitting a first dynamically changing time-unique identifier;
a second dynamically changing time-unique identifier;
one or more third dynamically changing time-unique identifiers;
one or more dynamically changing time-unique identifiers on the network.
140. The computer-implemented method of claim 139, wherein the first dynamically changing time-unique identifier comprises a replacementddid (R-DDID).
141. The computer-implemented method of claim 140, wherein the one or more third dynamically changing time-unique identifiers comprise Association DDIDs (a-DDIDs).
142. The computer-implemented method of claim 139, wherein the R-DDID includes a particular value.
143. The computer-implemented method of claim 139, wherein each a-DDID comprises a particular value.
144. The computer-implemented method of claim 141, further comprising the acts of:
converting the R-DDID to a first view of the R-DDID using a first key; and
converts the R-DDID into a second view of the R-DDID using a second key,
wherein the first key is different from the second key.
145. The computer-implemented method of claim 141, further comprising the acts of:
Converting the first A-DDID to a third view of the first A-DDID using a third key;
converting the first A-DDID to a fourth view of the first A-DDID using a fourth key,
wherein the third key is different from the fourth key, an
Wherein the third view is different from the fourth view.
146. The computer-implemented method of claim 139, wherein a first one of the second dynamically changing time-unique identifiers has the same value in the first data source and the second data source.
147. The computer-implemented method of claim 139, wherein at least one of the third one of the one or more dynamically changing time-unique identifiers comprises the first non-decryptable form.
148. The computer-implemented method of claim 139, wherein at least one of the one or more fourth dynamically changing time-unique identifiers comprises a second non-decryptable form.
149. The computer implemented method of claim 147, wherein the first non-decryptable form comprises encrypted data.
150. The computer-implemented method of claim 148, wherein the first non-decryptable form comprises encrypted data.
151. The computer-implemented method of claim 139, wherein at least one of the one or more data sources comprises a particular subset, population, or queue of data bodies.
152. The computer-implemented method of claim 139, wherein each of the one or more data sources belongs to a particular plurality of data subjects over a particular time period.
153. The computer-implemented method of claim 141, wherein at least one of the one or more a-DDIDs comprises one of: a digital packet or a classified packet.
154. The computer-implemented method of claim 141, wherein the one or more a-DDIDs comprise at least one of: a discrete value or a set of discrete values.
155. A system, comprising:
a communication interface for transmitting data over a network;
a memory having computer program code stored therein;
one or more data stores;
one or more processing units operatively coupled to the memory and configured to execute instructions in the computer program code that cause the one or more processing units to:
obtaining a request from a first user to provide a privacy policy;
determining a first privacy policy based at least in part on the request;
Obtaining data from a first user relating to a first plurality of data bodies;
generating a first dynamically changing time-unique identifier (DDID) for a first data body of the first plurality of data bodies, wherein the first dynamically changing time-unique identifier is configured to: replacing a first value relating to a first data body; and is
Complying with the determined first privacy policy;
storing the first dynamically changing time-unique identifier in one or more data stores;
receiving, over a network, a first request for a first value related to a first data body;
when the first request is not authorized to receive the first value in accordance with the first privacy policy, sending, over a network, the first dynamically changing time-unique identifier in response to the first request, and
the first value is transmitted over the network in response to the first request when the first request is authorized to receive the first value in accordance with the first privacy policy.
156. The system of claim 155, wherein the first dynamically changing time-unique identifier comprises a replacementddid (R-DDID).
157. The system of claim 155, wherein the first dynamically changing time-unique identifier comprises an Association DDID (a-DDID).
158. The system of claim 156, wherein the R-DDID includes a special value for replacing the first value.
159. The system of claim 157, wherein the a-DDID comprises a particular value.
160. The system of claim 159, wherein the particular value further comprises a category, queue, or range for replacing the first value.
161. The system of claim 155, wherein at least one of the following is received through the gasket: a first user providing a request for a privacy policy; data associated with a first plurality of data bodies; and a first request for a first value.
162. The system of claim 155 wherein the first value comprises a quasi-identifier.
163. The system of claim 162 wherein the quasi-identifiers comprise unstructured data.
164. The system of claim 162 wherein the quasi-identifier comprises a category, a queue, or a range of values.
165. The system of claim 155, wherein the privacy policy specifies generation of the synthetic data.
166. The system of claim 165, wherein the privacy policy further specifies a DDID that generates the synthetic data.
167. The system of claim 155, wherein at least some of the data obtained from the first user includes composite data.
168. The system of claim 155, wherein the data obtained from the first user includes only synthetic data.
169. A computer-implemented method, comprising:
obtaining a request from a first user to provide a privacy policy;
determining a first privacy policy based at least in part on the request;
obtaining data from a first user relating to a first plurality of data bodies;
generating a first dynamically changing time-unique identifier (DDID) for a first data body of the first plurality of data bodies, wherein the first dynamically changing time-unique identifier is configured to: replacing a first value relating to the first data body; and is provided with
Complying with the determined first privacy policy;
storing the first dynamically changing time-unique identifier in one or more data stores;
receiving, over a network, a first request for a first value related to a first data body;
transmitting, over the network, a first dynamically changing time-unique identifier in response to the first request when the first request is not authorized to receive the first value in accordance with the first privacy policy; and
the first value is sent over the network in response to the first request when the first request is authorized to receive the first value in accordance with the first privacy policy.
170. The computer-implemented method of claim 169, wherein the first dynamically changing time-unique identifier comprises a replacementddid (R-DDID).
171. The computer-implemented method of claim 169, wherein the first dynamically changing time-unique identifier comprises an Association DDID (a-DDID).
172. The computer-implemented method of claim 169, wherein the R-DDID comprises a particular value for replacing the first value.
173. The computer-implemented method of claim 171, wherein the a-DDID comprises a particular value.
174. The computer implemented method of claim 173, wherein the particular value further comprises a category, queue, or range of values for replacing the first value.
175. The computer-implemented method of claim 169, wherein at least one of the following is received by the shim: a request from a first user to provide a privacy policy; data relating to a first plurality of data bodies; a first request for a first value.
176. The computer-implemented method of claim 169, wherein the first value comprises a quasi-identifier.
177. The computer-implemented method of claim 176, wherein the quasi-identifiers comprise unstructured data.
178. The computer-implemented method of claim 176, wherein the quasi-identifiers comprise categories, queues, or ranges of values.
179. The computer-implemented method of claim 169, wherein the privacy policy specifies generation of synthetic data.
180. The computer-implemented method of claim 179, wherein the privacy policy further specifies a DDID that generates the synthetic data.
181. The computer-implemented method of claim 169, wherein at least some of the data obtained from the first user comprises composite data.
182. The computer-implemented method of claim 169, wherein the data obtained from the first user includes only synthetic data.
183. A non-transitory program storage device readable by a programmable control device, comprising instructions stored thereon that, when executed, cause the programmable control device to:
obtaining a request from a first user to provide a privacy policy;
determining a first privacy policy based at least in part on the request;
obtaining data from a first user relating to a first plurality of data bodies;
generating a first dynamically changing time-unique identifier (DDID) for a first data body of the first plurality of data bodies, wherein the first dynamically changing time-unique identifier is configured to: replacing a first value relating to the first data body; and is
Complying with the determined first privacy policy;
storing the first dynamically changing time-unique identifier in one or more data stores;
receiving, over a network, a first request for a first value related to a first data body;
transmitting, over the network, a first dynamically changing time-unique identifier in response to the first request when the first request is not authorized to receive the first value in accordance with the first privacy policy; and
the first value is sent over the network in response to the first request when the first request is authorized to receive the first value in accordance with the first privacy policy.
184. The non-transitory program storage device of claim 183, wherein the first dynamically changing time-unique identifier comprises a Replacement DDID (R-DDID).
185. The non-transitory program storage device of claim 183, wherein the first dynamically changing time-unique identifier comprises an Association DDID (a-DDID).
186. The non-transitory program storage device of claim 185, wherein the R-DDID includes a particular value for replacing the first value.
187. The non-transitory program storage device of claim 185, wherein the a-DDID includes a particular value.
188. The non-transitory program storage device of claim 187, wherein the particular value further comprises a class, queue, or range of values for replacing the first value.
189. The non-transitory program storage device of claim 183, wherein at least one of the following is received by the patch: a request from a first user to provide a privacy policy; data about a first plurality of data bodies; and a first request for a first value.
190. The non-transitory program storage device of claim 183, wherein the first value comprises a quasi-identifier.
191. The non-transitory program storage device of claim 190, wherein the quasi-identifier comprises unstructured data.
192. The non-transitory program storage device of claim 190, wherein the quasi-identifier comprises a class, queue or range of values.
193. The non-transitory program storage device embodying claim 183, wherein the privacy policy specifies generation of synthetic data.
194. The non-transitory program storage device of claim 193, wherein the privacy policy further specifies a DDID used to generate the synthetic data.
195. The non-transitory program storage device of claim 183, wherein at least some of the data obtained from the first user comprises synthetic data.
196. The non-transitory program storage device of claim 183, wherein the data obtained from the first user comprises only synthetic data.
CN202211401943.6A 2017-04-28 2018-04-27 System and method for implementing centralized privacy control in decentralized systems Pending CN115589332A (en)

Applications Claiming Priority (16)

Application Number Priority Date Filing Date Title
US201762491294P 2017-04-28 2017-04-28
US62/491,294 2017-04-28
US201762535601P 2017-07-21 2017-07-21
US62/535,601 2017-07-21
US201762554000P 2017-09-04 2017-09-04
US62/554,000 2017-09-04
US201762580628P 2017-11-02 2017-11-02
US62/580,628 2017-11-02
US201862644463P 2018-03-17 2018-03-17
US62/644,463 2018-03-17
US201862649103P 2018-03-28 2018-03-28
US62/649,103 2018-03-28
US15/963,609 2018-04-26
US15/963,609 US10572684B2 (en) 2013-11-01 2018-04-26 Systems and methods for enforcing centralized privacy controls in de-centralized systems
PCT/US2018/029890 WO2018201009A1 (en) 2017-04-28 2018-04-27 Systems and methods for enforcing centralized privacy controls in de-centralized systems
CN201880044101.5A CN111149332B (en) 2017-04-28 2018-04-27 System and method for implementing centralized privacy control in decentralized systems

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201880044101.5A Division CN111149332B (en) 2017-04-28 2018-04-27 System and method for implementing centralized privacy control in decentralized systems

Publications (1)

Publication Number Publication Date
CN115589332A true CN115589332A (en) 2023-01-10

Family

ID=63919294

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202211401943.6A Pending CN115589332A (en) 2017-04-28 2018-04-27 System and method for implementing centralized privacy control in decentralized systems
CN201880044101.5A Active CN111149332B (en) 2017-04-28 2018-04-27 System and method for implementing centralized privacy control in decentralized systems

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201880044101.5A Active CN111149332B (en) 2017-04-28 2018-04-27 System and method for implementing centralized privacy control in decentralized systems

Country Status (6)

Country Link
EP (1) EP3616383A4 (en)
JP (1) JP7064576B2 (en)
CN (2) CN115589332A (en)
AU (1) AU2018258656B2 (en)
CA (1) CA3061638C (en)
WO (1) WO2018201009A1 (en)

Families Citing this family (81)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2604540B (en) 2016-02-03 2023-01-11 Luther Systems System and method for secure management of digital contracts
WO2019108676A1 (en) * 2017-11-28 2019-06-06 Yale University Systems and methods of formal verification
US10901974B2 (en) * 2018-03-29 2021-01-26 Salesforce.Com, Inc. Hybrid cloud chain management of centralized and decentralized data
US11775479B2 (en) 2018-05-24 2023-10-03 Luther Systems Us Incorporated System and method for efficient and secure private similarity detection for large private document repositories
WO2020051710A1 (en) * 2018-09-12 2020-03-19 Joe Jay System and process for managing digitized security tokens
CN110009334B (en) * 2018-11-07 2020-04-28 阿里巴巴集团控股有限公司 Meckel tree construction and simple payment verification method and device
CN109257108A (en) * 2018-11-13 2019-01-22 广东水利电力职业技术学院(广东省水利电力技工学校) A kind of multiplicate controlling quantum communications protocol implementing method and system
US11860822B2 (en) 2018-11-19 2024-01-02 Luther Systems Us Incorporated Immutable ledger with efficient and secure data destruction, system and method
US11573973B1 (en) * 2018-12-19 2023-02-07 Vivek Vishnoi Methods and systems for the execution of analysis and/or services against multiple data sources while maintaining isolation of original data source
CN109670341A (en) * 2018-12-29 2019-04-23 中山大学 The method for secret protection that a kind of pair of structural data and semi-structured data combine
CA3126149A1 (en) * 2019-01-11 2020-07-16 Metafyre Inc. Systems, devices, and methods for internet of things integrated automation and control architectures
KR102185191B1 (en) * 2019-01-22 2020-12-01 (주)에스투더블유랩 Method and system for analyzing transaction of cryptocurrency
CN109936626B (en) * 2019-02-19 2020-05-29 阿里巴巴集团控股有限公司 Method, node and storage medium for implementing privacy protection in block chain
US20200302468A1 (en) * 2019-03-22 2020-09-24 The Procter & Gamble Company System and Method Including a Distributed Ledger Data Structure for Authenticating and Clearing Coupons
US20220165384A1 (en) * 2019-03-22 2022-05-26 Nephron Pharmaceuticals Corporation Blockchain systems and methods for remote monitoring
US11562134B2 (en) * 2019-04-02 2023-01-24 Genpact Luxembourg S.à r.l. II Method and system for advanced document redaction
CN110034917A (en) * 2019-04-11 2019-07-19 鸿秦(北京)科技有限公司 A kind of alliance's chain data processing method and device based on homomorphic encryption algorithm
PT115479B (en) 2019-04-29 2021-09-15 Mediceus Dados De Saude Sa COMPUTER SYSTEM AND METHOD OF OPERATION TO MANAGE ANNIMIZED PERSONAL DATA
US11106812B2 (en) 2019-05-09 2021-08-31 At&T Intellectual Property I, L.P. Controlling access to datasets described in a cryptographically signed record
EP3971810A4 (en) 2019-05-14 2022-07-06 Panasonic Intellectual Property Corporation of America Information transaction method, information user terminal, and program
FI20195426A1 (en) * 2019-05-23 2020-11-24 Univ Helsinki Compatible anonymization of data sets of different source
US20200402624A1 (en) * 2019-06-19 2020-12-24 Electronic Health Record Data, Inc. Electronic Healthcare Record Data Blockchain System
CN110502592B (en) * 2019-08-27 2023-08-11 深圳供电局有限公司 Project domain topic analysis system based on big data analysis technology
CN110598386B (en) * 2019-09-27 2023-05-30 腾讯科技(深圳)有限公司 Block chain-based data processing method, device, equipment and storage medium
JPWO2021100386A1 (en) * 2019-11-21 2021-05-27
CN114766019A (en) * 2019-11-25 2022-07-19 瑞典爱立信有限公司 Face anonymization system based on block chain
CN110955879B (en) * 2019-11-29 2023-04-18 腾讯科技(深圳)有限公司 Device control method, device, computer device and storage medium
CN111049856A (en) * 2019-12-26 2020-04-21 中国联合网络通信集团有限公司 Authentication method and device
US20210266170A1 (en) * 2020-02-26 2021-08-26 Antonio Rossi System and method of trustless confidential positive identification and de-anonymization of data using blockchain
CN111400756A (en) * 2020-03-13 2020-07-10 杭州复杂美科技有限公司 Private data uplink method, device and storage medium
US11531724B2 (en) 2020-03-28 2022-12-20 Dataparency, LLC Entity centric database
CN111428207B (en) * 2020-04-23 2023-11-14 重庆邮电大学 Digital copyright registration and transaction method based on blockchain technology
WO2020169125A2 (en) 2020-06-08 2020-08-27 Alipay Labs (singapore) Pte. Ltd. Blockchain-based document registration for custom clearance
WO2020169126A2 (en) * 2020-06-08 2020-08-27 Alipay Labs (singapore) Pte. Ltd. Managing user authorizations for blockchain-based custom clearance services
CN111868725B (en) 2020-06-08 2024-05-24 支付宝实验室(新加坡)有限公司 Processing import customs clearance data based on blockchain
EP3837617B1 (en) 2020-06-08 2023-08-02 Alipay Labs (Singapore) Pte. Ltd. Distributed storage of custom clearance data
SG11202102366SA (en) 2020-06-08 2021-04-29 Alipay Labs Singapore Pte Ltd User management of blockchain-based custom clearance service platform
SG11202103226UA (en) 2020-06-08 2021-04-29 Alipay Labs Singapore Pte Ltd Blockchain-based smart contract pools
CN111797400B (en) * 2020-07-08 2023-09-01 国家计算机网络与信息安全管理中心 Dynamic detection method and device for malicious application of Internet of vehicles
CN111881480A (en) * 2020-07-31 2020-11-03 平安付科技服务有限公司 Private data encryption method and device, computer equipment and storage medium
US11481513B2 (en) * 2020-08-14 2022-10-25 Sap, Se Decentralized storage of personal data
CN112073484B (en) * 2020-08-28 2022-01-04 武汉大学 GDPR compliance supervision method and system based on alliance chain
EP4218172A4 (en) * 2020-09-28 2024-04-24 NXM Labs, Inc. Security management of networked devices using a distributed ledger network
CN112199717B (en) * 2020-09-30 2024-03-22 中国科学院信息工程研究所 Privacy model training method and device based on small amount of public data
CN114024958A (en) * 2020-10-30 2022-02-08 北京八分量信息科技有限公司 Trust architecture aiming at autonomous propagation
US12093974B2 (en) * 2020-10-30 2024-09-17 Lucid Ratings, Inc. Review engine with blockchain-based verification
EP3995982A1 (en) * 2020-11-04 2022-05-11 Sistron BV System and method for storing and processing personal data
TWI829215B (en) * 2020-11-10 2024-01-11 林庠序 De-centralized data authorization control system capable of inspecting transfer history of read token to verify activity of read token
TWI829221B (en) * 2020-11-10 2024-01-11 林庠序 De-centralized data authorization control system capable of allowing data requestetr device to inspect correctness of data authorization policy stored in block chain subsystem
TWI829218B (en) * 2020-11-10 2024-01-11 林庠序 De-centralized data authorization control system capable of indirectly transferring read token through third-party service subsystem
TWI829222B (en) * 2020-11-10 2024-01-11 林庠序 De-centralized data authorization control system capable of utilizing third-party service subsystem to provide accessible data list to data requester device
TWI829216B (en) * 2020-11-10 2024-01-11 林庠序 De-centralized data authorization control system capable of forwarding token request through third-party service subsystem
TWI829219B (en) * 2020-11-10 2024-01-11 林庠序 De-centralized data authorization control system capable of transferring read token from block chain subsystem to data requester device
TWI829220B (en) * 2020-11-10 2024-01-11 林庠序 De-centralized data authorization control system capable of utilizing smart contract to generate and transfer authorization token
TWI829217B (en) * 2020-11-10 2024-01-11 林庠序 De-centralized data authorization control system capable of flexibly adjusting data authorization policy
CN112492636B (en) * 2020-12-18 2023-06-16 中国联合网络通信集团有限公司 Method and device for determining propagation loss
IT202000032405A1 (en) * 2020-12-28 2022-06-28 Stella All in One Srl METHOD FOR DIGITAL RIGHTS MANAGEMENT OF DOCUMENTS FOR DIGITIZATION, ARCHIVING AND DESTRUCTION FOR ISO27001 COMPLIANCE
US11874827B2 (en) 2020-12-30 2024-01-16 Luther Systems Us Incorporated System and method for automatic, rapid, and auditable updates of digital contracts
CN113177219A (en) * 2021-05-26 2021-07-27 永旗(北京)科技有限公司 Network data privacy protection method
US11483369B1 (en) * 2021-06-07 2022-10-25 Ciena Corporation Managing confirmation criteria for requested operations in distributed microservice networks
CN113676867B (en) * 2021-06-10 2023-11-07 西安电子科技大学 Internet of vehicles spectrum sharing excitation method, system, equipment, medium and terminal
CN113422681B (en) * 2021-06-16 2022-02-01 国网电子商务有限公司 Block chain digital signature method, device and system based on quantum cryptography
CN113297605B (en) * 2021-06-24 2023-05-05 中国建设银行股份有限公司 Copy data management method, apparatus, electronic device, and computer readable medium
CN113642036B (en) * 2021-07-07 2023-07-28 阿里巴巴华北技术有限公司 Data processing method, device and system
CN113852592B (en) * 2021-07-13 2024-08-20 天翼数字生活科技有限公司 Big data security operation and maintenance management and control method and system based on dynamic access control strategy
KR102570616B1 (en) * 2021-07-15 2023-08-23 주식회사 카카오 Method for generating de-identified key of terminal, server and terminal implementing the method
CN113360417B (en) * 2021-07-27 2024-08-02 中国工商银行股份有限公司 Test method, session modifier, electronic device and medium
US20230075246A1 (en) * 2021-09-07 2023-03-09 Collibra Nv Systems and methods for policy management
TWI790985B (en) * 2021-10-28 2023-01-21 市民永續股份有限公司 Data read authority control system based on block chain and zero-knowledge proof mechanism, and related data service system
CN113810507B (en) * 2021-11-18 2022-02-15 南京信息工程大学 Block chain credible node partitioning method based on IDE
CN114124376B (en) * 2021-11-23 2023-05-23 中国标准化研究院 Data processing method and system based on network data acquisition
CN114022049B (en) * 2021-12-10 2022-07-22 佛山市蜂王人力资源有限公司 Intelligent service information risk processing method and system based on cloud computing
CN114117540B (en) * 2022-01-25 2022-04-29 广州天鹏计算机科技有限公司 Big data analysis processing method and system
CN114978594B (en) * 2022-04-18 2024-02-09 南京工程学院 Self-adaptive access control method for cloud computing privacy protection
KR20230159087A (en) * 2022-05-13 2023-11-21 주식회사 헤세그 Method for using token on blockchain where recombined information is stored and system performing the same
CN115099814B (en) * 2022-06-13 2024-08-02 马上消费金融股份有限公司 Information processing method, device, equipment and storage medium
US12105848B2 (en) * 2022-08-19 2024-10-01 Telesign Corporation User data deidentification system
CN116010127B (en) * 2023-02-24 2023-08-29 荣耀终端有限公司 Message processing method, device and storage medium
KR20240137844A (en) * 2023-03-09 2024-09-20 주식회사 애브체인 Method and system for processing personal information using trust execution environment based on smart contract
US11792125B1 (en) * 2023-05-16 2023-10-17 Citibank, N.A. Reducing network traffic by filtering network requests based on network request-related information systems and methods
KR102686297B1 (en) * 2023-12-04 2024-07-22 (주)에이아이딥 Method for detecting and recognizing personal information, apparatus and computer program for performing the method

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9537650B2 (en) * 2009-12-15 2017-01-03 Microsoft Technology Licensing, Llc Verifiable trust for data through wrapper composition
US8862999B2 (en) * 2010-11-22 2014-10-14 International Business Machines Corporation Dynamic de-identification of data
CN104380690B (en) * 2012-06-15 2018-02-02 阿尔卡特朗讯 Framework for the intimacy protection system of recommendation service
US9129133B2 (en) * 2013-11-01 2015-09-08 Anonos, Inc. Dynamic de-identification and anonymity
US9361481B2 (en) * 2013-11-01 2016-06-07 Anonos Inc. Systems and methods for contextualized data protection
WO2016161073A1 (en) 2015-03-31 2016-10-06 Nasdaq, Inc. Systems and methods of blockchain transaction recordation
US10366204B2 (en) * 2015-08-03 2019-07-30 Change Healthcare Holdings, Llc System and method for decentralized autonomous healthcare economy platform
CA2995492A1 (en) * 2015-08-14 2017-02-23 Identitii Pty Ltd A computer implemented method for processing a financial transaction and a system therefor
US10454901B2 (en) * 2016-01-19 2019-10-22 Datavant, Inc. Systems and methods for enabling data de-identification and anonymous data linkage
JP6731783B2 (en) 2016-05-19 2020-07-29 株式会社野村総合研究所 Tamper detection system and tamper detection method
US11562812B2 (en) * 2016-07-15 2023-01-24 E-Nome Pty Ltd Computer implemented method for secure management of data generated in an EHR during an episode of care and a system therefor

Also Published As

Publication number Publication date
CN111149332A (en) 2020-05-12
CA3061638A1 (en) 2018-11-01
AU2018258656A1 (en) 2019-12-12
CN111149332B (en) 2022-09-23
JP7064576B2 (en) 2022-05-10
CA3061638C (en) 2022-04-26
JP2020519210A (en) 2020-06-25
WO2018201009A1 (en) 2018-11-01
EP3616383A1 (en) 2020-03-04
EP3616383A4 (en) 2020-04-08
AU2018258656B2 (en) 2021-04-22

Similar Documents

Publication Publication Date Title
CN111149332B (en) System and method for implementing centralized privacy control in decentralized systems
US11790117B2 (en) Systems and methods for enforcing privacy-respectful, trusted communications
US12093426B2 (en) Systems and methods for functionally separating heterogeneous data for analytics, artificial intelligence, and machine learning in global data ecosystems
US10572684B2 (en) Systems and methods for enforcing centralized privacy controls in de-centralized systems
US10043035B2 (en) Systems and methods for enhancing data protection by anonosizing structured and unstructured data and incorporating machine learning and artificial intelligence in classical and quantum computing environments
US9619669B2 (en) Systems and methods for anonosizing data
US9361481B2 (en) Systems and methods for contextualized data protection
EP3063691B1 (en) Dynamic de-identification and anonymity
CA2929269C (en) Dynamic de-identification and anonymity
US20230054446A1 (en) Systems and methods for functionally separating geospatial information for lawful and trustworthy analytics, artificial intelligence and machine learning
CA3104119C (en) Systems and methods for enforcing privacy-respectful, trusted communications
Zigomitros et al. A survey on privacy properties for data publishing of relational data
US20230147698A1 (en) System and method for controlling data using containers
WO2019086553A1 (en) Privacy management
CA2975441C (en) Systems and methods for contextualized data protection
US20140287723A1 (en) Mobile Applications For Dynamic De-Identification And Anonymity
Islam Privacy by design for social networks
Muid et al. Electronic Health Record Sharing and Access Controlling Blockchain Architecture using Data De-identi cation Method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20230412

Address after: Oregon USA

Applicant after: Datawing Intellectual Property Co.,Ltd.

Address before: USA New York

Applicant before: Data wing Co.,Ltd.