US20150067786A1 - Visual image authentication and transaction authorization using non-determinism - Google Patents
Visual image authentication and transaction authorization using non-determinism Download PDFInfo
- Publication number
- US20150067786A1 US20150067786A1 US14/017,735 US201314017735A US2015067786A1 US 20150067786 A1 US20150067786 A1 US 20150067786A1 US 201314017735 A US201314017735 A US 201314017735A US 2015067786 A1 US2015067786 A1 US 2015067786A1
- Authority
- US
- United States
- Prior art keywords
- user
- images
- transaction
- service provider
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/36—User authentication by graphic or iconic representation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q20/00—Payment architectures, schemes or protocols
- G06Q20/38—Payment protocols; Details thereof
- G06Q20/382—Payment protocols; Details thereof insuring higher security of transaction
- G06Q20/3821—Electronic credentials
- G06Q20/38215—Use of certificates or encrypted proofs of transaction rights
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q20/00—Payment architectures, schemes or protocols
- G06Q20/38—Payment protocols; Details thereof
- G06Q20/385—Payment protocols; Details thereof using an alias or single-use codes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q20/00—Payment architectures, schemes or protocols
- G06Q20/38—Payment protocols; Details thereof
- G06Q20/40—Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
- G06Q20/401—Transaction verification
- G06Q20/4014—Identity check for transactions
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09C—CIPHERING OR DECIPHERING APPARATUS FOR CRYPTOGRAPHIC OR OTHER PURPOSES INVOLVING THE NEED FOR SECRECY
- G09C5/00—Ciphering apparatus or methods not provided for in the preceding groups, e.g. involving the concealment or deformation of graphic data such as designs, written or printed messages
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/08—Network architectures or network communication protocols for network security for authentication of entities
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L9/00—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
- H04L9/14—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols using a plurality of keys or algorithms
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W12/00—Security arrangements; Authentication; Protecting privacy or anonymity
- H04W12/06—Authentication
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/21—Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/2117—User registration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/21—Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/2133—Verifying human interaction, e.g., Captcha
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L2209/00—Additional information or applications relating to cryptographic mechanisms or cryptographic arrangements for secret or secure communication H04L9/00
- H04L2209/24—Key scheduling, i.e. generating round keys or sub-keys for block encryption
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L2463/00—Additional details relating to network architectures or network communication protocols for network security covered by H04L63/00
- H04L2463/102—Additional details relating to network architectures or network communication protocols for network security covered by H04L63/00 applying security measure for e-commerce
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W12/00—Security arrangements; Authentication; Protecting privacy or anonymity
- H04W12/60—Context-dependent security
- H04W12/69—Identity-dependent
- H04W12/77—Graphical identity
Definitions
- This specification relates to security in computers, mobile phones and other devices.
- a shortcoming in the prior art, recognized by this specification, is that there is a lack of a secure integration of the identity of the user to the protection of the user's data and the control of the user's computer.
- a critical part of the computer instructions for an action or a transaction are usually executed on the host domain machine (e.g., the user's computer).
- the host domain machine e.g., the user's computer.
- Some examples of the user's computer are a Mac Book Pro, a Dell desktop computer, an IPhone, a Blackberry or an Android phone.
- cryptography keys are stored on the user's computer or a chip executing the operating system, which is not secure.
- Bob's computer when Bob's computer communicates with Mary's computer, even when using well-implemented Public Key Infrastructure (PKI), Bob's computer can only be sure that it is communicating with Mary's computer. Bob can not be sure that he is communicating with Mary and vice versa. Similarly, even Bob cannot be certain that the communications he sends Mary are the same as the communications that Mary receives as coming from him.
- PKI Public Key Infrastructure
- PKI Public Key Infrastructure
- each computer cannot be assured of who controls the other computer.
- an intruder e.g., a hacker
- the Trusted Platform Module has the fundamental cyber security weakness of not knowing who controls the other computer with which a user may be in communication with or who controls the computer which contains the Trusted Platform Module.
- Not knowing the other computer with which a current computer is in communication with may be a weakness that is significant when the operating system can directly access the TPM. If the user's computer is compromised, then the attacker can access the TPM.
- Another limitation and weakness of the TPM is that there is no mechanism for binding the identity of the user to the user's cryptography keys and other confidential information that should be bound to the user's true identity.
- the authorization of an action could be the execution of a financial transaction from a user's bank account, a stock trade in a user's brokerage account, the execution of an important functionality on the electrical grid, or access to important data on a private network such as SIPRnet (e.g. WikiLeaks).
- SIPRnet e.g. WikiLeaks
- the authorization of an action typically occurs through the web browser since the web browser presents a convenient interface for a person. However, the web browser is where the important connection between authentication of a user and authorization of an action may be broken.
- Existing systems have the user authenticating the user's computer, and then the same user's computer also authorizes (and may also execute) the action. Since the user's computer can be hacked, the lack of a secure and direct link between authenticating the user's computer and authorizing the action may render the act of user verification irrelevant.
- biometrics can be advantageous for security, because biometrics offers a reliable method for verifying who (the person) is that is actually initiating a transaction.
- biometrics if the handling of the biometric information, the storage of the biometric data, or the control of actions based on a biometric verification is done on an unsecured user's computer, the value of the biometrics may be greatly reduced or nullified.
- a Trojan attack is an attack in which the attacker pretends to be the user and/or the other system to which the user is communicating with. In other words, a valid, authorized user cannot verify that the action he or she is trying to execute is what is actually being executed, because a third party may be masquerading as the other system.
- the RSA SecurID® token is the industry-leading technology for authenticating and securing identity in online transactions.
- the recent attack and subsequent breach of the RSA SecurID token (announced March 2011) has highlighted the fundamental problems with current cybersecurity solutions. Malware played a significant role in causing this breach. Malicious software has many forms: virus, worm, Trojan horse, spyware etc. all of which have the singular purpose of undermining the security, confidentiality, integrity or availability of computer systems. Recent über malware is invisible. It encrypts and camouflages itself using the same mathematical techniques used by traditional, white hat cryptography.
- Malware is able to phish passwords or hijack financial transactions made via mobile devices or personal computers without the user's knowledge. It is not necessary for malware to break the cryptography of a device to compromise its security. Contemporary computers and electronic devices are particularly susceptible to malware attacks due to their processor architecture.
- the processors have a von Neumann architecture, which only execute one computing instruction at a time.
- malware has to corrupt or transform only a single machine instruction to initiate execution of malignant code.
- This is a deep vulnerability arising from current processor architecture and it cannot be easily rectified. Only one legitimate jump or branch instruction needs to be changed in a digital computer program to start it executing malware.
- anti-virus software that is supposed to check the program, might not get executed, may be disabled or in other cases may never detect the malware.
- the sequential execution of von Neumann machine instructions hinders a digital computer program from protecting itself
- a common malware technique is the so-called “man-in-the-middle” attack.
- This attack is an active form of eavesdropping in which the attacker makes independent connections with the counterparties in a given transaction; by using appropriate authentication the attacker controls the entire transaction.
- the counterparties are unaware of the presence of the attacker and assume they are transacting securely with each other.
- Internet communications and financial transactions can be intercepted and hijacked by malware (malicious software) performing a “man-in-the-middle” attack.
- malware malware
- These attacks are not easy to detect or prevent.
- the RSA SecurID breach demonstrated that pseudo-random number generators (i.e., deterministic algorithms), typically used in two-factor authentication solutions cannot prevent “man-in-the-middle” attacks launched by malware.
- Malware has a significant weakness: malware is poor at recognizing visual images since computer algorithms cannot match the visual pattern recognition ability of the human brain. Human beings have highly advanced visual pattern recognition skills. The embodiments described here exploit this fundamental weakness of malware.
- a third fundamental shortcoming of current cybersecurity solutions is the fact that static authentication factors, such as passwords, PINs and biometrics, entered directly into the user's computer or stored on computers in a digital or binary format such as ASCII code. This weakness makes static authentication factors vulnerable to phishing attacks in the host domain or security breaches in the network domain.
- FIG. 1A shows a block diagram of an embodiment of a system for executing secure transactions resistant to malware.
- FIG. 1B shows a memory system that is a component of the system shown in 1 A.
- FIG. 2A shows a block diagram of an embodiment of a service provider system.
- FIG. 2B shows memory system that is a component of the system in FIG. 2A .
- FIG. 3 shows a flow diagram of a user setting up the system to enable the execution of secure transactions.
- FIG. 3A shows a flow diagram of an embodiment of step A of executing a secure transaction.
- FIG. 3B shows a flow diagram of an embodiment of step B of executing a secure transaction.
- FIG. 4 shows a collection of images that are parts or a whole of a logo. Some of the images are rotated.
- FIG. 5 shows a collection of images.
- One is part of a logo.
- the word NAME is made up of a collection of images with a doodle background texture.
- FIG. 6 shows a collection of images. One is part of a logo. Another is the word BANK with a simplicial and dotted texture background. There are 26 visual images of the alphabet letters “ABCDEFGHIJKLMNOPQRSTUVWXYZ”.
- FIG. 7 shows a collection of images. One is part of a logo. Another is the word “ACCEPT” written with bubble texture on a simplicial background texture. And a third is the word “ABORT” written with bubble texture on a foliation background texture.
- FIG. 8 shows a collection of images.
- One object is a geometric image of a blue rectangle on top of a blue triangle which is on top of a red blob.
- Just to the right is a rectangle with vertical texture on top of a triangle with dotted texture on top of a blob with simplicial texture.
- FIG. 9 shows a collection of images, illustrating some different textures.
- FIG. 9 shows nine different textures: vertical, horizontal, mixed, dotted, bubble, simplicial and foliation.
- FIG. 10 shows recipient account number 9568342710 represented by two distinct collection of images.
- the number 3 is represented with a visual image using a triangular texture.
- the number 4 is represented with the letters “FOUR” using bubble texture to write the letters.
- FIG. 11 shows a collection of images representing a universal identifier. A subset of these can be used for user authentication.
- FIG. 12 a shows a user interface page for enrollment.
- FIG. 12 b shows a user interface page for enrollment that displays different visual image categories.
- FIG. 12 c shows a user interface page for enrollment that displays different images in the “sports” category.
- One image represents “cycling”.
- Another image represents “tennis”.
- Another image represents “skiing”.
- FIG. 13 a shows a user interface page for user verification or user login using visual images.
- FIG. 13 b shows a user interface page for user verification that displays different visual images.
- One image displays an elephant.
- Another image displays a car.
- FIGS. 14 a, 14 b and 14 c show the use of correlation to detect and find the locations of features in an image.
- FIG. 14 a shows an image of the word “apple” in a handwritten style font.
- FIG. 14 b shows an image, representing the letter “p” in a handwritten style font.
- FIG. 14 c shows a correlation function image, indicating the detection of the presence and exact locations of the letter “p” in the image in 14 , indicated by the bright peaks.
- FIGS. 15 a, 15 b and 15 c show the use of special types of noise to hinder the use of the correlation operation to find features in an image.
- FIG. 15 a show an image, representing the word “apple” in the same handwritten style font as FIG. 14 a, but with non-deterministic noise added.
- FIG. 15 b shows a raw image, representing the letter “p” in a handwritten style font.
- FIG. 15 c shows an unintelligible correlation function image, indicating the inability to detect the locations of the “p” in the image in FIG. 15 a.
- FIG. 16 shows a semiconductor device that is a photodetector. This hardware device can detect the arrival of single photons, which is an embodiment of quantum randomness.
- FIG. 17 shows a device that receives a polarized photon and splits it into a linear/horizontal vertical analyzer with 50% chance of detecting a “0” or “1”.
- This hardware device can detect the polarization of single photons, which is an embodiment of quantum randomness.
- Under the device is a diagram representing a photon that is circularly polarized.
- Under the device, on the right, is a diagram representing a photon that is linearly polarized.
- FIG. 18 shows a random noise generator and a digital logic circuit that captures and outputs this randomness. Below the generator are the time delays between separate events that are detected.
- the random noise generator may be implemented with a photodetector as shown in FIG. 16 . In this embodiment, the arrival times of photons enable quantum randomness.
- a promising method for cybersecurity is described that is more secure against modern malware, and provides a much better user experience compared with passwords or hardware tokens such as SecurID.
- No More Passwords uses visual images that are selected by a user to create a set of “favorites” that can easily be recalled and quickly selected by the user at login.
- No more passwords leverages the superior power of eye-brain processing of humans versus machines to ensure that a human, and not a bot or malware, is involved in a transaction or communication.
- the invention(s) described herein uses the unique, innate pattern recognition skills of humans to transform cybersecurity It advances online transaction security, which currently relies mainly on the straightforward use of passwords or, in some cases, the addition of other security enhancements that may provide some improvement in security, but are still inadequate. These measures typically increase the cost of the system while greatly reducing the convenience to the user.
- this invention can eliminate use of the alpha-numeric “password” such as “34YUiklmn” or a sequence of ASCII symbols such as “94Yzi2_e$mx&”
- the invention(s) herein also provides a basis for a much-improved user interface and the overall user experience around securing online transactions, access control, and the protection of an individual's personal data and identity.
- the invention(s) described herein use visual representations (images) that are both personal and memorable to each individual user.
- There is an enrollment process in which the user selects a set of images from a group of categories representing the user's “favorites.”
- the user is asked to select some or all personal favorites from a set of randomly-selected options as verification of both the user's identity and the fact that the user is in fact a human instead of an automated system that has hijacked the transaction flow.
- This approach has a number of advantages in terms of convenience to the user, while allowing anti-malware methods to be applied that provide substantial anti-hacking capability.
- the use of visual images to create a unique identity for a user has many advantages: The system is not only highly secure and resistant to various hacks and malware attacks, but is also intuitive, easy to use and attractive to users.
- the core technology behind an identity security system should support a user interface (UI) that provides all of these benefits; there is sufficient flexibility in the UI design and a range of security-enhancing features that can be used together in various ways to allow the UI design to be tailored to the needs of both the user and the device (e.g., PC, IPhone, Android IPhone, IPad, tablet computer) on which it is being used.
- UI user interface
- UI design example Since interaction with the user is a key part of the technology, it is helpful to describe a UI design example for two reasons: 1) to ensure that the technology is both effective and easy to use; 2) to help explain how embodiments work.
- the UI should be designed to run on the device(s) of choice within the intended application and tested for intuitiveness, ease of use, functionality, acceptance by and attractiveness to product users.
- the UI described in this section is intended as an example design, and shows how it might be implemented on a mobile phone.
- the example shown here is only meant to provide general clarity about what can be done with this technology and to serve as a high-level use case to describe the flow for creating and entering a unique login identity for a user.
- the user To enroll, the user first initiates the enrollment.
- the process will start with the launching of an application, or a request to enroll within a running application on a particular device such as a mobile phone, computer, terminal or website.
- a mobile phone In the example here, and figures below, the device is a mobile phone and the user starts the enrollment process by launching an app.
- the application starts enrollment by displaying the first enrollment screen with a superimposed popup window that provides brief instructions for enrollment and a box in which the user is asked to enter a username. This is shown in FIG. 12 a.
- the popup window disappears, showing the first enrollment screen that provides a list of categories for the user's “favorites”, as shown in FIG. 12 b.
- categories can include almost anything, such as animals, musical instruments, travel destinations, famous people, sports, etc.
- a few items from the list of categories are displayed on the screen, but the central portion of the screen with the category icons can be scrolled up or down to show other choices.
- the items that the user has currently chosen are shown as small icons at the bottom of the screen. This part of the display (and the header at the top) does not scroll, and provides a running tally of the user's choices throughout the process. This also serves as messaging to the user as to the progress of the enrollment process.
- a second screen appears showing specific items in the chosen category.
- This screen is shown in FIG. 12 c.
- the tally of choices is carried over, and shown at the bottom of this screen as well.
- the central portion of the screen with the icons can be scrolled up or down to display more than the nine items shown on the screen at one time. The user can then select his/her item from the available choices. Once a selection is made, the item chosen appears in the running tally below, and the display reverts back to the first enrollment screen which provides the category choices again.
- This process is repeated seven times in this example.
- the number of choices required from the user for enrollment can be changed, depending on the security level required, and an acceptable enrollment process for a particular case. In general, the fewer the choices required by the user, the less secure the embodiment will be, but the trade off between security and ease of use is important, and should be decided on a case-by-case basis.
- FIG. 12 a shows page 1 of an enrollment user interface (UI), which shows the popup window superimposed requesting entry of the username.
- UI enrollment user interface
- the username is obscured in a similar way as is typically accomplished with a password field.
- FIG. 12 b shows page 1 of an enrollment UI after the disappearance of the popup window, showing the same screen 1 the scrollable “favorites” category selections, and the boxes at the bottom of the screen where the running tally of the user's choices will be displayed
- FIG. 12 c shows enrollment page 2 showing 9 of the specific item choices available, and allowing the user to scroll for more.
- the user had selected “sports” as the category on the previous page, and this is the third favorite, as indicated by the running tally below showing the two previous choices, and the note in the header that reads “selection 3.”
- an enrolled user initiates verification by launching an image-enabled app, or requesting login to a local or remote system.
- the verification screen appears with a randomized group of choices for the user, and a popup window superimposed that requests entry of the username, as depicted in FIG. 13 a.
- the popup window disappears exposing the screen with the randomized choices for the user to select. This is shown in FIG. 13 b.
- the options offered on a single screen contain at least one of the user's “favorite” images that were selected at enrollment, but also contains a number of other “incorrect” options that are selected randomly from a large set of options.
- the central portion of the screen with the icons of the selectable items can be scrolled up or down to expose more choices.
- the user chooses four of the seven favorite items that were chosen at enrollment. Once all favorites have been correctly selected, the screen disappears, and login proceeds. If the choices are incorrect, the login process starts over again from the beginning For added security, there may be a limit placed on the number of failed attempts a user can make in a login session.
- FIG. 13 a shows a verification user interface (UI) page with the popup window superimposed requesting entry of the username.
- the verification page is a single page to make verification quick and simple.
- the username can be obscured in the same way as is typically done with a password field.
- FIG. 13 b shows the verification user interface page after the disappearance of the popup window showing the scrollable “favorites” selections that have been randomly selected from a large array of options.
- the application is ready for the third favorite out of a total of four required for verification.
- more than four favorites may be requested for a successful login.
- more than four favorites may be requested to complete a financial transaction.
- less than four favorites may be requested for a successful login.
- robust security is desired but also convenience and a positive experience of the user are also important.
- This tradeoff is fundamental to security technology from the old-fashioned lock and key, to the most modern and sophisticated security technology used today.
- the technology and embodiments have flexibility in this aspect, and that the choice of these parameters can be adjusted, not only from one application to another, but if desired, from one transaction to another. For example, if in an embodiment a user has chosen seven items at enrollment, he/she may be asked to select only four items to unlock the phone interface, but when logging into a bank account, he/she may be asked to enter all seven items. In an alternative embodiment, the user may be requested to select 12 items instead of 7. This means that the technology can be adjusted “on the fly” to accommodate varying security levels for different embodiments.
- the use of images, plus the application of image processing, and non-deterministic random number generator, makes the UI and the system secure against sophisticated malware and hacking methods.
- the images shown in the UI diagrams above can be reordered, and the options offered can be changed using the non-deterministic random numbers on every screen during enrollment and verification. This removes the possibility of malware or onlookers recognizing patterns in what is being presented to the user, or following the user's behavior.
- the images themselves are modified to prevent sophisticated malware from running in the background to recognize the images directly by means of computational pattern recognition.
- the UI design presented here is an example of how embodiments can be implemented.
- the user interface may be configured very differently. On some systems, it may be best to guide users though a series of separate screens instead of scrolling. If scrolling is preferred, it can be done in one or two dimensions on the screen, or perhaps using scroll wheels, similar to those used in the Apple iPhone's date and time settings.
- more category options, or sub category options may be useful.
- the items can be categorized, similar to the example for enrollment, and it may be desirable to have all the choices displayed on a single screen, rather than offering more items to choose via scrolling, in which case the categories could be panelized on the screen.
- the choice of the images used is also to be considered. Simple binary images, such as those shown in the example of an embodiment, may be used in some embodiments. Full-color images could be used as well, depending on what sort of image processing is preferred for security enhancements. The shape and size of the images is flexible as well. The images chosen could even be opened up to the user by providing a large database of downloadable images, similar to the wide array of ringtones now available for cell phones. There may be some restrictions on the properties of the images used, however, again depending on the specifics of the security needs, the device, and the user interface design, but overall, it is extremely flexible.
- Unpredictable numbers are used to unpredictably place images on the screen.
- Unpredictable numbers are used to add unpredictable noise to images.
- the sequence of images used for a login/authentication cannot be reproduced by a digital computer program because the numbers are not generated by a deterministic algorithm (i.e., a digital computer program). Instead, quantum devices are used.
- the quantum devices utilize one or more photons being emitted from a device and generating a random 0 or 1 based on the time at which the photon is emitted.
- a well-designed quantum device can generate numbers according to the following two quantum-random properties of no bias and history has no effect on the next event.
- the resulting unpredictability incorporated into the image generation and manipulation in the system can make the recognition of the visual images a difficult artificial intelligence (AI) problem for machines.
- This unpredictability can be applied in the noise generation that is used to make visual images more difficult for machine algorithms to recognize.
- a hardware device detects the polarization of photons and uses this detection to determine a quantum random 0 or 1.
- the hardware detector uses linearly polarized photons (light).
- the hardware detector uses circularly polarized photons (light).
- a quantum random 0 or 1 is generated by the detection of a single photon.
- a quantum random 0 or 1 is generated by the detection of more than one photon.
- a quantum random 0 or 1 is generated based on the relative timing based on quantum events 0, 1 and 2.
- T 1 is the time elapsed between quantum event 0 and quantum event 1;
- T 2 is the time elapsed between quantum event 1 and quantum event 2.
- elapsed time T 1 is greater than elapsed time T 2 then a quantum random 1 is generated; if elapsed time T 1 is less than elapsed time T 2 then a quantum random 0 is generated.
- events 0, 1, and 2 are the result of detecting a photon. In another embodiment, events 0, 1 and 2 are the result of detecting a photon that is horizontally polarized.
- the detection of a photon may occur in a semiconductor chip as shown in FIG. 16 .
- images help ensure that a human, not a machine, is controlling the transaction or the communication between the user and institution. This is based on the highly developed ability of humans to recognize images. Although machine vision is embryonic by comparison with the mature image recognition abilities of the human eye-brain combination, it is possible for machines to recognize images. In order to provide robust security in anticipation of the possibility that sophisticated malware may incorporate machine vision techniques to attack image-based security systems, proprietary methods were developed to counteract computational image recognition, and fully exploit innate human pattern recognition abilities.
- correlation operation is a direct point-by-point mathematical “comparison” of two functions that can be used not only to detect the presence of a feature in an image, but to also find its location accurately.
- the continuous expression that describes the non-normalized correlation operation C between two real, one-dimensional functions A and B is:
- the ⁇ operator represents the correlation operation.
- the correlation can be written as:
- R is a rotation operator applied to A.
- a ⁇ B IFFT((FFT( A )) ⁇ (FFT( B )))
- a and B are the two image arrays and “IFFT” represents the inverse FFT operation.
- This computation scales with image size much more slowly and increases as N*log(N).
- FFT is so widely used for many data processing tasks, and FFTs are a common component of most floating-point benchmark tests for processors, many modern processors are designed with FFTs in mind and some are even optimized for performing FFTs. Therefore, for sufficiently large images, the use of FFTs to compare images is efficient.
- the complexity of the correlation increases, for example if rotation is added, the computational load increases quickly, making computational pattern recognition more difficult.
- noise modified image One of these techniques is the processing of the image using a specialized noise structure to create a “noise modified image.”
- noise structures There are several different noise structures that can use the non-deterministic random numbers generated by quantum physics-based hardware. Having various noise structures further enhances the security of the technique because the type of noise used to modify the image can be varied.
- FIGS. 14 and 15 An example of using the noise structure is demonstrated in FIGS. 14 and 15 below.
- the binarized (black or white pixels only) image in FIG. 14 a both the presence and exact locations of the letter p are found in the word apple using a correlation operation.
- the “noise modified image” is correlated with an exact copy of the letters used in the base image, the result is unintelligible noise as shown in FIG. 15 c.
- FIGS. 14 a, 14 b and 14 c shows the use of correlation operation to detect and find the locations of features in an image.
- FIG. 14 a shows an image containing the word “apple” in a handwritten style font.
- FIG. 14 b shows an image of the the letter “p” in the same font used in the image.
- FIG. 14 c shows the correlation function image showing the detection of the presence, and exact locations of the letter “p” in the image in FIG. 14 a, indicated by the bright peaks in FIG. 14 c.
- FIGS. 15 a, 15 b and 15 c show the addition of special types of noise to defeat the use of the correlation operation to find features in an image.
- FIG. 15 a shows an image containing the same word “apple” in the handwritten style font from FIG. 14 a but with a special type of noise added that enhances the contrast of the noise over the letters versus the background, where the noise contrast is reduced.
- FIG. 15 b shows the raw image of the letter “p” in the same font used in the original image before the noise is added.
- FIG. 15 c shows the correlation function image which is unintelligible, indicating the inability to detect the presence or locations in the letter “p” in the image in FIG. 15 a.
- noise methods may be applied to number images (e.g., images of the numbers 0, 1, 2, 3, 4, 5, 6, 7, 8 or 9), images of animals, images of sports items, face images, and other images of favorites.
- security solutions are provided for secure transactions against untrusted browser attacks and other cyberattacks.
- the solution(s) described in the specification secure payment transactions.
- the solution(s) may secure access and use of private networks such as Secret Internet Protocol Router Network (SIPRnet) or resources on a public infrastructure such as the electrical grid.
- SIPRnet Secret Internet Protocol Router Network
- FIG. 1A shows an embodiment of a system 100 for providing secure transactions.
- system 100 may include user system 101 , and user system 101 may include secure area 102 , secure memory system 104 , secure processor system 106 , output system 108 , input system 110 , sensor 111 , communication system 112 , memory system 114 , processor system 116 , input/output system 118 , operating system 120 , and network interface 122 .
- System 100 may also include network 124 and service provider system 126 .
- system 100 may not have all of the elements or features listed and/or may have other elements or features instead of, or in addition to, those listed.
- System 100 is a system within which a secure transaction takes place ( FIGS. 1A , 1 B, 2 A, 2 B, 3 , 3 A, and 3 B describe various details of system 100 and various methods for using system 100 ).
- the word system refers to any device or system of devices that communicate with one another.
- User system 101 is one that has a secure area that is dedicated for performing secure transactions over a network.
- User system 101 may be a single device or a combination of multiple devices.
- User system 101 may be a portable device, personal computer, laptop, tablet computer, handheld computer, mobile phone, or other network system, for example (in this specification a network system is any device or system that is capable of sending and/or receiving communications via a network).
- a secure area 102 may be provided for performing secure transactions.
- authentication information references to any form of information used for authenticating a user.
- authentication information such as a biometric authentication and/or another form of authentication is bound to the authorization of an action.
- the authentication information is in some way combined with the information for performing the action, such as by being concatenated together and then applying a hash function to the result of the concatenation.
- the words “action” and “transaction” may be switched one with another to obtain different embodiments.
- the information may be concatenated, added together (e.g., in a binary addition of the binary values of information), be different inputs to the same function, and/or combined in another manner.
- a hash function is a function that accepts as its input argument an arbitrarily long string of bits (or bytes) and produces a fixed-size output.
- a hash function maps a variable length message m to a fixed-sized output, ⁇ (m).
- Typical output sizes range from 160 bits, 256 bits, 512 bits, or can also be substantially larger.
- the hash functions that are used are one-way.
- a one-way function ⁇ is a function that can be easily computed, but that its inverse ⁇ ⁇ 1 is extremely difficult to compute.
- Other types of one way functions may be used in place of a hash function.
- hash functions Any of a number of hash functions may be used.
- One possible hash function is SHA-1, designed by the National Security Agency and standardized by NIST.
- the output size of SHA-1 is 160 bits.
- Other alternative hash functions are of the type that conform with the standard SHA-256, which produces output values of 256 bits, and SHA-512, which produces output values of 512 bits.
- a hash function could be one of the SHA-3 candidates.
- a candidate example of a hash function is BLAKE.
- Another example of a hash function is Gr ⁇ stl.
- Another example of a hash function is JH.
- Another example of a hash function is Keccak.
- Skein Another example of a hash function is Skein.
- secure area 102 may have its own secure processor system and secure memory system, which are not accessible by the rest of user system 101 .
- Secure area 102 may be capable of taking over and/or blocking access to other parts of user system 101 .
- Secure memory system 104 may be a dedicated memory for securing transactions. In an embodiment, secure memory system 104 may not be accessed by the other processor systems of user system 101 .
- Memory system 104 may include, for example, any one of, some of, any combination of, or all of a long-term storage system, such as a hard drive; a short-term storage system, such as random access memory; a removable storage system, such as a floppy drive or a removable drive; and/or flash memory.
- Memory system 104 may include one or more machine-readable mediums that may store a variety of different types of information. Secure memory system 104 may store methods and information needed to perform the secure transaction, user information, a method of generating a registration key, and encryption/decryption code.
- Secure memory system 104 may include one or more memory units that each write and/or read to one or more machine readable media.
- the term machine-readable medium is used to refer to any non-transient medium capable carrying information that is readable by a machine.
- a machine-readable medium is a computer-readable medium.
- Another example of a machine-readable medium is paper having holes that are detected that trigger different mechanical, electrical, and/or logic responses. The content of secure memory 104 is discussed further in FIG. 1B , below.
- Secure processor system 106 may include one or more processors.
- Processor system 116 may include any one of, some of, any combination of, or all of multiple parallel processors, a single processor, a system of processors having one or more central processors and/or one or more specialized processors dedicated to specific tasks.
- Processor system 116 implements the machine instructions stored in memory 114 .
- Secure processor system 106 may include one or more processors that cannot be accessed by the main processor of the user system 101 . For example, in an embodiment all of the processors of secure processor system 106 cannot be accessed by the main processor of system 101 .
- the operating system of user system 101 may have no access to secure area 102 , and in an embodiment, secure area 102 may be programmed without benefit of an operating system, so that there is no standard manner of programming secure area 102 , which thwarts hackers from sending read and/or write commands (or any other commands) to secure area 102 , because secure area does not use standard read and write commands (and does not use any other standard commands).
- providing secure area 102 addresses the weakness of biometric authentication and other authentication methods.
- Output system 108 may include any one of, some of, any combination of, or all of a monitor system, a handheld display system, a printer system, a speaker system, a connection or interface system to a sound system, an interface system to peripheral devices and/or a connection and/or interface system to a computer system, intranet, and/or internet, for example.
- secure processor system 106 may be capable of taking over and using any portion of and/or all of output system 108 .
- a portion of the output system may be a dedicated display system that may be accessed only by secure area 102 .
- secure processor 106 may be capable of receiving input from input system 110 and/or blocking access to output system 108 by the main processor system and/or other devices.
- Input system 110 may include any one of, some of, any combination of, or all of a biometric sensor 111 , a keyboard system, a touch sensitive screen, a tablet pen, a stylus, a mouse system, a track ball system, a track pad system, buttons on a handheld system, a scanner system, a microphone system, a connection to a sound system, and/or a connection and/or interface system to a computer system, intranet, and/or internet (e.g. IrDA, USB).
- biometric sensor 111 may be a finger print scanner or a retinal scanner.
- user system 101 stores the processed data from user information 104 B during registration.
- user system 101 retrieves user information 104 B and compares the scanned output of sensor 111 to user information 104 B to authenticate a user.
- secure processor 106 may be capable of receiving input from input system 110 and/or blocking access to input system 110 by the main processor system and/or other devices.
- processor 116 may capture pressure (e.g., pressing fingers) events on a touch sensitive screen or a mouse clicking corresponding to something of interest (e.g., a visual image) on a PC display.
- FIG. 5 shows images of part of an icon, the word “NAME” and the letters of the alphabet “ABCDEFGHIJLKLMNOPQRSTUVWXYZ”.
- Communication system 112 communicatively links output system 108 , input system 110 , memory system 114 , processor system 116 , and/or input/output system 118 to each other.
- Communications system 112 may include any one of, some of, any combination of, or all of electrical cables, fiber optic cables, and/or means of sending signals through air or water (e.g. wireless communications), or the like.
- Some examples of means of sending signals through air and/or water include systems for transmitting electromagnetic waves such as infrared and/or radio waves and/or systems for sending sound waves.
- Memory system 114 may include, for example, any one of, some of, any combination of, or all of a long-term storage system, such as a hard drive; a short-term storage system, such as random access memory; a removable storage system, such as a floppy drive or a removable drive; and/or flash memory.
- Memory system 114 may include one or more machine-readable mediums that may store a variety of different types of information. Memory system 114 and memory system 104 may use the same type memory units and/or machine readable media.
- Memory system 114 may also store the operating system of user system 101 and/or a web browser (which may also be referred to as an HTTP client).
- memory system 114 may also store instructions for input system 110 to read in biometric data and send the biometric data to secure area 102 .
- Processor system 116 may include one or more processors.
- Processor system 116 may include any one of, some of, any combination of, or all of multiple parallel processors, a single processor, a system of processors having one or more central processors and/or one or more specialized processors dedicated to specific tasks.
- Processor system 116 implements the machine instructions stored in memory 114 .
- processor 116 does not have access to secure area 102 .
- processor 116 may capture pressure (e.g., pressing fingers) events on a touch sensitive screen or a mouse clicking corresponding to something of interest (e.g., a visual image) on a PC display.
- clicking on the red letter “R” e.g., via image entry 179 in FIG. 1B ) shown at the bottom of the FIG. 6 would have a similar effect to typing the letter “R” on the keyboard but would make it more difficult for malware to know what the user is entering.
- processor 116 only communicates to secure area 102 when secure area 102 authorizes processor 116 to communicate with secure area 102 .
- Secure area 102 may prevent processor 116 from communicating to secure 102 during the secure area's execution of critical operations such as setup, generation of keys, registration key, biometric authentication or decryption of transaction information.
- Input/output system 118 may include devices that have the dual function as input and output devices.
- input/output system 118 may include one or more touch sensitive screens, which display an image and therefore are an output device and accept input when the screens are pressed by a finger or stylus, for example.
- the user may see visual images of letters on a screen as shown in FIG. 5 . In FIG. 5 , pressing a finger over the letter “B” shown just below the word NAME would indicate typing or entering the letter “B”.
- the touch sensitive screen may be sensitive to heat and/or pressure.
- One or more of the input/output devices may be sensitive to a voltage or current produced by a stylus, for example.
- Input/output system 118 is optional, and may be used in addition to or in place of output system 108 and/or input device 110 .
- a portion of the input/output system 118 may be dedicated to secure transactions providing access only to secure area 102 .
- secure processor 106 may be capable of receiving/sending input/output from/via input system 110 and/or blocking access to input system 110 by the main processor system and/or other devices. Restricting access to a portion of and/or all of the input/output system 118 denies access to third party systems trying to hijack the secure transaction.
- Operating system 120 may be a set of machine instructions, stored in memory system 110 , to manage output system 108 , input system 110 , memory system 114 , input/output system 118 and processor system 116 . Operating system 120 may not have access to secure area 102 .
- Network interface 122 may be an interface that connects user system 101 with the network. Network interface 122 may be part of input/output system 118 .
- Network 124 may be any network and/or combination of networks of devices that communicate with one another (e.g., and combination of the Internet, telephone networks, and/or mobile phone networks).
- Service provider system 126 (which will be discussed further in conjunction with FIG. 2A ) may receive the transactions.
- the recipient may be the final recipient or an intermediary recipient of transactions.
- Service provider system 126 may be a financial institution or a recipient of a secure transaction.
- User system 101 may interact with any of a variety of service provider systems, such as service provider system 126 , via a network 124 , using a network interface 122 .
- Service provider system 126 may be a system of one or more computers or another electronic device, and may be operated by a person that grants a particular user access to its resources or enables a particular event (e.g., a financial transaction, a stock trade, or landing a plane at an airport, and so on).
- a financial transaction may be an instance or embodiment of a transaction.
- a stock trade is one embodiment of a financial transaction;
- a bank wire transfer is an embodiment of a financial transaction and
- an online credit card payment is an embodiment of a financial transaction.
- Any operation(s) that runs in a trusted environment, which may be secure area 102 may be treated as a secure transaction.
- every secure transaction may include one or more atomic operations and the use of the word transaction is generic to both financial transactions and operations including atomic operations unless stated otherwise.
- the word transactions is also generic to an individual or indivisible set of operations that must succeed or fail atomically (i.e., as a complete unit that cannot remain in an intermediate state).
- Operations that require security may include operations that make use of, or rely on, the confidentiality, integrity, authenticity, authority, and/or accountability of a system should be executed in a trusted environment (e.g., in a secure area, such as secure area 102 ).
- Types of operations that require security may be treated as secure transactions.
- a successful transaction other than logging information alters a system (e.g., of service provider 126 ) from one known, good state to another, while a failed transaction does not.
- a secure transaction assures the following properties:
- these functionalities may be processed using a mobile phone.
- a mobile phone are an Android phone, the iPhone and the Blackberry.
- a secure chip or secure part of the chip may reside in a personal computer.
- a secure chip may be temporarily or permanently disconnected from the rest of the system so that the operating system 120 does not have access to critical information entered into and received (e.g., read or heard) from the secure area's user interface.
- this critical information may be authentication information, such as a collection of images, biometric information, passwords, passcodes, PINS, other kinds of authentication factors, transaction information, and/or other user credentials.
- the portable device may have a user interface with a keyboard and mouse or display screen that is sensitive to the placement of fingers enables the user to select buttons, images, letters, numbers or symbols.
- the screen may be used to select one or more images.
- FIG. 7 shows the choice of selecting “ACCEPT” or “ABORT” using images. The selection is captured by image entry 179 shown in FIG. 1B . At least one embodiment may enable the user to enter transaction information using this keyboard and mouse or the display screen.
- Portable embodiments of user system 101 enable users to execute secure transactions in remote places such as inside a jet, on a golf course, inside a moving automobile, from a hotel room, in a satellite, at a military gate, and/or other isolated places.
- a person may be requested to choose their favorite food and he or she may select an apple image—via the user interface—as user verification.
- a transaction may require a person to select one or more images (i.e., a collection of images) from a display screen.
- Example images could be a picture or photo of an orange, a train, a specific pattern such as a peace sign or a diagram or a logo, a Mercedes car, a house, a candle, the Golden Gate bridge or a pen.
- FIG. 4 shows images of parts of a logo.
- FIG. 9 shows different texture images: horizontal, vertical, triangular, mixed, dotted, bubble, simplicial, and foliation.
- At the bottom of FIG. 8 is a geometric image of a blue rectangle on top of a blue triangle which is on top of a red blob.
- Just to the right in FIG. 8 is a rectangle with vertical texture on top of a triangle with dotted texture on top of a blob with simplicial texture.
- FIG. 5 shows images of the alphabet letters: “ABCDEFGHIJKLMNOPQRSTUVWXYZ”.
- the person may add his or her own images using image acquisition 173 , which are then used for user verification during the transaction.
- a display screen may be used, which may call image display 177 .
- user system 101 may be described as using collections of visual images as a user's universal identifier or as user authentication, other items or a combination of these items may be used for verifying the identity of the person such as face prints, iris scans, finger veins, DNA, toe prints, palm prints, handprints, voice prints and/or footprints. Any place, the expression “biometric prints” occurs any of the above listed different specific types of biometrics may be substituted to get specific embodiments.
- the authentication items may be PINs, passwords, sequences, collections of images that are easy to remember, and/or even psychometrics.
- the item used to verify the person may be any item that is unique.
- the item(s) used to verify the person may be one or more items that as a combination are difficult for malware to fabricate, guess, find by trial and error, and/or compute.
- the item(s) used to verify the person are uniquely associated with this person.
- the item used to verify the person has an unpredictable element.
- a secure area 102 may be a specialized part of the chip (e.g., a microprocessor), where the operating system 120 and web browser software do not have access to this specialized part of the chip.
- a specialized part of the chip may be able to turn off the operating system 120 's access to presses of the buttons or a screen of a mobile phone (or other computing device), preventing malware and key or screen logging software from intercepting a PIN or the selection of an image.
- a specialized part of the chip may be able to temporarily disconnect the rest of the chip's access to the screen (e.g., by preventing the execution of the operating system 120 and web browser).
- part of the display screen may be permanently disconnected from the part of the chip (e.g., from the microprocessor of the chip) that executes the operating system 120 and web browser.
- a part of the chip may only have access to the biometric sensor, while the rest of the chip—executing the operating system 120 and web browser—is permanently disconnected from the biometric sensor.
- a secure area such as secure area 102
- photons may be produced by the hardware as a part of the unpredictable process.
- the unpredictable process may be produced by a specialized circuit in the secure area.
- biometric prints and/or unpredictable information from unpredictable physical process are used to generate one or more keys in the secure area 102 .
- the secure area 102 may include embedded software.
- the embedded software is on a chip with a physical barrier around the chip to hinder reverse engineering of the chip, and/or hinder access to keys, transaction information, and/or possibly other user credentials.
- the selection of visual images, using image entry 179 are less susceptible to theft as they can be displayed on the screen in a form that is not easily recognizable or captured by malware. Because they are difficult for malware to recognize or apprehend, they can be presented by image display 177 in a less secure part of the system such as operating system 120 running a web browser.
- Each of the above embodiments may be used separately from one another in combination with any of the other embodiments. All of the embodiments of this specification may be used together or separately.
- some embodiments may use a secure area 102 that may be part of user system 101 or a special part of the chip that is able to acquire biometric prints, store authentication information, and/or authenticate the newly acquired items.
- the authentication information may include templates of biometric prints, images, pins, and/or passwords.
- the secure area may also be a part of the device where critical transaction information may be entered or verified on a display that the secure area only has access to.
- the host computer (domain) and the network have no access to the transaction information, no access to the keys, no access to biometrics, and/or no access to other critical user credentials (the transaction information, the keys, the biometrics, and/or other critical user credentials may be the contained and processed by the secure area).
- transaction information refers to one or more items of information that describe the transaction.
- one item of transaction information may be the name of the person or entity sending the money.
- Another item of transaction information may be the name of the person or entity receiving the money.
- Another item of transaction information may be the date or time of day.
- Another item of transaction information may be the sending person's (or entity's) account number.
- Another item of transaction information may be the receiving person's (or entity's) bank account number.
- FIG. 10 shows a recipient account number 9568342710 with a collection of visual images.
- the sending person or entity is the person or entity that sends a message that is part of the transaction and the receiving person or entity (recipient) is the person or entity that receives the message that is part of the transaction.
- Another item of transaction information may be the sending person's (or entity's) routing number.
- Another item of transaction information may be the receiving person's (or entity's) routing number.
- Another item of transaction information may be the amount of money that may be expressed in dollars, Eros, yen, francs, deutschmark, yuan or another currency.
- one or more images may be acquired by using image acquisition 173 in user system 101 . These one or more images may serve as a user's universal identifier or provide a method to authenticate the user.
- An example of one or more images that may serve as a universal identifier is shown in FIG. 11 .
- no images are stored in user system 101 .
- these images are acquired and encrypted by image encrypt/decrypt 175 and transmitted to service provider system 126 .
- a background texture may be selected by the user, that was generated by image generator 238 in service provider system 126 .
- FIG. 9 shows some examples of textures.
- a symbol, letter, number and/or image texture may be selected or generated.
- FIG. 8 shows the word “SIMPLICIAL” written with a bubble texture using a simplicial background texture.
- a unique icon or image may be chosen or generated by the user system 101 and/or the user and/or service provider system 126 .
- user makes sure that a recognizable image generated by image generator 238 appears on the user interface that is only known to service provider system 126 .
- FIG. 4 shows an example of different parts of a unique logo that may serve this purpose. The use of recognizable image in different forms helps hinder malware from capturing important setup information and helps assure that the user is communicating with the appropriate service provider system 126 .
- some initial transaction information is provided to service provider system 126 .
- This transaction information may include the user's name, the user's bank account number and bank.
- some of this transaction information provided via image entry 179 to service provider system 126 may be provided by using images (i.e., acquired with image acquisition 173 ) that are difficult for malware to capture or apprehend.
- one or more biometric prints may be acquired, and one or more unique registration keys and cryptography keys may be generated from the one or more of the biometric prints (items) or generated from an unpredictable physical process or both.
- the unpredictable physical process may come from a hardware chip or hardware circuit that uses photons as a part of the unpredictable process to create the cryptography keys.
- the software that secure area 102 executes may be embedded in secure memory 104 .
- the secure biometric print device has a number of components, which are described later.
- the security of the secure area 102 may be enhanced by any one of, any combination or of, or all of (1) the use of embedded software, (2) the lack of an operating system, and (3) the secure area being at least part of a self-contained device not connected to a computer or the internet.
- the unit that includes the secure area may contain its own processor.
- the secure area may not have any of these security enhancing features.
- the biometric sensor enables user system 101 to read biometric prints.
- the biometric sensor may include a fingerprint area sensor or a fingerprint sweep sensor, for example.
- the biometric sensor may contain an optical sensor that may acquire one or more types of biometrics.
- the biometric sensor may be a microphone or other kind of sensor that receives acoustic information, such as a person's voice.
- the sensor may be a device that acquires DNA or RNA.
- secure processor system 106 may execute the software instructions, such as acquiring a biometric print from the sensor, matching an acquired biometric print against a stored biometric print, sending communication and control commands to a display, and/or encrypting the registration key and transmitting the registration key to the administrator when the user and administrator are not in the same physical location.
- the security is enhanced, because the external processor is given fewer chances to inspect contents of secure area 102 .
- secure area 102 may store software instructions that are run by secure processor system 106 .
- Processor system 106 performs the biometric print acquisition, and/or the encryption or decryption.
- a specialized logic circuit is built that carries out the functions that the software causes the processors to perform, such as driving sensor 111 (which may be an acquisition unit, such as a biometric sensor).
- Secure memory system 104 may contain non-volatile memory in addition to volatile memory. Non-volatile memory enables the device to permanently store information for generating cryptography keys (encryption or decryption).
- secure memory system 104 may include memory on secure processor system 106 .
- the sensor or input system 110 and secure processor system 106 may be integrated into a single chip. Alternatively, in another embodiment, the sensor in input system 110 and secure processor system 106 may be two separate chips.
- FIG. 1B shows an embodiment of a block diagram of the contents of memory system 104 of FIG. 1A
- Memory system 104 may include instructions 152 , which in turn may include a setup routine 154 , an authentication of user routine 156 , a secure transaction routine 158 , having an initial request routine 160 , a service provider authentication routine 162 , and a completion of transaction routine 164 .
- Instructions 154 (of memory 104 ) may also include registration key generator 166 , drivers 168 , controller 169 , generate cryptography key 170 , perturb cryptography key 174 , hash functions 178 , perturbing functions 180 , and user interface 181 .
- Memory system 104 may also store data 182 , which may include biometric template T 184 , registration key R 186 , current cryptography key K 188 and transaction information S 192 .
- data 182 may include biometric template T 184 , registration key R 186 , current cryptography key K 188 and transaction information S 192 .
- memory system 104 may not have all of the elements or features listed and/or may have other elements or features instead of, or in addition to, those listed.
- Instructions 152 may include machine instructions implemented by processor 106 .
- Setup routine 154 is a routine that handles the setting up of the user system 101 , so that user system 101 may be used for performing secure transactions.
- Setup routine 104 may collect a new user's biometric print, and apply a hash function to the biometric print (and/or to other user information) to generate a registration key R.
- there may be specialized hardware in the secure area to help create unpredictableness used for the generation of cryptography key(s), seed(s), and/or registration key(s).
- a registration key, seed, or cryptography key may be generated by applying the hash function to the raw biometric print data, for example.
- setup routine 154 may apply a hash function to authentication information, such as a biometric print, to hardware noise produced by a phototransistor, and/or other user information or a combination of these to generate an initial cryptography key.
- the setup routine 154 may also send the registration key and/or the cryptography key to the service provider system 126 .
- the registration key R and/or the initial cryptography key may be received from service provider 126 .
- Authentication of user routine 156 may authenticate the user each time the user attempts to use user system 101 .
- This routine may call image acquisition 173 to acquire a collection images for user authentication.
- user system 101 may include a biometric sensor (e.g., as sensor 111 ) that scans the user's biometric print, reduces the biometric print to a template, and matches the newly derived biometric template to a stored template (which was obtain by setup routine 154 ). Then, if the stored template and the newly derived template match, the user is allowed to use user system 101 .
- a biometric sensor e.g., as sensor 111
- a biometric print acquired may be directly matched with a stored template.
- authentication of user routine 156 may require the user to enter a password. If the password received and the password stored match, the user is allowed to use user system 101 .
- Secure transaction routine 158 is a routine that implements the secure transaction.
- the initial request routine 160 is a first phase of secure transaction routine 158 .
- One purpose of initial request routine 160 is to receive a selection of images known to the user and acting as a user authentication that are difficult for malware to recognize or apprehend and transaction information entered and represented as images that are difficult for malware to recognize or apprehend.
- the transaction information is encrypted with the cryptography key.
- the cryptography key may perturbed to obtain a new cryptography key, respectively. In an alternative embodiment, the cryptography key is not changed each time
- Service provider authentication routine 162 authenticates the information provided by the service provider.
- the collection of images, representing the user's universal identifier or user authentication, received by service provider 126 to system 101 in reply to initial request 160 may be authenticated by service provider authentication routine 162 .
- Drivers 168 may include drivers for controlling input and output devices, such as the keyboard, a monitor, a pointing device (e.g., a mouse and/or a touch pad), a biometric print sensor (for collecting biometric prints).
- Controller 169 may include one or more machine instructions for taking control of the keypad, monitor and/or network interface, so the transaction may be performed securely, without fear of the processor system 116 compromising security as a result of being taken over by malware sent from another machine.
- Generate cryptography key 170 are machine instructions that generate a new cryptography key (e.g., by applying a function). In at least one embodiment, the cryptography key is not updated after the initial step. Perturb cryptography key 174 perturbs the current cryptography key to thereby generate the next cryptography key.
- Image acquisition 173 are machine instructions that acquire images.
- Image encrypt/decrypt are machine instructions that encrypt or decrypt one or more images. In at least one embodiment, these images are encrypted before sending to service provider system 126 . In at least one embodiment, encrypted images are received from service provider system 126 and decrypted with service provider system 126 before they are displayed to the user with image display 177 .
- Image display 177 are machine instructions that display one or more images to the user, utilizing user interface 181 . In at least one embodiment, images are displayed on a screen of a mobile phone or PC.
- Image entry 179 are machine instructions that determine which image a user has selected with his or her finger on a touch sensitive screen or has selected with a mouse.
- Hash functions 178 may be one or more one-way functions, which may be used by generate registration key 166 for generating a registration key from a biometric print and/or other user information. Those hash function(s) of hash functions 178 that are used by initial request 160 , authentication of service provider routine 162 , and completion of transaction routine 164 may be the same as one another or different from one another.
- Perturbing functions 180 may include one or more perturbing functions, which may be used by perturb cryptography key 174 . Different perturbing functions of perturbing functions 180 may be used during each initial request 160 , authentication of service provider routine 162 , and/or completion of transaction routine 164 . In this specification anytime a hash function is mentioned or a perturbing function is mentioned any other function may be substituted (e.g., any perturbing function may be replaced with a hash function and any hash function may be replaced with a perturbing function) to obtain another embodiment.
- any perturbing function and/or hash function mentioned in this specification may be a one way function.
- User interface 181 provides a page, a web browser or another method of displaying and entering information so that the user interface may provide one or more of the following functionalities, labeled with the letters A-F.
- the user may view the transaction information being sent.
- the user may enter instructions for sending transaction information.
- the user may receive information about whether or the user authentication was valid.
- the user may enter or generate one or more images known by the user and/or enter another biometric print or another type of user authentication such as a PIN.
- the user may determine the current state in the transaction process.
- F. The user may read directions or enter information for the next step in the transaction process.
- Biometric template T 184 may include templates, such as minutiae and/or other information characterizing biometric prints of users, which may be used to authenticate the user each time the user would like to use secure area 102 and/or system 101 .
- Registration key R 186 may be generated by applying a hash function to a collection of images selected or generated by the user, biometric print(s) and/or information derived from an unpredictable physical process. In one embodiment, the unpredictable physical process may use one or more phototransistors, each of which senses photons.
- Current cryptography key K 188 is the current cryptography key, which may be stored long enough for the next cryptography key to be generated from the current cryptography key.
- Transaction information S 192 may include information about a transaction that the user would like to perform.
- FIG. 2A shows a block diagram of an embodiment of a service provider system 200 in a system for securing transactions against cyber attacks.
- service provider system 200 may include output system 202 , input system 204 , memory system 206 , processor system 208 , communication system 212 , and input/output system 214 .
- the service provider system 200 may not have all the components and/or may have other embodiments in addition to or instead of the components listed above.
- Service provider system 200 may be a financial institution or any other system such as a power plant, a power grid, or a nuclear plant or any other system requiring secure access.
- service provider system 200 may be an embodiment of service provider system 126 . Any place in this specification where service provider 126 is mentioned service provider 200 may be substituted. Any place in this specification where service provider 200 is mentioned service provider 126 may be substituted.
- Service provider system 200 may include one or more webservers, applications servers, and/or databases, which may be part of a financial institution, for example.
- Output system 202 may include any one of, some of, any combination of, or all of a monitor system, a handheld display system, a printer system, a speaker system, a connection or interface system to a sound system, an interface system to peripheral devices and/or a connection and/or interface system to a computer system, intranet, and/or internet, for example.
- Input system 204 may include any one of, some of, any combination of, or all of a keyboard system, a touch sensitive screen, a tablet pen, a stylus, a mouse system, a track ball system, a track pad system, buttons on a handheld system, a scanner system, a microphone system, a connection to a sound system, and/or a connection and/or interface system to a computer system, intranet, and/or internet (e.g. IrDA, USB).
- a keyboard system e.g. a touch sensitive screen, a tablet pen, a stylus, a mouse system, a track ball system, a track pad system, buttons on a handheld system, a scanner system, a microphone system, a connection to a sound system, and/or a connection and/or interface system to a computer system, intranet, and/or internet (e.g. IrDA, USB).
- Memory system 206 may include may include, for example, any one of, some of, any combination of, or all of a long term storage system, such as a hard drive; a short term storage system, such as random access memory; a removable storage system, such as a floppy drive or a removable drive; and/or flash memory.
- Memory system 206 may include one or more machine-readable mediums that may store a variety of different types of information.
- the term machine-readable medium is used to refer to any medium capable carrying information that is readable by a machine.
- One example of a machine-readable medium is a computer-readable medium.
- Another example of a machine-readable medium is paper having holes that are detected that trigger different mechanical, electrical, and/or logic responses.
- Memory 206 may include encryption/decryption code, algorithms for authenticating transaction information, for example (memory 206 is discussed further in conjunction with FIG. 2B ).
- Processor system 208 executes the secure transactions on system 200 .
- Processor system 208 may include any one of, some of, any combination of, or all of multiple parallel processors, a single processor, a system of processors having one or more central processors and/or one or more specialized processors dedicated to specific tasks.
- processor system 208 may include a network interface to connect system 200 to user system 101 via network 124 .
- processor 208 may execute encryption and decryption algorithms,with which the transaction information was encrypted.
- processor 208 may decrypt secure messages from user system 101 and/or encrypt messages sent to user system 101 .
- Communication system 212 communicatively links output system 202 , input system 204 , memory system 206 , processor system 208 , and/or input/output system 214 to each other.
- Communications system 212 may include any one of, some of, any combination of, or all of electrical cables, fiber optic cables, and/or means of sending signals through air or water (e.g. wireless communications), or the like.
- Some examples of means of sending signals through air and/or water include systems for transmitting electromagnetic waves such as infrared and/or radio waves and/or systems for sending sound waves.
- memory system 206 may store instructions for system 200 to receive authenticated secure transaction information from user system 101 .
- Input/output system 214 may include devices that have the dual function as input and output devices.
- input/output system 214 may include one or more touch sensitive screens, which display an image and therefore are an output device and accept input when the screens are pressed by a finger or stylus, for example.
- the touch sensitive screen may be sensitive to heat and/or pressure.
- One or more of the input/output devices may be sensitive to a voltage or current produced by a stylus, for example.
- Input/output system 118 is optional, and may be used in addition to or in place of output system 202 and/or input device 204 .
- FIG. 2B shows an embodiment of a block diagram of the contents of memory system 206 of FIG. 2A
- Memory system 206 may include instructions 220 , which in turn may include a setup routine 222 , an authentication of user routine 224 , a request for authentication routine 226 , completion of transaction routine 228 , generate registration key 230 , generate cryptography key 232 , hash functions 242 , and perturbing functions 244 .
- Memory system 206 may also store data 245 , which may include registration key R 246 , current cryptography key K 248 , and transaction information S 252 .
- memory system 206 may not have all of the elements or features listed and/or may have other elements or features instead of, or in addition to, those listed.
- Setup routine 222 is a routine that handles the setting up of the service provider system 200 , so that service provider system 200 may be used for performing secure transactions.
- Setup routine 222 may receive a registration key from the user system, which in turn may be used for generating the initial cryptography key.
- the user may send the biometric print or template of the biometric print to service provider system 200 , and service provider system 200 may generate the registration key from the biometric print in the same manner that user system 101 generates the registration key from the template of the biometric print or from the biometric print and/or information obtained from an unpredictable physical process (e.g., by setup routine 222 applying a hash function to the biometric print and/or information derived from an unpredictable physical process).
- service provider system 200 may generate the registration key from the biometric print in the same manner that user system 101 generates the registration key from the template of the biometric print or from the biometric print and/or information obtained from an unpredictable physical process (e.g., by setup routine 222 applying a hash function to the biometric print and/or information derived from an unpredictable physical process).
- the user may visit the location of service provider, where the service provider may acquire a collection of images known to the user, which is used by service provider system 200 for at least partially creating the initial cryptography key.
- Generate cryptography key 232 are machine instructions that generate a new cryptography key from (e.g., by applying a function, such as a perturbing function to) a prior cryptography key. Generate cryptography key 232 may be the same routine as generate cryptography key 170 except that generate cryptography key 232 is implemented at service provider 200 and generate cryptography key 170 is implemented at user system 101 .
- Perturb cryptography key 236 may be the same as perturb cryptography key 174 , and perturb cryptography key 236 perturbs the current cryptography key to thereby generate the next cryptography key
- Hash functions 242 may be the same as hash functions 178 .
- Hash functions 242 may be one a way functions, which may be used by generate cryptography keys routine 230 .
- hash functions 242 may include a different function for generate cryptography keys 230 .
- Those hash function(s) of hash functions 242 that are used by authentication of user routine 224 , request for authentication routine 226 , and completion of transaction routine 228 may be the same as one another or different from one another.
- perturbing functions 244 may be used during each of authentication of user routine 224 , request for authentication routine 226 , and completion of transaction routine 228 .
- perturbing functions 244 and hash functions 242 are indicated as separate storage areas in from perturb cryptography key 236 , the perturbing functions may just be stored as part of the code for perturb cryptography key 236 .
- Registration key R 246 may be the same as registration key 185 and may be generated by applying a hash function to a collection of images selected or generated by the user and/or biometric print(s) and/or information from an unpredictable physical process.
- Current cryptography key K 248 may be the same as current cryptography key 188 , and may be the current cryptography key, which may be stored long enough for the next cryptography key to be generated from the current cryptography key.
- Transaction information S 252 may be the same as transaction 192 , and may include information about a transaction that the user would like to perform. Transaction information S 252 may be received from user system 101 and may be used to perform a transaction at service provider system 200 on behalf of user system 101 .
- FIG. 3 shows a flowchart of an embodiment of setting up user system 101 for securing transactions.
- This user system method may be the setup performed by user system 101 before enabling a user to execute secure transactions with a bank, financial institution or financial exchange.
- a sequence or collection of visual images that are easy to remember are obtained from the user.
- some visual images may be an image of an animal, an image of a car, an image of a house, an image of a place, an image of a person's name, an image of all or part of a bank logo.
- this collection of universal images may act as a universal identifier for the user.
- the universal identifier for that particular user may be composed of the following 7 images where order is not important: a train, the Golden Gate bridge, pink sparkle shoes, chocolate ice cream in a waffle cone, one of the Wells Fargo stagecoach horses, an orange, and a visual image of the name Haley.
- FIG. 11 An example of this visual image of a name is displayed as a visual image as shown in FIG. 11 .
- the universal identifier may use a particular background texture or pattern that is determined by the user or service provider system during setup.
- FIG. 9 shows examples of different textures.
- the visual image of Haley in FIG. 11 is represented with a bubble texture against a foliation background texture.
- the universal identifier may be used to request from the user as user authentication.
- user authentication may involve a subset of these images of the universal identifier or different set of visual images.
- biometric print information may be obtained from the user from a biometric sensor 111 in input system 110 in order to establish a method of user authentication.
- the user setup method may also collect other setup information, such as a Personal Identification Number (PIN), or a password.
- PIN Personal Identification Number
- the setup data that was collected may be denoted as a T.
- the universal identifier and user authentication information are encrypted and transmitted to the service provider system.
- this information is encrypted as visual images and then sent back to the service provider system.
- a Diffie-Hellman key exchange is used to establish keys to encrypt the universal identifier and user authentication information.
- step 306 the user service provider receives the encrypted universal identifier and user authentication information and decrypts them and stores them.
- step 308 user's account is initialized with user service provider and enabled for executing transactions.
- a Diffie-Hellman key exchange is a key exchange method where two parties (Alice and Bob) that have no prior knowledge of each other jointly establish a shared secret key over an unsecure communications channel.
- a group G is a set with a binary operation *, (g*g is denoted as g 2 ; g*g*g*g*g*g is denoted as g 5 ), such that the following four properties hold:
- integers ⁇ . . . , ⁇ 2, ⁇ 1, 0, 1, 2, . . . ⁇ with respect to the binary operation + are an example of an infinite group.
- 0 is the identity element.
- the inverse of 5 is ⁇ 5 and the inverse of ⁇ 107 is 107.
- the set of permutations on n elements ⁇ 1, 2, . . . , n ⁇ , denoted as S n is an example of a finite group with n! elements where the binary operation is function composition.
- Each element of S n is a function p: ⁇ 1, 2, . . . , n ⁇ 1, 2, . . . , n ⁇ that is 1 to 1 and onto.
- p is called a permutation
- H is a non-empty subset of a group G and H is a group with respect to the binary group operation * of G, then H is called a subgroup of G.
- H is a proper subgroup of G if H is not equal to G (i.e., H is a proper subset of G).
- G is a cyclic group if G has no proper subgroups.
- This multiplicative notation i.e. using superscripts
- This multiplicative notation is used in the description of the Diffie-Hillman key exchange protocol described below.
- Steps 1, 2, 3, 4, and 5 describe the Diffie-Hellman key exchange.
- Alice can encrypt a message m, as mg ab , and sends mg ab to Bob.
- a result from group theory implies that the order of every element of a group divides the number of elements in the group, denoted as
- 1 for all x in G where 1 is the identity element in G.
- Bob calculates (g a )
- ⁇ b (g
- the user and the service provider 126 agree upon a common key for the registration key.
- the user then encrypts one of the common keys with the registration key.
- the service provider 126 encrypts the common key with other information, which may be information specific to the user or a random number, for example.
- the user sends the encrypted common key (that was encrypted by the user with the registration) to the service provider 126 , and the service provider 126 sends the encrypted common key that the service provider 126 encrypted to the user.
- the user encrypts the encrypted common keys that was received from the service provider 126 with the registration key
- the service provider 126 encrypts the encrypted common key received from the user (which was encrypted with the registration key) with the same information that was used to encrypt the original copy of the common key of the service provider 126 .
- both the user and the service provider 126 will now have the common encrypted key derived from the registration key supplied by the user and the information supplied by the service provider 126 .
- the resulting encrypted common key may be used as the registration key (instead of the original registration key).
- the user system 101 and the service provider 126 may also agree upon a common key for the cryptography key.
- the common key of the cryptography key and registration key may be the same as one another or different.
- the user system 101 then encrypts one of the common keys and the cryptography key.
- the server encrypts the common key with other information, which may be information specific to the user or a random number for example (as was done for the registration key).
- the user system 101 sends the encrypted common key (that was encrypted by the user with the cryptography key) to the service provider 126 , and the service provider 126 sends the encrypted common keys (which was encrypted service provider 126 ) to the user.
- the user encrypts the encrypted common key that were received from the service provider 126 with the cryptography key
- the service provider 126 encrypts the encrypted common keys received from the user (which was already encrypted with the cryptography key by the user) with the same information that was used to encrypt the original copy of the common keys of the service provider 126 .
- both the user and the service provider 126 will now have the common key encrypted by the cryptography key supplied by the user and the information supplied by the service provider 126 .
- the resulting encrypted common key may be used as the cryptography key (instead of the original cryptography key).
- the secure transmission may use elliptic curve cryptography which is similar to the Diffie-Hellman exchange described previously.
- the secure transmission of cryptography key(s) K may use a camera that reads a proprietary pattern from the user's display of the device after setup is complete.
- the user's display is the screen of a mobile phone.
- the registration key R may be given to the administrator in the same physical place, such as at a bank, or the registration key may be mailed or electronically transmitted to the administrator if setup is accomplished remotely. In some applications, the registration key may be encrypted first and then electronically transmitted or sent by mail.
- the number j in the operator ⁇ j ( ) is the number of times that the operator ⁇ ( ) is applied to R.
- one item may be the name of the person or entity sending the money.
- the transaction may be a stock trade.
- the stock account number may be part of the transaction information.
- the ticker symbol of the stock—for example, GOOG—being bought or sold may be part of the transaction information (or the name of a commodity or other item being purchased).
- the number of shares may be part of the transaction information.
- the price per share (or unit price) at which the person wishes to buy or sell the shares may be an item of the transaction information. If the stock purchase (or sale) is a limit order, then an indication that the stock purchase is a limit order may be an item of the transaction information.
- an indication that the purchase is a market order may be an item of the transaction information.
- the name of the stock account (e.g. Ameritrade, Charles Schwab, etc.) or broker may also be an item of the transaction information.
- FIG. 3A shows a flow chart of transaction step A.
- TRANSACTION STEP A the person looks for one or more logos or visual images that helps person make sure that he or she is communicating to the appropriate user's bank, financial institution or other service provider system. In an embodiment, the person learns or creates this image that verifies the service provider system during setup.
- user selects a collection or sequence of visual images that are easy to remember, and/or presents a biometric print match and/or a password or PIN, that are acquired by user system 101 . This is referred to as user authentication.
- the person (user) securely enters transaction information by selecting or choosing visual images that are difficult for malware to read or recognize.
- the user may wirelessly transmit the encrypted transaction information via a mobile phone to service provider system 126 .
- the user may submit or enter a collection of images and encrypted transaction information to the web browser of user system 101 and use the Internet for transmission to the administrator (bank) at service provider system 126 .
- the user may submit the user authentication and encrypted transaction information by some other electronic means, such as a fax machine or an ATM machine.
- the current time ⁇ 1 is determined and provided as transaction information.
- the current time ⁇ 1 may be rounded to the nearest minute, for example.
- the sender and receiver may compute the difference in time between the clock of the sender and the clock of the receiver prior to sending a message in case the two clocks are not sufficiently synchronized.
- the time may be rounded to the nearest 5 minutes, the nearest, 10 minute, or the nearest hour, for example.
- the reference time is GMT time. For example, if the exact time is 19:05 and 45 seconds GMT, then ⁇ 1 is set to is 19:06 GMT. If the time is not correct or is too delayed from the original time, then the transaction may be aborted.
- TRANSACTION STEP B The administrator (bank or financial institution) receives at service provider system 126 the encrypted transaction information, encrypted one-time information and encrypted user authentication information.
- FIG. 3B shows a flow chart of transaction step B.
- Step B.1 The service provider system decrypts the user authentication information and checks that it is valid. If it is not valid, then the transaction is aborted. If the user authentication information is valid, then service provider system 126 goes to step B.2.
- Step B.2 The service provider decrypts E( , K) and checks that the user was able to correctly recognize the one-time information from the user's screen or web browser. If the one-time information ′ decrypted by the service provider system is not valid (i.e., ′ doesn't match ), then the transaction is aborted. If the one-time information ′ decrypted by the service provider system is valid (i.e., ′ matches ), then service provider system 126 goes to step B.3.
- the one-time information is displayed on the user's screen in a way that is difficult to recognize or apprehend by malware but recognizable by a person.
- Step B.3 The encrypted transaction information E( , K) is decrypted and transaction is executed.
- TRANSACTION STEP C The service provider system translates the transaction information to a new collection of visual images but that represent the same transaction information as .
- the service provider system encrypts this new visual representation of the transaction information as E( , K) and sends E( , K) back to the user system.
- the user system receives E( , K), decrypts it and the user checks that matches transaction information . If doesn't match transaction information , then the user may abort the transaction.
- transaction step D If matches the original transaction information submitted by the user, then the user sends a message to the service provider to complete the transaction.
- transaction step D There are a number of methods to implement transaction step D.
- the cryptography key K may be updated, denoted as ⁇ (K) on both sides. Then the encrypted transaction information E( , ⁇ (K)) or E( , K) is sent from the administrator (bank) back to the user.
- the user interface may implemented with a web browser in a personal computer or in a mobile phone.
- User input such as selecting letters, numbers or other input items may be accomplished with fingers on the glass screen of IPhone or Android phone.
- the letters, number or other input items may be entered with a mouse selecting appropriate letters as shown in FIG. 5 or 6 .
- the display screen may be rendered with a glass screen in a mobile phone such as an Android or IPhone.
- the display screen may use an LCD.
- some or all of the financial institution members of SWIFT may be stored in terms of patterns or images in the memory of the service provider system.
- the user may use her or her fingers to scroll on the screen and select one of the banks to make a transaction with.
- the user may use a mouse to scroll on the display of the personal computer.
- the user may be an employee of the bank.
- the device may be used to securely execute wire transfers between two banks
- a visual images of letters that are difficult for malware to read may be displayed as a keyboard to be used by a person to enter a password or transaction information as shown in FIGS. 5 and 6 .
- the display may enable the user to verify that the transaction information is correct or has not been tampered with by malware before executing the transaction.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Accounting & Taxation (AREA)
- General Engineering & Computer Science (AREA)
- Strategic Management (AREA)
- Finance (AREA)
- General Business, Economics & Management (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Computing Systems (AREA)
- Human Computer Interaction (AREA)
- Financial Or Insurance-Related Operations Such As Payment And Settlement (AREA)
Abstract
Methods and systems described herein perform a secure transaction. A display presents images that are difficult for malware to recognize but a person can recognize. In at least one embodiment, a person communicates transaction information using visual images received from the service provider system. In at least one embodiment, a universal identifier is represented by images recognizable by a person, but difficult for malware to recognize.
In some embodiments, methods and systems are provided for determining whether to grant access, by generating and displaying visual images on a screen that the user can recognize. In an embodiment, a person presses one's finger(s) on the screen to select images as a method for authenticating and protecting communication from malware.
In at least one embodiment, quantum randomness helps unpredictably vary the image location, generate noise in the image, or change the shape or texture of the image.
Description
- This application incorporates herein by reference U.S. Provisional Patent Application No. 61/698,675, entitled “No More Passwords”, filed Sep. 9, 2012.
- This specification relates to security in computers, mobile phones and other devices.
- The subject matter discussed in the background section should not be assumed to be prior art merely as a result of its mention in the background section. Similarly, a problem mentioned in the background section or associated with the subject matter of the background section should not be assumed to have been previously recognized in the prior art. The subject matter in the background section merely represents different approaches, which in and of themselves may also be inventions.
- A shortcoming in the prior art, recognized by this specification, is that there is a lack of a secure integration of the identity of the user to the protection of the user's data and the control of the user's computer. A critical part of the computer instructions for an action or a transaction are usually executed on the host domain machine (e.g., the user's computer). Some examples of the user's computer are a Mac Book Pro, a Dell desktop computer, an IPhone, a Blackberry or an Android phone. Currently cryptography keys are stored on the user's computer or a chip executing the operating system, which is not secure. For example, when Bob's computer communicates with Mary's computer, even when using well-implemented Public Key Infrastructure (PKI), Bob's computer can only be sure that it is communicating with Mary's computer. Bob can not be sure that he is communicating with Mary and vice versa. Similarly, even Bob cannot be certain that the communications he sends Mary are the same as the communications that Mary receives as coming from him.
- Sending a secure communication using Public Key Infrastructure (PKI) from one user machine to another user machine ensures communication between the user machines, but may not ensure secure communication between the users of the machines. Continuing, with the above example, as a result of the use of a Public Key Infrastructure, although Mary may be reasonably sure that Mary's machine is communicating with Bob's machine, Boris may be operating one or more computers in Russia and may have remotely broken into Bob's computer and may be using Bob's machine and pretending to be Bob.
- In the prior art, each computer cannot be assured of who controls the other computer. For example, even when a user is present, an intruder (e.g., a hacker) may be physically located thousands of miles away, but is remotely logged onto the user's machine and hijacking the user's intended action(s). Even the Trusted Platform Module (TPM) has the fundamental cyber security weakness of not knowing who controls the other computer with which a user may be in communication with or who controls the computer which contains the Trusted Platform Module. Not knowing the other computer with which a current computer is in communication with may be a weakness that is significant when the operating system can directly access the TPM. If the user's computer is compromised, then the attacker can access the TPM. Another limitation and weakness of the TPM is that there is no mechanism for binding the identity of the user to the user's cryptography keys and other confidential information that should be bound to the user's true identity.
- Another shortcoming of cyber security is that a secure link is missing between the authentication of a valid user, and the authorization of an action. The authorization of an action could be the execution of a financial transaction from a user's bank account, a stock trade in a user's brokerage account, the execution of an important functionality on the electrical grid, or access to important data on a private network such as SIPRnet (e.g. WikiLeaks). The authorization of an action typically occurs through the web browser since the web browser presents a convenient interface for a person. However, the web browser is where the important connection between authentication of a user and authorization of an action may be broken. Existing systems have the user authenticating the user's computer, and then the same user's computer also authorizes (and may also execute) the action. Since the user's computer can be hacked, the lack of a secure and direct link between authenticating the user's computer and authorizing the action may render the act of user verification irrelevant.
- Part of the disconnect (vulnerability) between authenticating the user and authorizing the user's action occurs, because authentication (e.g., biometric authentication) is typically and naively represented as an on/off switch. That is, after the user has been authenticated and the initial transaction approved, the remainder of the session is assumed to be secure and all actions after authentication are assumed to be legitimate, without performing any further checks. In the same way, if this on/off implementation occurs in an untrusted computing environment, then outstanding biometric algorithms and sensor(s) become irrelevant because the biometric authentication can be circumvented between the user authentication and the authorization or confidentiality part of the security system.
- The use of biometrics can be advantageous for security, because biometrics offers a reliable method for verifying who (the person) is that is actually initiating a transaction. However, even with the use of biometrics, if the handling of the biometric information, the storage of the biometric data, or the control of actions based on a biometric verification is done on an unsecured user's computer, the value of the biometrics may be greatly reduced or nullified.
- An additional aspect of the weakness of current authentication and authorization processes (such as those using biometrics) is that the action can be hijacked by executing a Trojan attack on the user's computer, for example. A Trojan attack is an attack in which the attacker pretends to be the user and/or the other system to which the user is communicating with. In other words, a valid, authorized user cannot verify that the action he or she is trying to execute is what is actually being executed, because a third party may be masquerading as the other system.
- An example of this weakness is the untrusted browser attack used to divert money from a user's bank account. Mary's web browser may display to her that she is about to send $500 to Bob's account, but in reality her untrusted browser is configured to send $50,000 to a thief's bank account.
- Since the web browser is executed on the user's computer, the browser cannot be trusted even when using PKI and one-time passcodes! A recent untrusted browser attack on the gold standard of security, RSA SecurID, demonstrates this surprising fact. The consequences of this particular cyberattack were that $447,000 was stolen from a company bank account in a matter of minutes, even though the valid user was using one-time passcodes to make the transaction more secure. The details of this cyberattack are quoted below in a MIT Technology Review, entitled “Real-Time Hackers Foil Two-Factor Security,” Sep. 18, 2009, which states, In mid-July, an account manager at Ferma, a construction firm in Mountain View, Calif., logged into the company's bank account to pay bills, using a one-time password to make the transactions more secure. Yet the manager's computer had a hitchhiker. A forensic analysis performed later would reveal that an earlier visit to another website had allowed a malicious program to invade his computer. While the manager issued legitimate payments the program initiated 27 transactions to various bank accounts, siphoning off $447,000 in a matter of minutes. “They not only got into my system here, they were able to ascertain how much they could draw, so they drew the limit,” says Roy Ferrari, Ferma's president. The theft happened despite Ferma's use of a one-time password, a six-digit code issued by a small electronic device every 30 or 60 seconds. Online thieves have adapted to this additional security by creating special programs—real-time Trojan horses—that can issue transactions to a bank while the account holder is online, turning the one-time password into a weak link in the financial security chain. “I think if a broken model,” Ferrari says. Security experts say that banks and consumers alike need to adapt—that banks should offer their account holders more security and consumers should take more steps to stay secure, especially protecting the computers they use for financial transactions. ‘We have to fundamentally rethink how customers interact with their banks online,’ says Joe Stewart, director of malware (malicious software) research fear security firm SecureWorks, in Atlanta, Ga. ‘Putting all the issues with the technology aside, if [attackers] can run their code on your system they can do anything you can do on your computer. They can become you.”
- There is now widespread understanding, both in popular and technical domains, of the theoretical and practical fragility of online transaction security. The RSA SecurID® token is the industry-leading technology for authenticating and securing identity in online transactions. The recent attack and subsequent breach of the RSA SecurID token (announced March 2011) has highlighted the fundamental problems with current cybersecurity solutions. Malware played a significant role in causing this breach. Malicious software has many forms: virus, worm, Trojan horse, spyware etc. all of which have the singular purpose of undermining the security, confidentiality, integrity or availability of computer systems. Recent über malware is invisible. It encrypts and camouflages itself using the same mathematical techniques used by traditional, white hat cryptography. Eric Filiol, “Malicious Cryptology and Mathematics,” Cryptography and Security in Computing (Intech, 2012), pp. 23-50. http://cdn.intechopen.com/pdfs/29700/InTechMalicious_cryptology_and_mathematics.pdf
- Malware is able to phish passwords or hijack financial transactions made via mobile devices or personal computers without the user's knowledge. It is not necessary for malware to break the cryptography of a device to compromise its security. Contemporary computers and electronic devices are particularly susceptible to malware attacks due to their processor architecture.
- Specifically, the processors have a von Neumann architecture, which only execute one computing instruction at a time. As a consequence, malware has to corrupt or transform only a single machine instruction to initiate execution of malignant code. This is a deep vulnerability arising from current processor architecture and it cannot be easily rectified. Only one legitimate jump or branch instruction needs to be changed in a digital computer program to start it executing malware. During machine execution, after the von Neumann machine program has been hijacked by malware, anti-virus software, that is supposed to check the program, might not get executed, may be disabled or in other cases may never detect the malware. The sequential execution of von Neumann machine instructions hinders a digital computer program from protecting itself
- A common malware technique is the so-called “man-in-the-middle” attack. This attack is an active form of eavesdropping in which the attacker makes independent connections with the counterparties in a given transaction; by using appropriate authentication the attacker controls the entire transaction. The counterparties are unaware of the presence of the attacker and assume they are transacting securely with each other. Internet communications and financial transactions can be intercepted and hijacked by malware (malicious software) performing a “man-in-the-middle” attack. These attacks are not easy to detect or prevent. In particular, the RSA SecurID breach demonstrated that pseudo-random number generators (i.e., deterministic algorithms), typically used in two-factor authentication solutions cannot prevent “man-in-the-middle” attacks launched by malware.
- Malware, however, has a significant weakness: malware is poor at recognizing visual images since computer algorithms cannot match the visual pattern recognition ability of the human brain. Human beings have highly advanced visual pattern recognition skills. The embodiments described here exploit this fundamental weakness of malware.
- A third fundamental shortcoming of current cybersecurity solutions is the fact that static authentication factors, such as passwords, PINs and biometrics, entered directly into the user's computer or stored on computers in a digital or binary format such as ASCII code. This weakness makes static authentication factors vulnerable to phishing attacks in the host domain or security breaches in the network domain.
- In the following drawings like reference numbers are used to refer to like elements. Although the following figures depict various examples, the one or more implementations are not limited to the examples depicted in the figures.
-
FIG. 1A shows a block diagram of an embodiment of a system for executing secure transactions resistant to malware. -
FIG. 1B shows a memory system that is a component of the system shown in 1A. -
FIG. 2A shows a block diagram of an embodiment of a service provider system. -
FIG. 2B shows memory system that is a component of the system inFIG. 2A . -
FIG. 3 shows a flow diagram of a user setting up the system to enable the execution of secure transactions. -
FIG. 3A shows a flow diagram of an embodiment of step A of executing a secure transaction. -
FIG. 3B shows a flow diagram of an embodiment of step B of executing a secure transaction. -
FIG. 4 shows a collection of images that are parts or a whole of a logo. Some of the images are rotated. -
FIG. 5 shows a collection of images. One is part of a logo. There are 26 visual images of the alphabet letters “ABCDEFGHIJKLMNOPQRSTUVWXYZ”. The word NAME is made up of a collection of images with a doodle background texture. -
FIG. 6 shows a collection of images. One is part of a logo. Another is the word BANK with a simplicial and dotted texture background. There are 26 visual images of the alphabet letters “ABCDEFGHIJKLMNOPQRSTUVWXYZ”. -
FIG. 7 shows a collection of images. One is part of a logo. Another is the word “ACCEPT” written with bubble texture on a simplicial background texture. And a third is the word “ABORT” written with bubble texture on a foliation background texture. -
FIG. 8 shows a collection of images. One object is a geometric image of a blue rectangle on top of a blue triangle which is on top of a red blob. Just to the right is a rectangle with vertical texture on top of a triangle with dotted texture on top of a blob with simplicial texture. -
FIG. 9 shows a collection of images, illustrating some different textures.FIG. 9 shows nine different textures: vertical, horizontal, mixed, dotted, bubble, simplicial and foliation. -
FIG. 10 shows recipient account number 9568342710 represented by two distinct collection of images. In the top representation, thenumber 3 is represented with a visual image using a triangular texture. In the bottom representation thenumber 4 is represented with the letters “FOUR” using bubble texture to write the letters. There is also part of a logo in the lower left corner ofFIG. 10 . -
FIG. 11 shows a collection of images representing a universal identifier. A subset of these can be used for user authentication. -
FIG. 12 a shows a user interface page for enrollment. -
FIG. 12 b shows a user interface page for enrollment that displays different visual image categories. -
FIG. 12 c shows a user interface page for enrollment that displays different images in the “sports” category. One image represents “cycling”. Another image represents “tennis”. Another image represents “skiing”. -
FIG. 13 a shows a user interface page for user verification or user login using visual images. -
FIG. 13 b shows a user interface page for user verification that displays different visual images. One image displays an elephant. Another image displays a car. -
FIGS. 14 a, 14 b and 14 c show the use of correlation to detect and find the locations of features in an image. -
FIG. 14 a shows an image of the word “apple” in a handwritten style font. -
FIG. 14 b shows an image, representing the letter “p” in a handwritten style font. -
FIG. 14 c shows a correlation function image, indicating the detection of the presence and exact locations of the letter “p” in the image in 14, indicated by the bright peaks. -
FIGS. 15 a, 15 b and 15 c show the use of special types of noise to hinder the use of the correlation operation to find features in an image. -
FIG. 15 a show an image, representing the word “apple” in the same handwritten style font asFIG. 14 a, but with non-deterministic noise added. -
FIG. 15 b shows a raw image, representing the letter “p” in a handwritten style font. -
FIG. 15 c shows an unintelligible correlation function image, indicating the inability to detect the locations of the “p” in the image inFIG. 15 a. -
FIG. 16 shows a semiconductor device that is a photodetector. This hardware device can detect the arrival of single photons, which is an embodiment of quantum randomness. -
FIG. 17 shows a device that receives a polarized photon and splits it into a linear/horizontal vertical analyzer with 50% chance of detecting a “0” or “1”. This hardware device can detect the polarization of single photons, which is an embodiment of quantum randomness. Under the device is a diagram representing a photon that is circularly polarized. Under the device, on the right, is a diagram representing a photon that is linearly polarized. -
FIG. 18 shows a random noise generator and a digital logic circuit that captures and outputs this randomness. Below the generator are the time delays between separate events that are detected. In an embodiment, the random noise generator may be implemented with a photodetector as shown inFIG. 16 . In this embodiment, the arrival times of photons enable quantum randomness. - Although the issues discussed in the background or elsewhere may have motivated some of the subject matter disclosed below, nonetheless, the embodiments disclosed below do not necessarily solve all of the problems associated with the subject matter discussed in the background or elsewhere. Some embodiments only address one of the problems, and some embodiments do not solve any of the problems associated with the subject matter discussed in the background or elsewhere. In general, the word “embodiment” is used to specify an optional feature and/or configuration.
- A groundbreaking method for cybersecurity is described that is more secure against modern malware, and provides a much better user experience compared with passwords or hardware tokens such as SecurID. No More Passwords uses visual images that are selected by a user to create a set of “favorites” that can easily be recalled and quickly selected by the user at login.
- No more passwords leverages the superior power of eye-brain processing of humans versus machines to ensure that a human, and not a bot or malware, is involved in a transaction or communication.
- Underlying the simplicity of this approach is a security technology that includes:
- A.) A non-deterministic random number generator hardware based on quantum physics.
- B.) Noise modification of images using the random number generator.
- C.) Visual Image morphing, positioning and reordering based on the random number generator.
- D.) Transaction-dependent passcodes.
- The application of all these methods in concert addresses current cybersecurity issues, as well as anticipating other possible approaches that hackers may attempt in the future, while the flexibility of the approach supports the creation of advanced, user-friendly user interface designs.
- Malware, phishing scams and other various forms of hacking and cybersecurity breaches have become a major issue today. The use of passwords is inadequate, inefficient and problematic for users and companies, and the problems with password use are increasing steadily.
- The invention(s) described herein uses the unique, innate pattern recognition skills of humans to transform cybersecurity It advances online transaction security, which currently relies mainly on the straightforward use of passwords or, in some cases, the addition of other security enhancements that may provide some improvement in security, but are still inadequate. These measures typically increase the cost of the system while greatly reducing the convenience to the user.
- Malware resistant authentication and transaction authorization is provided through the combined application of various methods and embodiments. In an embodiment, this invention can eliminate use of the alpha-numeric “password” such as “34YUiklmn” or a sequence of ASCII symbols such as “94Yzi2_e$mx&” The invention(s) herein also provides a basis for a much-improved user interface and the overall user experience around securing online transactions, access control, and the protection of an individual's personal data and identity.
- The invention(s) described herein use visual representations (images) that are both personal and memorable to each individual user. There is an enrollment process in which the user selects a set of images from a group of categories representing the user's “favorites.” At verification (i.e., login time) the user is asked to select some or all personal favorites from a set of randomly-selected options as verification of both the user's identity and the fact that the user is in fact a human instead of an automated system that has hijacked the transaction flow. This approach has a number of advantages in terms of convenience to the user, while allowing anti-malware methods to be applied that provide substantial anti-hacking capability.
- User Interface Design
- The use of visual images to create a unique identity for a user has many advantages: The system is not only highly secure and resistant to various hacks and malware attacks, but is also intuitive, easy to use and attractive to users. The core technology behind an identity security system should support a user interface (UI) that provides all of these benefits; there is sufficient flexibility in the UI design and a range of security-enhancing features that can be used together in various ways to allow the UI design to be tailored to the needs of both the user and the device (e.g., PC, IPhone, Android IPhone, IPad, tablet computer) on which it is being used.
- Since interaction with the user is a key part of the technology, it is helpful to describe a UI design example for two reasons: 1) to ensure that the technology is both effective and easy to use; 2) to help explain how embodiments work. The UI should be designed to run on the device(s) of choice within the intended application and tested for intuitiveness, ease of use, functionality, acceptance by and attractiveness to product users.
- The UI described in this section is intended as an example design, and shows how it might be implemented on a mobile phone. The example shown here is only meant to provide general clarity about what can be done with this technology and to serve as a high-level use case to describe the flow for creating and entering a unique login identity for a user.
- Enrollment.
- To enroll, the user first initiates the enrollment. The process will start with the launching of an application, or a request to enroll within a running application on a particular device such as a mobile phone, computer, terminal or website. In the example here, and figures below, the device is a mobile phone and the user starts the enrollment process by launching an app.
- Once launched, the application starts enrollment by displaying the first enrollment screen with a superimposed popup window that provides brief instructions for enrollment and a box in which the user is asked to enter a username. This is shown in
FIG. 12 a. - As soon as the username has been entered, the popup window disappears, showing the first enrollment screen that provides a list of categories for the user's “favorites”, as shown in
FIG. 12 b. These categories can include almost anything, such as animals, musical instruments, travel destinations, famous people, sports, etc. In this example, a few items from the list of categories are displayed on the screen, but the central portion of the screen with the category icons can be scrolled up or down to show other choices. The items that the user has currently chosen are shown as small icons at the bottom of the screen. This part of the display (and the header at the top) does not scroll, and provides a running tally of the user's choices throughout the process. This also serves as messaging to the user as to the progress of the enrollment process. - In an embodiment, after a category has been selected, a second screen appears showing specific items in the chosen category. This screen is shown in
FIG. 12 c. The tally of choices is carried over, and shown at the bottom of this screen as well. Again, in this example, the central portion of the screen with the icons can be scrolled up or down to display more than the nine items shown on the screen at one time. The user can then select his/her item from the available choices. Once a selection is made, the item chosen appears in the running tally below, and the display reverts back to the first enrollment screen which provides the category choices again. - This process is repeated seven times in this example. The number of choices required from the user for enrollment can be changed, depending on the security level required, and an acceptable enrollment process for a particular case. In general, the fewer the choices required by the user, the less secure the embodiment will be, but the trade off between security and ease of use is important, and should be decided on a case-by-case basis.
-
FIG. 12 ashows page 1 of an enrollment user interface (UI), which shows the popup window superimposed requesting entry of the username. In an embodiment, for added security, the username is obscured in a similar way as is typically accomplished with a password field. -
FIG. 12 b showspage 1 of an enrollment UI after the disappearance of the popup window, showing thesame screen 1 the scrollable “favorites” category selections, and the boxes at the bottom of the screen where the running tally of the user's choices will be displayed -
FIG. 12 c showsenrollment page 2 showing 9 of the specific item choices available, and allowing the user to scroll for more. In this diagram, the user had selected “sports” as the category on the previous page, and this is the third favorite, as indicated by the running tally below showing the two previous choices, and the note in the header that reads “selection 3.” - Verification (Login)
- While it may be acceptable for the enrollment process to take a few minutes, and require the user to be guided through multiple steps, verification should be as quick and simple as possible. This is well-known in biometrics since biometric devices usually require a series of steps to enroll. At verification, however, the expectation of the user is that the use of the technology will make verification of their identity not only more secure, but much easier and faster. The same is applicable with implementations. Despite widespread identity theft and hacking, many users are far more concerned with convenience than they are about security.
- In the example presented here, an enrolled user initiates verification by launching an image-enabled app, or requesting login to a local or remote system. Immediately, the verification screen appears with a randomized group of choices for the user, and a popup window superimposed that requests entry of the username, as depicted in
FIG. 13 a. Once the username is entered, the popup window disappears exposing the screen with the randomized choices for the user to select. This is shown inFIG. 13 b. The options offered on a single screen contain at least one of the user's “favorite” images that were selected at enrollment, but also contains a number of other “incorrect” options that are selected randomly from a large set of options. The central portion of the screen with the icons of the selectable items can be scrolled up or down to expose more choices. In this example, to pass verification, the user chooses four of the seven favorite items that were chosen at enrollment. Once all favorites have been correctly selected, the screen disappears, and login proceeds. If the choices are incorrect, the login process starts over again from the beginning For added security, there may be a limit placed on the number of failed attempts a user can make in a login session. -
FIG. 13 a shows a verification user interface (UI) page with the popup window superimposed requesting entry of the username. In an embodiment, the verification page is a single page to make verification quick and simple. Also note again that for added security, the username can be obscured in the same way as is typically done with a password field. -
FIG. 13 b shows the verification user interface page after the disappearance of the popup window showing the scrollable “favorites” selections that have been randomly selected from a large array of options. As indicated in the footer at the bottom of the screen, the application is ready for the third favorite out of a total of four required for verification. In other embodiments, more than four favorites may be requested for a successful login. In another embodiment, more than four favorites may be requested to complete a financial transaction. In other embodiments, less than four favorites may be requested for a successful login. - In embodiments, robust security is desired but also convenience and a positive experience of the user are also important. There is sometimes a tradeoff between security and convenience for the user, and this tradeoff is fundamental to security technology from the old-fashioned lock and key, to the most modern and sophisticated security technology used today.
- There is a correlation between the number of favorites required during enrollment, the number of favorites needed to verify, in the specific requirements of the order of the choices, and in the layout and presentation of the images themselves. For example, if the user is required to select his/her favorite images in the same order they were chosen at enrollment, this increases the security greatly, but makes remembering the images much easier than a password, since people memorize and remember by association. Each person has his own personal unique association, which makes this a natural approach to a stronger, more effective security system.
- It is helpful to note that the technology and embodiments have flexibility in this aspect, and that the choice of these parameters can be adjusted, not only from one application to another, but if desired, from one transaction to another. For example, if in an embodiment a user has chosen seven items at enrollment, he/she may be asked to select only four items to unlock the phone interface, but when logging into a bank account, he/she may be asked to enter all seven items. In an alternative embodiment, the user may be requested to select 12 items instead of 7. This means that the technology can be adjusted “on the fly” to accommodate varying security levels for different embodiments.
- In addition, as explained in a previous section, the use of images, plus the application of image processing, and non-deterministic random number generator, makes the UI and the system secure against sophisticated malware and hacking methods. The images shown in the UI diagrams above can be reordered, and the options offered can be changed using the non-deterministic random numbers on every screen during enrollment and verification. This removes the possibility of malware or onlookers recognizing patterns in what is being presented to the user, or following the user's behavior. As explained above, to address security, the images themselves are modified to prevent sophisticated malware from running in the background to recognize the images directly by means of computational pattern recognition. This can be accomplished by again using the non-deterministic random number generator to produce unpredictable parameters for the algorithms that modify the images using special types of noise, or applying rotation or translation to change the orientation or position of the image on the screen, or distorting the images slightly to change their shape. In fact, all of the above modifications can be applied simultaneously, randomly to each image, differently on every step in the enrollment or verification process, each time it is used. The same can be done to the text on the screen in order to make it unreadable by malware as well, if needed. Because the human eye/brain system is so highly adept at recognizing images, these modifications to the images can be made so that it is extremely difficult for sophisticated malware to recognize what is happening on the device, without spoiling the human user's experience
- As stated above, the UI design presented here is an example of how embodiments can be implemented. There are other UI embodiments that use visual images for login and entry of information a non-digital or non ASCII format. The intent is to highlight the main components that make up this system, while showing flexibility. The exact layout and features of the UI are up to the designer of the product or system which uses the technology. Depending on the details of the device, the application and the security requirements, the user interface may be configured very differently. On some systems, it may be best to guide users though a series of separate screens instead of scrolling. If scrolling is preferred, it can be done in one or two dimensions on the screen, or perhaps using scroll wheels, similar to those used in the Apple iPhone's date and time settings. In some cases, more category options, or sub category options may be useful. During the verification process, if preferred, the items can be categorized, similar to the example for enrollment, and it may be desirable to have all the choices displayed on a single screen, rather than offering more items to choose via scrolling, in which case the categories could be panelized on the screen.
- The choice of the images used is also to be considered. Simple binary images, such as those shown in the example of an embodiment, may be used in some embodiments. Full-color images could be used as well, depending on what sort of image processing is preferred for security enhancements. The shape and size of the images is flexible as well. The images chosen could even be opened up to the user by providing a large database of downloadable images, similar to the wide array of ringtones now available for cell phones. There may be some restrictions on the properties of the images used, however, again depending on the specifics of the security needs, the device, and the user interface design, but overall, it is extremely flexible.
- Security Advantages
- Given the dangers posed by malware, it is essential that recipients of internet dataflow in a transaction can be assured that the sender is human and the recipient (on the server side) is the actual institution (e.g., a bank) and not malware posing as a bank. The solution ensures a live human is reading, entering and broadcasting information. A GUI based on special processed images renders messages that are “unreadable” by machines or automated processes. This robust security solution is web server driven making it usable by personal computers, mobile devices and any device with a visual interface. Before describing the interface and GUI, we discuss some security advantages.
- Unpredictability
- On the web server, it uses one or more hardware devices that utilize fundamental laws of physics to generate non-deterministic random numbers. This is in contrast to the use of pseudo-random number generators in RSA SecurID, for example, which are based on deterministic algorithms. These unpredictable numbers are used for three major purposes:
- Unpredictable numbers are used to unpredictably place images on the screen.
- Unpredictable numbers are used to unpredictably change the image shape
- Unpredictable numbers are used to add unpredictable noise to images.
- Given this unpredictability at multiple sites, the sequence of images used for a login/authentication cannot be reproduced by a digital computer program because the numbers are not generated by a deterministic algorithm (i.e., a digital computer program). Instead, quantum devices are used. In some embodiments, the quantum devices utilize one or more photons being emitted from a device and generating a random 0 or 1 based on the time at which the photon is emitted.
- A well-designed quantum device can generate numbers according to the following two quantum-random properties of no bias and history has no effect on the next event.
- There is no bias: A single outcome xk of a bit sequence (x1 x2 . . . ) generated by quantum randomness is unbiased: P(xk=1)=P(xk=0)=½.
- History has no effect on the next event: Each outcome xk is independent of the history. There is no correlation that exists between previous or future outcomes. For each bj ∈ {0, 1}, P(xk=1|x1=b1, . . . , xk−1=bk−1)=½ and P(xk=0|x1=b1, . . . , xk−1=bk−1)=½.
- Let Π={(b1 b2 . . . ): bk ∈ {0, 1}} be the space of infinite sequences of 0's and 1's representing infinite quantum random bit sequences. It can be shown that if a quantum device producing the quantum randomness runs under ideal conditions to infinity, then the resulting infinite sequence of 0's and 1's (i.e., sequence in Π) is incomputable. In other words, no digital computer program (i.e., deterministic algorithm) can reproduce this infinite sequence of 0's and 1's. This incomputability of quantum random sequences is a useful property of non-deterministic random numbers. The resulting unpredictability incorporated into the image generation and manipulation in the system can make the recognition of the visual images a difficult artificial intelligence (AI) problem for machines. This unpredictability can be applied in the noise generation that is used to make visual images more difficult for machine algorithms to recognize.
- In an embodiment, a hardware device, as shown in
FIG. 17 , detects the polarization of photons and uses this detection to determine a quantum random 0 or 1. In an embodiment, the hardware detector uses linearly polarized photons (light). In an embodiment, the hardware detector uses circularly polarized photons (light). In an embodiment, a quantum random 0 or 1 is generated by the detection of a single photon. In an alternative embodiment, a quantum random 0 or 1 is generated by the detection of more than one photon. - In an embodiment, as shown in
FIG. 18 , a quantum random 0 or 1 is generated based on the relative timing based onquantum events FIG. 18 , T1 is the time elapsed betweenquantum event 0 andquantum event 1; T2 is the time elapsed betweenquantum event 1 andquantum event 2. In an embodiment, if elapsed time T1 is greater than elapsed time T2 then a quantum random 1 is generated; if elapsed time T1 is less than elapsed time T2 then a quantum random 0 is generated. In an alternative embodiment, if elapsed time T1 is greater than elapsed time T2 then a quantum random 0 is generated; if elapsed time T1 is less than elapsed time T2 then a quantum random 1 is generated. In an embodiment,events events - In an embodiment, the detection of a photon may occur in a semiconductor chip as shown in
FIG. 16 . - Noise
- As the number, scope and value of transactions being conducted via the Internet and through the use of mobile devices increases, so do the incentives for hackers to apply ever greater resources to their craft. At the same time, the available computing power that can be applied by malware towards attacks of escalating sophistication is increasing. Smart phones today have unprecedented number crunching power; while this power can be used to create clever security systems, it can also be harnessed by malware at any node in the communication path to which malware can gain access.
- In embodiments, images help ensure that a human, not a machine, is controlling the transaction or the communication between the user and institution. This is based on the highly developed ability of humans to recognize images. Although machine vision is embryonic by comparison with the mature image recognition abilities of the human eye-brain combination, it is possible for machines to recognize images. In order to provide robust security in anticipation of the possibility that sophisticated malware may incorporate machine vision techniques to attack image-based security systems, proprietary methods were developed to counteract computational image recognition, and fully exploit innate human pattern recognition abilities.
- One widely used approach to computational pattern recognition is the correlation operation. This is a direct point-by-point mathematical “comparison” of two functions that can be used not only to detect the presence of a feature in an image, but to also find its location accurately. The continuous expression that describes the non-normalized correlation operation C between two real, one-dimensional functions A and B is:
-
- The ⊚ operator represents the correlation operation. In discrete form, as implemented in a digital computer, the correlation can be written as:
-
- This can be extended to two dimensions for use with images as:
-
- It can be further extended to be used with two dimensional images, as well as finding the rotational orientation of one image with respect to the other as:
-
- R is a rotation operator applied to A.
- One reason the correlation operation is so powerful and widely used is that the calculation of the correlation function can be done efficiently using a fast Fourier transform (FFT). Performing the correlation operation directly, point-by-point, can be done very rapidly with modern computers for small images, but the computational complexity increases as N2, where N is the number of data points in the image being cross correlated (for images of equal size). However, the correlation operation can be calculated using FFTs as follows:
-
A⊚B=IFFT((FFT(A))×(FFT(B))) - Here, A and B are the two image arrays and “IFFT” represents the inverse FFT operation. This computation scales with image size much more slowly and increases as N*log(N). In addition, since the FFT is so widely used for many data processing tasks, and FFTs are a common component of most floating-point benchmark tests for processors, many modern processors are designed with FFTs in mind and some are even optimized for performing FFTs. Therefore, for sufficiently large images, the use of FFTs to compare images is efficient. However, as the complexity of the correlation increases, for example if rotation is added, the computational load increases quickly, making computational pattern recognition more difficult.
- If the images are small enough, the use of FFTs for doing correlations will become inefficient compared with direct correlation because of the extra computations needed to perform the forward FFTs and the inverse FFT. However, image recognition using correlation operations can be extremely effective with the power of modern computers and the choice of direct correlation or the alternate use of FFTs to calculate the correlation function (depending on image size).
- It is important that the system be resistant to hacking through the use of correlation operations, and other computational pattern recognition techniques. Consequently, techniques can be applied to images to disrupt the use of correlation operations that either recognize images or locate features within an image, yet the image remains fully recognizable by a living human observer.
- One of these techniques is the processing of the image using a specialized noise structure to create a “noise modified image.” There are several different noise structures that can use the non-deterministic random numbers generated by quantum physics-based hardware. Having various noise structures further enhances the security of the technique because the type of noise used to modify the image can be varied.
- An example of using the noise structure is demonstrated in
FIGS. 14 and 15 below. In the binarized (black or white pixels only) image inFIG. 14 a, both the presence and exact locations of the letter p are found in the word apple using a correlation operation. When the “noise modified image” is correlated with an exact copy of the letters used in the base image, the result is unintelligible noise as shown inFIG. 15 c. -
FIGS. 14 a, 14 b and 14 c shows the use of correlation operation to detect and find the locations of features in an image.FIG. 14 a shows an image containing the word “apple” in a handwritten style font.FIG. 14 b shows an image of the the letter “p” in the same font used in the image.FIG. 14 c shows the correlation function image showing the detection of the presence, and exact locations of the letter “p” in the image inFIG. 14 a, indicated by the bright peaks inFIG. 14 c. -
FIGS. 15 a, 15 b and 15 c show the addition of special types of noise to defeat the use of the correlation operation to find features in an image.FIG. 15 a shows an image containing the same word “apple” in the handwritten style font fromFIG. 14 a but with a special type of noise added that enhances the contrast of the noise over the letters versus the background, where the noise contrast is reduced. -
FIG. 15 b shows the raw image of the letter “p” in the same font used in the original image before the noise is added. -
FIG. 15 c shows the correlation function image which is unintelligible, indicating the inability to detect the presence or locations in the letter “p” in the image inFIG. 15 a. - In addition to the various noise structures that can be used, other randomized mathematical transformations can be applied to the images to make them even more difficult for machine algorithms to hack. These transformations include (1) translation, as in the figures above with the letters in the word “apple” being shifted up and down randomly; (2) rotation; (3) various types of morphing, including size and aspect ratio changes as well as both linear and non-linear geometric distortion. All of these transformations can be based on the non-deterministic random number generator for maximum security. Several of these different modifications can all be applied to a single image simultaneously, making recognition by a machine nearly impossible. Again, the image of the word “apple” in
FIG. 15 a is an example. Here, the letters are distorted slightly in shape and size, their positions are randomly altered, and the noise structure is applied. - These noise methods may be applied to number images (e.g., images of the
numbers - In some embodiments, security solutions are provided for secure transactions against untrusted browser attacks and other cyberattacks. In some embodiments, the solution(s) described in the specification secure payment transactions. In other embodiments, the solution(s) may secure access and use of private networks such as Secret Internet Protocol Router Network (SIPRnet) or resources on a public infrastructure such as the electrical grid.
-
FIG. 1A shows an embodiment of asystem 100 for providing secure transactions. In an embodiment,system 100 may includeuser system 101, anduser system 101 may includesecure area 102,secure memory system 104,secure processor system 106,output system 108,input system 110,sensor 111,communication system 112,memory system 114,processor system 116, input/output system 118,operating system 120, and network interface 122.System 100 may also includenetwork 124 andservice provider system 126. In other embodiments,system 100 may not have all of the elements or features listed and/or may have other elements or features instead of, or in addition to, those listed. -
System 100 is a system within which a secure transaction takes place (FIGS. 1A , 1B, 2A, 2B, 3, 3A, and 3B describe various details ofsystem 100 and various methods for using system 100). In this specification the word system refers to any device or system of devices that communicate with one another.User system 101 is one that has a secure area that is dedicated for performing secure transactions over a network.User system 101 may be a single device or a combination of multiple devices.User system 101 may be a portable device, personal computer, laptop, tablet computer, handheld computer, mobile phone, or other network system, for example (in this specification a network system is any device or system that is capable of sending and/or receiving communications via a network). In an embodiment, asecure area 102 may be provided for performing secure transactions. In this specification, authentication information references to any form of information used for authenticating a user. In an embodiment, withinsecure area 102, authentication information, such as a biometric authentication and/or another form of authentication is bound to the authorization of an action. In other words, the authentication information is in some way combined with the information for performing the action, such as by being concatenated together and then applying a hash function to the result of the concatenation. In this specification, the words “action” and “transaction” may be switched one with another to obtain different embodiments. Throughout this specification, whenever information is disclosed as being combined, the information may be concatenated, added together (e.g., in a binary addition of the binary values of information), be different inputs to the same function, and/or combined in another manner. - A hash function, denoted by Φ, is a function that accepts as its input argument an arbitrarily long string of bits (or bytes) and produces a fixed-size output. In other words, a hash function maps a variable length message m to a fixed-sized output, Φ(m). Typical output sizes range from 160 bits, 256 bits, 512 bits, or can also be substantially larger.
- An ideal hash function is a function Φ whose output is uniformly distributed in the following way: Suppose the output size of Φ is n bits. If the message m is chosen randomly, then for each of the 2n possible outputs z, the probability that Φ(m)=z is 2−n. In an embodiment, the hash functions that are used are one-way. A one-way function Φ has the property that given an output value z, it is computationally extremely difficult to find a message mz such that Φ(mz)=z. In other words, a one-way function Φ is a function that can be easily computed, but that its inverse Φ−1 is extremely difficult to compute. Other types of one way functions may be used in place of a hash function.
- Any of a number of hash functions may be used. One possible hash function is SHA-1, designed by the National Security Agency and standardized by NIST. The output size of SHA-1 is 160 bits. Other alternative hash functions are of the type that conform with the standard SHA-256, which produces output values of 256 bits, and SHA-512, which produces output values of 512 bits. A hash function could be one of the SHA-3 candidates. A candidate example of a hash function is BLAKE. Another example of a hash function is GrØstl. Another example of a hash function is JH. Another example of a hash function is Keccak. Another example of a hash function is Skein.
- In an embodiment,
secure area 102 may have its own secure processor system and secure memory system, which are not accessible by the rest ofuser system 101.Secure area 102 may be capable of taking over and/or blocking access to other parts ofuser system 101. -
Secure memory system 104 may be a dedicated memory for securing transactions. In an embodiment,secure memory system 104 may not be accessed by the other processor systems ofuser system 101.Memory system 104 may include, for example, any one of, some of, any combination of, or all of a long-term storage system, such as a hard drive; a short-term storage system, such as random access memory; a removable storage system, such as a floppy drive or a removable drive; and/or flash memory.Memory system 104 may include one or more machine-readable mediums that may store a variety of different types of information.Secure memory system 104 may store methods and information needed to perform the secure transaction, user information, a method of generating a registration key, and encryption/decryption code.Secure memory system 104 may include one or more memory units that each write and/or read to one or more machine readable media. The term machine-readable medium is used to refer to any non-transient medium capable carrying information that is readable by a machine. One example of a machine-readable medium is a computer-readable medium. Another example of a machine-readable medium is paper having holes that are detected that trigger different mechanical, electrical, and/or logic responses. The content ofsecure memory 104 is discussed further inFIG. 1B , below. -
Secure processor system 106 may include one or more processors.Processor system 116 may include any one of, some of, any combination of, or all of multiple parallel processors, a single processor, a system of processors having one or more central processors and/or one or more specialized processors dedicated to specific tasks.Processor system 116 implements the machine instructions stored inmemory 114.Secure processor system 106 may include one or more processors that cannot be accessed by the main processor of theuser system 101. For example, in an embodiment all of the processors ofsecure processor system 106 cannot be accessed by the main processor ofsystem 101. In an embodiment, the operating system ofuser system 101 may have no access to securearea 102, and in an embodiment,secure area 102 may be programmed without benefit of an operating system, so that there is no standard manner of programmingsecure area 102, which thwarts hackers from sending read and/or write commands (or any other commands) to securearea 102, because secure area does not use standard read and write commands (and does not use any other standard commands). As a consequence, providingsecure area 102 addresses the weakness of biometric authentication and other authentication methods. -
Output system 108 may include any one of, some of, any combination of, or all of a monitor system, a handheld display system, a printer system, a speaker system, a connection or interface system to a sound system, an interface system to peripheral devices and/or a connection and/or interface system to a computer system, intranet, and/or internet, for example. In an embodiment,secure processor system 106 may be capable of taking over and using any portion of and/or all ofoutput system 108. In an embodiment, a portion of the output system may be a dedicated display system that may be accessed only bysecure area 102. In an embodiment,secure processor 106 may be capable of receiving input frominput system 110 and/or blocking access tooutput system 108 by the main processor system and/or other devices. -
Input system 110 may include any one of, some of, any combination of, or all of abiometric sensor 111, a keyboard system, a touch sensitive screen, a tablet pen, a stylus, a mouse system, a track ball system, a track pad system, buttons on a handheld system, a scanner system, a microphone system, a connection to a sound system, and/or a connection and/or interface system to a computer system, intranet, and/or internet (e.g. IrDA, USB). In an embodiment,biometric sensor 111 may be a finger print scanner or a retinal scanner. In an embodiment,user system 101 stores the processed data from user information 104B during registration. In anembodiment user system 101 retrieves user information 104B and compares the scanned output ofsensor 111 to user information 104B to authenticate a user. In an embodimentsecure processor 106 may be capable of receiving input frominput system 110 and/or blocking access toinput system 110 by the main processor system and/or other devices. In at least one embodiment,processor 116 may capture pressure (e.g., pressing fingers) events on a touch sensitive screen or a mouse clicking corresponding to something of interest (e.g., a visual image) on a PC display.FIG. 5 shows images of part of an icon, the word “NAME” and the letters of the alphabet “ABCDEFGHIJLKLMNOPQRSTUVWXYZ”. -
Communication system 112 communicativelylinks output system 108,input system 110,memory system 114,processor system 116, and/or input/output system 118 to each other.Communications system 112 may include any one of, some of, any combination of, or all of electrical cables, fiber optic cables, and/or means of sending signals through air or water (e.g. wireless communications), or the like. Some examples of means of sending signals through air and/or water include systems for transmitting electromagnetic waves such as infrared and/or radio waves and/or systems for sending sound waves. -
Memory system 114 may include, for example, any one of, some of, any combination of, or all of a long-term storage system, such as a hard drive; a short-term storage system, such as random access memory; a removable storage system, such as a floppy drive or a removable drive; and/or flash memory.Memory system 114 may include one or more machine-readable mediums that may store a variety of different types of information.Memory system 114 andmemory system 104 may use the same type memory units and/or machine readable media.Memory system 114 may also store the operating system ofuser system 101 and/or a web browser (which may also be referred to as an HTTP client). In embodiment,memory system 114 may also store instructions forinput system 110 to read in biometric data and send the biometric data to securearea 102. -
Processor system 116 may include one or more processors.Processor system 116 may include any one of, some of, any combination of, or all of multiple parallel processors, a single processor, a system of processors having one or more central processors and/or one or more specialized processors dedicated to specific tasks.Processor system 116 implements the machine instructions stored inmemory 114. In an embodiment,processor 116 does not have access to securearea 102. In at least one embodiment,processor 116 may capture pressure (e.g., pressing fingers) events on a touch sensitive screen or a mouse clicking corresponding to something of interest (e.g., a visual image) on a PC display. - In an embodiment, clicking on the red letter “R” (e.g., via
image entry 179 inFIG. 1B ) shown at the bottom of theFIG. 6 would have a similar effect to typing the letter “R” on the keyboard but would make it more difficult for malware to know what the user is entering. - In an alternative embodiment,
processor 116 only communicates to securearea 102 whensecure area 102 authorizesprocessor 116 to communicate withsecure area 102.Secure area 102 may preventprocessor 116 from communicating to secure 102 during the secure area's execution of critical operations such as setup, generation of keys, registration key, biometric authentication or decryption of transaction information. - Input/
output system 118 may include devices that have the dual function as input and output devices. For example, input/output system 118 may include one or more touch sensitive screens, which display an image and therefore are an output device and accept input when the screens are pressed by a finger or stylus, for example. In at least one embodiment, the user may see visual images of letters on a screen as shown inFIG. 5 . InFIG. 5 , pressing a finger over the letter “B” shown just below the word NAME would indicate typing or entering the letter “B”. - The touch sensitive screen may be sensitive to heat and/or pressure. One or more of the input/output devices may be sensitive to a voltage or current produced by a stylus, for example. Input/
output system 118 is optional, and may be used in addition to or in place ofoutput system 108 and/orinput device 110. In an embodiment, a portion of the input/output system 118 may be dedicated to secure transactions providing access only to securearea 102. In an embodiment,secure processor 106 may be capable of receiving/sending input/output from/viainput system 110 and/or blocking access toinput system 110 by the main processor system and/or other devices. Restricting access to a portion of and/or all of the input/output system 118 denies access to third party systems trying to hijack the secure transaction. -
Operating system 120 may be a set of machine instructions, stored inmemory system 110, to manageoutput system 108,input system 110,memory system 114, input/output system 118 andprocessor system 116.Operating system 120 may not have access to securearea 102. Network interface 122 may be an interface that connectsuser system 101 with the network. Network interface 122 may be part of input/output system 118. -
Network 124 may be any network and/or combination of networks of devices that communicate with one another (e.g., and combination of the Internet, telephone networks, and/or mobile phone networks). Service provider system 126 (which will be discussed further in conjunction withFIG. 2A ) may receive the transactions. The recipient may be the final recipient or an intermediary recipient of transactions. -
Service provider system 126 may be a financial institution or a recipient of a secure transaction.User system 101 may interact with any of a variety of service provider systems, such asservice provider system 126, via anetwork 124, using a network interface 122.Service provider system 126 may be a system of one or more computers or another electronic device, and may be operated by a person that grants a particular user access to its resources or enables a particular event (e.g., a financial transaction, a stock trade, or landing a plane at an airport, and so on). - Methods for securing transactions are disclosed in this specification, which may be implemented using
system 100. A financial transaction may be an instance or embodiment of a transaction. Further, a stock trade is one embodiment of a financial transaction; a bank wire transfer is an embodiment of a financial transaction and an online credit card payment is an embodiment of a financial transaction. Any operation(s) that runs in a trusted environment, which may besecure area 102 may be treated as a secure transaction. In an embodiment, every secure transaction may include one or more atomic operations and the use of the word transaction is generic to both financial transactions and operations including atomic operations unless stated otherwise. In this specification, the word transactions is also generic to an individual or indivisible set of operations that must succeed or fail atomically (i.e., as a complete unit that cannot remain in an intermediate state). Operations that require security may include operations that make use of, or rely on, the confidentiality, integrity, authenticity, authority, and/or accountability of a system should be executed in a trusted environment (e.g., in a secure area, such as secure area 102). Types of operations that require security may be treated as secure transactions. Further, a successful transaction other than logging information alters a system (e.g., of service provider 126) from one known, good state to another, while a failed transaction does not. To be sure that a transaction results in a change of state only when the transaction is successful—particularly in systems that handle simultaneous actions—rollbacks, rollforwards, and deadlock handling mechanisms may be employed to assure atomicity and system state integrity, so that if there is an error in the transaction, the transaction does not take effect or does not cause an unacceptable state to occur. - In at least one embodiment, a secure transaction assures the following properties:
- A. Availability: Having timely and reliable access to a transactional resource.
- B. Confidentiality: Ensuring that transactional information is accessible only to those authorized to use the transactional information.
- C. Integrity: Ensuring that transactional information is protected from unauthorized modification.
- D. Authentication: Ensuring that transactional resources and users accessing the transactional resources are correctly labeled (identified).
- E. Authorization: Ensuring that only authorized users have access rights to transactional resources.
- F. Accounting: Ensuring that a transaction cannot be repudiated. Any operation that handles or provides access to data deemed too sensitive for an untrusted environment (e.g., any private data) may be treated as a secure transaction to ensure that information leakage does not occur.
- In at least one embodiment, these functionalities may be processed using a mobile phone. Some examples of a mobile phone are an Android phone, the iPhone and the Blackberry. In at least one embodiment, a secure chip or secure part of the chip may reside in a personal computer. In at least one embodiment involving a mobile phone or computer, a secure chip may be temporarily or permanently disconnected from the rest of the system so that the
operating system 120 does not have access to critical information entered into and received (e.g., read or heard) from the secure area's user interface. In at least one embodiment, this critical information may be authentication information, such as a collection of images, biometric information, passwords, passcodes, PINS, other kinds of authentication factors, transaction information, and/or other user credentials. - In at least one embodiment in which
user system 101 is a portable device, the portable device may have a user interface with a keyboard and mouse or display screen that is sensitive to the placement of fingers enables the user to select buttons, images, letters, numbers or symbols. In at least one embodiment, the screen may be used to select one or more images. As an example,FIG. 7 shows the choice of selecting “ACCEPT” or “ABORT” using images. The selection is captured byimage entry 179 shown inFIG. 1B . At least one embodiment may enable the user to enter transaction information using this keyboard and mouse or the display screen. - Portable embodiments of
user system 101 enable users to execute secure transactions in remote places such as inside a jet, on a golf course, inside a moving automobile, from a hotel room, in a satellite, at a military gate, and/or other isolated places. - In at least one embodiment, a person may be requested to choose their favorite food and he or she may select an apple image—via the user interface—as user verification. In another instance at a later time, a transaction may require a person to select one or more images (i.e., a collection of images) from a display screen. Example images could be a picture or photo of an orange, a train, a specific pattern such as a peace sign or a diagram or a logo, a Mercedes car, a house, a candle, the Golden Gate bridge or a pen.
-
FIG. 4 shows images of parts of a logo.FIG. 9 shows different texture images: horizontal, vertical, triangular, mixed, dotted, bubble, simplicial, and foliation. At the bottom ofFIG. 8 is a geometric image of a blue rectangle on top of a blue triangle which is on top of a red blob. Just to the right inFIG. 8 , is a rectangle with vertical texture on top of a triangle with dotted texture on top of a blob with simplicial texture.FIG. 5 shows images of the alphabet letters: “ABCDEFGHIJKLMNOPQRSTUVWXYZ”. In at least one embodiment, during setup the person may add his or her own images usingimage acquisition 173, which are then used for user verification during the transaction. When images are a part of the user verification process, a display screen may be used, which may callimage display 177. - Although some embodiments of
user system 101 below may be described as using collections of visual images as a user's universal identifier or as user authentication, other items or a combination of these items may be used for verifying the identity of the person such as face prints, iris scans, finger veins, DNA, toe prints, palm prints, handprints, voice prints and/or footprints. Any place, the expression “biometric prints” occurs any of the above listed different specific types of biometrics may be substituted to get specific embodiments. In terms of what a person knows, the authentication items may be PINs, passwords, sequences, collections of images that are easy to remember, and/or even psychometrics. In an embodiment, the item used to verify the person may be any item that is unique. In an embodiment, the item(s) used to verify the person may be one or more items that as a combination are difficult for malware to fabricate, guess, find by trial and error, and/or compute. In an embodiment, the item(s) used to verify the person are uniquely associated with this person. In an embodiment, the item used to verify the person has an unpredictable element. - In at least one embodiment, there is a
secure area 102 that may be a specialized part of the chip (e.g., a microprocessor), where theoperating system 120 and web browser software do not have access to this specialized part of the chip. In at least one embodiment, a specialized part of the chip may be able to turn off theoperating system 120's access to presses of the buttons or a screen of a mobile phone (or other computing device), preventing malware and key or screen logging software from intercepting a PIN or the selection of an image. In at least one embodiment, a specialized part of the chip may be able to temporarily disconnect the rest of the chip's access to the screen (e.g., by preventing the execution of theoperating system 120 and web browser). In at least one embodiment, part of the display screen may be permanently disconnected from the part of the chip (e.g., from the microprocessor of the chip) that executes theoperating system 120 and web browser. In at least one embodiment, a part of the chip may only have access to the biometric sensor, while the rest of the chip—executing theoperating system 120 and web browser—is permanently disconnected from the biometric sensor. - In at least one embodiment, there includes a secure area, such as
secure area 102, that executes a biometric acquisition and/or storage of cryptography keys, and other user credentials, which may be created from the biometric prints or created from unpredictable physical processes insecure area 102, or created from a combination of the biometric prints and unpredictable processes In at least one embodiment, photons may be produced by the hardware as a part of the unpredictable process. In least one embodiment, the unpredictable process may be produced by a specialized circuit in the secure area. - In yet another embodiment of the invention, biometric prints and/or unpredictable information from unpredictable physical process are used to generate one or more keys in the
secure area 102. Thesecure area 102 may include embedded software. In at least one embodiment, the embedded software is on a chip with a physical barrier around the chip to hinder reverse engineering of the chip, and/or hinder access to keys, transaction information, and/or possibly other user credentials. - By executing software from
server provider system 126, the selection of visual images, usingimage entry 179, are less susceptible to theft as they can be displayed on the screen in a form that is not easily recognizable or captured by malware. Because they are difficult for malware to recognize or apprehend, they can be presented byimage display 177 in a less secure part of the system such asoperating system 120 running a web browser. Each of the above embodiments may be used separately from one another in combination with any of the other embodiments. All of the embodiments of this specification may be used together or separately. - To provide additional security, some embodiments may use a
secure area 102 that may be part ofuser system 101 or a special part of the chip that is able to acquire biometric prints, store authentication information, and/or authenticate the newly acquired items. The authentication information may include templates of biometric prints, images, pins, and/or passwords. The secure area may also be a part of the device where critical transaction information may be entered or verified on a display that the secure area only has access to. In at least one embodiment, the host computer (domain) and the network have no access to the transaction information, no access to the keys, no access to biometrics, and/or no access to other critical user credentials (the transaction information, the keys, the biometrics, and/or other critical user credentials may be the contained and processed by the secure area). - In this specification, transaction information refers to one or more items of information that describe the transaction. For a payment transaction, one item of transaction information may be the name of the person or entity sending the money. Another item of transaction information may be the name of the person or entity receiving the money. Another item of transaction information, may be the date or time of day. Another item of transaction information may be the sending person's (or entity's) account number. Another item of transaction information may be the receiving person's (or entity's) bank account number.
FIG. 10 shows a recipient account number 9568342710 with a collection of visual images. - The sending person or entity is the person or entity that sends a message that is part of the transaction and the receiving person or entity (recipient) is the person or entity that receives the message that is part of the transaction. Another item of transaction information may be the sending person's (or entity's) routing number. Another item of transaction information may be the receiving person's (or entity's) routing number. Another item of transaction information may be the amount of money that may be expressed in dollars, Eros, yen, francs, deutschmark, yuan or another currency.
- During setup, one or more images may be acquired by using
image acquisition 173 inuser system 101. These one or more images may serve as a user's universal identifier or provide a method to authenticate the user. An example of one or more images that may serve as a universal identifier is shown inFIG. 11 . In at least one embodiment, no images are stored inuser system 101. In at least one embodiment, these images are acquired and encrypted by image encrypt/decrypt 175 and transmitted toservice provider system 126. During setup, in at least one embodiment, a background texture may be selected by the user, that was generated byimage generator 238 inservice provider system 126.FIG. 9 shows some examples of textures. - In at least one embodiment, a symbol, letter, number and/or image texture may be selected or generated. As an example,
FIG. 8 shows the word “SIMPLICIAL” written with a bubble texture using a simplicial background texture. In at least embodiment, a unique icon or image may be chosen or generated by theuser system 101 and/or the user and/orservice provider system 126. - In at least one embodiment, user makes sure that a recognizable image generated by
image generator 238 appears on the user interface that is only known toservice provider system 126.FIG. 4 shows an example of different parts of a unique logo that may serve this purpose. The use of recognizable image in different forms helps hinder malware from capturing important setup information and helps assure that the user is communicating with the appropriateservice provider system 126. - During setup, in at least one embodiment, some initial transaction information is provided to
service provider system 126. This transaction information may include the user's name, the user's bank account number and bank. In at least one embodiment, some of this transaction information provided viaimage entry 179 toservice provider system 126, may be provided by using images (i.e., acquired with image acquisition 173) that are difficult for malware to capture or apprehend. - In at least one embodiment, during setup one or more biometric prints may be acquired, and one or more unique registration keys and cryptography keys may be generated from the one or more of the biometric prints (items) or generated from an unpredictable physical process or both. In at least one embodiment, the unpredictable physical process may come from a hardware chip or hardware circuit that uses photons as a part of the unpredictable process to create the cryptography keys. During authentication, if the acquired biometric print is an acceptable match, then a sequence of transaction steps that make up the complete transaction may be initiated.
- In embodiments using a secure area, the software that
secure area 102 executes may be embedded insecure memory 104. In an embodiment, there is no operating system on the device or onsecure area 102 ofuser system 101. In an alternative embodiment, there is an operating system. The secure biometric print device has a number of components, which are described later. The security of thesecure area 102 may be enhanced by any one of, any combination or of, or all of (1) the use of embedded software, (2) the lack of an operating system, and (3) the secure area being at least part of a self-contained device not connected to a computer or the internet. For example, the unit that includes the secure area may contain its own processor. In an embodiment, the secure area may not have any of these security enhancing features. The biometric sensor enablesuser system 101 to read biometric prints. The biometric sensor may include a fingerprint area sensor or a fingerprint sweep sensor, for example. In at least one embodiment, the biometric sensor may contain an optical sensor that may acquire one or more types of biometrics. In at least one embodiment, the biometric sensor may be a microphone or other kind of sensor that receives acoustic information, such as a person's voice. In at least one embodiment, the sensor may be a device that acquires DNA or RNA. In an embodiment,secure processor system 106 may execute the software instructions, such as acquiring a biometric print from the sensor, matching an acquired biometric print against a stored biometric print, sending communication and control commands to a display, and/or encrypting the registration key and transmitting the registration key to the administrator when the user and administrator are not in the same physical location. By includingprocessor system 106 insecure area 102, the security is enhanced, because the external processor is given fewer chances to inspect contents ofsecure area 102. Alternatively,secure area 102 may store software instructions that are run bysecure processor system 106.Processor system 106 performs the biometric print acquisition, and/or the encryption or decryption. Alternatively, a specialized logic circuit is built that carries out the functions that the software causes the processors to perform, such as driving sensor 111 (which may be an acquisition unit, such as a biometric sensor). -
Secure memory system 104 may contain non-volatile memory in addition to volatile memory. Non-volatile memory enables the device to permanently store information for generating cryptography keys (encryption or decryption). In another embodiment,secure memory system 104 may include memory onsecure processor system 106. In another embodiment, the sensor orinput system 110 andsecure processor system 106 may be integrated into a single chip. Alternatively, in another embodiment, the sensor ininput system 110 andsecure processor system 106 may be two separate chips. -
FIG. 1B shows an embodiment of a block diagram of the contents ofmemory system 104 ofFIG. 1A ,Memory system 104 may includeinstructions 152, which in turn may include asetup routine 154, an authentication of user routine 156, asecure transaction routine 158, having aninitial request routine 160, a serviceprovider authentication routine 162, and a completion oftransaction routine 164. Instructions 154 (of memory 104) may also includeregistration key generator 166,drivers 168,controller 169, generatecryptography key 170, perturbcryptography key 174, hash functions 178, perturbingfunctions 180, and user interface 181.Memory system 104 may also storedata 182, which may includebiometric template T 184,registration key R 186, current cryptographykey K 188 andtransaction information S 192. In other embodiments,memory system 104 may not have all of the elements or features listed and/or may have other elements or features instead of, or in addition to, those listed. -
Instructions 152 may include machine instructions implemented byprocessor 106.Setup routine 154 is a routine that handles the setting up of theuser system 101, so thatuser system 101 may be used for performing secure transactions.Setup routine 104 may collect a new user's biometric print, and apply a hash function to the biometric print (and/or to other user information) to generate a registration key R. In at least one embodiment, there may be specialized hardware in the secure area to help create unpredictableness used for the generation of cryptography key(s), seed(s), and/or registration key(s). Alternatively, a registration key, seed, or cryptography key may be generated by applying the hash function to the raw biometric print data, for example. Similarly,setup routine 154 may apply a hash function to authentication information, such as a biometric print, to hardware noise produced by a phototransistor, and/or other user information or a combination of these to generate an initial cryptography key. Thesetup routine 154 may also send the registration key and/or the cryptography key to theservice provider system 126. In another embodiment, the registration key R and/or the initial cryptography key may be received fromservice provider 126. - Authentication of user routine 156 may authenticate the user each time the user attempts to use
user system 101. This routine may callimage acquisition 173 to acquire a collection images for user authentication. For example,user system 101 may include a biometric sensor (e.g., as sensor 111) that scans the user's biometric print, reduces the biometric print to a template, and matches the newly derived biometric template to a stored template (which was obtain by setup routine 154). Then, if the stored template and the newly derived template match, the user is allowed to useuser system 101. - In an alternative embodiment, a biometric print acquired may be directly matched with a stored template. Alternatively or additionally, authentication of user routine 156 may require the user to enter a password. If the password received and the password stored match, the user is allowed to use
user system 101. -
Secure transaction routine 158 is a routine that implements the secure transaction. Theinitial request routine 160 is a first phase ofsecure transaction routine 158. One purpose ofinitial request routine 160 is to receive a selection of images known to the user and acting as a user authentication that are difficult for malware to recognize or apprehend and transaction information entered and represented as images that are difficult for malware to recognize or apprehend. The transaction information is encrypted with the cryptography key. The encrypted transaction information and encrypted user authentication—both represented as images before encryption—are sent to the service provider. Duringinitial request routine 160, the cryptography key may perturbed to obtain a new cryptography key, respectively. In an alternative embodiment, the cryptography key is not changed each time - Service
provider authentication routine 162 authenticates the information provided by the service provider. The collection of images, representing the user's universal identifier or user authentication, received byservice provider 126 tosystem 101 in reply toinitial request 160 may be authenticated by serviceprovider authentication routine 162. -
Drivers 168 may include drivers for controlling input and output devices, such as the keyboard, a monitor, a pointing device (e.g., a mouse and/or a touch pad), a biometric print sensor (for collecting biometric prints).Controller 169 may include one or more machine instructions for taking control of the keypad, monitor and/or network interface, so the transaction may be performed securely, without fear of theprocessor system 116 compromising security as a result of being taken over by malware sent from another machine. - Generate
cryptography key 170 are machine instructions that generate a new cryptography key (e.g., by applying a function). In at least one embodiment, the cryptography key is not updated after the initial step.Perturb cryptography key 174 perturbs the current cryptography key to thereby generate the next cryptography key. -
Image acquisition 173 are machine instructions that acquire images. Image encrypt/decrypt are machine instructions that encrypt or decrypt one or more images. In at least one embodiment, these images are encrypted before sending toservice provider system 126. In at least one embodiment, encrypted images are received fromservice provider system 126 and decrypted withservice provider system 126 before they are displayed to the user withimage display 177.Image display 177 are machine instructions that display one or more images to the user, utilizing user interface 181. In at least one embodiment, images are displayed on a screen of a mobile phone or PC.Image entry 179 are machine instructions that determine which image a user has selected with his or her finger on a touch sensitive screen or has selected with a mouse. - Hash functions 178 may be one or more one-way functions, which may be used by generate
registration key 166 for generating a registration key from a biometric print and/or other user information. Those hash function(s) of hash functions 178 that are used byinitial request 160, authentication ofservice provider routine 162, and completion oftransaction routine 164 may be the same as one another or different from one another. - Perturbing functions 180 may include one or more perturbing functions, which may be used by
perturb cryptography key 174. Different perturbing functions of perturbingfunctions 180 may be used during eachinitial request 160, authentication ofservice provider routine 162, and/or completion oftransaction routine 164. In this specification anytime a hash function is mentioned or a perturbing function is mentioned any other function may be substituted (e.g., any perturbing function may be replaced with a hash function and any hash function may be replaced with a perturbing function) to obtain another embodiment. Optionally, any perturbing function and/or hash function mentioned in this specification may be a one way function. - User interface 181 provides a page, a web browser or another method of displaying and entering information so that the user interface may provide one or more of the following functionalities, labeled with the letters A-F.
- A. The user may view the transaction information being sent. B. The user may enter instructions for sending transaction information. C. The user may receive information about whether or the user authentication was valid. D. The user may enter or generate one or more images known by the user and/or enter another biometric print or another type of user authentication such as a PIN. E. The user may determine the current state in the transaction process. F. The user may read directions or enter information for the next step in the transaction process.
-
Data 182 may include any data that is needed for implementing any of the routines stored inmemory 104.Biometric template T 184 may include templates, such as minutiae and/or other information characterizing biometric prints of users, which may be used to authenticate the user each time the user would like to usesecure area 102 and/orsystem 101. Registrationkey R 186 may be generated by applying a hash function to a collection of images selected or generated by the user, biometric print(s) and/or information derived from an unpredictable physical process. In one embodiment, the unpredictable physical process may use one or more phototransistors, each of which senses photons. - Current cryptography
key K 188 is the current cryptography key, which may be stored long enough for the next cryptography key to be generated from the current cryptography key.Transaction information S 192 may include information about a transaction that the user would like to perform. Service Provider System -
FIG. 2A shows a block diagram of an embodiment of aservice provider system 200 in a system for securing transactions against cyber attacks. In an embodiment,service provider system 200 may includeoutput system 202,input system 204,memory system 206,processor system 208,communication system 212, and input/output system 214. In other embodiments, theservice provider system 200 may not have all the components and/or may have other embodiments in addition to or instead of the components listed above. -
Service provider system 200 may be a financial institution or any other system such as a power plant, a power grid, or a nuclear plant or any other system requiring secure access. In an embodiment,service provider system 200 may be an embodiment ofservice provider system 126. Any place in this specification whereservice provider 126 is mentionedservice provider 200 may be substituted. Any place in this specification whereservice provider 200 is mentionedservice provider 126 may be substituted.Service provider system 200 may include one or more webservers, applications servers, and/or databases, which may be part of a financial institution, for example. -
Output system 202 may include any one of, some of, any combination of, or all of a monitor system, a handheld display system, a printer system, a speaker system, a connection or interface system to a sound system, an interface system to peripheral devices and/or a connection and/or interface system to a computer system, intranet, and/or internet, for example. -
Input system 204 may include any one of, some of, any combination of, or all of a keyboard system, a touch sensitive screen, a tablet pen, a stylus, a mouse system, a track ball system, a track pad system, buttons on a handheld system, a scanner system, a microphone system, a connection to a sound system, and/or a connection and/or interface system to a computer system, intranet, and/or internet (e.g. IrDA, USB). -
Memory system 206 may include may include, for example, any one of, some of, any combination of, or all of a long term storage system, such as a hard drive; a short term storage system, such as random access memory; a removable storage system, such as a floppy drive or a removable drive; and/or flash memory.Memory system 206 may include one or more machine-readable mediums that may store a variety of different types of information. The term machine-readable medium is used to refer to any medium capable carrying information that is readable by a machine. One example of a machine-readable medium is a computer-readable medium. Another example of a machine-readable medium is paper having holes that are detected that trigger different mechanical, electrical, and/or logic responses.Memory 206 may include encryption/decryption code, algorithms for authenticating transaction information, for example (memory 206 is discussed further in conjunction withFIG. 2B ). -
Processor system 208 executes the secure transactions onsystem 200.Processor system 208 may include any one of, some of, any combination of, or all of multiple parallel processors, a single processor, a system of processors having one or more central processors and/or one or more specialized processors dedicated to specific tasks. In an embodiment,processor system 208 may include a network interface to connectsystem 200 touser system 101 vianetwork 124. In an embodiment,processor 208 may execute encryption and decryption algorithms,with which the transaction information was encrypted. In an embodiment,processor 208 may decrypt secure messages fromuser system 101 and/or encrypt messages sent touser system 101. -
Communication system 212 communicativelylinks output system 202,input system 204,memory system 206,processor system 208, and/or input/output system 214 to each other.Communications system 212 may include any one of, some of, any combination of, or all of electrical cables, fiber optic cables, and/or means of sending signals through air or water (e.g. wireless communications), or the like. Some examples of means of sending signals through air and/or water include systems for transmitting electromagnetic waves such as infrared and/or radio waves and/or systems for sending sound waves. In embodiment,memory system 206 may store instructions forsystem 200 to receive authenticated secure transaction information fromuser system 101. - Input/
output system 214 may include devices that have the dual function as input and output devices. For example, input/output system 214 may include one or more touch sensitive screens, which display an image and therefore are an output device and accept input when the screens are pressed by a finger or stylus, for example. The touch sensitive screen may be sensitive to heat and/or pressure. One or more of the input/output devices may be sensitive to a voltage or current produced by a stylus, for example. Input/output system 118 is optional, and may be used in addition to or in place ofoutput system 202 and/orinput device 204. -
FIG. 2B shows an embodiment of a block diagram of the contents ofmemory system 206 ofFIG. 2A ,Memory system 206 may includeinstructions 220, which in turn may include asetup routine 222, an authentication of user routine 224, a request forauthentication routine 226, completion oftransaction routine 228, generateregistration key 230, generatecryptography key 232, hash functions 242, and perturbing functions 244.Memory system 206 may also storedata 245, which may include registrationkey R 246, current cryptographykey K 248, andtransaction information S 252. In other embodiments,memory system 206 may not have all of the elements or features listed and/or may have other elements or features instead of, or in addition to, those listed. -
Setup routine 222 is a routine that handles the setting up of theservice provider system 200, so thatservice provider system 200 may be used for performing secure transactions.Setup routine 222 may receive a registration key from the user system, which in turn may be used for generating the initial cryptography key. - In an alternative embodiment, the user may send the biometric print or template of the biometric print to
service provider system 200, andservice provider system 200 may generate the registration key from the biometric print in the same manner thatuser system 101 generates the registration key from the template of the biometric print or from the biometric print and/or information obtained from an unpredictable physical process (e.g., bysetup routine 222 applying a hash function to the biometric print and/or information derived from an unpredictable physical process). - In another embodiment, the user may visit the location of service provider, where the service provider may acquire a collection of images known to the user, which is used by
service provider system 200 for at least partially creating the initial cryptography key. - Generate
cryptography key 232 are machine instructions that generate a new cryptography key from (e.g., by applying a function, such as a perturbing function to) a prior cryptography key. Generatecryptography key 232 may be the same routine as generatecryptography key 170 except that generatecryptography key 232 is implemented atservice provider 200 and generatecryptography key 170 is implemented atuser system 101. -
Perturb cryptography key 236 may be the same as perturb cryptography key 174, and perturbcryptography key 236 perturbs the current cryptography key to thereby generate the next cryptography key - Hash functions 242 may be the same as hash functions 178. Hash functions 242 may be one a way functions, which may be used by generate cryptography keys routine 230. Optionally, hash functions 242 may include a different function for generate
cryptography keys 230. Those hash function(s) of hash functions 242 that are used by authentication of user routine 224, request forauthentication routine 226, and completion oftransaction routine 228 may be the same as one another or different from one another. - Different perturbing functions of perturbing
functions 244 may be used during each of authentication of user routine 224, request forauthentication routine 226, and completion oftransaction routine 228. Although perturbingfunctions 244 and hashfunctions 242 are indicated as separate storage areas in fromperturb cryptography key 236, the perturbing functions may just be stored as part of the code forperturb cryptography key 236. -
Data 245 may include any data that is needed for implementing any of the routines stored inmemory 206. Registrationkey R 246 may be the same as registration key 185 and may be generated by applying a hash function to a collection of images selected or generated by the user and/or biometric print(s) and/or information from an unpredictable physical process. - Current cryptography
key K 248 may be the same ascurrent cryptography key 188, and may be the current cryptography key, which may be stored long enough for the next cryptography key to be generated from the current cryptography key. -
Transaction information S 252 may be the same astransaction 192, and may include information about a transaction that the user would like to perform.Transaction information S 252 may be received fromuser system 101 and may be used to perform a transaction atservice provider system 200 on behalf ofuser system 101. -
FIG. 3 shows a flowchart of an embodiment of setting upuser system 101 for securing transactions. This user system method may be the setup performed byuser system 101 before enabling a user to execute secure transactions with a bank, financial institution or financial exchange. - In step 302, a sequence or collection of visual images that are easy to remember are obtained from the user. In an embodiment, some visual images may be an image of an animal, an image of a car, an image of a house, an image of a place, an image of a person's name, an image of all or part of a bank logo. In at least one embodiment, this collection of universal images may act as a universal identifier for the user. As an example, the universal identifier for that particular user may be composed of the following 7 images where order is not important: a train, the Golden Gate bridge, pink sparkle shoes, chocolate ice cream in a waffle cone, one of the Wells Fargo stagecoach horses, an orange, and a visual image of the name Haley. An example of this visual image of a name is displayed as a visual image as shown in
FIG. 11 . The universal identifier may use a particular background texture or pattern that is determined by the user or service provider system during setup.FIG. 9 shows examples of different textures. The visual image of Haley inFIG. 11 is represented with a bubble texture against a foliation background texture. - In an embodiment, the universal identifier may be used to request from the user as user authentication. In an alternative embodiment, user authentication may involve a subset of these images of the universal identifier or different set of visual images.
- In an alternative embodiment, biometric print information may be obtained from the user from a
biometric sensor 111 ininput system 110 in order to establish a method of user authentication. The user setup method may also collect other setup information, such as a Personal Identification Number (PIN), or a password. The setup data that was collected may be denoted as a T. - In
step 304, the universal identifier and user authentication information are encrypted and transmitted to the service provider system. In at least one embodiment, this information is encrypted as visual images and then sent back to the service provider system. In at least one embodiment, a Diffie-Hellman key exchange is used to establish keys to encrypt the universal identifier and user authentication information. - In step 306, the user service provider receives the encrypted universal identifier and user authentication information and decrypts them and stores them.
- In step 308, user's account is initialized with user service provider and enabled for executing transactions.
- A Diffie-Hellman key exchange is a key exchange method where two parties (Alice and Bob) that have no prior knowledge of each other jointly establish a shared secret key over an unsecure communications channel. Before the Diffie-Hellman key exchange is described it is helpful to review the mathematical notion of a group. A group G is a set with a binary operation *, (g*g is denoted as g2; g*g*g*g*g is denoted as g5), such that the following four properties hold:
- (i.) The binary operation * is closed on G. In other words, a*b lies in G for all elements a and b in G.
- (ii.) The binary operation * is associative on G. a*(b*c)=(a*b)*c for all elements a, b, and c in G
- (iii.) There is a unique identity element e in G. a*e=e*a=a.
- (iv). Each element a in G has a unique inverse denoted as a−1. a*a−1=a−*a=e.
- The integers { . . . , −2, −1, 0, 1, 2, . . . } with respect to the binary operation + are an example of an infinite group. 0 is the identity element. For example, the inverse of 5 is −5 and the inverse of −107 is 107.
- The set of permutations on n elements {1, 2, . . . , n}, denoted as Sn, is an example of a finite group with n! elements where the binary operation is function composition. Each element of Sn is a function p:{1, 2, . . . , n}→{1, 2, . . . , n} that is 1 to 1 and onto. In this context, p is called a permutation The identity permutation e is the identity element in Sn, where e(k)=k for each k in {1, 2, . . . , n}.
- If H is a non-empty subset of a group G and H is a group with respect to the binary group operation * of G, then H is called a subgroup of G. H is a proper subgroup of G if H is not equal to G (i.e., H is a proper subset of G). G is a cyclic group if G has no proper subgroups.
- The integers modulo n (i.e., Zn={[0], [1], . . . [n−1]} are an example of a finite group with respect to addition modulo n. If n=5, [4]+[4]=[3] in Z5 because 5 divides (4+4)−3. Similarly, [3]+[4]=[3] in Z5. Observe that Z5 is a cyclic group because 5 is a prime number. When p is a prime number, 4 is a cyclic group containing p elements {[0], [1], . . . [p−1]}. [1] is called a generating element for cyclic group Zp since [1]m=[m] where m is a natural number such that 0<m≦s p−1 and [1]p=[0]. This multiplicative notation works as follows: [1]2=[1]+[1]; [1]3=[1]+[1]+[1]; and so on. This multiplicative notation (i.e. using superscripts) is used in the description of the Diffie-Hillman key exchange protocol described below.
- There are an infinite number of cyclic groups and an infinite number of these cyclic groups are extremely large. The notion of extremely large means the following: if 21024 is considered to be an extremely large number based on the computing power of current computers, then there are still an infinite number of finite cyclic groups with each cyclic group containing more than 21024 elements.
-
Steps - 1. Alice and Bob agree on an extremely large, finite, cyclic group G and a generating element g in G. (Alice and Bob sometimes agree on finite, cyclic group G and element g long before the rest of the key exchange protocol; g is assumed to be known by all attackers.) The group G is written multiplicatively as explained previously.
- 2. Alice picks a random natural number a and sends ga to Bob.
- 3. Bob picks a random natural number b and sends gb to Alice.
- 4. Alice computes (gb)a.
- 5. Bob computes (ga)b.
- Both Alice and Bob are now in possession of the group element gab, which can serve as the shared secret key. The values of (gb)a and (ga)b are the same because g is an element of group G.
- Alice can encrypt a message m, as mgab, and sends mgab to Bob. Bob knows |G|, b, and ga. A result from group theory implies that the order of every element of a group divides the number of elements in the group, denoted as |G|. This means x|G|=1 for all x in G where 1 is the identity element in G. Bob calculates (ga)|G|−b=(g|G|)a g−ab=(gab)−1. After Bob receives the encrypted message mgab from Alice, then Bob applies (gab)−1 and decrypts the encrypted message by computing mgab(gab)−1=m.
- The user and the
service provider 126 agree upon a common key for the registration key. The user then encrypts one of the common keys with the registration key. Theservice provider 126 encrypts the common key with other information, which may be information specific to the user or a random number, for example. Then the user sends the encrypted common key (that was encrypted by the user with the registration) to theservice provider 126, and theservice provider 126 sends the encrypted common key that theservice provider 126 encrypted to the user. Next, the user encrypts the encrypted common keys that was received from theservice provider 126 with the registration key, and theservice provider 126 encrypts the encrypted common key received from the user (which was encrypted with the registration key) with the same information that was used to encrypt the original copy of the common key of theservice provider 126. Thus, both the user and theservice provider 126 will now have the common encrypted key derived from the registration key supplied by the user and the information supplied by theservice provider 126. The resulting encrypted common key may be used as the registration key (instead of the original registration key). - Optionally, the
user system 101 and theservice provider 126 may also agree upon a common key for the cryptography key. The common key of the cryptography key and registration key may be the same as one another or different. Theuser system 101 then encrypts one of the common keys and the cryptography key. The server encrypts the common key with other information, which may be information specific to the user or a random number for example (as was done for the registration key). Then theuser system 101 sends the encrypted common key (that was encrypted by the user with the cryptography key) to theservice provider 126, and theservice provider 126 sends the encrypted common keys (which was encrypted service provider 126) to the user. Next, the user encrypts the encrypted common key that were received from theservice provider 126 with the cryptography key, and theservice provider 126 encrypts the encrypted common keys received from the user (which was already encrypted with the cryptography key by the user) with the same information that was used to encrypt the original copy of the common keys of theservice provider 126. Thus, both the user and theservice provider 126 will now have the common key encrypted by the cryptography key supplied by the user and the information supplied by theservice provider 126. The resulting encrypted common key may be used as the cryptography key (instead of the original cryptography key). - In other embodiments, the secure transmission may use elliptic curve cryptography which is similar to the Diffie-Hellman exchange described previously. In other embodiments, the secure transmission of cryptography key(s) K may use a camera that reads a proprietary pattern from the user's display of the device after setup is complete. In an embodiment, the user's display is the screen of a mobile phone.
- In at least one embodiment, the registration key R may be given to the administrator in the same physical place, such as at a bank, or the registration key may be mailed or electronically transmitted to the administrator if setup is accomplished remotely. In some applications, the registration key may be encrypted first and then electronically transmitted or sent by mail. The
service provider system 126 uses the registration key R to generate the cryptography key (thatservice provider system 126 received), and is used to compute the cryptography key K as K=Φj(R) where j≧0 and stores cryptography key K for a particular user in asecure area 102. The number j in the operator Φj( ) is the number of times that the operator Φ( ) is applied to R. - For a payment transaction, one item may be the name of the person or entity sending the money. In at least one embodiment, the transaction may be a stock trade. In these embodiments, the stock account number may be part of the transaction information. In at least one embodiment, the ticker symbol of the stock—for example, GOOG—being bought or sold may be part of the transaction information (or the name of a commodity or other item being purchased). The number of shares may be part of the transaction information. The price per share (or unit price) at which the person wishes to buy or sell the shares may be an item of the transaction information. If the stock purchase (or sale) is a limit order, then an indication that the stock purchase is a limit order may be an item of the transaction information. If the stock purchase (or sale) is a market order, then an indication that the purchase is a market order may be an item of the transaction information. The name of the stock account (e.g. Ameritrade, Charles Schwab, etc.) or broker may also be an item of the transaction information.
- In at least one embodiment, there are transaction steps A and B, which are executed to successfully complete a transaction. In at least one embodiment, there are transaction steps A, B, and C, which are executed to successfully complete a transaction.
FIG. 3A shows a flow chart of transaction step A. - TRANSACTION STEP A. In at least one embodiment, the person looks for one or more logos or visual images that helps person make sure that he or she is communicating to the appropriate user's bank, financial institution or other service provider system. In an embodiment, the person learns or creates this image that verifies the service provider system during setup. When a transaction is requested by the person, user selects a collection or sequence of visual images that are easy to remember, and/or presents a biometric print match and/or a password or PIN, that are acquired by
user system 101. This is referred to as user authentication. The person (user) securely enters transaction information by selecting or choosing visual images that are difficult for malware to read or recognize. - Step A.1 The person verifies in web browser or visual display that her or she is communicating to the appropriate bank, financial institution or other service provider system.
- Step A.2 The person enters their user authentication information as a collection of visual images, a PIN or password or a biometric print.
- Step A.3 The person enters a one-time sequence of letters, and/or a one-time sequence of numbers or a one-time sequence of images or a combination that is unique for this transaction and difficult for malware to guess.
- Step A.4 The person selects and enters transaction information into
user system 101. - Step A.5 Transaction information is encrypted with key K denoted as E(, K). User authentication information is encrypted as E(, K). One-time information information is encrypted as E(, K) and are then sent to service provider system.
- There are many different methods for transmitting encrypted user authentication E(, K), encrypted unique information E(, K) and encrypted transaction information E(, K) to the administrator (bank) at
service provider system 126. In one method, the user may wirelessly transmit the encrypted transaction information via a mobile phone toservice provider system 126. In a third method, the user may submit or enter a collection of images and encrypted transaction information to the web browser ofuser system 101 and use the Internet for transmission to the administrator (bank) atservice provider system 126. In many other methods, the user may submit the user authentication and encrypted transaction information by some other electronic means, such as a fax machine or an ATM machine. - In at least one embodiment, the current time τ1 is determined and provided as transaction information. The current time τ1 may be rounded to the nearest minute, for example. Optionally, the sender and receiver may compute the difference in time between the clock of the sender and the clock of the receiver prior to sending a message in case the two clocks are not sufficiently synchronized. In other embodiments, the time may be rounded to the nearest 5 minutes, the nearest, 10 minute, or the nearest hour, for example. Here the reference time is GMT time. For example, if the exact time is 19:05 and 45 seconds GMT, then τ1 is set to is 19:06 GMT. If the time is not correct or is too delayed from the original time, then the transaction may be aborted.
- TRANSACTION STEP B. The administrator (bank or financial institution) receives at
service provider system 126 the encrypted transaction information, encrypted one-time information and encrypted user authentication information.FIG. 3B shows a flow chart of transaction step B. -
- Step B.2 The service provider decrypts E(, K) and checks that the user was able to correctly recognize the one-time information from the user's screen or web browser. If the one-time information ′ decrypted by the service provider system is not valid (i.e., ′ doesn't match ), then the transaction is aborted. If the one-time information ′ decrypted by the service provider system is valid (i.e., ′ matches ), then
service provider system 126 goes to step B.3. -
-
- TRANSACTION STEP C. The service provider system translates the transaction information to a new collection of visual images but that represent the same transaction information as . The service provider system encrypts this new visual representation of the transaction information as E(, K) and sends E(, K) back to the user system. The user system receives E(, K), decrypts it and the user checks that matches transaction information . If doesn't match transaction information , then the user may abort the transaction.
- Transaction Step D.
-
-
- In at least one embodiment, the user interface may implemented with a web browser in a personal computer or in a mobile phone. User input such as selecting letters, numbers or other input items may be accomplished with fingers on the glass screen of IPhone or Android phone. For a PC, the letters, number or other input items, may be entered with a mouse selecting appropriate letters as shown in
FIG. 5 or 6. In at least one embodiment, the display screen may be rendered with a glass screen in a mobile phone such as an Android or IPhone. In other embodiments, the display screen may use an LCD. In at least one embodiment, some or all of the financial institution members of SWIFT may be stored in terms of patterns or images in the memory of the service provider system. In at least one embodiment, the user may use her or her fingers to scroll on the screen and select one of the banks to make a transaction with. In at least one embodiment, the user may use a mouse to scroll on the display of the personal computer. - In at least one embodiment, the user may be an employee of the bank. In at least one embodiment, the device may be used to securely execute wire transfers between two banks In at least one embodiment, a visual images of letters that are difficult for malware to read may be displayed as a keyboard to be used by a person to enter a password or transaction information as shown in
FIGS. 5 and 6 . In at least one embodiment, the display may enable the user to verify that the transaction information is correct or has not been tampered with by malware before executing the transaction. - Each embodiment disclosed herein may be used or otherwise combined with any of the other embodiments disclosed. Any element of any embodiment may be used in any embodiment. At least one embodiment of this specification includes all of the embodiments being used together except for those that are mutually exclusive.
- Although the invention has been described with reference to specific embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from the true spirit and scope of the invention. In addition, modifications may be made without departing from the essential teachings of the invention.
Claims (21)
1. A method of securing a transaction comprising:
transaction information is entered into a user system, the user system having a processor system having at least one processor, a communications interface, and a memory system;
the user selects or enters transaction information using images received from the service provider system.
2. The method of claim 1 wherein some of said images represent letters or numbers.
3. The method of claim 1 wherein at least one of said images is an image of an animal.
4. The method of claim 1 wherein at least one of said images has color or texture.
5. The method of claim 1 wherein one-time information is communicated with said images.
6. The method of claim 1 wherein at least one of said images is at least part of a logo.
7. The method of claim 1 wherein said service provider encrypts one or more said images before transmitting them to said user system.
8. The method of claim 1 wherein a user looks at one or more images to check that service provider is valid.
9. The method of claim 1 wherein at least some of said images are used as a universal identifier for said user.
10. The method of claim 1 wherein said service provider is a bank or financial exchange.
11. The method of claim 1 , wherein at least one said visual image is of at least part of a human face that expresses a smile.
12. The method of claim 1 wherein noise is combined with said images and the noise is generated using quantum randomness.
13. A method for determining whether to grant access to a secure entity comprising:
generating visual images and displaying said images on a screen;
and a user selecting said visual images from said display screen;
wherein said determining uses a processor system having a least one processor, a communications interface, and a memory system.
14. The method of claim 13 wherein said screen is a touch sensitive screen and the user selects said images with his or her fingers.
15. The method of claim 13 where the order of said visual images is randomly permuted based on a non-deterministic process generated by hardware.
16. The method of claim 15 wherein said hardware is part of the web server.
17. The method of claim 13 wherein said visual images are randomly generated by a web server and transmitted to a mobile phone or PC.
18. The method of claim 13 wherein noise is combined with the visual images and said noise is generated using quantum randomness.
19. The method of claim 13 wherein at least one of said images is an image of an animal.
20. The method of claim 13 wherein at least one of said images has texture.
21. The method of claim 13 wherein at least one of said images has color.
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/017,735 US20150067786A1 (en) | 2013-09-04 | 2013-09-04 | Visual image authentication and transaction authorization using non-determinism |
PCT/US2013/058404 WO2014039763A1 (en) | 2012-09-09 | 2013-09-06 | Visual image authentication and transaction authorization using non-determinism |
US14/857,796 US10592651B2 (en) | 2012-09-09 | 2015-09-17 | Visual image authentication |
US16/153,639 US20190050554A1 (en) | 2013-09-04 | 2018-10-05 | Logo image and advertising authentication |
US16/819,094 US11128453B2 (en) | 2013-09-04 | 2020-03-15 | Visual image authentication |
US17/396,702 US11693944B2 (en) | 2013-09-04 | 2021-08-08 | Visual image authentication |
US18/216,618 US20230359764A1 (en) | 2013-09-04 | 2023-06-30 | Visual Image Authentication |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/017,735 US20150067786A1 (en) | 2013-09-04 | 2013-09-04 | Visual image authentication and transaction authorization using non-determinism |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/857,796 Continuation-In-Part US10592651B2 (en) | 2012-09-09 | 2015-09-17 | Visual image authentication |
Related Child Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/857,796 Continuation-In-Part US10592651B2 (en) | 2012-09-09 | 2015-09-17 | Visual image authentication |
US14/857,796 Continuation US10592651B2 (en) | 2012-09-09 | 2015-09-17 | Visual image authentication |
US16/153,639 Continuation-In-Part US20190050554A1 (en) | 2013-09-04 | 2018-10-05 | Logo image and advertising authentication |
US16/819,094 Continuation-In-Part US11128453B2 (en) | 2013-09-04 | 2020-03-15 | Visual image authentication |
US17/396,702 Continuation-In-Part US11693944B2 (en) | 2013-09-04 | 2021-08-08 | Visual image authentication |
US18/216,618 Continuation-In-Part US20230359764A1 (en) | 2013-09-04 | 2023-06-30 | Visual Image Authentication |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150067786A1 true US20150067786A1 (en) | 2015-03-05 |
Family
ID=52585211
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/017,735 Abandoned US20150067786A1 (en) | 2012-09-09 | 2013-09-04 | Visual image authentication and transaction authorization using non-determinism |
US14/857,796 Active 2035-07-11 US10592651B2 (en) | 2012-09-09 | 2015-09-17 | Visual image authentication |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/857,796 Active 2035-07-11 US10592651B2 (en) | 2012-09-09 | 2015-09-17 | Visual image authentication |
Country Status (1)
Country | Link |
---|---|
US (2) | US20150067786A1 (en) |
Cited By (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150234757A1 (en) * | 2014-02-19 | 2015-08-20 | Samsung Electronics Co., Ltd. | Security information inputting/outputting method and electronic device adapted to the method |
US20150365434A1 (en) * | 2011-05-26 | 2015-12-17 | International Business Machines Corporation | Rotation of web site content to prevent e-mail spam/phishing attacks |
CN105373924A (en) * | 2015-10-10 | 2016-03-02 | 北京思比科微电子技术股份有限公司 | System facing terminal equipment and providing safety payment function |
US20160359877A1 (en) * | 2015-06-05 | 2016-12-08 | Cisco Technology, Inc. | Intra-datacenter attack detection |
US20180144112A1 (en) * | 2016-11-02 | 2018-05-24 | Skeyecode | Method for authenticating a user by means of a non-secure terminal |
CN108092771A (en) * | 2018-02-11 | 2018-05-29 | 成都信息工程大学 | A kind of anti-tamper controlled quantum safety direct communication method and system |
US10027657B1 (en) | 2016-07-06 | 2018-07-17 | Wells Fargo Bank, N.A. | Authentication/authorization without a password |
US10134035B1 (en) * | 2015-01-23 | 2018-11-20 | Island Intellectual Property, Llc | Invariant biohash security system and method |
US20180336333A1 (en) * | 2017-05-17 | 2018-11-22 | American Express Travel Related Services Company, Inc. | Approving transactions using a captured biometric template |
US10289438B2 (en) | 2016-06-16 | 2019-05-14 | Cisco Technology, Inc. | Techniques for coordination of application components deployed on distributed virtual machines |
US10339366B2 (en) * | 2013-10-23 | 2019-07-02 | Mobilesphere Holdings II LLC | System and method for facial recognition |
US10374904B2 (en) | 2015-05-15 | 2019-08-06 | Cisco Technology, Inc. | Diagnostic network visualization |
US10523512B2 (en) | 2017-03-24 | 2019-12-31 | Cisco Technology, Inc. | Network agent for generating platform specific network policies |
US10523541B2 (en) | 2017-10-25 | 2019-12-31 | Cisco Technology, Inc. | Federated network and application data analytics platform |
US10554501B2 (en) | 2017-10-23 | 2020-02-04 | Cisco Technology, Inc. | Network migration assistant |
US10574575B2 (en) | 2018-01-25 | 2020-02-25 | Cisco Technology, Inc. | Network flow stitching using middle box flow stitching |
US10594542B2 (en) | 2017-10-27 | 2020-03-17 | Cisco Technology, Inc. | System and method for network root cause analysis |
US10594560B2 (en) | 2017-03-27 | 2020-03-17 | Cisco Technology, Inc. | Intent driven network policy platform |
US10614206B2 (en) * | 2016-12-01 | 2020-04-07 | International Business Machines Corporation | Sequential object set passwords |
US10680887B2 (en) | 2017-07-21 | 2020-06-09 | Cisco Technology, Inc. | Remote device status audit and recovery |
US10708183B2 (en) | 2016-07-21 | 2020-07-07 | Cisco Technology, Inc. | System and method of providing segment routing as a service |
US10708152B2 (en) | 2017-03-23 | 2020-07-07 | Cisco Technology, Inc. | Predicting application and network performance |
US10764141B2 (en) | 2017-03-27 | 2020-09-01 | Cisco Technology, Inc. | Network agent for reporting to a network policy system |
US20200313880A1 (en) * | 2019-03-25 | 2020-10-01 | Stmicroelectronics (Rousset) Sas | Encryption and/or decryption key device, system and method |
US10798015B2 (en) | 2018-01-25 | 2020-10-06 | Cisco Technology, Inc. | Discovery of middleboxes using traffic flow stitching |
US10797970B2 (en) | 2015-06-05 | 2020-10-06 | Cisco Technology, Inc. | Interactive hierarchical network chord diagram for application dependency mapping |
US20200344231A1 (en) * | 2019-04-23 | 2020-10-29 | Microsoft Technology Licensing, Llc | Resource access based on audio signal |
US10826803B2 (en) | 2018-01-25 | 2020-11-03 | Cisco Technology, Inc. | Mechanism for facilitating efficient policy updates |
US10873794B2 (en) | 2017-03-28 | 2020-12-22 | Cisco Technology, Inc. | Flowlet resolution for application performance monitoring and management |
US10972388B2 (en) | 2016-11-22 | 2021-04-06 | Cisco Technology, Inc. | Federated microburst detection |
US10999149B2 (en) | 2018-01-25 | 2021-05-04 | Cisco Technology, Inc. | Automatic configuration discovery based on traffic flow data |
US11128700B2 (en) | 2018-01-26 | 2021-09-21 | Cisco Technology, Inc. | Load balancing configuration based on traffic flow telemetry |
US11233821B2 (en) | 2018-01-04 | 2022-01-25 | Cisco Technology, Inc. | Network intrusion counter-intelligence |
US11275820B2 (en) * | 2019-03-08 | 2022-03-15 | Master Lock Company Llc | Locking device biometric access |
US11528283B2 (en) | 2015-06-05 | 2022-12-13 | Cisco Technology, Inc. | System for monitoring and managing datacenters |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9235697B2 (en) | 2012-03-05 | 2016-01-12 | Biogy, Inc. | One-time passcodes with asymmetric keys |
US10360395B2 (en) * | 2014-11-28 | 2019-07-23 | Fiske Software, Llc | Hiding information in noise |
WO2016187432A1 (en) * | 2015-05-19 | 2016-11-24 | Michael Fiske | Hiding a public key exchange in noise |
US10285049B2 (en) * | 2015-11-24 | 2019-05-07 | Raytheon Company | Device and method for baseband signal encryption |
US9392460B1 (en) | 2016-01-02 | 2016-07-12 | International Business Machines Corporation | Continuous user authentication tool for mobile device communications |
EP3452943B1 (en) * | 2016-05-02 | 2021-06-30 | Hewlett-Packard Development Company, L.P. | Authentication using sequence of images |
AU2017261844A1 (en) | 2016-05-10 | 2018-11-22 | Commonwealth Scientific And Industrial Research Organisation | Authenticating a user |
US10581985B2 (en) | 2016-10-03 | 2020-03-03 | J2B2, Llc | Systems and methods for providing coordinating identifiers over a network |
US10477345B2 (en) | 2016-10-03 | 2019-11-12 | J2B2, Llc | Systems and methods for identifying parties based on coordinating identifiers |
US10601931B2 (en) | 2016-10-03 | 2020-03-24 | J2B2, Llc | Systems and methods for delivering information and using coordinating identifiers |
US11741196B2 (en) | 2018-11-15 | 2023-08-29 | The Research Foundation For The State University Of New York | Detecting and preventing exploits of software vulnerability using instruction tags |
RU2754240C1 (en) * | 2020-12-16 | 2021-08-30 | ОБЩЕСТВО С ОГРАНИЧЕННОЙ ОТВЕТСТВЕННОСТЬЮ "КуРэйт" (ООО "КуРэйт") | Method and system for confirming transactions using a randomly generated graphical key |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040230843A1 (en) * | 2003-08-20 | 2004-11-18 | Wayne Jansen | System and method for authenticating users using image selection |
US20060045361A1 (en) * | 2004-09-01 | 2006-03-02 | Fuji Xerox Co., Ltd. | Encoding device, decoding device, encoding method, decoding method, and program therefor |
US20090320124A1 (en) * | 2008-06-23 | 2009-12-24 | Echostar Technologies Llc | Apparatus and methods for dynamic pictorial image authentication |
US20100046553A1 (en) * | 2008-08-20 | 2010-02-25 | Esther Finale LLC | Data packet generator for generating passcodes |
US20100186074A1 (en) * | 2009-01-16 | 2010-07-22 | Angelos Stavrou | Authentication Using Graphical Passwords |
US20110202982A1 (en) * | 2007-09-17 | 2011-08-18 | Vidoop, Llc | Methods And Systems For Management Of Image-Based Password Accounts |
US20130097697A1 (en) * | 2011-10-14 | 2013-04-18 | Microsoft Corporation | Security Primitives Employing Hard Artificial Intelligence Problems |
US20140250516A1 (en) * | 2011-06-30 | 2014-09-04 | Dongxuan Gao | Method for authenticating identity of handset user |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5140538A (en) * | 1988-12-27 | 1992-08-18 | University Of Arkansas | Hybrid digital-analog computer parallel processor |
US8473336B1 (en) * | 2004-09-20 | 2013-06-25 | Jared Kopf | Resource access control method and system for imprinting a product or service on the mind of a user of an online resource |
US8117458B2 (en) * | 2006-05-24 | 2012-02-14 | Vidoop Llc | Methods and systems for graphical image authentication |
US20090150983A1 (en) * | 2007-08-27 | 2009-06-11 | Infosys Technologies Limited | System and method for monitoring human interaction |
US8483518B2 (en) * | 2010-02-19 | 2013-07-09 | Microsoft Corporation | Image-based CAPTCHA exploiting context in object recognition |
-
2013
- 2013-09-04 US US14/017,735 patent/US20150067786A1/en not_active Abandoned
-
2015
- 2015-09-17 US US14/857,796 patent/US10592651B2/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040230843A1 (en) * | 2003-08-20 | 2004-11-18 | Wayne Jansen | System and method for authenticating users using image selection |
US20060045361A1 (en) * | 2004-09-01 | 2006-03-02 | Fuji Xerox Co., Ltd. | Encoding device, decoding device, encoding method, decoding method, and program therefor |
US20110202982A1 (en) * | 2007-09-17 | 2011-08-18 | Vidoop, Llc | Methods And Systems For Management Of Image-Based Password Accounts |
US20090320124A1 (en) * | 2008-06-23 | 2009-12-24 | Echostar Technologies Llc | Apparatus and methods for dynamic pictorial image authentication |
US20100046553A1 (en) * | 2008-08-20 | 2010-02-25 | Esther Finale LLC | Data packet generator for generating passcodes |
US20100186074A1 (en) * | 2009-01-16 | 2010-07-22 | Angelos Stavrou | Authentication Using Graphical Passwords |
US20140250516A1 (en) * | 2011-06-30 | 2014-09-04 | Dongxuan Gao | Method for authenticating identity of handset user |
US20130097697A1 (en) * | 2011-10-14 | 2013-04-18 | Microsoft Corporation | Security Primitives Employing Hard Artificial Intelligence Problems |
Cited By (87)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10079856B2 (en) * | 2011-05-26 | 2018-09-18 | International Business Machines Corporation | Rotation of web site content to prevent e-mail spam/phishing attacks |
US20150365434A1 (en) * | 2011-05-26 | 2015-12-17 | International Business Machines Corporation | Rotation of web site content to prevent e-mail spam/phishing attacks |
US10339366B2 (en) * | 2013-10-23 | 2019-07-02 | Mobilesphere Holdings II LLC | System and method for facial recognition |
US10664578B2 (en) * | 2014-02-19 | 2020-05-26 | Samsung Electronics Co., Ltd | Security information inputting/outputting method and electronic device adapted to the method |
US20150234757A1 (en) * | 2014-02-19 | 2015-08-20 | Samsung Electronics Co., Ltd. | Security information inputting/outputting method and electronic device adapted to the method |
US10134035B1 (en) * | 2015-01-23 | 2018-11-20 | Island Intellectual Property, Llc | Invariant biohash security system and method |
US10374904B2 (en) | 2015-05-15 | 2019-08-06 | Cisco Technology, Inc. | Diagnostic network visualization |
US11902120B2 (en) | 2015-06-05 | 2024-02-13 | Cisco Technology, Inc. | Synthetic data for determining health of a network security system |
US10623283B2 (en) | 2015-06-05 | 2020-04-14 | Cisco Technology, Inc. | Anomaly detection through header field entropy |
US11968103B2 (en) | 2015-06-05 | 2024-04-23 | Cisco Technology, Inc. | Policy utilization analysis |
US11252058B2 (en) | 2015-06-05 | 2022-02-15 | Cisco Technology, Inc. | System and method for user optimized application dependency mapping |
US10320630B2 (en) | 2015-06-05 | 2019-06-11 | Cisco Technology, Inc. | Hierarchichal sharding of flows from sensors to collectors |
US10326673B2 (en) | 2015-06-05 | 2019-06-18 | Cisco Technology, Inc. | Techniques for determining network topologies |
US12113684B2 (en) | 2015-06-05 | 2024-10-08 | Cisco Technology, Inc. | Identifying bogon address spaces |
US11968102B2 (en) | 2015-06-05 | 2024-04-23 | Cisco Technology, Inc. | System and method of detecting packet loss in a distributed sensor-collector architecture |
US10904116B2 (en) | 2015-06-05 | 2021-01-26 | Cisco Technology, Inc. | Policy utilization analysis |
US10439904B2 (en) | 2015-06-05 | 2019-10-08 | Cisco Technology, Inc. | System and method of determining malicious processes |
US10505828B2 (en) | 2015-06-05 | 2019-12-10 | Cisco Technology, Inc. | Technologies for managing compromised sensors in virtualized environments |
US10516585B2 (en) | 2015-06-05 | 2019-12-24 | Cisco Technology, Inc. | System and method for network information mapping and displaying |
US10516586B2 (en) | 2015-06-05 | 2019-12-24 | Cisco Technology, Inc. | Identifying bogon address spaces |
US20160359877A1 (en) * | 2015-06-05 | 2016-12-08 | Cisco Technology, Inc. | Intra-datacenter attack detection |
US11936663B2 (en) | 2015-06-05 | 2024-03-19 | Cisco Technology, Inc. | System for monitoring and managing datacenters |
US10536357B2 (en) | 2015-06-05 | 2020-01-14 | Cisco Technology, Inc. | Late data detection in data center |
US11924073B2 (en) | 2015-06-05 | 2024-03-05 | Cisco Technology, Inc. | System and method of assigning reputation scores to hosts |
US10567247B2 (en) * | 2015-06-05 | 2020-02-18 | Cisco Technology, Inc. | Intra-datacenter attack detection |
US11902122B2 (en) | 2015-06-05 | 2024-02-13 | Cisco Technology, Inc. | Application monitoring prioritization |
US11368378B2 (en) | 2015-06-05 | 2022-06-21 | Cisco Technology, Inc. | Identifying bogon address spaces |
US11252060B2 (en) | 2015-06-05 | 2022-02-15 | Cisco Technology, Inc. | Data center traffic analytics synchronization |
US11695659B2 (en) | 2015-06-05 | 2023-07-04 | Cisco Technology, Inc. | Unique ID generation for sensors |
US11405291B2 (en) | 2015-06-05 | 2022-08-02 | Cisco Technology, Inc. | Generate a communication graph using an application dependency mapping (ADM) pipeline |
US10862776B2 (en) | 2015-06-05 | 2020-12-08 | Cisco Technology, Inc. | System and method of spoof detection |
US10659324B2 (en) | 2015-06-05 | 2020-05-19 | Cisco Technology, Inc. | Application monitoring prioritization |
US11477097B2 (en) | 2015-06-05 | 2022-10-18 | Cisco Technology, Inc. | Hierarchichal sharding of flows from sensors to collectors |
US11637762B2 (en) | 2015-06-05 | 2023-04-25 | Cisco Technology, Inc. | MDL-based clustering for dependency mapping |
US10693749B2 (en) | 2015-06-05 | 2020-06-23 | Cisco Technology, Inc. | Synthetic data for determining health of a network security system |
US11496377B2 (en) | 2015-06-05 | 2022-11-08 | Cisco Technology, Inc. | Anomaly detection through header field entropy |
US10797970B2 (en) | 2015-06-05 | 2020-10-06 | Cisco Technology, Inc. | Interactive hierarchical network chord diagram for application dependency mapping |
US10728119B2 (en) | 2015-06-05 | 2020-07-28 | Cisco Technology, Inc. | Cluster discovery via multi-domain fusion for application dependency mapping |
US10735283B2 (en) * | 2015-06-05 | 2020-08-04 | Cisco Technology, Inc. | Unique ID generation for sensors |
US10742529B2 (en) | 2015-06-05 | 2020-08-11 | Cisco Technology, Inc. | Hierarchichal sharding of flows from sensors to collectors |
US11601349B2 (en) | 2015-06-05 | 2023-03-07 | Cisco Technology, Inc. | System and method of detecting hidden processes by analyzing packet flows |
US11528283B2 (en) | 2015-06-05 | 2022-12-13 | Cisco Technology, Inc. | System for monitoring and managing datacenters |
US11522775B2 (en) | 2015-06-05 | 2022-12-06 | Cisco Technology, Inc. | Application monitoring prioritization |
US11502922B2 (en) | 2015-06-05 | 2022-11-15 | Cisco Technology, Inc. | Technologies for managing compromised sensors in virtualized environments |
CN105373924A (en) * | 2015-10-10 | 2016-03-02 | 北京思比科微电子技术股份有限公司 | System facing terminal equipment and providing safety payment function |
US10289438B2 (en) | 2016-06-16 | 2019-05-14 | Cisco Technology, Inc. | Techniques for coordination of application components deployed on distributed virtual machines |
US10027657B1 (en) | 2016-07-06 | 2018-07-17 | Wells Fargo Bank, N.A. | Authentication/authorization without a password |
US10581832B1 (en) | 2016-07-06 | 2020-03-03 | Wells Fargo Bank, N.A. | Authentication / authorization without a password |
US10708183B2 (en) | 2016-07-21 | 2020-07-07 | Cisco Technology, Inc. | System and method of providing segment routing as a service |
US11283712B2 (en) | 2016-07-21 | 2022-03-22 | Cisco Technology, Inc. | System and method of providing segment routing as a service |
US20180144112A1 (en) * | 2016-11-02 | 2018-05-24 | Skeyecode | Method for authenticating a user by means of a non-secure terminal |
US10972388B2 (en) | 2016-11-22 | 2021-04-06 | Cisco Technology, Inc. | Federated microburst detection |
US10614206B2 (en) * | 2016-12-01 | 2020-04-07 | International Business Machines Corporation | Sequential object set passwords |
US10708152B2 (en) | 2017-03-23 | 2020-07-07 | Cisco Technology, Inc. | Predicting application and network performance |
US11088929B2 (en) | 2017-03-23 | 2021-08-10 | Cisco Technology, Inc. | Predicting application and network performance |
US10523512B2 (en) | 2017-03-24 | 2019-12-31 | Cisco Technology, Inc. | Network agent for generating platform specific network policies |
US11252038B2 (en) | 2017-03-24 | 2022-02-15 | Cisco Technology, Inc. | Network agent for generating platform specific network policies |
US10594560B2 (en) | 2017-03-27 | 2020-03-17 | Cisco Technology, Inc. | Intent driven network policy platform |
US11509535B2 (en) | 2017-03-27 | 2022-11-22 | Cisco Technology, Inc. | Network agent for reporting to a network policy system |
US11146454B2 (en) | 2017-03-27 | 2021-10-12 | Cisco Technology, Inc. | Intent driven network policy platform |
US10764141B2 (en) | 2017-03-27 | 2020-09-01 | Cisco Technology, Inc. | Network agent for reporting to a network policy system |
US10873794B2 (en) | 2017-03-28 | 2020-12-22 | Cisco Technology, Inc. | Flowlet resolution for application performance monitoring and management |
US11202132B2 (en) | 2017-03-28 | 2021-12-14 | Cisco Technology, Inc. | Application performance monitoring and management platform with anomalous flowlet resolution |
US11683618B2 (en) | 2017-03-28 | 2023-06-20 | Cisco Technology, Inc. | Application performance monitoring and management platform with anomalous flowlet resolution |
US11863921B2 (en) | 2017-03-28 | 2024-01-02 | Cisco Technology, Inc. | Application performance monitoring and management platform with anomalous flowlet resolution |
US20180336333A1 (en) * | 2017-05-17 | 2018-11-22 | American Express Travel Related Services Company, Inc. | Approving transactions using a captured biometric template |
US10747866B2 (en) | 2017-05-17 | 2020-08-18 | American Express Travel Related Services Company, Inc. | Transaction approval based on a scratch pad |
US10339291B2 (en) * | 2017-05-17 | 2019-07-02 | American Express Travel Related Services Company, Inc. | Approving transactions using a captured biometric template |
US10680887B2 (en) | 2017-07-21 | 2020-06-09 | Cisco Technology, Inc. | Remote device status audit and recovery |
US11044170B2 (en) | 2017-10-23 | 2021-06-22 | Cisco Technology, Inc. | Network migration assistant |
US10554501B2 (en) | 2017-10-23 | 2020-02-04 | Cisco Technology, Inc. | Network migration assistant |
US10523541B2 (en) | 2017-10-25 | 2019-12-31 | Cisco Technology, Inc. | Federated network and application data analytics platform |
US10594542B2 (en) | 2017-10-27 | 2020-03-17 | Cisco Technology, Inc. | System and method for network root cause analysis |
US10904071B2 (en) | 2017-10-27 | 2021-01-26 | Cisco Technology, Inc. | System and method for network root cause analysis |
US11750653B2 (en) | 2018-01-04 | 2023-09-05 | Cisco Technology, Inc. | Network intrusion counter-intelligence |
US11233821B2 (en) | 2018-01-04 | 2022-01-25 | Cisco Technology, Inc. | Network intrusion counter-intelligence |
US10999149B2 (en) | 2018-01-25 | 2021-05-04 | Cisco Technology, Inc. | Automatic configuration discovery based on traffic flow data |
US10798015B2 (en) | 2018-01-25 | 2020-10-06 | Cisco Technology, Inc. | Discovery of middleboxes using traffic flow stitching |
US10574575B2 (en) | 2018-01-25 | 2020-02-25 | Cisco Technology, Inc. | Network flow stitching using middle box flow stitching |
US10826803B2 (en) | 2018-01-25 | 2020-11-03 | Cisco Technology, Inc. | Mechanism for facilitating efficient policy updates |
US11128700B2 (en) | 2018-01-26 | 2021-09-21 | Cisco Technology, Inc. | Load balancing configuration based on traffic flow telemetry |
CN108092771A (en) * | 2018-02-11 | 2018-05-29 | 成都信息工程大学 | A kind of anti-tamper controlled quantum safety direct communication method and system |
US11947649B2 (en) | 2019-03-08 | 2024-04-02 | Master Lock Company Llc | Locking device biometric access |
US11275820B2 (en) * | 2019-03-08 | 2022-03-15 | Master Lock Company Llc | Locking device biometric access |
US20200313880A1 (en) * | 2019-03-25 | 2020-10-01 | Stmicroelectronics (Rousset) Sas | Encryption and/or decryption key device, system and method |
US20200344231A1 (en) * | 2019-04-23 | 2020-10-29 | Microsoft Technology Licensing, Llc | Resource access based on audio signal |
US11949677B2 (en) * | 2019-04-23 | 2024-04-02 | Microsoft Technology Licensing, Llc | Resource access based on audio signal |
Also Published As
Publication number | Publication date |
---|---|
US10592651B2 (en) | 2020-03-17 |
US20160034682A1 (en) | 2016-02-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10592651B2 (en) | Visual image authentication | |
US20190050554A1 (en) | Logo image and advertising authentication | |
US20180144114A1 (en) | Securing Blockchain Transactions Against Cyberattacks | |
US10491379B2 (en) | System, device, and method of secure entry and handling of passwords | |
US11824991B2 (en) | Securing transactions with a blockchain network | |
Sabzevar et al. | Universal multi-factor authentication using graphical passwords | |
US11693944B2 (en) | Visual image authentication | |
US11128453B2 (en) | Visual image authentication | |
US10848304B2 (en) | Public-private key pair protected password manager | |
WO2015188424A1 (en) | Key storage device and method for using same | |
EP2758922A2 (en) | Securing transactions against cyberattacks | |
CN110291755A (en) | Accredited key server | |
CN113826096A (en) | User authentication and signature apparatus and method using user biometric identification data | |
WO2014039763A1 (en) | Visual image authentication and transaction authorization using non-determinism | |
US20230359764A1 (en) | Visual Image Authentication | |
Tekawade et al. | Social engineering solutions for document generation using key-logger security mechanism and QR code | |
Misbahuddin et al. | A user friendly password authenticated key agreement for multi server environment | |
US20240169350A1 (en) | Securing transactions with a blockchain network | |
Reddy et al. | A comparative analysis of various multifactor authentication mechanisms | |
Liou | Performance measures for evaluating the dynamic authentication techniques | |
US12136083B2 (en) | Offline interception-free interaction with a cryptocurrency network using a network-disabled device | |
MORAKINYO | A secure bank login system using a multi-factor authentication | |
Abdullah et al. | Pass Matrix Based Graphical Password Authentication on the Android Platform Check for updates | |
Molla | Mobile user authentication system (MUAS) for e-commerce applications. | |
Marchang et al. | Multidimensional: User with File Content and Server’s Status Based Authentication for Secure File Operations in Cloud |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |