Abstract
Information privacy is constantly negotiated when people interact with enterprises and government agencies via the Internet. In this context, all relevant stakeholders take privacy-related decisions. Individuals, either as consumers buying online products and services or citizens using e-government services, face decisions with regard to the use of online services, the disclosure of personal information, and the use of privacy enhancing technologies. Enterprises make decisions regarding their investments on policies and technologies for privacy protection. Governments also decide on privacy regulations, as well as on the development of e-government services that store and process citizens’ personal information. Motivated by the aforementioned issues and challenges, we focus on aspects of privacy decision-making in the digital era and address issues of individuals’ privacy behavior. We further discuss issues of strategic privacy decision-making for online service providers and e-government service providers.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
Information privacy is a multi-disciplinary and crucial topic for understanding the digital world [4, 65]. Information privacy mainly relates to personal data stored in information systems, such as medical records, financial data, photos, and videos. In this research, we focus on online privacy where personal data are shared over the Internet.
Current research on information privacy highlights issues such as privacy concerns of online users [8, 17, 74], the so-called “privacy paradox”, referring to the inconsistency of users’ privacy-related behavior and their privacy concerns [40, 44, 81]. Another main strand of research includes Privacy-Enhancing Technologies (PETs) [58].
In the information age, privacy has become a luxury to maintain as data privacy can be violated on the internet through technical tools such as cookies or tracking online activities [11, 57]. However, the rapid growth of the Internet and what it has brought to people’s lives (especially during the past ten years) are truly astonishing. The Internet makes people’s lives incredibly convenient and websites will probably remain an important communication channel, along with direct messaging applications.
Privacy, however, is not just an Information Technology (IT) problem, although it could be in many cases. Many psychological, social and cultural factors play a significant role in the field of privacy. Human behavior is a considerable variable as individuals interact with others in online environments exchanging private information and making decisions about their privacy [15].
The variety of information that individuals share online can potentially characterize them [13, 54]. The mechanisms that individuals use when making online sharing decisions are the main focus of this research.
The individual decision process with respect to privacy is affected by multiple factors. Incomplete information bounded rationality, and systematic psychological deviations are considerable variables that influence individual’s privacy behavior [2, 10]. First, incomplete information refers to privacy decision-making, where third parties share personal information about an individual without her being part of the transaction. How personal information will be used might be known only to a subset of the parties making decisions (information asymmetry); thus, risk could be hard to calculate, as it may dependent on unknown random variables. Benefits and costs associated with privacy intrusions and protection are complex, multifaceted, and context-specific. They are frequently bundled with other products and services (e.g., a search engine query can prompt the desired result but can also give observers information about the searcher’s interests), and they are often realized only after privacy violations have taken place. They can be monetary but also immaterial and, thus, difficult to quantify.
Second, individuals would be unable to act in an optimal way, even if they had access to complete information. Especially when individuals have to manage huge volumes of data and make decisions about the protection or disclosure of personal information, bounded rationality limits their ability to process and memorize all their actions. They rely on simplified irrational models, strategies, and heuristics [6].
Third, individuals might deviate from the rational strategy, even if they had access to complete information and could successfully calculate optimization strategies for their privacy-sensitive decisions. A vast body of economics and psychology literature has revealed several forms of systematic psychological deviations from rationality that affect individual decision-making [36, 76]. For example, in addition to their cognitive and computational bounds, individuals are influenced by motivational limitations and misrepresentations of personal utility. Research in psychology also documents how individuals mispredict their own future preferences or draw inaccurate conclusions from past choices [5]. In addition, individuals often suffer from self-control problems, in particular, the tendency to trade off costs and benefits in ways that damage their future utility in favor of immediate gratification. Individuals’ behavior can also be guided by social preferences or norms, such as fairness or altruism. Many of these deviations apply naturally to privacy-sensitive scenarios. Any of these factors might influence decision-making behavior inside and outside the privacy domain, although not all factors need to always be present. Empirical evidence of their influence on privacy decision-making would not necessarily imply that individuals act recklessly or make choices against their own best interest. It would, however, imply bias and limitations in the individual decision process that we should consider when designing privacy public policy and PETs.
2 Privacy Trade-Offs in the Digital Age
What are the privacy implications of behavioral decision-making in online transactions? To answer this question we should notice what privacy stands for. For decades a long-lasting debate exists among scholars to define exactly what that right entails [61]. Undoubtedly, privacy is a fundamental human right [79], but also a “chameleon” that changes meaning depending on context [37]. Looking for a privacy definition in literature we found clear disarray. Nobody seems to have a very clear idea what the right to privacy is [53]. As Solove [72] points out, privacy means different things to different people.
Warren and Brandeis [79] in 1890 described Privacy as the protection of individuals space and their right to be left alone. Other authors have defined privacy as the control over personal information [80], or as an aspect of dignity, integrity and human freedom [68]. Nonetheless, all approaches have something in common: a reference to the boundaries between private and public.
Privacy in the modern world has two dimensions. First, it has to do with the identity of a person and, second, it has to do with the way personal information is used. Individuals during their daily online transactions as consumers of products and services have many topics to consider and decisions to make related to privacy. Consumers seek for maximum benefits and minimum cost for themselves. Firms, on the other hand, can benefit from the ability to learn so much about their customers. Under the above prism scientists working on behavioral decision-making focus their research on the trade-offs and the protection (or sharing) of information [4].
Privacy transactions nowadays occur in three different types of markets [3]. First, we have transactions for non-privacy goods where consumers often reveal personal information, which may be collected, analyzed and processed some way. In this case, the potential secondary use of information should be considered as a possibility. The second type of privacy-related transactions occurs where firms provide consumers free products or services (e.g. search engines, online social networks, free cloud services). In these transactions, consumers provide directly personal information, although the exchange of services for personal data is not always visible. The third type of privacy-related transactions occurs in the market of privacy tools. For example, consumers may acquire a PET tool to protect their transactions or hide their browsing behavior [7].
Consumers’ personal data analysis can improve firms’ marketing capabilities and increase revenues through targeted offers. Consequently, firms employ innovative strategies in order to allure consumers to easily provide more personal information and shape preferences [60]. By observing consumers’ behavior, firms can learn how to improve their services and turn to price discriminations strategies for clear profit [9]. On the other hand, consumers benefit from targeted advertisement strategies, since advertisements are tailored to consumers’ interests. Firms and consumers can both benefit from such targeting; the former reduce communication cost with consumers, and the latter gain easily useful information [75].
Finally, a more intangible but also important form of indirect consumers’ costs is related to the fact that the more an individual’s data is shared with other parties, the more those parties gain a bargaining advantage in future transactions with that individual. While consumers receive offers for products, data holders accumulate information about them over time and across platforms and transactions. This data permits the creation of a detailed dossier of the consumers’ preferences and tastes, and the prediction of her future behavior [29].
Results from literature about privacy transactions show that decision-making for the collection and diffusion of private information by firms and other third parties will almost always raise issues for private life. Consumers seem to act shortsightedly when trade-offs apply short term benefits and long term costs for privacy invasions. This suggests that consumers may not always behave rationally when facing privacy trade-offs. Current research talks about the privacy paradox phenomenon, where individuals face obstacles in making privacy sensitive decisions because of incomplete information, bounded access to the available information, and plenty deviations and behavioral biases suggested by behavioral decision research [2, 6].
3 Information Privacy in Cloud Computing: A Game Theory Approach
In the literature, the adoption and implementation of cloud computing technology have become an important milestone for modern organizations and inseparably connected with the protection or disclosure of personal information. Four-factor analysis of the human component, technology, organization, and environment is used to understand cloud computing technology adoption [43, 46, 51]. Cloud computing adoption by the organizations can be considered as a utopia if individual users are not familiar with the cloud technology. Sharma et al. [70] point out studies from the field of information systems where behavioral constructs are key factors influencing the individual user to adopt a new technology [12, 24, 41, 77]. Sharma et al. [70] examine if and to what extent factors such as perceived usefulness, perceived ease of use, computer self-efficacy and trust can affect individual users to adopt cloud technologies and indicate that the above factors were found to be important indeed.
A major inhibiting factor has to do with the loss of control over storage of critical data and the service’s outsourced nature. The challenge for cloud providers is to identify and understand the concerns of privacy-sensitive stakeholders and adopt security practices that meet their requirements [19]. Misunderstanding the privacy concerns of end-users may lead to loss of business, as they may either stop using a perceivably insecure or privacy-abusing service, or falsify their provided information, hence minimizing the potential for profit via personalized advertising. An end-user can give fake data if she believes that the service provider is going to abuse the privacy agreement and sell personal data derived from a cloud based subscription to a third party [16].
Di Vimercati et al. [78] underline that the significant benefit of elasticity in clouds appealed companies and individual users to adopt cloud technologies. At the same time, this benefit is proved harmful for users’ privacy, as security threats and a potential loss of control from data owners exists. In this case, the adoption of the cloud computing paradigm is diminished. European Network and Information Security Agency (ENISA) [1] lists the issue of loss of control over data as a top risk for cloud computing. Also, in 2013 the “Cloud Security Alliance - CSA” lists data breaches and data loss as two of the top nine threats in cloud computing [14, 32]. The new complexity of the cloud paradigm (e.g. distribution and virtualization), the class of data (e.g. sensitive data) or the fact that CSPs might be not fully trustworthy are topics that increase security and privacy obstacles for cloud adoption.
Game theory in these cases emerges as an interesting tool to explore the aforementioned issues, as it can be used to interpret stakeholder interactions and interdependencies across the above scenarios. For example, Rajbhandari and Snekkenes [62] implemented a game theory-based approach to analyze risks to privacy, in place of the traditional probabilistic risk analysis (PRA). Their scenario is based on an online bookstore where the user has to subscribe in order to have access to a service. Two players take part in this game: the user and the online bookstore. The user could provide either genuine or fake information, whereas the bookstore could sell user’s information to a third party or respect it. A mixed strategy Nash equilibrium was chosen for solving the game, with user’s negative payoffs, in order to describe quantitatively the level of privacy risk.
Snekkenes [71] applies Conflicting Incentives Risk Analysis (CIRA) in a case where a bank and a customer are involved in a deal. Snekkenes attempts to identify who is to take the role of the risk owner in case of data breach incidents and what are the utility factors weighted on the risk owner’s perception of utility. The CIRA approach identifies stakeholders, actions, and payoffs. Each action can be viewed as a strategy in a potentially complex game, where the implementation of the action amounts to the participation in a game. CIRA shows how this method can be used to identify privacy risks and human behavior.
Also, according to Hausken [33], the behavioral dimension is a very important factor in order to estimate risk. A conflict behavior, which is recorded on individuals’ choices, can be integrated into a probabilistic risk analysis and analyzed through game theory. Resnick [66] worked on providing the use of “cheap pseudonyms” as a way to measure reputation in Internet interaction between stakeholders. This was a game of multiple players where users provided pseudonyms during an interaction in the Internet world and they had the option either to continue playing with the current pseudonym or find a new one, at each period of time. A suboptimal equilibria is found, as a repeated prisoner’s dilemma type of game, while methods of limiting identity changes are suggested.
Cai et al. [20] insert a game-theory approach to managing decision errors, as there is a gap between strategic decisions and actions. They study the effects of decision errors on optimal equilibrium strategy of the firm and the user. Cavusoglu and Raghunathan [22] propose a game theory for determining if a provider should invest on high or low-cost ICT and compare game theory and decision theory approaches. They show that in cases where firms choose their action before attackers choose theirs (sequential game), firms gain the maximum payoff. Also, when firms adopt knowledge from previous hacker attacks to estimate future hacker effort, then the distance between the results of decision theory and game theory approaches is diminishing.
Gao and Zhong [31] address the problems of distorted incentives for stakeholders in an electronic environment, applying differential game theory in a case where two competing firms offer the same product to customers and the one can influence the value of their information assets by changing pricing rates. To assure consumers that they do not risk losing sensitive information, and also, increase consumer demand, firms usually integrate their security investment strategies. Researchers reveal that higher consumer demand loss and higher targeted attacks, avert both firms from aggressive defense policy against hackers and would rather prefer to decrease the negative effect of hacker attacks by lowering their pricing rates.
Concluding, game theory research in online privacy-related decision-making has shown that it can give credible results in understanding privacy-related behavior.
4 Impact of Consumer Trust in Cloud Services
Sato [67] refers that 88% of consumers, worldwide, are worried about the loss of their data. Who has access to their data? Where consumers’ data is physically stored? Can cloud service providers (CSPs) find ways to gain consumers’ trust? Is the CSPs attempt towards consumer trust, a value for money strategy? These are typical questions that consumers and CSPs make about trust in clouds and online environments.
Ramachandran and Chang [63] highlight key issues associated with data security in the clouds. One key factor for cloud adoption is building trust when storing and computing sensitive data in the cloud. Trust related to e-services offered in virtual online environments is a major topic for both consumers and cloud service providers, as well as for cloud researchers. Trust is strongly tied to online security. McKnight et al. [49] indicate three significant trust components: ability, integrity and good will as prominent factors for a new ICT adoption. Ability is equal to CSPs efficiency in resources and skills that will not deter consumers from adopting cloud technologies. Integrity refers to CSPs obligations to comply with regulations, and good will means that CSPs assure priority to consumers’ needs.
Sharma et al. [70] suggest that trust in clouds has a positive and significant relationship with individual’s decision to adopt cloud computing services. In clouds, users often want to share sensitive information and CSPs should ensure their privacy [39]. Svantesson and Clarke [73] suggested that CSPs should apply such policies to ensure users that their data are safe and allure them to use clouds.
Consumers trust CSPs only to the extent that the risk is perceived to be low and the convenience payoff for them to be high. Pearson [59] argues that when customers have to decide about trusting CSPs for personal data exchange services, they should consider organization’s operational, security, privacy and compliance requirements and choose what best suits them.
5 Asymmetric Information and Strategic Stakeholders Interaction in Clouds
Asymmetric information is a concept encountered often in commercial transactions between sellers and buyers, end-users and service providers where one party has more information compared to the other. Potentially, this could lead to a harmful situation as one party can take advantage of the other party’s lack of knowledge. Information asymmetries are commonly met in principal-agent problems where misinforming is caused and the communication process is affected [23].
Principal-agent problems occur when an entity (or agent) makes decisions on behalf of another entity: Principal is “a person, who authorizes an agent to act with a third trusted party” [18, 27]. A dilemma exists when the agreement between participants is not respected and the agent is motivated to act for his own personal gain and in contrary to the “principal”. Principals do not know enough about whether an agreement has been satisfied and, therefore, their decisions are taken under some risk and uncertainty and involve costs for both parties. The above information problem can be solved if the third trusted party provides incentives in order the agents to act appropriately and in accordance with the principals. In terms of game theory, rules should be changed so that the rational agents are confronted with what principal desires [18].
McKinney and Yoos [48] argue that information is almost always unspecified to an unbounded variety of problems and the involved agents (so-called stakeholders) almost always act without having full information about their decisions. Whilst literature on information risk is adequately studied in the last decades, there is no risk premium for information asymmetry [34]. Easley and O’hara [26] argue that information asymmetry creates something called information risk and their model showed that additional private information from consumers receives higher expected returns to the involved agents.
For an agent, a risk premium is the minimum economic benefit by which the expected return from decision-making under risk must exceed the known return on a risk-free decision where full information is provided to the involved stakeholders. A rational agent is risk averse. He attempts to reduce the uncertainty when exposed to information asymmetry. The utility of such a strategic movement expected to be high in many cases. For such risky outcomes, a decision-maker adopts a criterion as a rule of choice, where higher expected value strategic movements are simply the preferred ones [55].
From a game theory perspective, uncertain outcomes exist where potential preferences with regards to appropriate risky choices coincide. In cases where the above-expected utility hypothesis is satisfied, it can be proved useful to explain choices that seem to contradict the expected value criterion. Asymmetric information in clouding introduces scenarios where stakeholders (consumers and service providers) interact strategically. A game theory approach based on trust is regarded as a useful tool to explain the conflict and cooperation between intelligent rational decision-makers.
Njilla et al. [52] introduce a game-theoretic model for trust in clouds suggesting that risk and trust are two behavioral factors that influence decision-making in uncertain environments like cloud markets, where consumers seem they do not have full control over their stored data. They adopt a game theoretic approach to establishing a relationship between trust and factors that could affect the assessments to risk. The scenario refers to three players: end-users, service providers, and attackers. The provider defends the system’s infrastructure against attackers, while end-users tempt not to trust an online service in case of data privacy breaches. Njilla et al. [52] propose a game model which mitigates cyber attack behavior. They analyze different solutions obtained from the Nash Equilibrium (NE) and find that frequent attacks with contemporary providers’ ability to mitigate the loss, might cause the attacker to be detected and caught. Thus, it is possible, in this case, the attacker not to attack because of high risk and penalties. But what about the gain and the loss when the provider invests in security and the attacker decides to attack and succeeds his target with users’ private data compromised? What are the payoffs of each player in this case? These remain open questions.
Maghrabi and Pfluegel [47] use game theory by an end-user perspective to assess risk pertaining to moving to public clouds. While previous works focus on how to help the cloud provider to assess risk, they developed a model for benefits and costs associated with attacks on the end user’s asset in order to help the user decide whether or not adopt the cloud. The end-user is conformed to a Service Level Agreement (SLA), which promises protection against external attacks.
Douss et al. [25] propose a game trust model for mobile ad hoc networks. Assuring reputation and establishing trust between collaborating parties is indirectly a way to provide the secure online environment. The authors suggest an evaluation model for trust value. They applied computational methods and developed a framework for trust establishment.
Li et al. [42] study price bidding strategies when multiple users interact and compete for resource usage in cloud computing. The provided cloud services are available to end-users with a pay-as-you-go manner [38, 56]. A non-cooperative game model is developed with multiple cloud users, where each cloud user has incomplete and asymmetric information about the other users. They work on utility functions with the “time efficiency” parameters incorporated to calculate net profit for each user, in order to help them to decide whether to use the cloud service. For a cloud provider, the income is a number of money users pay for resource usage [50]. A rational user will maximize his net reward by choosing the appropriate bidding strategy \((= U_{of \ choosing \ the \ cloud \ service} - P_{ayment})\); U stands for utility and P stands for payment. However, it is irrational for a cloud provider to provide enough resources for all potential requests in a specific time. Therefore, cloud users compete for resource usage. The above stakeholders’ strategic interactions are analyzed from a game-theoretic perspective and the existence of Nash equilibrium is also confirmed by a proposed near-equilibrium price bidding algorithm. For future research, a good idea is to study the cloud users’ choice among different cloud providers or determine a properly mixed bidding strategy.
Fagnani et al. [28] consider a network of units (e.g., smartphones or tablets) where users have decided to make an external backup for their data and, also, are able to offer space to store data of other connected units. They propose a peer-to-peer storage game model and design an algorithm which makes units interact and store data backup from connected neighbors. The algorithm has been converged to Nash equilibrium of the game, but several challenges have arisen for future research analysis related to stakeholders’ interactions in a more trusted environment.
Moreover, the resource allocation problem in cloud computing where users compete for gaining more space to run their applications and store their data is analyzed by Jebalia et al. [35]. They develop a resource allocation model based on a cooperative game approach, where cloud providers provide a great number of resources in order to maximize profit and combine the adoption of security mechanisms with payoffs maximizing.
Security and privacy are often located as opposite concepts. Much of focus is on reducing cost during the establishment of a trustworthiness infrastructure in cloud computing, which gradually requires disclosing private information and proposing a model of trading privacy for trust [52, 69]. Also, Lilien et al. [45] indicate the difference between maintaining a high level of privacy and establishing trust for transactions in cloud environments. Users, who display a particular interest in concealing private information intensively, request from cloud providers a set of corresponding credentials which establishing trust for these users. The tradeoff problem exists where the assurance for the minimum user’s privacy loss meet the choice of revealing the minimum number of credentials for satisfying trust requirements.
Raya et al. [64] suggest a trust privacy tradeoff game-theoretic model that gives incentives to stakeholders to build trust and at the same time assure privacy loss at a minimum level. Individual players do not trust cloud providers unless they received an appropriate incentive.
Gal-Oz et al. [30] introduce a tradeoff approach studying the relationship between trust and privacy in online transactions. They suggest that pseudonyms constitute a necessary component for maintaining privacy since pseudonyms prevent association with transaction ID and ensure a level of reputation. The more pseudonyms used, the more reputation is succeeded.
Mentioning the above major issues, we indicate that any application relying upon an emerging cloud computing technology should consider the different possible threats. The problem is a lack of a clearly defined meaning of such a risk that benefits the cloud users to make proper choice and cloud service providers to avoid threats efficiently.
6 Conclusions
A game theory approach is adopted as a very general language for modeling choices by agents in whom the actions of other agents can affect each player’s outcome. Game theory assumes players choose strategies which maximize the utility of game outcomes given their beliefs about what others will do.
The most challenging question is often how beliefs are formed. Most approaches suggest that beliefs are derived from what other players are likely to do. Game theory focuses on preferences and the formation of beliefs. Equilibrium specifies not only a strategy for each of the players but also a belief for each of the players. Each belief is the probability of the other players having particular types, given the type of the player with that belief. The way players specify reasonable beliefs is by equating choices.
However, some limits arise. First, many games that occur in social life are so complex, which means that at a specific time, players cannot form accurate beliefs about what other players would choose and therefore they cannot choose equilibrium strategies. So, what strategies might be chosen by players with bounded rationality, or how a repeated game helps players to improve their strategic choices? Second, in empirical works, only received payoffs are easily measured (e.g., prices in auctions). A huge variety of experiments show that game theory sometimes explains behavior adequately, and sometimes is badly rejected by behavioral and process data [21]. The above inference can be used to create a more general theory which matches the standard theory when it is accurate, and explains the cases in which is badly rejected. This emerging approach is called “behavioral game theory” which uses the analytical game theory to explain observed violations by incorporating bounds on rationality.
Game theory is the standard theory to analyze cases where individuals or firms interact; for example, strategic interaction of privacy-sensitive end-users’ use of cloud-based mobile apps, e-commerce transactions between sellers and consumers, and many other social dilemmas such as the provision of public goods. Behavioral game theory introduces psychological parameters which amplify a rational scenario and give a motivational basis for players’ behavior. Representation, social preferences over outcomes, initial conditions and learning are the basic components for a precise analysis [21].
In this work, we focus on Information Privacy in Cyberspace Transactions. Cyberspace is a synopsis for the web of consumer electronics, computers, and communication networks that interconnect the world. The potential surveillance of electronic activities presents a serious threat to information privacy. The collection and use of private information have caused serious concerns about privacy invasion by consumers, creating a personalization-privacy tradeoff. The key approach to addressing privacy concerns is via the protection of privacy through the implementation of fair information practices, a set of standards governing the collection and use of personal information. We take a game-theoretic approach to explore the motivation of firms for privacy protection and its impact on competition and social welfare in the context of product and price personalization. We find that privacy protection can work as a competition-mitigating mechanism by generating asymmetry in the consumer segments to which firms offer personalization, enhancing the profit extraction abilities of the firms. In equilibrium, both symmetric and asymmetric choices of privacy protection by the firms can result, depending on the size of the personalization scope and the investment cost of protection. Further, as consumers become more concerned about their privacy, it is more likely that all firms adopt privacy protection strategies. In the perspective of welfare, we show that autonomous choices of privacy protection by personalizing firms can improve social welfare at the expense of consumer welfare. We further find that regulation enforcing the implementation of fair information practices can be efficient from the social welfare perspective, mainly by limiting the incentives of the firms to exploit the competition-mitigation effect.
References
Cloud Computing Risk Assessment, ENISA (2009). https://www.enisa.europa.eu/publications/cloud-computing-risk-assessment. Accessed 10 Feb 2017
Acquisti, A.: Privacy in electronic commerce and the economics of immediate gratification. In: Proceedings of the 5th ACM Conference on Electronic Commerce, pp. 21–29. ACM (2004)
Acquisti, A.: The economics of personal data and the economics of privacy (2010)
Acquisti, A., Brandimarte, L., Loewenstein, G.: Privacy and human behavior in the age of information. Science 347(6221), 509–514 (2015)
Acquisti, A., Grossklags, J.: Privacy and rationality. In: Strandburg, K.J., Raicu, D.S. (eds.) Privacy and Technologies of Identity, pp. 15–29. Springer, Heidelberg (2006)
Acquisti, A., Grossklags, J.: What can behavioral economics teach us about privacy. Digit. Priv.: Theory Technol. Pract. 18, 363–377 (2007)
Acquisti, A., John, L.K., Loewenstein, G.: What is privacy worth? J. Legal Stud. 42(2), 249–274 (2013)
Acquisti, A., Taylor, C., Wagman, L.: The economics of privacy. J. Econ. Lit. 54(2), 442–492 (2016)
Acquisti, A., Varian, H.R.: Conditioning prices on purchase history. Mark. Sci. 24(3), 367–381 (2005)
Adjerid, I., Peer, E., Acquisti, A.: Beyond the privacy paradox: objective versus relative risk in privacy decision making (2016)
Aguirre, E., Roggeveen, A.L., Grewal, D., Wetzels, M.: The personalization-privacy paradox: implications for new media. J. Consum. Mark. 33(2), 98–110 (2016)
Al-Somali, S.A., Gholami, R., Clegg, B.: An investigation into the acceptance of online banking in Saudi Arabia. Technovation 29(2), 130–141 (2009)
Alberts, J.K., Nakayama, T.K., Martin, J.N.: Human Communication in Society. Pearson, Upper Saddle River (2015)
Almorsy, M., Grundy, J., Müller, I.: An analysis of the cloud computing security problem. arXiv preprint arXiv:1609.01107 (2016)
Andriotis, P., Takasu, A., Tryfonas, T.: Smartphone message sentiment analysis. In: Peterson, G., Shenoi, S. (eds.) DigitalForensics 2014. IAICT, vol. 433, pp. 253–265. Springer, Heidelberg (2014). doi:10.1007/978-3-662-44952-3_17
Andriotis, P., Tryfonas, T.: Impact of user data privacy management controls on mobile device investigations. In: Peterson, G., Shenoi, S. (eds.) Advances in Digital Forensics XII. IAICT, vol. 484, pp. 89–105. Springer, Cham (2016). doi:10.1007/978-3-319-46279-0_5
Andriotis, P., Tzermias, Z., Mparmpaki, A., Ioannidis, S., Oikonomou, G.: Multilevel visualization using enhanced social network analysis with smartphone data. Int. J. Digit. Crime Forensics (IJDCF) 5(4), 34–54 (2013)
Bosse, D.A., Phillips, R.A.: Agency theory and bounded self-interest. Acad. Manag. Rev. 41(2), 276–297 (2016)
Brunette, G., Mogull, R., et al.: Security guidance for critical areas of focus in cloud computing v2. 1. Cloud Secur. Alliance 1–76 (2009)
Cai, C.X., Mei, S.E., Zhong, W.J.: A game-theory approach to manage decision errors. In: MATEC Web of Conferences, vol. 44. EDP Sciences (2016)
Camerer, C.: Behavioral Game Theory: Experiments in Strategic Interaction. Princeton University Press, Princeton (2003)
Cavusoglu, H., Raghunathan, S., Yue, W.T.: Decision-theoretic and game-theoretic approaches to it security investment. J. Manag. Inf. Syst. 25(2), 281–304 (2008)
Christozov, D., Chukova, S., Mateev, P.: Informing processes, risks, evaluation of the risk of misinforming. Found. Informing Sci. 323–356 (2009)
Davis, F.D.: Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 319–340 (1989)
Douss, A.B.C., Abassi, R., El Fatmi, S.G.: A trust management based security mechanism against collusion attacks in a MANET environment. In: 2014 Ninth International Conference on Availability, Reliability and Security (ARES), pp. 325–332. IEEE (2014)
Easley, D., O’hara, M.: Information and the cost of capital. J. Financ. 59(4), 1553–1583 (2004)
Eisenhardt, K.M.: Agency theory: an assessment and review. Acad. Manag. Rev. 14(1), 57–74 (1989)
Fagnani, F., Franci, B., Grasso, E.: A game theoretic approach to a peer-to-peer cloud storage model. arXiv preprint arXiv:1607.02371 (2016)
Farrell, J.: Can privacy be just another good. J. Telecomm. High Tech. L. 10, 251 (2012)
Gal-Oz, N., Grinshpoun, T., Gudes, E.: Privacy issues with sharing reputation across virtual communities. In: Proceedings of the 4th International Workshop on Privacy and Anonymity in the Information Society, p. 3. ACM (2011)
Gao, X., Zhong, W.: A differential game approach to security investment and information sharing in a competitive environment. IIE Trans. 48(6), 511–526 (2016)
Group, T.T.W., et al: The notorious nine: cloud computing top threats in 2013. Cloud Secur. Alliance (2013)
Hausken, K.: Probabilistic risk analysis and game theory. Risk Anal. 22(1), 17–27 (2002)
Hirshleifer, D.A., Huang, C., Teoh, S.H.: Information asymmetry, market participation, and asset prices (2016)
Jebalia, M., Ben Letaïfa, A., Hamdi, M., Tabbane, S.: An overview on coalitional game-theoretic approaches for resource allocation in cloud computing architectures. Int. J. Cloud Comput. 22 4(1), 63–77 (2015)
Kahneman, D., Tversky, A.: Choices, Values, and Frames. Cambridge University Press, Cambridge (2000)
Kang, J., Shilton, K., Estrin, D., Burke, J.: Self-surveillance privacy. Iowa L. Rev. 97, 809 (2011)
Kaur, P.D., Chana, I.: A resource elasticity framework for QoS-aware execution of cloud applications. Future Gener. Comput. Syst. 37, 14–25 (2014)
King, N.J., Raja, V.: Protecting the privacy and security of sensitive customer data in the cloud. Comput. Law Secur. Rev. 28(3), 308–319 (2012)
Kokolakis, S.: Privacy attitudes and privacy behaviour: a review of current research on the privacy paradox phenomenon. Comput. Secur. 64, 122–134 (2017)
Kumar Sharma, S., Madhumohan Govindaluri, S.: Internet banking adoption in India: structural equation modeling approach. J. Indian Bus. Res. 6(2), 155–169 (2014)
Li, W., Huang, Z.: The research of influence factors of online behavioral advertising avoidance. Am. J. Ind. Bus. Manag. 6(09), 947 (2016)
Lian, J.W., Yen, D.C., Wang, Y.T.: An exploratory study to understand the critical factors affecting the decision to adopt cloud computing in Taiwan hospital. Int. J. Inf. Manag. 34(1), 28–36 (2014)
Liang, H., Shen, F., Fu, K.: Privacy protection and self-disclosure across societies: a study of global twitter users. New Media Soc. 1461444816642210 (2016)
Lilien, L., Bhargava, B.: Trading Privacy for Trust in Online Interactions. Idea Group, Hershey (2008)
Low, C., Chen, Y., Wu, M.: Understanding the determinants of cloud computing adoption. Ind. Manag. Data Syst. 111(7), 1006–1023 (2011)
Maghrabi, L., Pfluegel, E.: Moving assets to the cloud: a game theoretic approach based on trust. In: 2015 International Conference on Cyber Situational Awareness, Data Analytics and Assessment (CyberSA), pp. 1–5. IEEE (2015)
McKinney Jr., E.H., Yoos, C.J.: Information about information: a taxonomy of views. MIS Q. 329–344 (2010)
McKnight, D.H., Choudhury, V., Kacmar, C.: The impact of initial consumer trust on intentions to transact with a web site: a trust building model. J. Strateg. Inf. Syst. 11(3), 297–323 (2002)
Mei, J., Li, K., Ouyang, A., Li, K.: A profit maximization scheme with guaranteed quality of service in cloud computing. IEEE Trans. Comput. 64(11), 3064–3078 (2015)
Morgan, L., Conboy, K.: Key factors impacting cloud computing adoption. Computer 46(10), 97–99 (2013)
Njilla, L.Y., Pissinou, N., Makki, K.: Game theoretic modeling of security and trust relationship in cyberspace. Int. J. Commun. Syst. 29(9), 1500–1512 (2016)
Nofer, M., Hinz, O., Muntermann, J., Roßnagel, H.: The economic impact of privacy violations and security breaches. Bus. Inf. Syst. Eng. 6(6), 339–348 (2014)
Nosko, A., Wood, E., Molema, S.: All about me: disclosure in online social networking profiles: the case of Facebook. Comput. Hum. Behav. 26(3), 406–418 (2010)
O’Brien, M.K., Ahmed, A.A.: Rationality in human movement. Exerc. Sport Sci. Rev. 44(1), 20–28 (2016)
Pal, R., Hui, P.: Economic models for cloud service markets: pricing and capacity planning. Theoret. Comput. Sci. 496, 113–124 (2013)
Pan, Y., Zinkhan, G.M.: Exploring the impact of online privacy disclosures on consumer trust. J. Retail. 82(4), 331–338 (2006)
Parra-Arnau, J., Rebollo-Monedero, D., Forné, J.: Measuring the privacy of user profiles in personalized information systems. Future Gener. Comput. Syst. 33, 53–63 (2014)
Pearson, S.: Privacy, security and trust in cloud computing. In: Pearson, S., Yee, G. (eds.) Privacy and Security for Cloud Computing. CCN, pp. 3–42. Springer, Heidelberg (2013). doi:10.1007/978-1-4471-4189-1_1
Pitta, D.A.: Jump on the bandwagon-its the last one: new developments in online promotion. J. Consum. Mark. 27(2) (2010)
Post, R.C.: Rereading warren and brandeis: privacy, property, and appropriation. Case W. Res. L. Rev. 41, 647 (1990)
Rajbhandari, L., Snekkenes, E.A.: Mapping between classical risk management and game theoretical approaches. In: Decker, B., Lapon, J., Naessens, V., Uhl, A. (eds.) CMS 2011. LNCS, vol. 7025, pp. 147–154. Springer, Heidelberg (2011). doi:10.1007/978-3-642-24712-5_12
Ramachandran, M., Chang, V.: Towards performance evaluation of cloud service providers for cloud data security. Int. J. Inf. Manag. 36(4), 618–625 (2016)
Raya, M., Shokri, R., Hubaux, J.P.: On the tradeoff between trust and privacy in wireless ad hoc networks. In: Proceedings of the Third ACM Conference on Wireless Network Security, pp. 75–80. ACM (2010)
Regan, P.M.: Privacy as a common good in the digital world. Inf. Commun. Soc. 5(3), 382–405 (2002)
Resnick, P., et al.: The social cost of cheap pseudonyms. J. Econ. Manag. Strategy 10(2), 173–199 (2001)
Sato, M.: Personal data in the cloud: a global survey of consumer attitudes. Minato-u, To yo, pp. 105–7123 (2010)
Schoeman, F.D.: Privacy and Social Freedom. Cambridge University Press, Cambridge (1992)
Seigneur, J.-M., Jensen, C.D.: Trading privacy for trust. In: Jensen, C., Poslad, S., Dimitrakos, T. (eds.) iTrust 2004. LNCS, vol. 2995, pp. 93–107. Springer, Heidelberg (2004). doi:10.1007/978-3-540-24747-0_8
Sharma, S.K., Al-Badi, A.H., Govindaluri, S.M., Al-Kharusi, M.H.: Predicting motivators of cloud computing adoption: a developing country perspective. Comput. Hum. Behav. 62, 61–69 (2016)
Snekkenes, E.: Position paper: privacy risk analysis is about understanding conflicting incentives. In: Fischer-Hübner, S., Leeuw, E., Mitchell, C. (eds.) IDMAN 2013. IAICT, vol. 396, pp. 100–103. Springer, Heidelberg (2013). doi:10.1007/978-3-642-37282-7_9
Solove, D.J.: A taxonomy of privacy. Univ. Pa. Law Rev. 477–564 (2006)
Svantesson, D., Clarke, R.: Privacy and consumer risks in cloud computing. Comput. Law Secur. Rev. 26(4), 391–397 (2010)
Tsai, J.Y., Egelman, S., Cranor, L., Acquisti, A.: The effect of online privacy information on purchasing behavior: an experimental study. Inf. Syst. Res. 22(2), 254–268 (2011)
Tucker, C.E.: Social networks, personalized advertising, and privacy controls. J. Mark. Res. 51(5), 546–562 (2014)
Tversky, A., Kahneman, D.: Advances in prospect theory: cumulative representation of uncertainty. In: Arló-Costa, H., Hendricks, V.F., Benthem, J. (eds.) Readings in Formal Epistemology. SGTP, vol. 1, pp. 493–519. Springer, Cham (2016). doi:10.1007/978-3-319-20451-2_24
Venkatesh, V., Morris, M.G., Davis, G.B., Davis, F.D.: User acceptance of information technology: toward a unified view. MIS Q. 425–478 (2003)
Vimercati, S.D.C., Foresti, S., Samarati, P.: Data security issues in cloud scenarios. In: Jajodia, S., Mazumdar, C. (eds.) ICISS 2015. LNCS, vol. 9478, pp. 3–10. Springer, Cham (2015). doi:10.1007/978-3-319-26961-0_1
Warren, S.D., Brandeis, L.D.: The right to privacy. Harv. Law Rev. 193–220 (1890)
Westin, A.F.: Privacy and freedom. Wash. Lee Law Rev. 25(1), 166 (1968)
Young, A.L., Quan-Haase, A.: Privacy protection strategies on Facebook: the internet privacy paradox revisited. Inf. Commun. Soc. 16(4), 479–500 (2013)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Anastasopoulou, K., Kokolakis, S., Andriotis, P. (2017). Privacy Decision-Making in the Digital Era: A Game Theoretic Review. In: Tryfonas, T. (eds) Human Aspects of Information Security, Privacy and Trust. HAS 2017. Lecture Notes in Computer Science(), vol 10292. Springer, Cham. https://doi.org/10.1007/978-3-319-58460-7_41
Download citation
DOI: https://doi.org/10.1007/978-3-319-58460-7_41
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-58459-1
Online ISBN: 978-3-319-58460-7
eBook Packages: Computer ScienceComputer Science (R0)