# Inconclusive Quantum Measurements and Decisions under Uncertainty

^{1}Department of Management, Technology and Economics, ETH Zürich, Zürich, Switzerland^{2}Bogolubov Laboratory of Theoretical Physics, Joint Institute for Nuclear Research, Dubna, Russia^{3}Swiss Finance Institute, University of Geneva, Geneva, Switzerland

We give a mathematical definition for the notion of inconclusive quantum measurements. In physics, such measurements occur at intermediate stages of a complex measurement procedure, with the final measurement result being operationally testable. Since the mathematical structure of Quantum Decision Theory (QDT) has been developed in analogy with the theory of quantum measurements, the inconclusive quantum measurements correspond, in QDT, to intermediate stages of decision making in the process of taking decisions under uncertainty. The general form of the quantum probability for a composite event is the sum of a utility factor, describing a rational evaluation of the considered prospect, and of an attraction factor, characterizing irrational, subconscious attitudes of the decision maker. Despite the involved irrationality, the probability of prospects can be evaluated. This is equivalent to the possibility of calculating quantum probabilities without specifying hidden variables. We formulate a general way of evaluation, based on the use of non-informative priors. As an example, we suggest the explanation of the decoy effect. Our quantitative predictions are in very good agreement with experimental data.

## 1. Introduction

The standard theory of quantum measurements [1] is based on the projection operator measure corresponding to operationally testable events. Simple measurements really have to be operationally testable in order to possess physical meaning. However, if a measurement is composite, consisting of several parts, the intermediate stages do not have to necessarily be operationally testable, but can be inconclusive.

As a typical example, we can recall the known double-slit experiment, when particles pass through a screen with two slits and then are registered by particle detectors some distance away from the screen. This experiment can be treated as a composite event consisting of two parts, one being the passage through one of the slits and second, registration by detectors. The registration of a particle by a detector is an operationally testable event, since the particle is either detected or not, with the result being evident for the observer. But the passage of the particle through one of the slits is not directly observed, and the experimentalist does not know which of the slits the particle has passed through. In that sense, the passage of the particle through a slit is an inconclusive event. The existence of this inconclusive event, occurring at the intermediate stage of the experiment, is intimately associated with an interference effect. Otherwise, if the experimentalist would precisely determine the slit through which the particle has passed, the interference pattern registered by the particle detectors would be destroyed. The existence of interference is precisely due to the presence of the inconclusive event that happened at the intermediate stage.

The occurrence of inconclusive events in decision making is even more frequent and important. Practically any decision, before it is explicitly formulated, passes through a stage of deliberation and hesitation accompanying the choice. That is, any decision is actually a composite event consisting of an intermediate stage of deliberation and of the final stage of taking a decision. The final stage of decision making is equivalent to an operationally testable event in quantum measurements. While the intermediate stage of deliberation is analogous to an inconclusive event.

The analogy between the theory of quantum measurements and decision theory has been mentioned by von Neumann [1]. Following this analogy, Quantum Decision Theory (QDT) has been advanced [2–7], with the mathematical structure that is applicable to both decision making as well as to quantum measurements. The generality of our framework, being equally suitable for quantum measurements and decision making, is its principal difference from all other attempts that employ quantum techniques in psychological sciences. An extensive literature on various quantum models in psychology and cognitive science can be found in the books [8–11] and review articles [12–15].

Any approach, applying quantum techniques to decision theory, is naturally based on the notion of probability. This is because quantum theory is intrinsically probabilistic. Respectively, the intrinsically probabilistic nature of QDT is what makes it principally different from stochastic decision theory, where the choice is treated as always being deterministic, while in the process of choosing the decision maker acts with errors [16–20]. Such stochastic decision theories can be termed as “deterministic theories embedded into an environment with stochastic noise.” The standard way of using a stochastic approach is to assume a probability distribution over the values characterizing the errors made by the subjects in the process of decision making. Then the parameters entering the distribution are fitted to a posteriori empirical data by maximizing the log-likelihood function. Such a procedure allows one to better fit the given set of data to the assumed basic deterministic decision theory, in particular due to the introduction of additional fitting parameters. However, it does not essentially change the structure of the underlying deterministic theory, although improving it slightly. And, being descriptive, the classical stochastic approach does not provide quantitative predictions.

Contrary to classical stochastic theory, in the quantum approach, we do not assume that the choice of a decision maker is deterministic, with just some weak disturbance by errors. Following the general quantum interpretation, we consider the choice process, including deliberations, hesitations, and subconscious estimation of competing outcomes, as intrinsically random. The probabilistic decision, in the quantum case, is not just a stochastic decoration of a deterministic process, but it is an unavoidable random part of any choice. The existence of the hidden, often irrational subconscious feelings and deliberations, results in the appearance of quantum interference and entanglement. The difference between classical stochastic decision theory and QDT is similar to the difference between classical statistical physics and quantum theory. In the former, all processes are assumed to be deterministic, with statistics coming into play because of measurement uncertainties, such as no precise knowledge of initial conditions and the impossibility of measuring exactly the locations and velocities of all particles. In contrast, quantum theory is principally probabilistic, which becomes especially important for composite measurements.

A detailed mathematical theory of quantum measurements in the case of composite events has been developed in our previous papers [21–23]. In the present paper, we concentrate our attention on composite measurements including intermediate inconclusive events and on the application of this notion for characterizing decision making under risk and uncertainty. The importance of composite events, including intermediate inconclusive events, in decision theory makes it necessary to pay a special attention to the correct mathematical formulation of such events and to the description of their properties allowing for the quantitative evaluation of the corresponding quantum probabilities. We show that, despite uncertainty accompanying inconclusive events, it is possible to give quantitative evaluations for quantum probabilities in decision theory, based on non-informative priors. Considering, as an illustration, the decoy effect, we demonstrate that even the simple non-informative priors provide predictions in very good agreement with experimental data.

## 2. Composite Quantum Measurements and Events

In this section, we give a brief summary of the general scheme for defining quantum probabilities for composite events. As we have stressed above, in our approach, the mathematics is the same for describing either quantum measurements or decision making. Therefore, when referring to an event, we can keep in mind either a fact of measurement or a decision action.

Let *A*_{n} be a conclusive operationally testable event labeled by an index *n*. And let *B* = {*B*_{α}} be a set of inconclusive events labeled by α. Defining the space of events as a Hilbert space ${H}$, we associate with an event *A*_{n} a state |*n*〉 in this Hilbert space and an event operator ${\widehat{P}}_{n}$,

The event operator for an operationally testable event is a projector.

The set of inconclusive events *B* generates in the Hilbert space ${H}$ the state |*B*〉 and the event operator ${\widehat{P}}_{B}$,

where the state reads

with coefficients *b*_{α} being random complex numbers. The event operator for an inconclusive event is not necessarily a projector, but a member of a positive operator-valued measure [7, 21–23].

The space of events, in the quantum approach, is the Hilbert space

that is a tensor product of the spaces

Each decision maker is characterized by an operator $\widehat{\rho}$ that can be termed the strategic state of a decision maker, which, in quantum theory, corresponds to a statistical operator. The pair $\left\{{H},\widehat{\rho}\right\}$, in physics, is named a statistical ensemble, and in decision theory, it is a decision ensemble.

A composite event is called a prospect and is denoted as

A prospect π_{n} generates a state |π_{n}〉 in the Hilbert space of events ${H}$ and a prospect operator $\widehat{P}({\pi}_{n})$,

with the prospect state

The prospect operator is a member of a positive operator-valued measure, which implies that these operators satisfy the resolution of unity [21, 23]. Since they contain random quantities *b*_{α}, the corresponding random resolution has to be understood not as a direct equality between numbers, but, e.g., as the equality in mean [24].

The prospect probability is

with the trace over the space ${H}$. To form a probability measure, the prospect probabilities are to be normalized:

Taking explicitly the trace in expression (Equation 8) and separating diagonal and off-diagonal terms, we see that the prospect probability

is represented as a sum of a positive-definite term

and a sign-undefined term

The appearance of the sign-undefined term is a purely quantum effect responsible, in quantum measurements, for interference patterns. The attenuation of this quantum term is called decoherence. In quantum theory, decoherence can be due to external as well as to internal perturbations and the influence of measurements [25–27]. And in QDT, decoherence can occur due to the accumulation of information [28].

The disappearance of the quantum term implies the transition to classical theory. This is formulated as the *quantum-classical correspondence principle* [29], which in our case reads as

This principle tells us that the term *f*(π_{n}) plays the role of classical probability, hence is to be normalized:

When decisions concern a choice between lotteries, the classical term *f*(π_{n}) has to be defined according to classical decision theory based either on expected utility theory or on some non-expected value functional. This suggests to call *f*(π_{n}) a *utility factor*, since it is defined on rational grounds and reflects the utility of a choice. The quantum term is caused by the interference and entanglement effects in quantum theory, which correspond, in decision making, to irrational effects describing the attractiveness of choice. Therefore, it can be called the attraction factor. From Equations (9) and (14), it follows the *alternation law*

Note that, in quantum theory, the definition of the composite event in the form of prospect (Equation 5) is valid for any type of operators, since they are defined on different spaces. No problem with non-commutativity of operators defined on a common Hilbert space arises. This way makes it possible to introduce joint quantum probabilities for several measurements [21, 23]. Contrary to this, considering operators on the same Hilbert space does not allow one to define joint probabilities for non-commuting operators. Sometimes, one treats the Lüders probability of consecutive measurements as a conditional probability. This, however, is not justified from the theoretical point of view [21, 23, 30] and also contradicts experimental data [31, 32]. But defining the quantum joint probability according to expression (Equation 8) contains no contradictions.

In the present section, the general scheme of QDT is presented. Being limited by the length of this paper, we cannot go into all mathematical details that have been thoroughly described in our previous publications. However, we would like to stress that for the purpose of practical applications, it is not necessary to study all these mathematical peculiarities, but it is sufficient to employ the final formulas following the prescribed rules. One can use the formulated rules as given prescriptions, without studying their justification. The main formulas of this section, which are necessary for the following application, are: the form of the quantum probability (Equation 10), the normalization conditions (Equations 9 and 14), and the alternation law (Equation 15). More details required for practical application will be given in the sections below.

## 3. Non-Informative Prior for Utility Factors

To make the above scheme applicable to decision theory, it is necessary to specify how one should quantify the values of the utility factors and attraction factors. Here we show how these values can be defined as non-informative priors.

Let us consider a set of lotteries *L*_{n} = {*x*_{i}, *p*_{n}(*x*_{i}) : *i* = 1, 2, …, *N*_{n}}, enumerated by the index *n* = 1, 2, …, *N*_{L}, with payoffs *x*_{i} and their probabilities *p*_{n}(*x*_{i}). The related expected utilities $U({L}_{n})=\sum _{i}u({x}_{i}){p}_{n}({x}_{i})$ can be defined according to the expected utility theory [33]. For the present consideration, the utility functions *u*(*x*) do not need to be specified. For instance, they can be taken as linear functions, since this choice has the advantage of making the utility factors independent from the units measuring the payoffs.

In QDT, the act of choosing a lottery *L*_{n}, denoted as *A*_{n}, together with the accompanying set of inconclusive events *B*, including the decision-maker hesitations [6, 30], compose the prospect (Equation 5). Depending on whether the expected utilities are positive on negative, there can be two cases.

If the expected utilities of the considered set of lotteries are all positive (non-negative), such that

then it is reasonable to require that zero utility corresponds to zero utility factor:

The case where the utility factor is simply proportional to the related expected utility trivially obeys this condition (Equation 17). Taking into account the normalization condition (Equation 14) gives the utility factor

When the expected utilities are negative, which happens in the domain of losses, such that

the required condition is that an infinite loss corresponds to zero utility factor:

The simplest way to satisfy this condition (Equation 20) is that the utility factor is inversely proportional to the related expected utility. Taking into account the normalization condition, we get

The utility-factor forms (Equations 18 and 21) coincide with the choice probabilities in the Luce choice axiom [34]. It is possible to show that generalized forms for the utility factors can be derived by maximizing a conditional Shannon entropy or from the principle of minimal information [12, 35, 36].

In the case of positive expected utilities, we consider the information functional, taking into account the normalization condition (Equation 14) and the expected log-likelihood Λ. This functional reads as

where

Minimizing functional (Equation 22) results in the utility factor

in which the positive sign of α is prescribed by the condition that the larger utility implies the larger factor.

In the case of negative expected utilities, the information functional takes the form

where

Then its minimization yields the utility factor

with the positive sign of γ prescribed by the requirement that the larger cost implies the smaller factor.

The utility factors (Equations 23 and 25) are the examples of power-law distributions that are known in many applications [35–37].

## 4. Non-Informative Prior for Attraction Factors

Although the attraction factor characterizes irrational features of decision making, it can be estimated by invoking non-informative prior assumptions. An important consequence of the latter is the *quarter law* derived earlier [4, 5, 12]. Here we first give the new, probably the simplest, derivation of the quarter law and, second, we show how this law can be used for estimating the attraction factors in the case of an arbitrary number of prospects.

Let us consider the sum

of the attraction factor moduli, where

plays the role of the attraction-factor distribution. The latter is normalized as

since the attraction factors, in view of condition (Equation 15), vary in the interval [−1, 1]. If *q*(π_{n}) does not equal zero, then normalization (Equation 28) is evident. And when *q*(π_{n}) = 0, then one should use the identity

for the semi-integral of the Dirac function.

The use of a non-informative prior implies that the values of the attraction factor are not known. A full ignorance is captured by a uniform distribution, which, according to normalization (Equation 28), gives

In that case, integral (Equation 26) results in the *quarter law*

If the prospect lattice ${L}$ = {π_{n}} consists of *N*_{L} prospects, we can always arrange the corresponding attraction factors in the ascending order, such that

We denote the largest attraction factor as

Given the unknown values of the attraction factors, the non-informative prior assumes that they are uniformly distributed and at the same time they must obey the ordering constraint (Equation 31). Then, the joint cumulative distribution of the attraction factors is given by

where the series η_{1} ≤ η_{2} ≤ … ≤ η_{NL} of inequalities ensure the ordering. It is then straightforward to show that the average values of the *q*(π_{n}) are equidistant, i.e., the difference between any two neighboring factors is on average

Taking their average values as determining their typical values, we omit the symbol 〈.〉 representing the average operator and use Equation (34) to represent the *n*-th attraction factor as

With notations (Equations 32 and 34), the alternation condition (Equation 15) yields

And the quarter law (Equation 30) leads to the gap

If *N*_{L} is even, then

while when *N*_{L} is odd, then

This allows us to represent gap (Equation 37) as

And for the largest attraction factor, we find

The above expressions make it possible to evaluate, on the basis of the non-informative prior, the whole set

of the attraction factors:

For example, in the case of two prospects, we have

which yields the attraction set

For three prospects, we get

hence

Similarly, for four prospects, we find

with the attraction set

When there are five prospects, then

from where

Thus, we can evaluate the attraction factors for any number of prospects, obtaining a kind of a quantized attraction set. In the case of an asymptotically large number *N _{L}* of prospects, we have

and

The non-informative priors can be employed for predicting the results of decision making. This makes the principal difference compared with the introduction into expected utility of adjustment parameters that are fitted *post-hoc* to the given experimental data [38].

## 5. Quantitative Explanation of Decoy Effect

We now show how the non-informative priors of the attraction factors can be employed to explain the decoy effect and for quantitative prediction in decision-making. Throughout this section, we denote, for simplicity, the objects of choice, say *A*, as well as the act of choosing an object *A*, by the same letter *A*. As has been emphasized above, the act of choice under uncertainty is a composite prospect. But, again for simplicity, we employ the same letter for denoting the action *A* and the related prospect (Equation 5).

The decoy effect was first studied by Huber et al. [39], who called it the effect of asymmetrically dominated alternatives. Later this effect has been confirmed in a number of experimental investigations [40–43]. The meaning of the decoy effect can be illustrated by the following example. Suppose a buyer is choosing between two objects, *A* and *B*. The object *A* is of better quality, but of higher price, while the object *B* is of slightly lower quality, while less expensive. As far as the functional properties of both objects are not drastically different, but *B* is cheaper, the majority of buyers value the object *B* higher. At this moment, the salesperson mentions that there is a third object *C*, which is of about the same quality as *A*, but of a much higher price than *A*. This causes the buyer to reconsider the choice between the objects *A* and *B*, while the object *C*, having the same quality as *A* but being much more expensive, is of no interest. Choosing now between *A* and *B*, the majority of buyers prefer the higher quality but more expensive object *A*. The object *C*, being not a choice alternative, plays the role of a decoy. Experimental studies confirmed the decoy effect for a variety of objects: cars, microwave ovens, shoes, computers, bicycles, beer, apartments, mouthwash, etc. The same decoy effect also exists in the choice of human mates distinguished by attractiveness and sense of humor [44]. It is common as well for animals, for instance, in the choice of female frogs of males with different attraction calls characterized either by low-frequency and longer duration or by faster call rates [45].

The decoy effect contradicts the regularity axiom in decision making telling that if *B* is preferred to *A* in the absence of *C*, then this preference has to remain in the presence of *C*.

In the frame of QDT, the decoy effect is explained as follows. Assume buyers consider an object *A*, which is of higher quality but more expensive, and an object *B*, which is of moderate quality but cheaper. Suppose the buyers have evaluated these objects *A* and *B*, which implies that the initial values of the objects are described by the utility factors *f*(*A*) and *f*(*B*). In experiments, the latter correspond to the fractions of buyers evaluating higher the related object. When the decoy *C*, of high quality but essentially more expensive, is presented, it attracts the attention of buyers to the quality characteristic. The role of the decoy is well understood as attracting the attention of buyers to a particular feature of the considered objects, because of which the decoy effect is sometimes named the attraction effect [40]. In the present case, the decoy attracts the buyer attention to quality. The attraction, induced by the decoy, is described by the attraction factors *q*(*A*) and *q*(*B*). Hence the probabilities of the related choices are now

Since the quality feature becomes more attractive, *q*(*A*) > *q*(*B*). According to the non-informative prior, we can estimate the attraction factors as *q*(*A*) = 1/4 and *q*(*B*) = −1/4.

To be more precise, let us take numerical values from the experiment of Ariely and Wallsten [43], where the objects under sale are microwave ovens. The evaluation without a decoy results in *f*(*A*) = 0.4 and *f*(*B*) = 0.6. In the presence of the decoy, we predict that the choice probabilities can be evaluated as

This gives *p*(*A*) = 0.65 and *p*(*B*) = 0.35. The experimental values for the choice between *A* and *B*, in the presence of but excluding *C*, correspond to the fractions *p*_{exp}(*A*) = 0.61 and *p*_{exp}(*B*) = 0.39, which is close to the predicted probabilities.

Another example can be taken from the studies of the frog mate choice [45], where frog males have attraction calls differing in either low-frequency sound or call rate. The males with lower frequency calls are denoted as *A*, while those with high call rate, as *B*. In an experiment with 80 frog females, without a decoy, it was found that females evaluate higher the fastest call rate, so that *f*(*A*) = 0.35 and *f*(*B*) = 0.65. In the presence of an inferior decoy, attracting attention to the low-frequency characteristic, the non-informative prior predicts the probabilities

The empirically observed fractions are found to be *p*_{exp}(*A*) = 0.6 and *p*_{exp}(*B*) = 0.4, in remarquable agreement with our predictions.

To make it clear how the decoy effect fits the title of the paper “Inconclusive quantum measurements and decisions under uncertainty,” it is worth extending the comments that have been mentioned in the Introduction.

Our principal point of view is that decision making, generally, almost always deals with composite events, since any choice is accompanied by subconscious feelings and irrational biases. The latter are often difficult to formalize and, even more, their weights usually are not known and are practically unmeasurable. This is why these subconscious irrational factors can be treated as what is called inconclusive events. When choosing between several possibilities, say *A*_{n}, one actually considers composite prospects, as defined in Equation (5). And the composite nature of choices requires the use of quantum techniques, as has been explained in our previous paper [30]. Otherwise, the probabilities of simple events could be characterized by classical theory. It is the composite nature of the considered prospects that yields the appearance of the quantum term *q*(π_{n}) related to interference and coherence effects. In that way, the choice between the objects in the decoy effect is also a composite prospect, being composed of the choice as such and accompanying subconscious feelings forming an inconclusive set. This is why the use of QDT here is necessary and why it gives so good results.

It is admissible to give a schematic picture of the choice in the decoy effect by analogy with the double-slit experiment in physics, which is mentioned in the Introduction. Thus, making a concrete selection of either an object *A* or *B* is the analog of the registration of the particle by a detector. But before such a selection is done, there exists the uncertainty of deciding which of the object features are actually more important. These not precisely defined acts of hesitation play the role of the slits, with the uncertainty associated with which of them the particle has passed through. When it is known which of the slits the particle has passed through, then the interference effects in physics disappear. Similarly, in decision theory, if the values of each object are clearly defined, there are no hesitations, no interference, and the selection can be based on classical rules. Such an objective evaluation in the decoy effect happens in the absence of any decoy, when a decision maker rationally evaluates the features of the given objects, say quality and price. The appearance of a decoy induces hesitations concerning which of the features are actually more important. These hesitations before the choice are the analogs of the uncertainty of which slits will be visited by the traveling particle. The uncertainty results in the interference and the arising quantum term, whether in the registration of a particle or in the final choice of a decision maker.

## 6. Discussion

We have presented a mathematical formulation for the concept of inconclusive quantum measurements and events. This type of measurements in physics happens at intermediate stages of composite measuring procedures, while the final measurement stage is operationally testable. In decision making, inconclusive events correspond to the intermediate stage of deliberations. Invoking non-informative priors, it is possible to estimate the prospect probabilities, thus, predicting the results of decision making.

Generally, invoking more information on the properties of the attraction factor, it is possible to define its form more accurately than the value given by non-informative prior. For example, from condition (Equation 9) it follows that

Hence, for a positive *q*(π_{n}), we have

While for a negative *q*(π_{n}), we get

Therefore, the attraction factor has to satisfy the limits

This suggests that the absolute value of the attraction factor can be modeled by an expression proportional to

with μ and ν being positive parameters and the sign defined by the ambiguity and risk aversion principle [4–6, 12]. More detailed study of such a form will be given in a separate paper.

But it turns out that even the simple non-informative prior provides us a rather good estimate allowing for quantitative predictions in decision making. And we have illustrated the approach by the decoy effect for which the non-informative priors yield quantitative predictions in very good agreement with empirical data.

In this paper, decision making by separate subjects is considered. We think that the theory can be generalized by considering societies of decision makers. The exchange of information in a society should certainly influence the decisions of the society members. To develop a theory of many agents, it is necessary to generalize the apporach by treating a dynamical model of agents exchanging information. Then, we think, it would be feasible to describe the behavior of the agents operating in a financial market and taking decisions about buying or selling shares in the presence of information asymmetry. And it would be possible to explain the known stylized facts in financial markets, such as, for example, the fat tails of return distributions, and volatility clustering, as well as transient bubbles and crashes, which are connected with herding behavior. Some first results in that direction are reported in our previous papers [6, 28], where the role of additional information, received by decision makers, is analyzed and it is shown that the amount of the additional information essentially influences the value of the quantum term. Further work on the generalization of the approach toward a dynamical theory of decision-maker societies is in progress.

## Author Contributions

All authors listed, have made substantial, direct and intellectual contribution to the work, and approved it for publication.

## Conflict of Interest Statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

## Acknowledgments

Financial support from the ETH Zürich Risk Center is appreciated. We are greateful for discussions to E.P. Yukalova.

## References

1. von Neumann J. *Mathematical Foundations of Quantum Mechanics*. Princeton, NJ: Princeton University (1955).

2. Yukalov VI, Sornette D. Quantum decision theory as quantum theory of measurement. *Phys Lett A* (2008) **372**:6867–71. doi: 10.1016/j.physleta.2008.09.053

3. Yukalov VI, Sornette D. Physics of risk and uncertainty in quantum decision making. *Eur Phys J B* (2009a) **71**:533–48. doi: 10.1140/epjb/e2009-00245-9

4. Yukalov VI, Sornette D. Mathematical structure of quantum decision theory. *Adv Complex Syst.* (2010) **13**:659–96. doi: 10.1142/S0219525910002803

5. Yukalov VI, Sornette D. Decision theory with prospect interference and entanglement. *Theor Decis.* (2011) **70**:283–328. doi: 10.1007/s11238-010-9202-y

6. Yukalov VI, Sornette D. Manipulating decision making of typical agents. *IEEE Trans Syst Man Cybern Syst.* (2014a) **44**:1155–68. doi: 10.1109/TSMC.2014.2314283

7. Yukalov VI, Sornette D. Positive operator-valued measures in quantum decision theory. *Lect Notes Comput Sci.* (2015a) **8951**:146–61. doi: 10.1007/978-3-319-15931-7_12

9. Busemeyer JR, Bruza P. *Quantum Models of Cognition and Decision*. Cambridge: Cambridge University (2012).

12. Yukalov VI, Sornette D. Processing information in quantum decision theory. *Entropy* (2009b) **11**:1073–120. doi: 10.3390/e11041073

13. Sornette D. Physics and financial economics (1776–2014): puzzles, Ising and agent-based models. *Rep Prog Phys.* (2014) **77**:062001. doi: 10.1088/0034-4885/77/6/062001

14. Ashtiani M, Azgomi MA. A survey of quantum-like approaches to decision making and cognition. *Math Soc Sci.* (2015) **75**:49–50. doi: 10.1016/j.mathsocsci.2015.02.004

15. Haven E, Khrennikov A. Quantum probability and mathematical modelling of decision making. *Philos Trans R Soc A* (2016) **374**:20150105. doi: 10.1098/rsta.2015.0105

16. Hey JD. Experimental investigations of errors in decision making under risk. *Eur Econ Rev.* (1995) **39**:633–40. doi: 10.1016/0014-2921(09)40007-4

17. Loomes G, Sugden R. Testing different stochastic specifications of risky choice. *Economica* (1998) **65**:581–98. doi: 10.1111/1468-0335.00147

18. Carbone E, Hey JD. Which error story is best? *J Risk Uncert.* (2000) **202**:161–76. doi: 10.1023/A:1007829024107

19. Hey JD. Why we should not be silent about noise. *Exp Econ.* (2005) **8**:325–45. doi: 10.1007/s10683-005-5373-8

20. Conte A, Hey JD, Moffatt PG. Mixture models of choice under risk. *J Econometr.* (2011) **162**:79–88. doi: 10.1016/j.jeconom.2009.10.011

21. Yukalov VI, Sornette D. Quantum probabilities of composite events in quantum measurements with multimode states. *Laser Phys.* (2013) **23**:105502. doi: 10.1088/1054-660X/23/10/105502

22. Yukalov VI, Sornette D. Quantum theory of measurements as quantum decision theory. *J Phys Conf Ser.* (2015b) **594**:012048. doi: 10.1088/1742-6596/594/1/012048

23. Yukalov VI, Sornette D. Quantum probability and quantum decision making. *Philos Trans R Soc A* (2016) **374**:20150100. doi: 10.1098/rsta.2015.0100

25. Yukalov VI. Equilibration and thermalization in finite quantum systems. *Laser Phys Lett.* (2011) **8**:485–507. doi: 10.1002/lapl.201110002

26. Yukalov VI. Equilibration of quasi-isolated quantum systems. *Phys Lett A* (2012a) **376**:550–4. doi: 10.1016/j.physleta.2011.11.015

27. Yukalov VI. Decoherence and equilibration under nondestructive measurements. *Ann Phys.* (2012b) **327**:253–63. doi: 10.1016/j.aop.2011.09.009

28. Yukalov VI, Sornette D. Role of information in decision making of social agents. *Int J Inf Technol Dec Mak.* (2015c) **14**:1129–66. doi: 10.1142/S0219622014500564

30. Yukalov VI, Sornette D. Conditions for quantum interference in cognitive sciences. *Top Cogn Sci.* (2014b) **6**:79–90. doi: 10.1111/tops.12065

31. Boyer-Kassem T, Duchên S, Guerci E. Testing quantum-like models of judgment for question order effects. arXiv:1501.04901 (2015a).

32. Boyer-Kassem T, Duchên S, Guerci E. *Quantum-Like Models Cannot Account for the Conjunction Fallacy*. Working Paper, Tilburg University, Tilburg (2015b).

33. von Neumann J, Morgenstern O. *Theory of Games and Economic Behavior*. Princeton, NJ: Princeton University (1953).

35. Frank SA. The common patterns of nature. *J Evol Biol.* (2009) **22**:1563–85. doi: 10.1111/j.1420-9101.2009.01775.x

36. Batty M. Space, scale, and scaling in entropy maximizing. *Geogr Anal.* (2010) **42**:395–421. doi: 10.1111/j.1538-4632.2010.00800.x

37. Saichev A, Malevergne Y, Sornette D. *Theory of Zipf's Law and Beyond*. Heidelberg: Springer (2010).

38. Sinisalchi M. Vector expected utility and attitudes toward variation. *Econometrica* (2009) **77**:801–55. doi: 10.3982/ECTA7564

39. Huber J, Payne JW, Puto C. Adding asymmetrically dominated alternatives: violations of regularity and similarity hypothesis. *J Consum Res.* (1982) **9**:90–8. doi: 10.1086/208899

40. Simonson I. Choice based on reasons: the case of attraction and compromise effects. *J Consum Res.* (1989) **16**:158–74. doi: 10.1086/209205

41. Wedell DH. Distinguishing among models of contextuality induced preference reversals. *Learn Memory Cogn.* (1991) **17**:767–78. doi: 10.1037/0278-7393.17.4.767

42. Tversky A, Simonson I. Context-dependent preferences. *Manag Sci.* (1993) **39**:1179–89. doi: 10.1287/mnsc.39.10.1179

43. Ariely D, Wallsten TS. Seeking subjective dominance in multidimensional space: an explanation of the asymmetric dominance effect. *Org Behav Human Decis Process.* (1995) **63**:223–32. doi: 10.1006/obhd.1995.1075

44. Bateson M, Healy SD. Comparative evaluation and its implications for mate choice. *Trends Ecol Evol.* (2006) **20**:659–64. doi: 10.1016/j.tree.2005.08.013

Keywords: quantum measurements, decision theory, inconclusive events, quantum probability, non-informative priors, decoy effect

Citation: Yukalov VI and Sornette D (2016) Inconclusive Quantum Measurements and Decisions under Uncertainty. *Front. Phys*. 4:12. doi: 10.3389/fphy.2016.00012

Received: 25 January 2016; Accepted: 24 March 2016;

Published: 14 April 2016.

Edited by:

Emmanuel E. Haven, University of Leicester, UKReviewed by:

Jan Sladkowski, The University of Silesia, PolandSalvatore Micciche', Universitá Degli Studi di Palermo, Italy

Copyright © 2016 Yukalov and Sornette. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Vyacheslav I. Yukalov, yukalov@theor.jinr.ru