915 resultados para Database search Evidential value Bayesian decision theory Influence diagrams


Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study focuses on empirical investigations and seeks implications by utilizing three different methodologies to test various aspects of trader behavior. The first methodology utilizes Prospect Theory to determine trader behavior during periods of extreme wealth contracting periods. Secondly, a threshold model to examine the sentiment variable is formulated and thirdly a study is made of the contagion effect and trader behavior. ^ The connection between consumers' sense of financial well-being or sentiment and stock market performance has been studied at length. However, without data on actual versus experimental performance, implications based on this relationship are meaningless. The empirical agenda included examining a proprietary file of daily trader activities over a five-year period. Overall, during periods of extreme wealth altering conditions, traders "satisfice" rather than choose the "best" alternative. A trader's degree of loss aversion depends on his/her prior investment performance. A model that explains the behavior of traders during periods of turmoil is developed. Prospect Theory and the data file influenced the design of the model. ^ Additional research included testing a model that permitted the data to signal the crisis through a threshold model. The third empirical study sought to investigate the existence of contagion caused by declining global wealth effects using evidence from the mining industry in Canada. Contagion, where a financial crisis begins locally and subsequently spreads elsewhere, has been studied in terms of correlations among similar regions. The results provide support for Prospect Theory in two out of the three empirical studies. ^ The dissertation emphasizes the need for specifying precise, testable models of investors' expectations by providing tools to identify paradoxical behavior patterns. True enhancements in this field must include empirical research utilizing reliable data sources to mitigate data mining problems and allow researchers to distinguish between expectations-based and risk-based explanations of behavior. Through this type of research, it may be possible to systematically exploit "irrational" market behavior. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The first essay developed a respondent model of Bayesian updating for a double-bound dichotomous choice (DB-DC) contingent valuation methodology. I demonstrated by way of data simulations that current DB-DC identifications of true willingness-to-pay (WTP) may often fail given this respondent Bayesian updating context. Further simulations demonstrated that a simple extension of current DB-DC identifications derived explicitly from the Bayesian updating behavioral model can correct for much of the WTP bias. Additional results provided caution to viewing respondents as acting strategically toward the second bid. Finally, an empirical application confirmed the simulation outcomes. The second essay applied a hedonic property value model to a unique water quality (WQ) dataset for a year-round, urban, and coastal housing market in South Florida, and found evidence that various WQ measures affect waterfront housing prices in this setting. However, the results indicated that this relationship is not consistent across any of the six particular WQ variables used, and is furthermore dependent upon the specific descriptive statistic employed to represent the WQ measure in the empirical analysis. These results continue to underscore the need to better understand both the WQ measure and its statistical form homebuyers use in making their purchase decision. The third essay addressed a limitation to existing hurricane evacuation modeling aspects by developing a dynamic model of hurricane evacuation behavior. A household's evacuation decision was framed as an optimal stopping problem where every potential evacuation time period prior to the actual hurricane landfall, the household's optimal choice is to either evacuate, or to wait one more time period for a revised hurricane forecast. A hypothetical two-period model of evacuation and a realistic multi-period model of evacuation that incorporates actual forecast and evacuation cost data for my designated Gulf of Mexico region were developed for the dynamic analysis. Results from the multi-period model were calibrated with existing evacuation timing data from a number of hurricanes. Given the calibrated dynamic framework, a number of policy questions that plausibly affect the timing of household evacuations were analyzed, and a deeper understanding of existing empirical outcomes in regard to the timing of the evacuation decision was achieved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study investigated the utility of the Story Model for decision making at the jury level by examining the influence of evidence order and deliberation style on story consistency and guilt. Participants were shown a video-taped trial stimulus and then provided case perceptions including a guilt judgment and a narrative about what occurred during the incident. Participants then deliberated for approximately thirty minutes using either an evidence-driven or verdict-driven deliberation style before again providing case perceptions, including a guilt determination, a narrative about what happened during the incident, and an evidence recognition test. Multi-level regression analyses revealed that evidence order, deliberation style and sample interacted to influence both story consistency measures and guilt. Among students, participants in the verdict-driven deliberation condition formed more consistent pro-prosecution stories when the prosecution presented their case in story-order, while participants in the evidence-driven deliberation condition formed more consistent pro-prosecution stories when the defense's case was presented in story-order. Findings were the opposite among community members, with participants in the verdict-driven deliberation condition forming more consistent pro-prosecution stories when the defense's case was presented in story-order, and participants in the evidence-driven deliberation condition forming more consistent pro-prosecution stories when the prosecution's case was presented in story-order. Additionally several story consistency measures influenced guilt decisions. Thus there is some support for the hypothesis that story consistency mediates the influence of evidence order and deliberation style on guilt decisions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Most advertising research has focussed at examining effects of advertising on attitudinal responses or brand preference and choice. However, in a natural environment, the time period between advertising exposure and purchase decision is filled with prepurchase search. Prepurchase external search refers to information search from sources other than memory, prior to making a purchase decision. Usually consumers access only a small subset of available information and base their choice decisions on it. Prepurchase search therefore acts as a filter and, the final choice depends critically on the small subset of potential inputs the consumer notes in the environment and integrates into the decision. Previous research has identified a variety of factors that affect consumers' prepurchase search behavior. However, there is little understanding of how specific advertisements designed by marketers impact consumers' prepurchase search. A marketer would like consumers to search information that reflects favorably on his/her brand. Hence, s/he would attempt to influence the brands and attributes on which consumers seek information prior to making a choice. The dissertation investigates the process by which a particular marketer's advertising influences consumers' search on available brands, i.e., the marketer's brand and other competing brands. The dissertation considers a situation where exposure to advertising occurs prior to seeking information from any other source. Hence, the impact of advertising on subsequent search behavior is the topic of interest. The dissertation develops a conceptual model of advertising effects on brand search and conducts two experiments to test the tenets of this model. Specifically, the dissertation demonstrates that attitudinal responses generated by advertising mediate advertising effects on search attitudes and behaviors. The dissertation goes on to examine how attitudinal responses generated by advertising and subsequent effects on search alter brand preference and choice. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The first essay developed a respondent model of Bayesian updating for a double-bound dichotomous choice (DB-DC) contingent valuation methodology. I demonstrated by way of data simulations that current DB-DC identifications of true willingness-to-pay (WTP) may often fail given this respondent Bayesian updating context. Further simulations demonstrated that a simple extension of current DB-DC identifications derived explicitly from the Bayesian updating behavioral model can correct for much of the WTP bias. Additional results provided caution to viewing respondents as acting strategically toward the second bid. Finally, an empirical application confirmed the simulation outcomes. The second essay applied a hedonic property value model to a unique water quality (WQ) dataset for a year-round, urban, and coastal housing market in South Florida, and found evidence that various WQ measures affect waterfront housing prices in this setting. However, the results indicated that this relationship is not consistent across any of the six particular WQ variables used, and is furthermore dependent upon the specific descriptive statistic employed to represent the WQ measure in the empirical analysis. These results continue to underscore the need to better understand both the WQ measure and its statistical form homebuyers use in making their purchase decision. The third essay addressed a limitation to existing hurricane evacuation modeling aspects by developing a dynamic model of hurricane evacuation behavior. A household’s evacuation decision was framed as an optimal stopping problem where every potential evacuation time period prior to the actual hurricane landfall, the household’s optimal choice is to either evacuate, or to wait one more time period for a revised hurricane forecast. A hypothetical two-period model of evacuation and a realistic multi-period model of evacuation that incorporates actual forecast and evacuation cost data for my designated Gulf of Mexico region were developed for the dynamic analysis. Results from the multi-period model were calibrated with existing evacuation timing data from a number of hurricanes. Given the calibrated dynamic framework, a number of policy questions that plausibly affect the timing of household evacuations were analyzed, and a deeper understanding of existing empirical outcomes in regard to the timing of the evacuation decision was achieved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study focuses on empirical investigations and seeks implications by utilizing three different methodologies to test various aspects of trader behavior. The first methodology utilizes Prospect Theory to determine trader behavior during periods of extreme wealth contracting periods. Secondly, a threshold model to examine the sentiment variable is formulated and thirdly a study is made of the contagion effect and trader behavior. The connection between consumers' sense of financial well-being or sentiment and stock market performance has been studied at length. However, without data on actual versus experimental performance, implications based on this relationship are meaningless. The empirical agenda included examining a proprietary file of daily trader activities over a five-year period. Overall, during periods of extreme wealth altering conditions, traders "satisfice" rather than choose the "best" alternative. A trader's degree of loss aversion depends on his/her prior investment performance. A model that explains the behavior of traders during periods of turmoil is developed. Prospect Theory and the data file influenced the design of the model. Additional research included testing a model that permitted the data to signal the crisis through a threshold model. The third empirical study sought to investigate the existence of contagion caused by declining global wealth effects using evidence from the mining industry in Canada. Contagion, where a financial crisis begins locally and subsequently spreads elsewhere, has been studied in terms of correlations among similar regions. The results provide support for Prospect Theory in two out of the three empirical studies. The dissertation emphasizes the need for specifying precise, testable models of investors' expectations by providing tools to identify paradoxical behavior patterns. True enhancements in this field must include empirical research utilizing reliable data sources to mitigate data mining problems and allow researchers to distinguish between expectations-based and risk-based explanations of behavior. Through this type of research, it may be possible to systematically exploit "irrational" market behavior.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Aging as a social phenomenon is guided in the ways of production and reproduction, that is linked to the peculiarities and compliances of the social structure influence the values and erected senses. The search for the understanding of the reality will be given by the appropriation of knowledge/information that in contemporary times are increasingly tied to the media means, so that they go beyond the mere condition media, reaching the condition of instruments of direct, attitudes and opinions production. Among the media instruments, there are news medias, that are important means of information dissemination and consequently production of senses, including over aging. Thus, the objective is to: apprehend the social representations and meanings associated with aging in the media space about aging; browse the social representations about aging in media space and their influence on the relations that are established in the cultural socio-economic context. To do it The Theory of Social Representation will be used. To collect the data, 57 online news from the three main state newspapers were studied: Tribuna do Norte, Gazeta do Povo and Jornal de Hoje, which were captured through a search tol of these newspaper sites, using the words: “aging””elderly”. These materials were analyzed making use of Bardin’s Qualitative Analysis, which allowed the establishment of five categories, namely: Aging and violence; Aging in contemporary times; Aging and health; Aging and citizenship; and Aging, work and action. In the first category news reporting violent situations are framed the regardless of the victim’s condition or the one charged by the violence, the fragility of elderly persists. In Aging in contemporary times the attempts of the news media to explain the demographic changes of the quantitative increases in the elderly, the burden it may lead to the full development of the country are noticed. In Aging and health is noticed the imminent end of the condition brought on age, as a synonym for diseases and conditions. In the fourth category, Aging and citizenship situations are shown where peculiarities and needs or the elderly need to turn into obligation to be fulfilled, denouncing the condition of low expression and social power of the class. Finally, in Aging, work and action situations that indicate the non-expectation of elderly interaction with new technologies and participation in decision-making directions of society are brought. In a general and specific form this analysis allowed learning the ways of production of meanings about aging in the papers, as these ones tend to represent aging through intentioned situations as hegemonic needs, building the social representation of the elderly as fragile person, submissive, inactive, subject to violence and susceptible to becoming ill.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Ultimatum Game is a methodology of the Game Theory that intends to investigate the individuals cooperative behavior in situations of resources division. Studies have shown that half of the subjects don’t accept unfair division of resources, and prefer to bear a momentary cost to revenge the deceivers. However, people who have assertiveness impairment, such as social phobic individuals, could have some difficulties to reject unfair offer division resource, especially in situations that cause over anxiety, like being in the presence of an individual considered to be in a high hierarchical level. A negative perception about his own worth can also make the person thinks that he does not deserve a fair division. These individuals also have a strong desire to convey a positive impression to the others, which could cause them to be more generous in a resource division. The aim of this study was to verify, through the Ultimatum Game, if social anxiety individuals would accept more high confederate’s unfair offers that low confederate’s unfair offers; and if they would be more generous in goods division, in the same game, when compared to individuals without social anxiety. Ninety-five (95) college students participated in this study answering the Social Phobia Inventory, the Factorial Scale of Extroversion, socio-demographic questionnaire, situational anxiety scale and, finally, the Ultimatum Game in four rounds (1st and 3rd – confederate representing high or low ranking using an unfair proposal; 2nd – confederate without social status using fair proposal; 4th – subject’s research proposes the offer). The results showed a significant negative correlation between social anxiety and haughtiness, and social anxiety and assertiveness, and a significant positive correlation between social anxiety and situational anxiety. There was no significant difference in situational anxiety due to the status for anxious individuals. Also we found no significant difference in the amount of donated goods, showing that generous behavior does not differ between groups. Finally, the social status did not influence the decision in response to the game for anxious individuals. These results corroborate to other studies that show the relationship between social anxiety and assertiveness, and social anxiety and negative self-perception capability and value (low haughtiness). As show the results of situational anxiety scale, the high status stimulus was not perceived as threatening to the individual, which may have affected his answer in the game. The results for the Ultimatum Game follow the same direction as the acceptance rate for unfair proposals (approximately 50%) in studies with non-clinical sample.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Ultimatum Game is a methodology of the Game Theory that intends to investigate the individuals cooperative behavior in situations of resources division. Studies have shown that half of the subjects don’t accept unfair division of resources, and prefer to bear a momentary cost to revenge the deceivers. However, people who have assertiveness impairment, such as social phobic individuals, could have some difficulties to reject unfair offer division resource, especially in situations that cause over anxiety, like being in the presence of an individual considered to be in a high hierarchical level. A negative perception about his own worth can also make the person thinks that he does not deserve a fair division. These individuals also have a strong desire to convey a positive impression to the others, which could cause them to be more generous in a resource division. The aim of this study was to verify, through the Ultimatum Game, if social anxiety individuals would accept more high confederate’s unfair offers that low confederate’s unfair offers; and if they would be more generous in goods division, in the same game, when compared to individuals without social anxiety. Ninety-five (95) college students participated in this study answering the Social Phobia Inventory, the Factorial Scale of Extroversion, socio-demographic questionnaire, situational anxiety scale and, finally, the Ultimatum Game in four rounds (1st and 3rd – confederate representing high or low ranking using an unfair proposal; 2nd – confederate without social status using fair proposal; 4th – subject’s research proposes the offer). The results showed a significant negative correlation between social anxiety and haughtiness, and social anxiety and assertiveness, and a significant positive correlation between social anxiety and situational anxiety. There was no significant difference in situational anxiety due to the status for anxious individuals. Also we found no significant difference in the amount of donated goods, showing that generous behavior does not differ between groups. Finally, the social status did not influence the decision in response to the game for anxious individuals. These results corroborate to other studies that show the relationship between social anxiety and assertiveness, and social anxiety and negative self-perception capability and value (low haughtiness). As show the results of situational anxiety scale, the high status stimulus was not perceived as threatening to the individual, which may have affected his answer in the game. The results for the Ultimatum Game follow the same direction as the acceptance rate for unfair proposals (approximately 50%) in studies with non-clinical sample.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This concise essay attempts to show why Isaak Illich Rubin is, until today, the best interpreter, commentator and developer of The Capital of Karl Marx, understanding Marx’s work as an ontology and a gnoseology of the capitalist economic system. To do this, we analyze the relations existing between Marx, Rubin and the theory of science of the Spanish Marxist philosopher Gustavo Bueno. In this way, we can interpret the work “Essays on Marxist theory of value” also as an ontology and a gnoseology of capitalism.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Many modern applications fall into the category of "large-scale" statistical problems, in which both the number of observations n and the number of features or parameters p may be large. Many existing methods focus on point estimation, despite the continued relevance of uncertainty quantification in the sciences, where the number of parameters to estimate often exceeds the sample size, despite huge increases in the value of n typically seen in many fields. Thus, the tendency in some areas of industry to dispense with traditional statistical analysis on the basis that "n=all" is of little relevance outside of certain narrow applications. The main result of the Big Data revolution in most fields has instead been to make computation much harder without reducing the importance of uncertainty quantification. Bayesian methods excel at uncertainty quantification, but often scale poorly relative to alternatives. This conflict between the statistical advantages of Bayesian procedures and their substantial computational disadvantages is perhaps the greatest challenge facing modern Bayesian statistics, and is the primary motivation for the work presented here.

Two general strategies for scaling Bayesian inference are considered. The first is the development of methods that lend themselves to faster computation, and the second is design and characterization of computational algorithms that scale better in n or p. In the first instance, the focus is on joint inference outside of the standard problem of multivariate continuous data that has been a major focus of previous theoretical work in this area. In the second area, we pursue strategies for improving the speed of Markov chain Monte Carlo algorithms, and characterizing their performance in large-scale settings. Throughout, the focus is on rigorous theoretical evaluation combined with empirical demonstrations of performance and concordance with the theory.

One topic we consider is modeling the joint distribution of multivariate categorical data, often summarized in a contingency table. Contingency table analysis routinely relies on log-linear models, with latent structure analysis providing a common alternative. Latent structure models lead to a reduced rank tensor factorization of the probability mass function for multivariate categorical data, while log-linear models achieve dimensionality reduction through sparsity. Little is known about the relationship between these notions of dimensionality reduction in the two paradigms. In Chapter 2, we derive several results relating the support of a log-linear model to nonnegative ranks of the associated probability tensor. Motivated by these findings, we propose a new collapsed Tucker class of tensor decompositions, which bridge existing PARAFAC and Tucker decompositions, providing a more flexible framework for parsimoniously characterizing multivariate categorical data. Taking a Bayesian approach to inference, we illustrate empirical advantages of the new decompositions.

Latent class models for the joint distribution of multivariate categorical, such as the PARAFAC decomposition, data play an important role in the analysis of population structure. In this context, the number of latent classes is interpreted as the number of genetically distinct subpopulations of an organism, an important factor in the analysis of evolutionary processes and conservation status. Existing methods focus on point estimates of the number of subpopulations, and lack robust uncertainty quantification. Moreover, whether the number of latent classes in these models is even an identified parameter is an open question. In Chapter 3, we show that when the model is properly specified, the correct number of subpopulations can be recovered almost surely. We then propose an alternative method for estimating the number of latent subpopulations that provides good quantification of uncertainty, and provide a simple procedure for verifying that the proposed method is consistent for the number of subpopulations. The performance of the model in estimating the number of subpopulations and other common population structure inference problems is assessed in simulations and a real data application.

In contingency table analysis, sparse data is frequently encountered for even modest numbers of variables, resulting in non-existence of maximum likelihood estimates. A common solution is to obtain regularized estimates of the parameters of a log-linear model. Bayesian methods provide a coherent approach to regularization, but are often computationally intensive. Conjugate priors ease computational demands, but the conjugate Diaconis--Ylvisaker priors for the parameters of log-linear models do not give rise to closed form credible regions, complicating posterior inference. In Chapter 4 we derive the optimal Gaussian approximation to the posterior for log-linear models with Diaconis--Ylvisaker priors, and provide convergence rate and finite-sample bounds for the Kullback-Leibler divergence between the exact posterior and the optimal Gaussian approximation. We demonstrate empirically in simulations and a real data application that the approximation is highly accurate, even in relatively small samples. The proposed approximation provides a computationally scalable and principled approach to regularized estimation and approximate Bayesian inference for log-linear models.

Another challenging and somewhat non-standard joint modeling problem is inference on tail dependence in stochastic processes. In applications where extreme dependence is of interest, data are almost always time-indexed. Existing methods for inference and modeling in this setting often cluster extreme events or choose window sizes with the goal of preserving temporal information. In Chapter 5, we propose an alternative paradigm for inference on tail dependence in stochastic processes with arbitrary temporal dependence structure in the extremes, based on the idea that the information on strength of tail dependence and the temporal structure in this dependence are both encoded in waiting times between exceedances of high thresholds. We construct a class of time-indexed stochastic processes with tail dependence obtained by endowing the support points in de Haan's spectral representation of max-stable processes with velocities and lifetimes. We extend Smith's model to these max-stable velocity processes and obtain the distribution of waiting times between extreme events at multiple locations. Motivated by this result, a new definition of tail dependence is proposed that is a function of the distribution of waiting times between threshold exceedances, and an inferential framework is constructed for estimating the strength of extremal dependence and quantifying uncertainty in this paradigm. The method is applied to climatological, financial, and electrophysiology data.

The remainder of this thesis focuses on posterior computation by Markov chain Monte Carlo. The Markov Chain Monte Carlo method is the dominant paradigm for posterior computation in Bayesian analysis. It has long been common to control computation time by making approximations to the Markov transition kernel. Comparatively little attention has been paid to convergence and estimation error in these approximating Markov Chains. In Chapter 6, we propose a framework for assessing when to use approximations in MCMC algorithms, and how much error in the transition kernel should be tolerated to obtain optimal estimation performance with respect to a specified loss function and computational budget. The results require only ergodicity of the exact kernel and control of the kernel approximation accuracy. The theoretical framework is applied to approximations based on random subsets of data, low-rank approximations of Gaussian processes, and a novel approximating Markov chain for discrete mixture models.

Data augmentation Gibbs samplers are arguably the most popular class of algorithm for approximately sampling from the posterior distribution for the parameters of generalized linear models. The truncated Normal and Polya-Gamma data augmentation samplers are standard examples for probit and logit links, respectively. Motivated by an important problem in quantitative advertising, in Chapter 7 we consider the application of these algorithms to modeling rare events. We show that when the sample size is large but the observed number of successes is small, these data augmentation samplers mix very slowly, with a spectral gap that converges to zero at a rate at least proportional to the reciprocal of the square root of the sample size up to a log factor. In simulation studies, moderate sample sizes result in high autocorrelations and small effective sample sizes. Similar empirical results are observed for related data augmentation samplers for multinomial logit and probit models. When applied to a real quantitative advertising dataset, the data augmentation samplers mix very poorly. Conversely, Hamiltonian Monte Carlo and a type of independence chain Metropolis algorithm show good mixing on the same dataset.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The advances in three related areas of state-space modeling, sequential Bayesian learning, and decision analysis are addressed, with the statistical challenges of scalability and associated dynamic sparsity. The key theme that ties the three areas is Bayesian model emulation: solving challenging analysis/computational problems using creative model emulators. This idea defines theoretical and applied advances in non-linear, non-Gaussian state-space modeling, dynamic sparsity, decision analysis and statistical computation, across linked contexts of multivariate time series and dynamic networks studies. Examples and applications in financial time series and portfolio analysis, macroeconomics and internet studies from computational advertising demonstrate the utility of the core methodological innovations.

Chapter 1 summarizes the three areas/problems and the key idea of emulating in those areas. Chapter 2 discusses the sequential analysis of latent threshold models with use of emulating models that allows for analytical filtering to enhance the efficiency of posterior sampling. Chapter 3 examines the emulator model in decision analysis, or the synthetic model, that is equivalent to the loss function in the original minimization problem, and shows its performance in the context of sequential portfolio optimization. Chapter 4 describes the method for modeling the steaming data of counts observed on a large network that relies on emulating the whole, dependent network model by independent, conjugate sub-models customized to each set of flow. Chapter 5 reviews those advances and makes the concluding remarks.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Humans are natural politicians. We obsessively collect social information that is both observable (e.g., about third-party relationships) and unobservable (e.g., about others’ psychological states), and we strategically employ that information to manage our cooperative and competitive relationships. To what extent are these abilities unique to our species, and how did they evolve? The present dissertation seeks to contribute to these two questions. To do so, I take a comparative perspective, investigating social decision-making in humans’ closest living relatives, bonobos and chimpanzees. In Chapter 1, I review existing literature on theory of mind—or the ability to understand others’ psychological states—in these species. I also present a theoretical framework to guide further investigation of social cognition in bonobos and chimpanzees based on hypotheses about the proximate and ultimate origins of their species differences. In Chapter 2, I experimentally investigate differences in the prosocial behavior of bonobos and chimpanzees, revealing species-specific prosocial motivations that appear to be less flexible than those exhibited by humans. In Chapter 3, I explore through decision-making experiments bonobos’ ability to evaluate others based on their prosocial or antisocial behavior during third-party interactions. Bonobos do track the interactions of third-parties and evaluate actors based on these interactions. However, they do not exhibit the human preference for those who are prosocial towards others, instead consistently favoring an antisocial individual. The motivation to prefer those who demonstrate a prosocial disposition may be a unique feature of human psychology that contributes to our ultra-cooperative nature. In Chapter 4, I investigate the adaptive value of social cognition in wild primates. I show that the recruitment behavior of wild chimpanzees at Gombe National Park, Tanzania is consistent with the use of third-party knowledge, and that those who appear to use third-party knowledge receive immediate proximate benefits. They escape further aggression from their opponents. These findings directly support the social intelligence hypothesis that social cognition has evolved in response to the demands of competing with one’s own group-mates. Thus, the studies presented here help to better characterize the features of social decision-making that are unique to humans, and how these abilities evolved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Marketers have long looked for observables that could explain differences in consumer behavior. Initial attempts have centered on demographic factors, such as age, gender, and race. Although such variables are able to provide some useful information for segmentation (Bass, Tigert, and Longdale 1968), more recent studies have shown that variables that tap into consumers’ social classes and personal values have more predictive accuracy and also provide deeper insights into consumer behavior. I argue that one demographic construct, religion, merits further consideration as a factor that has a profound impact on consumer behavior. In this dissertation, I focus on two types of religious guidance that may influence consumer behaviors: religious teachings (being content with one’s belongings), and religious problem-solving styles (reliance on God).

Essay 1 focuses on the well-established endowment effect and introduces a new moderator (religious teachings on contentment) that influences both owner and buyers’ pricing behaviors. Through fifteen experiments, I demonstrate that when people are primed with religion or characterized by stronger religious beliefs, they tend to value their belongings more than people who are not primed with religion or who have weaker religious beliefs. These effects are caused by religious teachings on being content with one’s belongings, which lead to the overvaluation of one’s own possessions.

Essay 2 focuses on self-control behaviors, specifically healthy eating, and introduces a new moderator (God’s role in the decision-making process) that determines the relationship between religiosity and the healthiness of food choices. My findings demonstrate that consumers who indicate that they defer to God in their decision-making make unhealthier food choices as their religiosity increases. The opposite is true for consumers who rely entirely on themselves. Importantly, this relationship is mediated by the consumer’s consideration of future consequences. This essay provides an explanation to the existing mixed findings on the relationship between religiosity and obesity.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Periods of drought and low streamflow can have profound impacts on both human and natural systems. People depend on a reliable source of water for numerous reasons including potable water supply and to produce economic value through agriculture or energy production. Aquatic ecosystems depend on water in addition to the economic benefits they provide to society through ecosystem services. Given that periods of low streamflow may become more extreme and frequent in the future, it is important to study the factors that control water availability during these times. In the absence of precipitation the slower hydrological response of groundwater systems will play an amplified role in water supply. Understanding the variability of the fraction of streamflow contribution from baseflow or groundwater during periods of drought provides insight into what future water availability may look like and how it can best be managed. The Mills River Basin in North Carolina is chosen as a case-study to test this understanding. First, obtaining a physically meaningful estimation of baseflow from USGS streamflow data via computerized hydrograph analysis techniques is carried out. Then applying a method of time series analysis including wavelet analysis can highlight signals of non-stationarity and evaluate the changes in variance required to better understand the natural variability of baseflow and low flows. In addition to natural variability, human influence must be taken into account in order to accurately assess how the combined system reacts to periods of low flow. Defining a combined demand that consists of both natural and human demand allows us to be more rigorous in assessing the level of sustainable use of a shared resource, in this case water. The analysis of baseflow variability can differ based on regional location and local hydrogeology, but it was found that baseflow varies from multiyear scales such as those associated with ENSO (3.5, 7 years) up to multi decadal time scales, but with most of the contributing variance coming from decadal or multiyear scales. It was also found that the behavior of baseflow and subsequently water availability depends a great deal on overall precipitation, the tracks of hurricanes or tropical storms and associated climate indices, as well as physiography and hydrogeology. Evaluating and utilizing the Duke Combined Hydrology Model (DCHM), reasonably accurate estimates of streamflow during periods of low flow were obtained in part due to the model’s ability to capture subsurface processes. Being able to accurately simulate streamflow levels and subsurface interactions during periods of drought can be very valuable to water suppliers, decision makers, and ultimately impact citizens. Knowledge of future droughts and periods of low flow in addition to tracking customer demand will allow for better management practices on the part of water suppliers such as knowing when they should withdraw more water during a surplus so that the level of stress on the system is minimized when there is not ample water supply.