926 resultados para posterior choice model


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Typical daily decision-making process of individuals regarding use of transport system involves mainly three types of decisions: mode choice, departure time choice and route choice. This paper focuses on the mode and departure time choice processes and studies different model specifications for a combined mode and departure time choice model. The paper compares different sets of explanatory variables as well as different model structures to capture the correlation among alternatives and taste variations among the commuters. The main hypothesis tested in this paper is that departure time alternatives are also correlated by the amount of delay. Correlation among different alternatives is confirmed by analyzing different nesting structures as well as error component formulations. Random coefficient logit models confirm the presence of the random taste heterogeneity across commuters. Mixed nested logit models are estimated to jointly account for the random taste heterogeneity and the correlation among different alternatives. Results indicate that accounting for the random taste heterogeneity as well as inter-alternative correlation improves the model performance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This research improved the measurement of public transport accessibility by capturing; travellers' behaviour; diversity of public transport mode; and the subjectivity of travellers' decision in the complex transport networks. The results of this research not only highlighted the importance of considering public transport network characteristics but also, revealed the impact of public transport diversity in the modelling of public transport accessibility. The research developed a hybrid discrete choice model with a nested logit structure to treat the correlation among the public transport mode choices and, a logit correction factor to rectify the correlation among the stop choices.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of this dissertation is to provide conceptual tools for the social scientist for clarifying, evaluating and comparing explanations of social phenomena based on formal mathematical models. The focus is on relatively simple theoretical models and simulations, not statistical models. These studies apply a theory of explanation according to which explanation is about tracing objective relations of dependence, knowledge of which enables answers to contrastive why and how-questions. This theory is developed further by delineating criteria for evaluating competing explanations and by applying the theory to social scientific modelling practices and to the key concepts of equilibrium and mechanism. The dissertation is comprised of an introductory essay and six published original research articles. The main theses about model-based explanations in the social sciences argued for in the articles are the following. 1) The concept of explanatory power, often used to argue for the superiority of one explanation over another, compasses five dimensions which are partially independent and involve some systematic trade-offs. 2) All equilibrium explanations do not causally explain the obtaining of the end equilibrium state with the multiple possible initial states. Instead, they often constitutively explain the macro property of the system with the micro properties of the parts (together with their organization). 3) There is an important ambivalence in the concept mechanism used in many model-based explanations and this difference corresponds to a difference between two alternative research heuristics. 4) Whether unrealistic assumptions in a model (such as a rational choice model) are detrimental to an explanation provided by the model depends on whether the representation of the explanatory dependency in the model is itself dependent on the particular unrealistic assumptions. Thus evaluating whether a literally false assumption in a model is problematic requires specifying exactly what is supposed to be explained and by what. 5) The question of whether an explanatory relationship depends on particular false assumptions can be explored with the process of derivational robustness analysis and the importance of robustness analysis accounts for some of the puzzling features of the tradition of model-building in economics. 6) The fact that economists have been relatively reluctant to use true agent-based simulations to formulate explanations can partially be explained by the specific ideal of scientific understanding implicit in the practise of orthodox economics.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the quest for a descriptive theory of decision-making, the rational actor model in economics imposes rather unrealistic expectations and abilities on human decision makers. The further we move from idealized scenarios, such as perfectly competitive markets, and ambitiously extend the reach of the theory to describe everyday decision making situations, the less sense these assumptions make. Behavioural economics has instead proposed models based on assumptions that are more psychologically realistic, with the aim of gaining more precision and descriptive power. Increased psychological realism, however, comes at the cost of a greater number of parameters and model complexity. Now there are a plethora of models, based on different assumptions, applicable in differing contextual settings, and selecting the right model to use tends to be an ad-hoc process. In this thesis, we develop optimal experimental design methods and evaluate different behavioral theories against evidence from lab and field experiments.

We look at evidence from controlled laboratory experiments. Subjects are presented with choices between monetary gambles or lotteries. Different decision-making theories evaluate the choices differently and would make distinct predictions about the subjects' choices. Theories whose predictions are inconsistent with the actual choices can be systematically eliminated. Behavioural theories can have multiple parameters requiring complex experimental designs with a very large number of possible choice tests. This imposes computational and economic constraints on using classical experimental design methods. We develop a methodology of adaptive tests: Bayesian Rapid Optimal Adaptive Designs (BROAD) that sequentially chooses the "most informative" test at each stage, and based on the response updates its posterior beliefs over the theories, which informs the next most informative test to run. BROAD utilizes the Equivalent Class Edge Cutting (EC2) criteria to select tests. We prove that the EC2 criteria is adaptively submodular, which allows us to prove theoretical guarantees against the Bayes-optimal testing sequence even in the presence of noisy responses. In simulated ground-truth experiments, we find that the EC2 criteria recovers the true hypotheses with significantly fewer tests than more widely used criteria such as Information Gain and Generalized Binary Search. We show, theoretically as well as experimentally, that surprisingly these popular criteria can perform poorly in the presence of noise, or subject errors. Furthermore, we use the adaptive submodular property of EC2 to implement an accelerated greedy version of BROAD which leads to orders of magnitude speedup over other methods.

We use BROAD to perform two experiments. First, we compare the main classes of theories for decision-making under risk, namely: expected value, prospect theory, constant relative risk aversion (CRRA) and moments models. Subjects are given an initial endowment, and sequentially presented choices between two lotteries, with the possibility of losses. The lotteries are selected using BROAD, and 57 subjects from Caltech and UCLA are incentivized by randomly realizing one of the lotteries chosen. Aggregate posterior probabilities over the theories show limited evidence in favour of CRRA and moments' models. Classifying the subjects into types showed that most subjects are described by prospect theory, followed by expected value. Adaptive experimental design raises the possibility that subjects could engage in strategic manipulation, i.e. subjects could mask their true preferences and choose differently in order to obtain more favourable tests in later rounds thereby increasing their payoffs. We pay close attention to this problem; strategic manipulation is ruled out since it is infeasible in practice, and also since we do not find any signatures of it in our data.

In the second experiment, we compare the main theories of time preference: exponential discounting, hyperbolic discounting, "present bias" models: quasi-hyperbolic (α, β) discounting and fixed cost discounting, and generalized-hyperbolic discounting. 40 subjects from UCLA were given choices between 2 options: a smaller but more immediate payoff versus a larger but later payoff. We found very limited evidence for present bias models and hyperbolic discounting, and most subjects were classified as generalized hyperbolic discounting types, followed by exponential discounting.

In these models the passage of time is linear. We instead consider a psychological model where the perception of time is subjective. We prove that when the biological (subjective) time is positively dependent, it gives rise to hyperbolic discounting and temporal choice inconsistency.

We also test the predictions of behavioral theories in the "wild". We pay attention to prospect theory, which emerged as the dominant theory in our lab experiments of risky choice. Loss aversion and reference dependence predicts that consumers will behave in a uniquely distinct way than the standard rational model predicts. Specifically, loss aversion predicts that when an item is being offered at a discount, the demand for it will be greater than that explained by its price elasticity. Even more importantly, when the item is no longer discounted, demand for its close substitute would increase excessively. We tested this prediction using a discrete choice model with loss-averse utility function on data from a large eCommerce retailer. Not only did we identify loss aversion, but we also found that the effect decreased with consumers' experience. We outline the policy implications that consumer loss aversion entails, and strategies for competitive pricing.

In future work, BROAD can be widely applicable for testing different behavioural models, e.g. in social preference and game theory, and in different contextual settings. Additional measurements beyond choice data, including biological measurements such as skin conductance, can be used to more rapidly eliminate hypothesis and speed up model comparison. Discrete choice models also provide a framework for testing behavioural models with field data, and encourage combined lab-field experiments.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In many environmental valuation applications standard sample sizes for choice modelling surveys are impractical to achieve. One can improve data quality using more in-depth surveys administered to fewer respondents. We report on a study using high quality rank-ordered data elicited with the best-worst approach. The resulting "exploded logit" choice model, estimated on 64 responses per person, was used to study the willingness to pay for external benefits by visitors for policies which maintain the cultural heritage of alpine grazing commons. We find evidence supporting this approach and reasonable estimates of mean WTP, which appear theoretically valid and policy informative. © The Author (2011).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

After the “European” experience of BSE and further food safety crises consumer trust is playing an increasingly important role in political and marketing decision making. This also relates to the area of consumer acceptance of GM food. This paper integrates consumer trust with the theory of planned behavior and a stated choice model to gain a more complete picture of consumer decision making. Preliminary results indicate that when GM products offer practical benefits to consumers acceptance may increase considerably. Furthermore, both trust and perceived benefits contribute significantly to explaining the level of acceptance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We hypothesise that differences in people's attitudes and personality traits lead them to attribute varying importance to environmental considerations, safety, comfort, convenience and flexibility. Differences in personality traits call be revealed not only in the individuals' choice of transport, but also in other actions of their everyday lives-such as how much they recycle, whether they take precautions or avoid dangerous pursuits. Conditioning on a set of exogenous individual characteristics, we use indicators of attitudes and personality traits to form latent variables for inclusion in an, otherwise standard, discrete mode choice model. With a sample of Swedish commuters, we find that both attitudes towards flexibility and comfort, as well as being pro-environmentally inclined, influence the individual's choice of mode. Although modal time and cost still are important, it follows that there are other ways, apart from economic incentives, to attract individuals to the, from society's perspective, desirable public modes of transport. Our results should provide useful information to policy-makers and transportation planners developing sustainable transportation systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background The estimation of demographic parameters from genetic data often requires the computation of likelihoods. However, the likelihood function is computationally intractable for many realistic evolutionary models, and the use of Bayesian inference has therefore been limited to very simple models. The situation changed recently with the advent of Approximate Bayesian Computation (ABC) algorithms allowing one to obtain parameter posterior distributions based on simulations not requiring likelihood computations. Results Here we present ABCtoolbox, a series of open source programs to perform Approximate Bayesian Computations (ABC). It implements various ABC algorithms including rejection sampling, MCMC without likelihood, a Particle-based sampler and ABC-GLM. ABCtoolbox is bundled with, but not limited to, a program that allows parameter inference in a population genetics context and the simultaneous use of different types of markers with different ploidy levels. In addition, ABCtoolbox can also interact with most simulation and summary statistics computation programs. The usability of the ABCtoolbox is demonstrated by inferring the evolutionary history of two evolutionary lineages of Microtus arvalis. Using nuclear microsatellites and mitochondrial sequence data in the same estimation procedure enabled us to infer sex-specific population sizes and migration rates and to find that males show smaller population sizes but much higher levels of migration than females. Conclusion ABCtoolbox allows a user to perform all the necessary steps of a full ABC analysis, from parameter sampling from prior distributions, data simulations, computation of summary statistics, estimation of posterior distributions, model choice, validation of the estimation procedure, and visualization of the results.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Integrated choice and latent variable (ICLV) models represent a promising new class of models which merge classic choice models with the structural equation approach (SEM) for latent variables. Despite their conceptual appeal, applications of ICLV models in marketing remain rare. We extend previous ICLV applications by first estimating a multinomial choice model and, second, by estimating hierarchical relations between latent variables. An empirical study on travel mode choice clearly demonstrates the value of ICLV models to enhance the understanding of choice processes. In addition to the usually studied directly observable variables such as travel time, we show how abstract motivations such as power and hedonism as well as attitudes such as a desire for flexibility impact on travel mode choice. Furthermore, we show that it is possible to estimate such a complex ICLV model with the widely available structural equation modeling package Mplus. This finding is likely to encourage more widespread application of this appealing model class in the marketing field.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Formation of deletions by recombination between short direct repeats is thought to involve either a break-join or a copy-choice process. The key step of the latter is slippage of the replication machinery between the repeats. We report that the main replicase of Escherichia coli, DNA polymerase III holoenzyme, slips between two direct repeats of 27 bp that flank an inverted repeat of approximately equal 300bp. Slippage was detected in vitro, on a single-stranded DNA template, in a primer extension assay. It requires the presence of a short (8 bp) G+C-rich sequence at the base of a hairpin that can form by annealing of the inverted repeats. It is stimulated by (i) high salt concentration, which might stabilize the hairpin, and (ii) two proteins that ensure the processivity of the DNA polymerase III holoenzyme: the single-stranded DNA binding protein and the beta subunit of the polymerase. Slippage is rather efficient under optimal reaction conditions because it can take place on >50% of template molecules. This observation supports the copy-choice model for recombination between short direct repeats.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Following Youngjohn, Lees-Haley, and Binder's (1999) comment on Johnson and Lesniak-Karpiak's (1997) study that warnings lead to more subtle malingering, researchers have sought to better understand warning effects. However, such studies have been largely atheoretical and may have confounded warning and coaching. This study examined the effect on malingering of a warning that was based on criminological-sociological concepts derived from the rational choice model of deterrence theory. A total of 78 participants were randomly assigned to a control group, an unwarned simulator group, or one of two warned simulator groups. The warning groups comprised low- and high-level conditions depending on warning intensity. Simulator participants received no coaching about how to fake tests. Outcome variables were scores derived from the Test of Memory Malingering and Wechsler Memory Scale-III. When the rate of malingering was compared across the four groups, a high-level warning effect was found such that warned participants were significantly less likely to exaggerate than unwarned simulators. In an exploratory follow-up analysis, the warned groups were divided into those who reported malingering and those who did not report malingering, and the performance of these groups was compared to that of unwarned simulators and controls. Using this approach, results showed that participants who were deterred from malingering by warning performed no worse than controls. However, on a small number of tests, self-reported malingerers in the low-level warning group appeared less impaired than unwarned simulators. This pattern was not observed in the high-level warning condition. Although cautious interpretation of findings is necessitated by the exploratory nature of some analyses, overall results suggest that using a carefully designed warning may be useful for reducing the rate of malingering. The combination of some noteworthy effect sizes, despite low power and the small size of some groups, suggests that further investigation of the effects of warnings needs to continue to determine their effect more fully.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a study on estimating the latent demand for rail transit in Australian context. Based on travel mode-choice modelling, a two-stage analysis approach is proposed, namely market population identification and mode share estimation. A case study is conducted on Midland-Fremantle rail transit corridor in Perth, Western Australia. The required data mainly include journey-to-work trip data from Australian Bureau of Statistics Census 2006 and work-purpose mode-choice model in Perth Strategic Transport Evaluation Model. The market profile is analysed, such as catchment areas, market population, mode shares, mode specific trip distributions and average trip distances. A numerical simulation is performed to test the sensitivity of the transit ridership to the change of fuel price. A corridor-level transit demand function of fuel price is thus obtained and its characteristics of elasticity are discussed. This study explores a viable approach to developing a decision-support tool for the assessment of short-term impacts of policy and operational adjustments on corridor-level demand for rail transit.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Despite its potential multiple contributions to sustainable policy objectives, urban transit is generally not widely used by the public in terms of its market share compared to that of automobiles, particularly in affluent societies with low-density urban forms like Australia. Transit service providers need to attract more people to transit by improving transit quality of service. The key to cost-effective transit service improvements lies in accurate evaluation of policy proposals by taking into account their impacts on transit users. If transit providers knew what is more or less important to their customers, they could focus their efforts on optimising customer-oriented service. Policy interventions could also be specified to influence transit users’ travel decisions, with targets of customer satisfaction and broader community welfare. This significance motivates the research into the relationship between urban transit quality of service and its user perception as well as behaviour. This research focused on two dimensions of transit user’s travel behaviour: route choice and access arrival time choice. The study area chosen was a busy urban transit corridor linking Brisbane central business district (CBD) and the St. Lucia campus of The University of Queensland (UQ). This multi-system corridor provided a ‘natural experiment’ for transit users between the CBD and UQ, as they can choose between busway 109 (with grade-separate exclusive right-of-way), ordinary on-street bus 412, and linear fast ferry CityCat on the Brisbane River. The population of interest was set as the attendees to UQ, who travelled from the CBD or from a suburb via the CBD. Two waves of internet-based self-completion questionnaire surveys were conducted to collect data on sampled passengers’ perception of transit service quality and behaviour of using public transit in the study area. The first wave survey is to collect behaviour and attitude data on respondents’ daily transit usage and their direct rating of importance on factors of route-level transit quality of service. A series of statistical analyses is conducted to examine the relationships between transit users’ travel and personal characteristics and their transit usage characteristics. A factor-cluster segmentation procedure is applied to respodents’ importance ratings on service quality variables regarding transit route preference to explore users’ various perspectives to transit quality of service. Based on the perceptions of service quality collected from the second wave survey, a series of quality criteria of the transit routes under study was quantitatively measured, particularly, the travel time reliability in terms of schedule adherence. It was proved that mixed traffic conditions and peak-period effects can affect transit service reliability. Multinomial logit models of transit user’s route choice were estimated using route-level service quality perceptions collected in the second wave survey. Relative importance of service quality factors were derived from choice model’s significant parameter estimates, such as access and egress times, seat availability, and busway system. Interpretations of the parameter estimates were conducted, particularly the equivalent in-vehicle time of access and egress times, and busway in-vehicle time. Market segmentation by trip origin was applied to investigate the difference in magnitude between the parameter estimates of access and egress times. The significant costs of transfer in transit trips were highlighted. These importance ratios were applied back to quality perceptions collected as RP data to compare the satisfaction levels between the service attributes and to generate an action relevance matrix to prioritise attributes for quality improvement. An empirical study on the relationship between average passenger waiting time and transit service characteristics was performed using the service quality perceived. Passenger arrivals for services with long headways (over 15 minutes) were found to be obviously coordinated with scheduled departure times of transit vehicles in order to reduce waiting time. This drove further investigations and modelling innovations in passenger’ access arrival time choice and its relationships with transit service characteristics and average passenger waiting time. Specifically, original contributions were made in formulation of expected waiting time, analysis of the risk-aversion attitude to missing desired service run in the passengers’ access time arrivals’ choice, and extensions of the utility function specification for modelling passenger access arrival distribution, by using complicated expected utility forms and non-linear probability weighting to explicitly accommodate the risk of missing an intended service and passenger’s risk-aversion attitude. Discussions on this research’s contributions to knowledge, its limitations, and recommendations for future research are provided at the concluding section of this thesis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper uses a nonstructural, ordered discrete choice model to measure the effects of various parent and child characteristics upon the independent caregiving decisions of the adult children of elderly parents sampled in the 1982 and 1984 National Long Term Care Survey (NLTCS). While significant effects are noted, emphasis is placed on test statistics constructed to measure the independence of caregiving decisions. The test statistic results are conclusive: The caregiving decisions of adult children are dependent across time and family members. Structural models taking dependencies among family members into account note effects similar to those in the nonstructural model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As governments seek to transition to more efficient vehicle fleets, one strategy has been to incentivize ‘green’ vehicle choice by exempting some of these vehicles from road user charges. As an example, to stimulate sales of Energy-Efficient Vehicles (EEVs) in Sweden, some of these automobiles were exempted from Stockholm’s congestion tax. In this paper the effect this policy had on the demand for new, privately-owned, exempt EEVs is assessed by first estimating a model of vehicle choice and then by applying this model to simulate vehicle alternative market shares under different policy scenarios. The database used to calibrate the model includes owner-specific demographics merged with vehicle registry data for all new private vehicles registered in Stockholm County during 2008. Characteristics of individuals with a higher propensity to purchase an exempt EEV were identified. The most significant factors included intra-cordon residency (positive), distance from home to the CBD (negative), and commuting across the cordon (positive). By calculating vehicle shares from the vehicle choice model and then comparing these estimates to a simulated scenario where the congestion tax exemption was inactive, the exemption was estimated to have substantially increased the share of newly purchased, private, exempt EEVs in Stockholm by 1.8% (+/- 0.3%; 95% C.I.) to a total share of 18.8%. This amounts to an estimated 10.7% increase in private, exempt EEV purchases during 2008 i.e. 519 privately owned, exempt EEVs.