868 resultados para Cost Over run


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent times, crowdsourcing over social networks has emerged as an active tool for complex task execution. In this paper, we address the problem faced by a planner to incen-tivize agents in the network to execute a task and also help in recruiting other agents for this purpose. We study this mecha-nism design problem under two natural resource optimization settings: (1) cost critical tasks, where the planner’s goal is to minimize the total cost, and (2) time critical tasks, where the goal is to minimize the total time elapsed before the task is executed. We define a set of fairness properties that should beideally satisfied by a crowdsourcing mechanism. We prove that no mechanism can satisfy all these properties simultane-ously. We relax some of these properties and define their ap-proximate counterparts. Under appropriate approximate fair-ness criteria, we obtain a non-trivial family of payment mech-anisms. Moreover, we provide precise characterizations of cost critical and time critical mechanisms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CuIn1-xAlxSe2 (CIASe) thin films were grown by a simple sol-gel route followed by annealing under vacuum. Parameters related to the spin-orbit (Delta(SO)) and crystal field (Delta(CF)) were determined using a quasi-cubic model. Highly oriented (002) aluminum doped (2%) ZnO, 100 nm thin films, were co-sputtered for CuIn1-xAlxSe2/AZnO based solar cells. Barrier height and ideality factor varied from 0.63 eV to 0.51 eV and 1.3186 to 2.095 in the dark and under 1.38 A. M 1.5 solar illumination respectively. Current-voltage characteristics carried out at 300 K were confined to a triangle, exhibiting three limiting conduction mechanisms: Ohms law, trap-filled limit curve and SCLC, with 0.2 V being the cross-over voltage, for a quadratic transition from Ohm's to Child's law. Visible photodetection was demonstrated with a CIASe/AZO photodiode configuration. Photocurrent was enhanced by one order from 3 x 10(-3) A in the dark at 1 V to 3 x 10(-2) A upon 1.38 sun illumination. The optimized photodiode exhibits an external quantum efficiency of over 32% to 10% from 350 to 1100 nm at high intensity 17.99 mW cm(-2) solar illumination. High responsivity R-lambda similar to 920 A W-1, sensitivity S similar to 9.0, specific detectivity D* similar to 3 x 10(14) Jones, make CIASe a potential absorber for enhancing the forthcoming technological applications of photodetection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Elasticity in cloud systems provides the flexibility to acquire and relinquish computing resources on demand. However, in current virtualized systems resource allocation is mostly static. Resources are allocated during VM instantiation and any change in workload leading to significant increase or decrease in resources is handled by VM migration. Hence, cloud users tend to characterize their workloads at a coarse grained level which potentially leads to under-utilized VM resources or under performing application. A more flexible and adaptive resource allocation mechanism would benefit variable workloads, such as those characterized by web servers. In this paper, we present an elastic resources framework for IaaS cloud layer that addresses this need. The framework provisions for application workload forecasting engine, that predicts at run-time the expected demand, which is input to the resource manager to modulate resource allocation based on the predicted demand. Based on the prediction errors, resources can be over-allocated or under-allocated as compared to the actual demand made by the application. Over-allocation leads to unused resources and under allocation could cause under performance. To strike a good trade-off between over-allocation and under-performance we derive an excess cost model. In this model excess resources allocated are captured as over-allocation cost and under-allocation is captured as a penalty cost for violating application service level agreement (SLA). Confidence interval for predicted workload is used to minimize this excess cost with minimal effect on SLA violations. An example case-study for an academic institute web server workload is presented. Using the confidence interval to minimize excess cost, we achieve significant reduction in resource allocation requirement while restricting application SLA violations to below 2-3%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Energy harvesting sensor nodes are gaining popularity due to their ability to improve the network life time and are becoming a preferred choice supporting green communication. In this paper, we focus on communicating reliably over an additive white Gaussian noise channel using such an energy harvesting sensor node. An important part of this paper involves appropriate modeling of energy harvesting, as done via various practical architectures. Our main result is the characterization of the Shannon capacity of the communication system. The key technical challenge involves dealing with the dynamic (and stochastic) nature of the (quadratic) cost of the input to the channel. As a corollary, we find close connections between the capacity achieving energy management policies and the queueing theoretic throughput optimal policies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present a combination of technologies to provide an Energy-on-Demand (EoD) service to enable low cost innovation suitable for microgrid networks. The system is designed around the low cost and simple Rural Energy Device (RED) Box which in combination with Short Message Service (SMS) communication methodology serves as an elementary proxy for Smart meters which are typically used in urban settings. Further, customer behavior and familiarity in using such devices based on mobile experience has been incorporated into the design philosophy. Customers are incentivized to interact with the system thus providing valuable behavioral and usage data to the Utility Service Provider (USP). Data that is collected over time can be used by the USP for analytics envisioned by using remote computing services known as cloud computing service. Cloud computing allows for a sharing of computational resources at the virtual level across several networks. The customer-system interaction is facilitated by a third party Telecom Service provider (TSP). The approximate cost of the RED Box is envisaged to be under USD 10 on production scale.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Predation risk can strongly constrain how individuals use time and space. Grouping is known to reduce an individual's time investment in costly antipredator behaviours. Whether grouping might similarly provide a spatial release from antipredator behaviour and allow individuals to use risky habitat more and, thus, improve their access to resources is poorly known. We used mosquito larvae, Aedes aegypti, to test the hypothesis that grouping facilitates the use of high-risk habitat. We provided two habitats, one darker, low-risk and one lighter, high-risk, and measured the relative time spent in the latter by solitary larvae versus larvae in small groups. We tested larvae reared under different resource levels, and thus presumed to vary in body condition, because condition is known to influence risk taking. We also varied the degree of contrast in habitat structure. We predicted that individuals in groups should use high-risk habitat more than solitary individuals allowing for influences of body condition and contrast in habitat structure. Grouping strongly influenced the time spent in the high-risk habitat, but, contrary to our expectation, individuals in groups spent less time in the high-risk habitat than solitary individuals. Furthermore, solitary individuals considerably increased the proportion of time spent in the high-risk habitat over time, whereas individuals in groups did not. Both solitary individuals and those in groups showed a small increase over time in their use of riskier locations within each habitat. The differences between solitary individuals and those in groups held across all resource and contrast conditions. Grouping may, thus, carry a poorly understood cost of constraining habitat use. This cost may arise because movement traits important for maintaining group cohesion (a result of strong selection on grouping) can act to exaggerate an individual preference for low-risk habitat. Further research is needed to examine the interplay between grouping, individual movement and habitat use traits in environments heterogeneous in risk and resources. (C) 2015 The Association for the Study of Animal Behaviour. Published by Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two-dimensional magnetic recording 2-D (TDMR) is a promising technology for next generation magnetic storage systems based on a systems-level framework involving sophisticated signal processing at the core. The TDMR channel suffers from severe jitter noise along with electronic noise that needs to be mitigated during signal detection and recovery. Recently, we developed noise prediction-based techniques coupled with advanced signal detectors to work with these systems. However, it is important to understand the role of harmful patterns that can be avoided during the encoding process. In this paper, we investigate the Voronoi-based media model to study the harmful patterns over multitrack shingled recording systems. Through realistic quasi-micromagnetic simulation studies, we identify 2-D data patterns that contribute to high media noise. We look into the generic Voronoi model and present our analysis on multitrack detection with constrained coded data. We show that the 2-D constraints imposed on input patterns result in an order of magnitude improvement in the bit-error rate for the TDMR systems. The use of constrained codes can reduce the complexity of 2-D intersymbol interference (ISI) signal detection, since the lesser 2-D ISI span can be accommodated at the cost of a nominal code rate loss. However, a system must be designed carefully so that the rate loss incurred by a 2-D constraint does not offset the detector performance gain due to more distinguishable readback signals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new DC plasma torch in which are jet states and deposition parameters can be regulated over a wide range has been built. It showed advantages in producing stable plasma conditions at a small gas flow rate. Plasma jets with and without magnetically rotated arcs could be generated. With straight are jet deposition, diamond films could be formed at a rate of 39 mu m/h on Mo substrates of Phi 25 mm, and the conversion rate of carbon in CH4 to diamond was less than 3%. Under magnetically rotated conditions, diamond films could be deposited uniformly in a range of Phi 40 mm at 30 mu m/h, with a quite low total gas flow rate and high carbon conversion rate of over 11%. Mechanisms of rapid and uniform deposition of diamond films with low gas consumption and high carbon transition efficiency are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A 4Gbit/s directly modulated DBR laser is demonstrated with nanometre scale thermal tuning over an extended 20-70°C temperature range. >40dB side mode suppression over the entire temperature range is achieved. © 2005 Optical Society of America.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An uncooled three-section tunable distributed Bragg reflector laser is demonstrated as an athermal transmitter for low-cost uncooled wavelength-division-multiplexing (WDM) systems with tight channel spacing. A ±0.02-nm thermal wavelength drift is achieved under continuous-wave operation up to 70 °C. Dynamic sidemode suppression ratio of greater than 35 dB is consistently obtained under 3.125-Gb/s direct modulation over a 20 °C-70 °C temperature range, with wavelength variation of as low as ±0.2 nm. This indicates that more than an order of magnitude reduction in coarse WDM channel spacing is possible using this source. © 2005 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, I examine the treatment of competitive profit of professor Varian in his textbook on Microeconomics, as a representative of the “modern” post-Marxian view on competitive profit. I show how, on the one hand, Varian defines profit as the surplus of revenues over cost and, thus, as a part of the value of commodities that is not any cost. On the other hand, however, Varian defines profit as a cost, namely, as the opportunity cost of capital, so that, in competitive conditions, the profit or income of capital is determined by the opportunity cost of capital. I argue that this second definition contradicts the first and that it is based on an incoherent conception of opportunity cost.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data have been collected on fisheries catch and effort trends since the latter half of the 1800s. With current trends in declining stocks and stricter management regimes, data need to be collected and analyzed over shorter periods and at finer spatial resolution than in the past. New methods of electronic reporting may reduce the lag time in data collection and provide more accurate spatial resolution. In this study I evaluated the differences between fish dealer and vessel reporting systems for federal fisheries in the US New England and Mid-Atlantic areas. Using data on landing date, report date, gear used, port landed, number of hauls, number of fish sampled and species quotas from available catch and effort records I compared dealer and vessel electronically collected data against paper collected dealer and vessel data to determine if electronically collected data are timelier and more accurate. To determine if vessel or dealer electronic reporting is more useful for management, I determined differences in timeliness and accuracy between vessel and dealer electronic reports. I also compared the cost and efficiency of these new methods with less technology intensive reporting methods using available cost data and surveys of seafood dealers for cost information. Using this information I identified potentially unnecessary duplication of effort and identified applications in ecosystem-based fisheries management. This information can be used to guide the decisions of fisheries managers in the United States and other countries that are attempting to identify appropriate fisheries reporting methods for the management regimes under consideration. (PDF contains 370 pages)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Executive Summary: This study describes the socio-economic characteristics of the U.S. Caribbean trap fishery that encompasses the Commonwealth of Puerto Rico and Territory of the U.S. Virgin Islands. In-person interviews were administered to one hundred randomly selected trap fishermen, constituting nearly 25% of the estimated population. The sample was stratified by geographic area and trap tier. The number of traps owned or fished to qualify for a given tier varied by island. In Puerto Rico, tier I consisted of fishermen who had between 1-40 fish traps, tier II was made up of fishermen who possessed between 41 and 100 fish traps, and tier III consisted of fishermen who held in excess of 100 fish traps. In St. Thomas and St. John, tier I was composed of fishermen who held between 1 and 50 fish traps, tier II consisted of fishermen who had between 51-150 fish traps and tier III was made up of fishermen who had in excess of 150 fish traps. Lastly, in St. Croix, tier I was made up of fishermen who had less than 20 fish traps and tier II consisted of fishermen who had 20 or more fish traps. The survey elicited information on household demographics, annual catch and revenue, trap usage, capital investment on vessels and equipment, fixed and variable costs, behavioral response to a hypothetical trap reduction program and the spatial distribution of traps. The study found that 79% of the sampled population was 40 years or older. The typical Crucian trap fisherman was older than their Puerto Rican and St. Thomian and St. Johnian counterparts. Crucian fishermen’s average age was 57 years whereas Puerto Rican fishermen’s average age was 51 years, and St. Thomian and St. Johnian fishermen’s average age was 48 years. As a group, St. Thomian and St. Johnian fishermen had 25 years of fishing experience, and Puerto Rican and Crucian fishermen had 30, and 29 years, respectively. Overall, 90% of the households had at least one dependent. The average number of dependents across islands was even, ranging between 2.8 in the district of St. Thomas and St. John and 3.4 in the district of St. Croix. The percentage utilization of catch for personal or family use was relatively low. Regionally, percentage use of catch for personal or family uses ranged from 2.5% in St. Croix to 3.8% in the St. Thomas and St. John. About 47% of the respondents had a high school degree. The majority of the respondents were highly dependent on commercial fishing for their household income. In St. Croix, commercial fishing made up 83% of the fishermen’s total household income, whereas in St. Thomas and St. John and Puerto Rico it contributed 74% and 68%, respectively. The contribution of fish traps to commercial fishing income ranged from 51% in the lowest trap tier in St. Thomas and St. John to 99% in the highest trap tier in St. Croix. On an island basis, the contribution of fish traps to fishing income was 75% in St. Croix, 61% in St. Thomas and St. John, and 59% in Puerto Rico. The value of fully rigged vessels ranged from $400 to $250,000. Over half of the fleet was worth $10,000 or less. The St. Thomas and St. John fleet reported the highest mean value, averaging $58,518. The Crucian and Puerto Rican fleets were considerably less valuable, averaging $19,831 and $8,652, respectively. The length of the vessels ranged from 14 to 40 feet. Fifty-nine percent of the sampled vessels were at least 23 feet in length. The average length of the St. Thomas and St. John fleet was 28 feet, whereas the fleets based in St. Croix and Puerto Rico averaged 21 feet. The engine’s propulsion ranged from 8 to 400 horsepower (hp). The mean engine power was 208 hp in St. Thomas and St. John, 108 hp in St. Croix, and 77 hp in Puerto Rico. Mechanical trap haulers and depth recorders were the most commonly used on-board equipment. About 55% of the sampled population reported owning mechanical trap haulers. In St. Thomas and St. John, 100% of the respondents had trap haulers compared to 52% in Puerto Rico and 20% in St. Croix. Forty-seven percent of the fishermen surveyed stated having depth recorders. Depth recorders were most common in the St. Thomas and St. John fleet (80%) and least common in the Puerto Rican fleet (37%). The limited presence of emergency position indication radio beacons (EPIRBS) and radar was the norm among the fish trap fleet. Only 8% of the respondents had EPIRBS and only 1% had radar. Interviewees stated that they fished between 1 and 350 fish traps. Puerto Rican respondents fished on average 39 fish traps, in contrast to St. Thomian and St. Johnian and Crucian respondents, who fished 94 and 27 fish traps, respectively. On average, Puerto Rican respondents fished 11 lobster traps, and St. Thomian and St. Johnian respondents fished 46 lobster traps. None of the Crucian respondents fished lobster traps. The number of fish traps built or purchased ranged between 0 and 175, and the number of lobster traps built or bought ranged between 0 and 200. Puerto Rican fishermen on average built or purchased 30 fish traps and 14 lobster traps, and St. Thomian and St. Johnian fishermen built or bought 30 fish traps and 11 lobster traps. Crucian fishermen built or bought 25 fish traps and no lobster traps. As a group, fish trap average life ranged between 1.3 and 5 years, and lobster traps lasted slightly longer, between 1.5 and 6 years. The study found that the chevron or arrowhead style was the most common trap design. Puerto Rican fishermen owned an average of 20 arrowhead traps. St. Thomian and St. Johnian and Crucian fishermen owned an average of 44 and 15 arrowhead fish traps, respectively. The second most popular trap design was the square trap style. Puerto Rican fishermen had an average of 9 square traps, whereas St. Thomian and St. Johnian fishermen had 33 traps and Crucian fishermen had 2 traps. Antillean Z (or S) -traps, rectangular and star traps were also used. Although Z (or S) -traps are considered the most productive trap design, fishermen prefer the smaller-sized arrowhead and square traps because they are easier and less expensive to build, and larger numbers of them can be safely deployed. The cost of a fish trap, complete with rope and buoys, varied significantly due to the wide range of construction materials utilized. On average, arrowhead traps commanded $94 in Puerto Rico, $251 in St. Thomas and St. John, and $119 in St. Croix. The number of trips per week ranged between 1 and 6. However, 72% of the respondents mentioned that they took two trips per week. On average, Puerto Rican fishermen took 2.1 trips per week, St. Thomian and St. Johnian fishermen took 1.4 trips per week, and Crucian fishermen took 2.5 trips per week. Most fishing trips started at dawn and finished early in the afternoon. Over 82% of the trips lasted 8 hours or less. On average, Puerto Rican fishermen hauled 27 fish traps per trip whereas St. Thomian and St. Johnian fishermen and Crucian fishermen hauled 68 and 26 fish traps per trip, respectively. The number of traps per string and soak time varied considerably across islands. In St. Croix, 84% of the respondents had a single trap per line, whereas in St. Thomas and St. John only 10% of the respondents had a single trap per line. Approximately, 43% of Puerto Rican fishermen used a single trap line. St. Thomian and St. Johnian fishermen soaked their traps for 6.9 days while Puerto Rican and Crucian fishermen soaked their traps for 5.7 and 3.6 days, respectively. The heterogeneity of the industry was also evidenced by the various economic surpluses generated. The survey illustrated that higher gross revenues did not necessarily translate into higher net revenues. Our analysis also showed that, on average, vessels in the trap fishery were able to cover their cash outlays, resulting in positive vessel income (i.e., financial profits). In Puerto Rico, annual financial profits ranged from $4,760 in the lowest trap tier to $32,467 in the highest tier, whereas in St. Thomas and St. John annual financial profits ranged from $3,744 in the lowest tier to $13,652 in the highest tier. In St. Croix, annual financial profits ranged between $9,229 and $15,781. The survey also showed that economic profits varied significantly across tiers. Economic profits measure residual income after deducting the remuneration required to keep the various factors of production in their existing employment. In Puerto Rico, annual economic profits ranged from ($9,339) in the lowest trap tier to $ 8,711 in the highest trap tier. In St. Thomas and St. John, annual economic profits ranged from ($7,920) in the highest tier to ($18,486) in the second highest tier. In St. Croix, annual economic profits ranged between ($7,453) to $10,674. The presence of positive financial profits and negative economic profits suggests that higher economic returns could be earned from a societal perspective by redirecting some of these scarce capital and human resources elsewhere in the economy. Furthermore, the presence of negative economic earnings is evidence that the fishery is overcapitalized and that steps need to be taken to ensure the long-run economic viability of the industry. The presence of positive financial returns provides managers with a window of opportunity to adopt policies that will strengthen the biological and economic performance of the fishery while minimizing any adverse impacts on local fishing communities. Finally, the document concludes by detailing how the costs and earnings information could be used to develop economic models that evaluate management proposals. (PDF contains 147 pages)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we demonstrate the design of a low-cost optical current sensor. The sensor principle is the Faraday rotation of a light beam through a magneto-optical material, SF2, when a magnetic field is present. The prototype has a high sensitivity and a high linearity for currents ranging from 0 up to 800 A. The error of the optical fibre sensor is smaller than 1% for electric currents over 175 A.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the quest for a descriptive theory of decision-making, the rational actor model in economics imposes rather unrealistic expectations and abilities on human decision makers. The further we move from idealized scenarios, such as perfectly competitive markets, and ambitiously extend the reach of the theory to describe everyday decision making situations, the less sense these assumptions make. Behavioural economics has instead proposed models based on assumptions that are more psychologically realistic, with the aim of gaining more precision and descriptive power. Increased psychological realism, however, comes at the cost of a greater number of parameters and model complexity. Now there are a plethora of models, based on different assumptions, applicable in differing contextual settings, and selecting the right model to use tends to be an ad-hoc process. In this thesis, we develop optimal experimental design methods and evaluate different behavioral theories against evidence from lab and field experiments.

We look at evidence from controlled laboratory experiments. Subjects are presented with choices between monetary gambles or lotteries. Different decision-making theories evaluate the choices differently and would make distinct predictions about the subjects' choices. Theories whose predictions are inconsistent with the actual choices can be systematically eliminated. Behavioural theories can have multiple parameters requiring complex experimental designs with a very large number of possible choice tests. This imposes computational and economic constraints on using classical experimental design methods. We develop a methodology of adaptive tests: Bayesian Rapid Optimal Adaptive Designs (BROAD) that sequentially chooses the "most informative" test at each stage, and based on the response updates its posterior beliefs over the theories, which informs the next most informative test to run. BROAD utilizes the Equivalent Class Edge Cutting (EC2) criteria to select tests. We prove that the EC2 criteria is adaptively submodular, which allows us to prove theoretical guarantees against the Bayes-optimal testing sequence even in the presence of noisy responses. In simulated ground-truth experiments, we find that the EC2 criteria recovers the true hypotheses with significantly fewer tests than more widely used criteria such as Information Gain and Generalized Binary Search. We show, theoretically as well as experimentally, that surprisingly these popular criteria can perform poorly in the presence of noise, or subject errors. Furthermore, we use the adaptive submodular property of EC2 to implement an accelerated greedy version of BROAD which leads to orders of magnitude speedup over other methods.

We use BROAD to perform two experiments. First, we compare the main classes of theories for decision-making under risk, namely: expected value, prospect theory, constant relative risk aversion (CRRA) and moments models. Subjects are given an initial endowment, and sequentially presented choices between two lotteries, with the possibility of losses. The lotteries are selected using BROAD, and 57 subjects from Caltech and UCLA are incentivized by randomly realizing one of the lotteries chosen. Aggregate posterior probabilities over the theories show limited evidence in favour of CRRA and moments' models. Classifying the subjects into types showed that most subjects are described by prospect theory, followed by expected value. Adaptive experimental design raises the possibility that subjects could engage in strategic manipulation, i.e. subjects could mask their true preferences and choose differently in order to obtain more favourable tests in later rounds thereby increasing their payoffs. We pay close attention to this problem; strategic manipulation is ruled out since it is infeasible in practice, and also since we do not find any signatures of it in our data.

In the second experiment, we compare the main theories of time preference: exponential discounting, hyperbolic discounting, "present bias" models: quasi-hyperbolic (α, β) discounting and fixed cost discounting, and generalized-hyperbolic discounting. 40 subjects from UCLA were given choices between 2 options: a smaller but more immediate payoff versus a larger but later payoff. We found very limited evidence for present bias models and hyperbolic discounting, and most subjects were classified as generalized hyperbolic discounting types, followed by exponential discounting.

In these models the passage of time is linear. We instead consider a psychological model where the perception of time is subjective. We prove that when the biological (subjective) time is positively dependent, it gives rise to hyperbolic discounting and temporal choice inconsistency.

We also test the predictions of behavioral theories in the "wild". We pay attention to prospect theory, which emerged as the dominant theory in our lab experiments of risky choice. Loss aversion and reference dependence predicts that consumers will behave in a uniquely distinct way than the standard rational model predicts. Specifically, loss aversion predicts that when an item is being offered at a discount, the demand for it will be greater than that explained by its price elasticity. Even more importantly, when the item is no longer discounted, demand for its close substitute would increase excessively. We tested this prediction using a discrete choice model with loss-averse utility function on data from a large eCommerce retailer. Not only did we identify loss aversion, but we also found that the effect decreased with consumers' experience. We outline the policy implications that consumer loss aversion entails, and strategies for competitive pricing.

In future work, BROAD can be widely applicable for testing different behavioural models, e.g. in social preference and game theory, and in different contextual settings. Additional measurements beyond choice data, including biological measurements such as skin conductance, can be used to more rapidly eliminate hypothesis and speed up model comparison. Discrete choice models also provide a framework for testing behavioural models with field data, and encourage combined lab-field experiments.