54 resultados para Discrete Choice Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

HE PROBIT MODEL IS A POPULAR DEVICE for explaining binary choice decisions in econometrics. It has been used to describe choices such as labor force participation, travel mode, home ownership, and type of education. These and many more examples can be found in papers by Amemiya (1981) and Maddala (1983). Given the contribution of economics towards explaining such choices, and given the nature of data that are collected, prior information on the relationship between a choice probability and several explanatory variables frequently exists. Bayesian inference is a convenient vehicle for including such prior information. Given the increasing popularity of Bayesian inference it is useful to ask whether inferences from a probit model are sensitive to a choice between Bayesian and sampling theory techniques. Of interest is the sensitivity of inference on coefficients, probabilities, and elasticities. We consider these issues in a model designed to explain choice between fixed and variable interest rate mortgages. Two Bayesian priors are employed: a uniform prior on the coefficients, designed to be noninformative for the coefficients, and an inequality restricted prior on the signs of the coefficients. We often know, a priori, whether increasing the value of a particular explanatory variable will have a positive or negative effect on a choice probability. This knowledge can be captured by using a prior probability density function (pdf) that is truncated to be positive or negative. Thus, three sets of results are compared:those from maximum likelihood (ML) estimation, those from Bayesian estimation with an unrestricted uniform prior on the coefficients, and those from Bayesian estimation with a uniform prior truncated to accommodate inequality restrictions on the coefficients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The power required to operate large mills is typically 5-10 MW. Hence, optimisation of power consumption will have a significant impact on overall economic performance and environmental impact. Power draw modelling results using the discrete element code PFC3D have been compared with results derived from the widely used empirical Model of Morrell. This is achieved by calculating the power draw for a range of operating conditions for constant mill size and fill factor using two modelling approaches. fThe discrete element modelling results show that, apart from density, selection of the appropriate material damping ratio is critical for the accuracy of modelling of the mill power draw. The relative insensitivity of the power draw to the material stiffness allows selection of moderate stiffness values, which result in acceptable computation time. The results obtained confirm that modelling of the power draw for a vertical slice of the mill, of thickness 20% of the mill length, is a reliable substitute for modelling the full mill. The power draw predictions from PFC3D show good agreement with those obtained using the empirical model. Due to its inherent flexibility, power draw modelling using PFC3D appears to be a viable and attractive alternative to empirical models where necessary code and computer power are available.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Predictions of flow patterns in a 600-mm scale model SAG mill made using four classes of discrete element method (DEM) models are compared to experimental photographs. The accuracy of the various models is assessed using quantitative data on shoulder, toe and vortex center positions taken from ensembles of both experimental and simulation results. These detailed comparisons reveal the strengths and weaknesses of the various models for simulating mills and allow the effect of different modelling assumptions to be quantitatively evaluated. In particular, very close agreement is demonstrated between the full 3D model (including the end wall effects) and the experiments. It is also demonstrated that the traditional two-dimensional circular particle DEM model under-predicts the shoulder, toe and vortex center positions and the power draw by around 10 degrees. The effect of particle shape and the dimensionality of the model are also assessed, with particle shape predominantly affecting the shoulder position while the dimensionality of the model affects mainly the toe position. Crown Copyright (C) 2003 Published by Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes a process-based metapopulation dynamics and phenology model of prickly acacia, Acacia nilotica, an invasive alien species in Australia. The model, SPAnDX, describes the interactions between riparian and upland sub-populations of A. nilotica within livestock paddocks, including the effects of extrinsic factors such as temperature, soil moisture availability and atmospheric concentrations of carbon dioxide. The model includes the effects of management events such as changing the livestock species or stocking rate, applying fire, and herbicide application. The predicted population behaviour of A. nilotica was sensitive to climate. Using 35 years daily weather datasets for five representative sites spanning the range of conditions that A. nilotica is found in Australia, the model predicted biomass levels that closely accord with expected values at each site. SPAnDX can be used as a decision-support tool in integrated weed management, and to explore the sensitivity of cultural management practices to climate change throughout the range of A. nilotica. The cohort-based DYMEX modelling package used to build and run SPAnDX provided several advantages over more traditional population modelling approaches (e.g. an appropriate specific formalism (discrete time, cohort-based, process-oriented), user-friendly graphical environment, extensible library of reusable components, and useful and flexible input/output support framework). (C) 2003 Published by Elsevier Science B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Preventive maintenance actions over the warranty period have an impact on the warranty servicing cost to the manufacturer and the cost to the buyer of fixing failures over the life of the product after the warranty expires. However, preventive maintenance costs money and is worthwhile only when these costs exceed the reduction in other costs. The paper deals with a model to determine when preventive maintenance actions (which rejuvenate the unit) carried out at discrete time instants over the warranty period are worthwhile. The cost of preventive maintenance is borne by the buyer. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information processing speed, as measured by elementary cognitive tasks, is correlated with higher order cognitive ability so that increased speed relates to improved cognitive performance. The question of whether the genetic variation in Inspection Time (IT) and Choice Reaction Time (CRT) is associated with IQ through a unitary factor was addressed in this multivariate genetic study of IT, CRT, and IQ subtest scores. The sample included 184 MZ and 206 DZ twin pairs with a mean age of 16.2 years (range 15-18 years). They were administered a visual (pi-figure) IT task, a two-choice RT task, five computerized subtests of the Multidimensional Aptitude Battery, and the digit symbol substitution subtest from the WAIS-R. The data supported a factor model comprising a general, three group (verbal ability, visuospatial ability, broad speediness), and specific genetic factor structure, a shared environmental factor influencing all tests but IT, plus unique environmental factors that were largely specific to individual measures. The general genetic factor displayed factor loadings ranging between 0.35 and 0.66 for the IQ subtests, with IT and CRT loadings of -0.47 and -0.24, respectively. Results indicate that a unitary factor is insufficient to describe the entire relationship between cognitive speed measures and all IQ subtests, with independent genetic effects explaining further covariation between processing speed (especially CRT) and Digit Symbol.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cluster analysis via a finite mixture model approach is considered. With this approach to clustering, the data can be partitioned into a specified number of clusters g by first fitting a mixture model with g components. An outright clustering of the data is then obtained by assigning an observation to the component to which it has the highest estimated posterior probability of belonging; that is, the ith cluster consists of those observations assigned to the ith component (i = 1,..., g). The focus is on the use of mixtures of normal components for the cluster analysis of data that can be regarded as being continuous. But attention is also given to the case of mixed data, where the observations consist of both continuous and discrete variables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The best accepted method for design of autogenous and semi-autogenous (AG/SAG) mills is to carry out pilot scale test work using a 1.8 m diameter by 0.6 m long pilot scale test mill. The load in such a mill typically contains 250,000-450,000 particles larger than 6 mm, allowing correct representation of more than 90% of the charge in Discrete Element Method (DEM) simulations. Most AG/SAG mills use discharge grate slots which are 15 mm or more in width. The mass in each size fraction usually decreases rapidly below grate size. This scale of DEM model is now within the possible range of standard workstations running an efficient DEM code. This paper describes various ways of extracting collision data front the DEM model and translating it into breakage estimates. Account is taken of the different breakage mechanisms (impact and abrasion) and of the specific impact histories of the particles in order to assess the breakage rates for various size fractions in the mills. At some future time, the integration of smoothed particle hydrodynamics with DEM will allow for the inclusion of slurry within the pilot mill simulation. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces the rank-dependent quality-adjusted life-years (QALY) model, a new method to aggregate QALYs in economic evaluations of health care. The rank-dependent QALY model permits the formalization of influential concepts of equity in the allocation of health care, such as the fair innings approach, and it includes as special cases many of the social welfare functions that have been proposed in the literature. An important advantage of the rank-dependent QALY model is that it offers a straightforward procedure to estimate equity weights for QALYs. We characterize the rank-dependent QALY model and argue that its central condition has normative appeal. (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Lattice Solid Model has been used successfully as a virtual laboratory to simulate fracturing of rocks, the dynamics of faults, earthquakes and gouge processes. However, results from those simulations show that in order to make the next step towards more realistic experiments it will be necessary to use models containing a significantly larger number of particles than current models. Thus, those simulations will require a greatly increased amount of computational resources. Whereas the computing power provided by single processors can be expected to increase according to Moore's law, i.e., to double every 18-24 months, parallel computers can provide significantly larger computing power today. In order to make this computing power available for the simulation of the microphysics of earthquakes, a parallel version of the Lattice Solid Model has been implemented. Benchmarks using large models with several millions of particles have shown that the parallel implementation of the Lattice Solid Model can achieve a high parallel-efficiency of about 80% for large numbers of processors on different computer architectures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To determine whether the choice of client fishes in the cleaner fish Labroides dimidiatus was influenced by client size, cleaner fish were given a choice of equal amount of food spread on large and small client redfin butterflyfish Chaetodon trifasciatus models. All large models received bites from cleaners compared to 27% for small models. Seventy-nine per cent of cleaners took their first bite from the large fish model. The results suggest that client size may affect cleaner fish choice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We apply a three-dimensional approach to describe a new parametrization of the L-operators for the two-dimensional Bazhanov-Stroganov (BS) integrable spin model related to the chiral Potts model. This parametrization is based on the solution of the associated classical discrete integrable system. Using a three-dimensional vertex satisfying a modified tetrahedron equation, we construct an operator which generalizes the BS quantum intertwining matrix S. This operator describes the isospectral deformations of the integrable BS model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article we study the effects of adsorbed phase compression, lattice structure, and pore size distribution on the analysis of adsorption in microporous activated carbon. The lattice gas approach of Ono-Kondo is modified to account for the above effects. Data of nitrogen adsorption at 77 K onto a number of activated carbon samples are analyzed to investigate the pore filling pressure versus pore width, the packing effect, and the compression of the adsorbed phase. It is found that the PSDs obtained from this analysis are comparable to those obtained by the DFT method. The discrete nature of the PSDs derived from the modified lattice gas theory is due to the inherent assumption of discrete layers of molecules. Nevertheless, it does provide interesting information on the evolution of micropores during the activation process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A dynamic model which describes the impulse behavior of concentrated grounds at high currents is described in this paper. This model is an extension of previous models in that it can successfully account for the surge behavior of concentrated grounds over a much wider range of current densities. It is able to describe the well known effect of ionization of soil as well as the observed effect of discrete breakdowns and filamentary arc paths at much higher currents. Results of verification against experimental results are also presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Burdekin River of northeastern Australia has constructed a substantial delta during the Holocene (delta plain area 1260 km2). The vertical succession through this delta comprises (1) a basal, coarse-grained transgressive lag overlying a continental omission surface, overlain by (2) a mud interval deposited as the coastal region was inundated by the postglacially rising sea, in turn overlain by (3) a generally sharp-based sand unit deposited principally in channel and mouth-bar environments with lesser volumes of floodplain and coastal facies. The Holocene Burdekin Delta was constructed as a series of at least thirteen discrete delta lobes, formed as the river avulsed. Each lobe consists of a composite sand body typically 5-8 m thick. The oldest lobes, formed during the latter stages of the postglacial sea-level rise (10-5.5 kyr BP), are larger than those formed during the highstand (5.5-3 kyr BP), which are in turn larger than those formed during the most recent slight sea-level lowering and stillstand (3-0 kyr BP). Radiocarbon ages and other stratigraphic data indicate that inter-avulsion period has decreased through time coincident with the decrease in delta lobe area. The primary control on Holocene delta architecture appears to have been a change from a pluvial climate known to characterize the region 12-4 kyr BP to the present drier, ENSO-dominated climate. In addition to decreasing the sediment supply via lower rates of chemical weathering, this change may have contributed to the shorter avulsion period by facilitating extreme variability of discharge. More frequent avulsion may also have been facilitated by the lengthening of the delta-plain channels as the system prograded seaward. Copyright © 2006, SEPM (Society for Sedimentary Geology).