13 resultados para Multinomial logit models with random coefficients (RCL)
em Greenwich Academic Literature Archive - UK
Resumo:
This paper studies two models of two-stage processing with no-wait in process. The first model is the two-machine flow shop, and the other is the assembly model. For both models we consider the problem of minimizing the makespan, provided that the setup and removal times are separated from the processing times. Each of these scheduling problems is reduced to the Traveling Salesman Problem (TSP). We show that, in general, the assembly problem is NP-hard in the strong sense. On the other hand, the two-machine flow shop problem reduces to the Gilmore-Gomory TSP, and is solvable in polynomial time. The same holds for the assembly problem under some reasonable assumptions. Using these and existing results, we provide a complete complexity classification of the relevant two-stage no-wait scheduling models.
Resumo:
Too often, validation of computer models is considered as a "once and forget" task. In this paper a systematic and graduated approach to evacua tion model validation is suggested.
Resumo:
The powerful general Pacala-Hassell host-parasitoid model for a patchy environment, which allows host density–dependent heterogeneity (HDD) to be distinguished from between-patch, host density–independent heterogeneity (HDI), is reformulated within the class of the generalized linear model (GLM) family. This improves accessibility through the provision of general software within well–known statistical systems, and allows a rich variety of models to be formulated. Covariates such as age class, host density and abiotic factors may be included easily. For the case where there is no HDI, the formulation is a simple GLM. When there is HDI in addition to HDD, the formulation is a hierarchical generalized linear model. Two forms of HDI model are considered, both with between-patch variability: one has binomial variation within patches and one has extra-binomial, overdispersed variation within patches. Examples are given demonstrating parameter estimation with standard errors, and hypothesis testing. For one example given, the extra-binomial component of the HDI heterogeneity in parasitism is itself shown to be strongly density dependent.
Resumo:
Numerical models are important tools used in engineering fields to predict the behaviour and the impact of physical elements. There may be advantages to be gained by combining Case-Based Reasoning (CBR) techniques with numerical models. This paper considers how CBR can be used as a flexible query engine to improve the usability of numerical models. Particularly they can help to solve inverse and mixed problems, and to solve constraint problems. We discuss this idea with reference to the illustrative example of a pneumatic conveyor problem. The paper describes example problems faced by design engineers in this context and the issues that need to be considered in this approach. Solution of these problems require methods to handle constraints in both the retrieval phase and the adaptation phase of a typical CBR cycle. We show approaches to the solution of these problesm via a CBR tool.
Resumo:
We consider various single machine scheduling problems in which the processing time of a job depends either on its position in a processing sequence or on its start time. We focus on problems of minimizing the makespan or the sum of (weighted) completion times of the jobs. In many situations we show that the objective function is priority-generating, and therefore the corresponding scheduling problem under series-parallel precedence constraints is polynomially solvable. In other situations we provide counter-examples that show that the objective function is not priority-generating.
Resumo:
Host-parasitoid models including integrated pest management (IPM) interventions with impulsive effects at both fixed and unfixed times were analyzed with regard to host-eradication, host-parasitoid persistence and host-outbreak solutions. The host-eradication periodic solution with fixed moments is globally stable if the host's intrinsic growth rate is less than the summation of the mean host-killing rate and the mean parasitization rate during the impulsive period. Solutions for all three categories can coexist, with switch-like transitions among their attractors showing that varying dosages and frequencies of insecticide applications and the numbers of parasitoids released are crucial. Periodic solutions also exist for models with unfixed moments for which the maximum amplitude of the host is less than the economic threshold. The dosages and frequencies of IPM interventions for these solutions are much reduced in comparison with the pest-eradication periodic solution. Our results, which are robust to inclusion of stochastic effects and with a wide range of parameter values, confirm that IPM is more effective than any single control tactic.
Resumo:
Abstract not available
Resumo:
Serial Analysis of Gene Expression (SAGE) is a relatively new method for monitoring gene expression levels and is expected to contribute significantly to the progress in cancer treatment by enabling a precise and early diagnosis. A promising application of SAGE gene expression data is classification of tumors. In this paper, we build three event models (the multivariate Bernoulli model, the multinomial model and the normalized multinomial model) for SAGE data classification. Both binary classification and multicategory classification are investigated. Experiments on two SAGE datasets show that the multivariate Bernoulli model performs well with small feature sizes, but the multinomial performs better at large feature sizes, while the normalized multinomial performs well with medium feature sizes. The multinomial achieves the highest overall accuracy.
Resumo:
Recently, research has been carried out to test a novel bumping method which omits the under bump metallurgy forming process by bonding copper columns directly onto the Al pads of the silicon dies. This bumping method could be adopted to simplify the flip chip manufacturing process, increase the productivity and achieve a higher I/O count. This paper describes an investigation of the solder joint reliability of flip-chips based on this new bumping process. Computer modelling methods are used to predict the shape of solder joints and response of flip chips to thermal cyclic loading. The accumulated plastic strain energy at the comer solder joints is used as the damage indicator. Models with a range of design parameters have been compared for their reliability. The parameters that have been investigated are the copper column height, radius and solder volume. The ranking of the relative importance of these parameters is given. For most of the results presented in the paper, the solder material has been assumed to be the lead-free 96.5Sn3.5Ag alloy but some results for 60Sn40Pb solder joints have also been presented.
Resumo:
We consider a range of single machine and identical parallel machine pre-emptive scheduling models with controllable processing times. For each model we study a single criterion problem to minimize the compression cost of the processing times subject to the constraint that all due dates should be met. We demonstrate that each single criterion problem can be formulated in terms of minimizing a linear function over a polymatroid, and this justifies the greedy approach to its solution. A unified technique allows us to develop fast algorithms for solving both single criterion problems and bicriteria counterparts.
Resumo:
Heat is extracted away from an electronic package by convection, conduction, and/or radiation. The amount of heat extracted by forced convection using air is highly dependent on the characteristics of the airflow around the package which includes its velocity and direction. Turbulence in the air is also important and is required to be modeled accurately in thermal design codes that use computational fluid dynamics (CFD). During air cooling the flow can be classified as laminar, transitional, or turbulent. In electronics systems, the flow around the packages is usually in the transition region, which lies between laminar and turbulent flow. This requires a low-Reynolds number numerical model to fully capture the impact of turbulence on the fluid flow calculations. This paper provides comparisons between a number of turbulence models with experimental data. These models included the distance from the nearest wall and the local velocity (LVEL), Wolfshtein, Norris and Reynolds, k-ε, k-ω, shear-stress transport (SST), and kε/kl models. Results show that in terms of the fluid flow calculations most of the models capture the difficult wake recirculation region behind the package reasonably well, although for packages whose heights cause a high degree of recirculation behind the package the SST model appears to struggle. The paper also demonstrates the sensitivity of the models to changes in the mesh density; this study is aimed specifically at thermal design engineers as mesh independent simulations are rarely conducted in an industrial environment.
Resumo:
Johnson's SB and the logit-logistic are four-parameter distribution models that may be obtained from the standard normal and logistic distributions by a four-parameter transformation. For relatively small data sets, such as diameter at breast height measurements obtained from typical sample plots, distribution models with four or less parameters have been found to be empirically adequate. However, in situations in which the distributions are complex, for example in mixed stands or when the stand has been thinned or when working with aggregated data, then distribution models with more shape parameters may prove to be necessary. By replacing the symmetric standard logistic distribution of the logit-logistic with a one-parameter “standard Richards” distribution and transforming by a five-parameter Richards function, we obtain a new six-parameter distribution model, the “Richit-Richards”. The Richit-Richards includes the “logit-Richards”, the “Richit-logistic”, and the logit-logistic as submodels. Maximum likelihood estimation is used to fit the model, and some problems in the maximum likelihood estimation of bounding parameters are discussed. An empirical case study of the Richit-Richards and its submodels is conducted on pooled diameter at breast height data from 107 sample plots of Chinese fir (Cunninghamia lanceolata (Lamb.) Hook.). It is found that the new models provide significantly better fits than the four-parameter logit-logistic for large data sets.
Resumo:
Various models for predicting discharge rates have been developed over the last four decades by many research workers (notably Beverloo [1], Johanson [2], Brown [3], Carleton [4], Crewdson [5], Nedderman [6], Gu [7].). In many cases these models offer comparable approaches to the prediction of discharge rates of bulk particulates from storage equipment when solely gravity is acting to initiate flow (since they invariably consider the use of mass-flow design equipment). The models that have been developed consider a wide range of bulk particulates (coarse, incompressible, fine, cohesive) and most contemporary works have incorporated validation against test programmes. Research currently underway at The Wolfson Centre for Bulk Solids Handling Technology, University of Greenwich, has considered the relative performance of these models with respect to a range of bulk properties and with particular focus upon the flexibility of the models to cater for different geometrical factors for vessels.