938 resultados para optimal-stocking model
Resumo:
Almost all leprosy cases reported in industrialized countries occur amongst immigrants or refugees from developing countries where leprosy continues to be an important health issue. Screening for leprosy is an important question for governments in countries with immigration and refugee programmes. A decision analysis framework is used to evaluate leprosy screening. The analysis uses a set of criteria and parameters regarding leprosy screening, and available data to estimate the number of cases which would be detected by a leprosy screening programme of immigrants from countries with different leprosy prevalences, compared with a policy of waiting for immigrants who develop symptomatic clinical diseases to present for health care. In a cohort of 100,000 immigrants from high leprosy prevalence regions (3.6/10,000), screening would detect 32 of the 42 cases which would arise in the destination country over the 14 years after migration; from medium prevalence areas (0.7/10,000) 6.3 of the total 8.1 cases would be detected, and from low prevalence regions (0.2/10,600) 1.8 of 2.3 cases. Using Australian data, the migrant mix would produce 74 leprosy cases from 10 years intake; screening would detect 54, and 19 would be diagnosed subsequently after migration. Screening would only produce significant case-yield amongst immigrants from regions or social groups with high leprosy prevalence. Since the number of immigrants to Australia from countries of higher endemnicity is not large routine leprosy screening would have a small impact on case incidence.
Resumo:
We model a buyer who wishes to combine objects owned by two separate sellers in order to realize higher value. Sellers are able to avoid entering into negotiations with the buyer, so that the order in which they negotiate is endogenous. Holdout occurs if at least one of the sellers is not present in the first round of negotiations. We demonstrate that complementarity of the buyer's technology is a necessary condition for equilibrium holdout. Moreover, a rise in complementarity leads to an increased likelihood of holdout, and an increased efficiency loss. Applications include patents, the land assembly problem, and mergers.
Resumo:
Purpose - Using Brandenburger and Nalebuff`s 1995 co-opetition model as a reference, the purpose of this paper is to seek to develop a tool that, based on the tenets of classical game theory, would enable scholars and managers to identify which games may be played in response to the different conflict of interest situations faced by companies in their business environments. Design/methodology/approach - The literature on game theory and business strategy are reviewed and a conceptual model, the strategic games matrix (SGM), is developed. Two novel games are described and modeled. Findings - The co-opetition model is not sufficient to realistically represent most of the conflict of interest situations faced by companies. It seeks to address this problem through development of the SGM, which expands upon Brandenburger and Nalebuff`s model by providing a broader perspective, through incorporation of an additional dimension (power ratio between players) and three novel, respectively, (rival, individualistic, and associative). Practical implications - This proposed model, based on the concepts of game theory, should be used to train decision- and policy-makers to better understand, interpret and formulate conflict management strategies. Originality/value - A practical and original tool to use game models in conflict of interest situations is generated. Basic classical games, such as Nash, Stackelberg, Pareto, and Minimax, are mapped on the SGM to suggest in which situations they Could be useful. Two innovative games are described to fit four different types of conflict situations that so far have no corresponding game in the literature. A test application of the SGM to a classic Intel Corporation strategic management case, in the complex personal computer industry, shows that the proposed method is able to describe, to interpret, to analyze, and to prescribe optimal competitive and/or cooperative strategies for each conflict of interest situation.
Resumo:
Starting with an initial price vector, prices are adjusted in order to eliminate the excess demand and at the same time to keep the transfers to the sellers as low as possible. In each step of the auction, to which set of sellers should those transfers be made is the key issue in the description of the algorithm. We assume additively separable utilities and introduce a novel distinction by considering multiple sellers owing multiple identical objects and multiple buyers with an exogenously defined quota, consuming more than one object but at most one unit of a seller`s good and having multi-dimensional payoffs. This distinction induces a necessarily more complicated construction of the over-demanded sets than the constructions of these sets for the other assignment games. For this approach, our mechanism yields the buyer-optimal competitive equilibrium payoff, which equals the buyer-optimal stable payoff. The symmetry of the model allows to getting the seller-optimal stable payoff and the seller-optimal competitive equilibrium payoff can then be also derived.
Resumo:
Certification of an ISO 14001 Environmental Management System (EMS) is currently an important requirement for those enterprises wishing to sell their products in the context of a global market. The system`s structure is based on environmental impact evaluation (EIE). However, if an erroneous or inadequate methodology is applied, the entire process may be jeopardized. Many methodologies have been developed for making of EIEs, some of them are fairly complex and unsuitable for EMS implementation in an organizational context, principally when small and medium size enterprises (SMEs) are involved. The proposed methodology for EIE is part of a model for implementing EMS. The methodological approach used was a qualitative exploratory research method based upon sources of evidence such as document analyses, semi-structured interviews and participant observations. By adopting a cooperative implementation model based on the theory of system engineering, difficulties relating to implementation of the sub-system were overcome thus encouraging SMEs to implement EMS. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Reasons for performing study: Light microscopical studies show that the key lesion of laminitis is separation at the hoof lamellar dermal-epidermal interface. More precise knowledge of the damage occurring in the lamellar basement membrane zone may result if laminitis affected tissue is examined with the transmission electron microscope. This could lead to better understanding of the pathogenesis of lesions and the means of treatment or prevention. Objectives: To investigate the ultrastructure of acute laminitis as disease of greater severity is induced by increasing oligofructose (OF) dosage. Methods: Three pairs of normal horses, dosed with OF at 7.5, 10 and 12.5 g/kg bwt via nasogastric intubation, developed laminitis 48 h later. Following euthanasia, their forefeet were processed for transmission electron microscopy. Lamellar basal cell hemidesmosome (HD) numbers and the distance between the basal cell plasmalemma and the lamina densa of the basement membrane were estimated and compared to control tissue. Results: Increasing OF dosage caused greater HD loss and more severe laminitis. The characteristic separation of the basement membrane, cytoskeleton failure and rounded basal cell nuclei results from combined HD dysassembly and anchoring filament failure. Conclusions: Without properly assembled HDs, dysadhesion between the lamina densa of the basement membrane (BM) and epidermal basal cells occurs, emphasising the fundamental importance of HDs in maintaining attachment at the lamellar interface. Medical conditions that trigger lamellar matrix metalloproteinase (MMP) activation and/or compromise entry of glucose into lamellar basal cells appear to promote loss and failure of HDs and, therefore, laminitis development. Potential relevance: A correlation between lameness severity and escalating loss of lamellar HDs now exists. Therapy aimed at protecting the lamellar environment from haematogenous delivery of MMP activators or from glucose deprivation may control laminitis development.
Resumo:
This article presents a proposal of a systemic model composed for the micro and small companies (MSE) of the region of Ribeiro Preto and the agents which influenced their environment. The proposed model was based on Stafford Beer`s (Diagnosing the system for organizations. Chichester, Wiley, 1985) systemic methodologies VSM (Viable System Model) and on Werner Ulrich`s (1983) CSH (Critical Systems Heuristics). The VSM is a model for the diagnosis of the structure of an organization and of its flows of information through the application of the cybernetics concepts (Narvarte, In El Modelo del Sistema Viable-MSV: experiencias de su aplicacin en Chile. Proyecto Cerebro Colectivo del IAS, Santiago, 2001). On the other hand, CSH focus on the context of the social group applied to the systemic vision as a counterpoint to the organizational management view considered by the VSM. MSE of Ribeiro Preto and Sertozinho had been analyzed as organizations inserted in systems that relate and integrate with other systems concerning the public administration, entities of representation and promotion agencies. The research questions: which are the bonds of interaction among the subsystems in this process and who are the agents involved? The systemic approach not only diagnosed a social group, formed by MSE of Ribeiro Preto and Sertozinho, public authorities and support entities, but could also delineate answers that aimed the clarification of obscure questions generating financial assistance to the formularization of efficient actions for the development of this system.
Resumo:
No abstract
Resumo:
We develop a forward-looking version of the recursive dynamic MIT Emissions Prediction and Policy Analysis (EPPA) model, and apply it to examine the economic implications of proposals in the US Congress to limit greenhouse gas (GHG) emissions. We find that shocks in the consumption path are smoothed out in the forward-looking model and that the lifetime welfare cost of GHG policy is lower than in the recursive model, since the forward-looking model can fully optimize over time. The forward-looking model allows us to explore issues for which it is uniquely well suited, including revenue-recycling and early action crediting. We find capital tax recycling to be more welfare-cost reducing than labor tax recycling because of its long-term effect on economic growth. Also, there are substantial incentives for early action credits; however, when spread over the full horizon of the policy they do not have a substantial effect on lifetime welfare costs.
Resumo:
This paper develops a multi-regional general equilibrium model for climate policy analysis based on the latest version of the MIT Emissions Prediction and Policy Analysis (EPPA) model. We develop two versions so that we can solve the model either as a fully inter-temporal optimization problem (forward-looking, perfect foresight) or recursively. The standard EPPA model on which these models are based is solved recursively, and it is necessary to simplify some aspects of it to make inter-temporal solution possible. The forward-looking capability allows one to better address economic and policy issues such as borrowing and banking of GHG allowances, efficiency implications of environmental tax recycling, endogenous depletion of fossil resources, international capital flows, and optimal emissions abatement paths among others. To evaluate the solution approaches, we benchmark each version to the same macroeconomic path, and then compare the behavior of the two versions under a climate policy that restricts greenhouse gas emissions. We find that the energy sector and CO(2) price behavior are similar in both versions (in the recursive version of the model we force the inter-temporal theoretical efficiency result that abatement through time should be allocated such that the CO(2) price rises at the interest rate.) The main difference that arises is that the macroeconomic costs are substantially lower in the forward-looking version of the model, since it allows consumption shifting as an additional avenue of adjustment to the policy. On the other hand, the simplifications required for solving the model as an optimization problem, such as dropping the full vintaging of the capital stock and fewer explicit technological options, likely have effects on the results. Moreover, inter-temporal optimization with perfect foresight poorly represents the real economy where agents face high levels of uncertainty that likely lead to higher costs than if they knew the future with certainty. We conclude that while the forward-looking model has value for some problems, the recursive model produces similar behavior in the energy sector and provides greater flexibility in the details of the system that can be represented. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
We present an electronic model with long range interactions. Through the quantum inverse scattering method, integrability of the model is established using a one-parameter family of typical irreducible representations of gl(211). The eigenvalues of the conserved operators are derived in terms of the Bethe ansatz, from which the energy eigenvalues of the Hamiltonian are obtained.
Resumo:
Most biogeographical studies propose that southern temperate faunal disjunctions are either the result of vicariance of taxa originated in Gondwana or the result of transoceanic dispersal of taxa originated after the breakup of Gondwana. The aim of this paper is to show that this is a false dichotomy. Antarctica retained a mild climate until mid-Cenozoic and had lasting connections, notably with southern South America and Australia. Both taxa originally Gondwanan and taxa secondarily on Gondwanan areas were subjected to tectonic-induced vicariance, and there is no need to invoke ad hoc transoceanic dispersal, even for post-Gondwanan taxa. These different elements with circumantarctic distributions are here called `allochronic taxa` - taxa presently occupying the same area, but whose presence in that area does not belong to the same time period. This model allows accommodation of conflicting sources of evidence now available for many groups with circumantarctic distributions. The fact that the species from both layers are mixed up in the current biodiversity implies the need to use additional sources of evidence - such as biogeographical, palaeontological, geological and molecular - to discriminate which are the original Gondwanan and which are post-Gondwanan elements in austral landmasses.
Resumo:
In this paper, we propose a new nonlocal density functional theory characterization procedure, the finite wall thickness model, for nanoporous carbons, whereby heterogeneity of pore size and pore walls in the carbon is probed simultaneously. We determine the pore size distributions and pore wall thickness distributions of several commercial activated carbons and coal chars, with good correspondence with X-ray diffraction. It is shown that the conventional infinite wall thickness approach overestimates the pore size slightly. Pore-pore correlation has been shown to have a negligible effect on prediction of pore size and pore wall thickness distributions for small molecules such as argon used in characterization. By utilizing the structural parameters (pore size and pore wall thickness distribution) in the generalized adsorption isotherm (GAI) we are able to predict adsorption uptake of supercritical gases in BPL and Norit RI Extra carbons, in excellent agreement with experimental adsorption uptake data up to 60 MPa. The method offers a useful technique for probing features of the solid skeleton, hitherto studied by crystallographic methods.
Resumo:
This paper presents a method for estimating the posterior probability density of the cointegrating rank of a multivariate error correction model. A second contribution is the careful elicitation of the prior for the cointegrating vectors derived from a prior on the cointegrating space. This prior obtains naturally from treating the cointegrating space as the parameter of interest in inference and overcomes problems previously encountered in Bayesian cointegration analysis. Using this new prior and Laplace approximation, an estimator for the posterior probability of the rank is given. The approach performs well compared with information criteria in Monte Carlo experiments. (C) 2003 Elsevier B.V. All rights reserved.