64 resultados para Dental fixed architecture
Resumo:
Many revenue management (RM) industries are characterized by (a) fixed capacities in theshort term (e.g., hotel rooms, seats on an airline flight), (b) homogeneous products (e.g., twoairline flights between the same cities at similar times), and (c) customer purchasing decisionslargely influenced by price. Competition in these industries is also very high even with just twoor three direct competitors in a market. However, RM competition is not well understood andpractically all known implementations of RM software and most published models of RM donot explicitly model competition. For this reason, there has been considerable recent interestand research activity to understand RM competition. In this paper we study price competitionfor an oligopoly in a dynamic setting, where each of the sellers has a fixed number of unitsavailable for sale over a fixed number of periods. Demand is stochastic, and depending on howit evolves, sellers may change their prices at any time. This reflects the fact that firms constantly,and almost costlessly, change their prices (alternately, allocations at a price in quantity-basedRM), reacting either to updates in their estimates of market demand, competitor prices, orinventory levels. We first prove existence of a unique subgame-perfect equilibrium for a duopoly.In equilibrium, in each state sellers engage in Bertrand competition, so that the seller withthe lowest reservation value ends up selling a unit at a price that is equal to the equilibriumreservation value of the competitor. This structure hence extends the marginal-value conceptof bid-price control, used in many RM implementations, to a competitive model. In addition,we show that the seller with the lowest capacity sells all its units first. Furthermore, we extendthe results transparently to n firms and perform a number of numerical comparative staticsexploiting the uniqueness of the subgame-perfect equilibrium.
Resumo:
In the mid-1980s, many European countries introduced fixed-term contracts.Since then their labor markets have become more dynamic. This paper studiesthe implications of such reforms for the duration distribution ofunemployment, with particular emphasis on the changes in the durationdependence. I estimate a parametric duration model using cross-sectionaldata drawn from the Spanish Labor Force Survey from 1980 to 1994 to analyzethe chances of leaving unemployment before and after the introduction offixed-term contracts. I find that duration dependence has increased sincesuch reform. Semi-parametric estimation of the model also shows that forlong spells, the probability of leaving unemployment has decreased sincesuch reform.
Resumo:
Most methods for small-area estimation are based on composite estimators derived from design- or model-based methods. A composite estimator is a linear combination of a direct and an indirect estimator with weights that usually depend on unknown parameters which need to be estimated. Although model-based small-area estimators are usually based on random-effects models, the assumption of fixed effects is at face value more appropriate.Model-based estimators are justified by the assumption of random (interchangeable) area effects; in practice, however, areas are not interchangeable. In the present paper we empirically assess the quality of several small-area estimators in the setting in which the area effects are treated as fixed. We consider two settings: one that draws samples from a theoretical population, and another that draws samples from an empirical population of a labor force register maintained by the National Institute of Social Security (NISS) of Catalonia. We distinguish two types of composite estimators: a) those that use weights that involve area specific estimates of bias and variance; and, b) those that use weights that involve a common variance and a common squared bias estimate for all the areas. We assess their precision and discuss alternatives to optimizing composite estimation in applications.
Resumo:
We propose a stylized model of a problem-solving organization whoseinternal communication structure is given by a fixed network. Problemsarrive randomly anywhere in this network and must find their way to theirrespective specialized solvers by relying on local information alone.The organization handles multiple problems simultaneously. For this reason,the process may be subject to congestion. We provide a characterization ofthe threshold of collapse of the network and of the stock of foatingproblems (or average delay) that prevails below that threshold. We buildupon this characterization to address a design problem: the determinationof what kind of network architecture optimizes performance for any givenproblem arrival rate. We conclude that, for low arrival rates, the optimalnetwork is very polarized (i.e. star-like or centralized ), whereas it islargely homogenous (or decentralized ) for high arrival rates. We also showthat, if an auxiliary assumption holds, the transition between these twoopposite structures is sharp and they are the only ones to ever qualify asoptimal.
Resumo:
This paper studies the interactions between financing constraints and theemployment decisions of firms when both fixed-term and permanent employmentcontracts are available. We first develop a dynamic model that shows theeffects of financing constraints and firing costs on employment decisions. Oncecalibrated, the model shows that financially constrained firms tend to use moreintensely fixed term workers, and to make them absorb a larger fraction of thetotal employment volatility than financially unconstrained firms do. We testand confirm the predictions of the model on a unique panel data of Italian manufacturingfirms with detailed information about the type of workers employedby the firms and about firm financing constraints.
Resumo:
This paper proposes a common and tractable framework for analyzingdifferent definitions of fixed and random effects in a contant-slopevariable-intercept model. It is shown that, regardless of whethereffects (i) are treated as parameters or as an error term, (ii) areestimated in different stages of a hierarchical model, or whether (iii)correlation between effects and regressors is allowed, when the sameinformation on effects is introduced into all estimation methods, theresulting slope estimator is also the same across methods. If differentmethods produce different results, it is ultimately because differentinformation is being used for each methods.
Resumo:
The European Space Agency Soil Moisture andOcean Salinity (SMOS) mission aims at obtaining global maps ofsoil moisture and sea surface salinity from space for large-scale andclimatic studies. It uses an L-band (1400–1427 MHz) MicrowaveInterferometric Radiometer by Aperture Synthesis to measurebrightness temperature of the earth’s surface at horizontal andvertical polarizations ( h and v). These two parameters will beused together to retrieve the geophysical parameters. The retrievalof salinity is a complex process that requires the knowledge ofother environmental information and an accurate processing ofthe radiometer measurements. Here, we present recent resultsobtained from several studies and field experiments that were partof the SMOS mission, and highlight the issues still to be solved.
Resumo:
null
Resumo:
A stochastic nonlinear partial differential equation is constructed for two different models exhibiting self-organized criticality: the Bak-Tang-Wiesenfeld (BTW) sandpile model [Phys. Rev. Lett. 59, 381 (1987); Phys. Rev. A 38, 364 (1988)] and the Zhang model [Phys. Rev. Lett. 63, 470 (1989)]. The dynamic renormalization group (DRG) enables one to compute the critical exponents. However, the nontrivial stable fixed point of the DRG transformation is unreachable for the original parameters of the models. We introduce an alternative regularization of the step function involved in the threshold condition, which breaks the symmetry of the BTW model. Although the symmetry properties of the two models are different, it is shown that they both belong to the same universality class. In this case the DRG procedure leads to a symmetric behavior for both models, restoring the broken symmetry, and makes accessible the nontrivial fixed point. This technique could also be applied to other problems with threshold dynamics.
Resumo:
Nuestro trabajo pretende dar unas ideas básicas para el manejo de pacientes sometidos a tratamiento anticoagulante dando a conocer las posibles pautas a seguir respecto a su medicación: mantener intacta la terapia anticoagulante, modificar la terapia anticoagulante o realizar tratamiento combinado de heparina con anticoagulantes orales. La decisión deberá tomarse conjuntamente con el hematólogo responsable del paciente valorando cada caso según la morbilidad intrínseca de la intervención, el riesgo de tromboembolismo en función de la patología de base que presente el paciente y el grado de coagulación determinado mediante estudios de laboratorio (tiempo de protrombina e índice de trombotest).
Resumo:
Las obturaciones dentales de amalgama constituyen la fuente principal de exposición permanente de bajo nivel al vapor de mercurio (Hg°) y al mercurio inorgánico (Hg(II)) para la población general. La dosis de mercurio absorbido procedente de la amalgama es de 2.7 ¿g/día/persona para una cantidad promedio de 7.4 obturaciones. Si esta cantidad consistiera enteramente en mercurio inorgánico (Hg(II)), estaría muy por debajo de la cifra de 15 ¿g/día para una persona de 65 kg que la OMS considera como ingesta tolerable de mercurio inorgánico. En el caso de una exposición permanente a la misma cantidad, pero de vapor de mercurio (Hg0), se obtendría una concentración de 0.18 mg/m3 que puede compararse con la concentración de referencia de la EPA de 0.3 mg/m3 o con el nivel de riesgo mínimo de la ATSDR de 0.2 mg/m3. Varios estudios clínicos longitudinales y aleatorizados han evaluado la relación entre la concentración urinaria de mercurio y la exposición al mercurio procedente de las obturaciones dentales de amalgama en niños, particularmente vulnerables al Hg0, así como los posibles efectos neurológicos de tal exposición. La concentración promedio de mercurio en orina en los niños tratados con amalgama, con un promedio de 18,7 superficies obturadas, aumentó hasta un pico de 3.2 ¿g/L a los 2 años de iniciado el tratamiento y a los 7 años de seguimiento había descendido hasta los niveles basales y no se detectó ninguna alteración en las distintas exploraciones de monitorización neuropsicológica. De la misma manera, diversas investigaciones epidemiológicas no han aportado ninguna evidencia del papel de la amalgama en la posible causa o exacerbación de trastornos degenerativos como la esclerosis lateral amiotrófica, la enfermedad de Alzheimer, la esclerosis múltiple o el Parkinson. La extracción de las obturaciones de amalgama produce un aumento transitorio de los niveles de mercurio en sangre inmediatamente después de extraer las obturaciones de amalgama, pero de pequeña magnitud y que se normaliza a los 100 días, por lo que el efecto del dique de goma tiene una relevancia toxicológica menor. La conclusión de esta revisión es que la amalgama dental continúa siendo un excelente material de obturación.
Resumo:
Se presenta en este artículo una revisión del riesgo de transmisión de enfermedades infecciosas en la clínica dental. Las hepatitis víricas, en especial la hepatitis B y la C, la infección por el virus de la inmunodeficiencia humana, la tuberculosis, y otras enfermedades infecciosas pueden ser potencialmente transmitidas en el ejercicio de la profesión, tanto a los pacientes como a los profesionales. El conocimiento de la probabilidad de transmisión y sus características son la base sobre la que desarrollarán las medidas preventivas de control de infección que intentan evitar o por lo menos minimizar la probabilidad de adquirir estas enfermedades en el ámbito laboral.
Resumo:
This work proposes a parallel architecture for a motion estimation algorithm. It is well known that image processing requires a huge amount of computation, mainly at low level processing where the algorithms are dealing with a great numbers of data-pixel. One of the solutions to estimate motions involves detection of the correspondences between two images. Due to its regular processing scheme, parallel implementation of correspondence problem can be an adequate approach to reduce the computation time. This work introduces parallel and real-time implementation of such low-level tasks to be carried out from the moment that the current image is acquired by the camera until the pairs of point-matchings are detected