900 resultados para Markov Chains
Resumo:
We address the problem of heat transport in a chain of coupled quantum harmonic oscillators, exposed to the influences of local environments of various nature, stressing the effects that the specific nature of the environment has on the phenomenology of the transport process. We study in detail the behavior of thermodynamically relevant quantities such as heat currents and mean energies of the oscillators, establishing rigorous analytical conditions for the existence of a steady state, whose features we analyze carefully. In particular, we assess the conditions that should be faced to recover trends reminiscent of the classical Fourier law of heat conduction and highlight how such a possibility depends on the environment linked to our system.
Resumo:
This paper proposes a continuous time Markov chain (CTMC) based sequential analytical approach for composite generation and transmission systems reliability assessment. The basic idea is to construct a CTMC model for the composite system. Based on this model, sequential analyses are performed. Various kinds of reliability indices can be obtained, including expectation, variance, frequency, duration and probability distribution. In order to reduce the dimension of the state space, traditional CTMC modeling approach is modified by merging all high order contingencies into a single state, which can be calculated by Monte Carlo simulation (MCS). Then a state mergence technique is developed to integrate all normal states to further reduce the dimension of the CTMC model. Moreover, a time discretization method is presented for the CTMC model calculation. Case studies are performed on the RBTS and a modified IEEE 300-bus test system. The results indicate that sequential reliability assessment can be performed by the proposed approach. Comparing with the traditional sequential Monte Carlo simulation method, the proposed method is more efficient, especially in small scale or very reliable power systems.
Resumo:
On 25 April 1998 part of the tailings pond dike of the Aznalcollar Zn mine north of the Guadalquivir marshes (Donana) in southern Spain collapsed releasing an estimated 5 million m3 of acidic metal-rich waste. This event contaminated farmland and wetland up to >40 km downstream, including the 900-ha 'Entremuros', an important area for birds within the Donana world heritage site. In spite of the contamination, birds continued to feed in this area. Samples of two abundant macrophytes (Typha dominguensis and Scirpus maritimus) were taken from the Entremuros and nearby uncontaminated areas; these plants are important food items for several bird species. Analyses showed that in the Entremuros mean plant tissue concentrations of Cd were 3-40-fold (0.8-7.4 ppm) and Zn 20-100-fold (20-3384 ppm) greater than those from control areas. Comparable dietary concentrations of Zn have been reported to cause severe physiological damage to aquatic birds under experimental conditions. Elevated Cd concentrations are of concern as Cd bioconcentrates and is a cumulative poison. Metals released in this accident are moving into this food-chain and present a considerable risk to species feeding on Typha sp. and Scirpus sp. Many other food-webs exist in this area and require detailed examination to identify the species at risk, and to facilitate the management of these risks to minimise future impacts to the wildlife of Donana. Copyright (C) 1999 Elsevier Science Ltd.
Resumo:
Quality of care is an important aspect of healthcare monitoring, which is used to ensure that the healthcare system is delivering care of the highest standard. With populations growing older there is an increased urgency in making sure that the healthcare delivered is of the highest standard. Healthcare providers are under increased pressure to ensure that this is the case with public and government demand expecting a healthcare system of the highest quality. Modelling quality of care is difficult to measure due to the many ways of defining it. This paper introduces a potential model which could be used to take quality of care into account when modelling length of stay. The Coxian phase-type distribution is used to model length of stay and the associated quality of care incorporated into the Coxian using a Hidden Markov model. Covariates are also introduced to determine their impact on the hidden level to find out what potentially can affect quality of care. This model is applied to geriatic patient data from the Lombardy region of Italy. The results obtained highlighted that bed numbers and the type of hospital (public or private) can have an effect on the quality of care delivered.
Resumo:
The desire for more robust supply chains has led to a growth in theoretical and practical interest in the application of both preventive and impact reductive disturbance management principles. This application should ultimately lead to less vulnerable and more competitive supply chains. Based on the extant literature we identify fresh food supply chain's contextual factors (products, processes, supply chain networks and supply chain business environment) and their corresponding characteristics, as well as the main disturbance management principles used. To analyse their influence on the selection and application of disturbance management principles in fresh food supply chains we conducted three case studies. In each case we collected data on the relevant contextual factors, disturbance management principles applied and company background. As an underlying methodology, we first conduct within-case analysis and then expand the analyses to a cross-case context. Based on the findings from these case studies, propositions are built concerning the nature of contextual factors and their characteristics, and their influence on the selection and application of disturbance management principles in fresh food supply chains. Our main findings are related to the identification of contextual characteristics of fresh food supply chains that are either critical vulnerability sources, critical enablers or conditionals and as such require, facilitate or condition selection and application of disturbance management principles.
Resumo:
Hidden Markov models (HMMs) are widely used probabilistic models of sequential data. As with other probabilistic models, they require the specification of local conditional probability distributions, whose assessment can be too difficult and error-prone, especially when data are scarce or costly to acquire. The imprecise HMM (iHMM) generalizes HMMs by allowing the quantification to be done by sets of, instead of single, probability distributions. iHMMs have the ability to suspend judgment when there is not enough statistical evidence, and can serve as a sensitivity analysis tool for standard non-stationary HMMs. In this paper, we consider iHMMs under the strong independence interpretation, for which we develop efficient inference algorithms to address standard HMM usage such as the computation of likelihoods and most probable explanations, as well as performing filtering and predictive inference. Experiments with real data show that iHMMs produce more reliable inferences without compromising the computational efficiency.
Resumo:
In this paper, a novel and effective lip-based biometric identification approach with the Discrete Hidden Markov Model Kernel (DHMMK) is developed. Lips are described by shape features (both geometrical and sequential) on two different grid layouts: rectangular and polar. These features are then specifically modeled by a DHMMK, and learnt by a support vector machine classifier. Our experiments are carried out in a ten-fold cross validation fashion on three different datasets, GPDS-ULPGC Face Dataset, PIE Face Dataset and RaFD Face Dataset. Results show that our approach has achieved an average classification accuracy of 99.8%, 97.13%, and 98.10%, using only two training images per class, on these three datasets, respectively. Our comparative studies further show that the DHMMK achieved a 53% improvement against the baseline HMM approach. The comparative ROC curves also confirm the efficacy of the proposed lip contour based biometrics learned by DHMMK. We also show that the performance of linear and RBF SVM is comparable under the frame work of DHMMK.
Resumo:
Markov Decision Processes (MDPs) are extensively used to encode sequences of decisions with probabilistic effects. Markov Decision Processes with Imprecise Probabilities (MDPIPs) encode sequences of decisions whose effects are modeled using sets of probability distributions. In this paper we examine the computation of Γ-maximin policies for MDPIPs using multilinear and integer programming. We discuss the application of our algorithms to “factored” models and to a recent proposal, Markov Decision Processes with Set-valued Transitions (MDPSTs), that unifies the fields of probabilistic and “nondeterministic” planning in artificial intelligence research.
Resumo:
During the late twentieth century the supply chains for gold were considered by the majority of consumers (when they were considered at all) to be driven by simple commercial imperatives. That notion was shattered during the first decade of the twenty-first century by the appearance of ethical campaigns, led by advocates determined to present major players in the gold industry as morally reprehensible. The ‘No Dirty Gold’ campaign sought to shift the purchasing of gold onto a moral register, in order to challenge the activities of large mining corporations. It was followed by the Fairtrade Foundation’s ‘Fairtrade Gold’ initiative, which had aspirations to support subsistence mining communities at the expense of big business. By directly targeting a luxury material and playing on its inherent social ambiguities, campaigners hoped to thoroughly moralise the purchasing of gold objects. Dr Oakley’s presentation will examine the forces behind this developing social phenomenon, describe the trajectories of a selection of major campaigns, and consider the extent to which these have impacted on public attitudes, gold miners and the actions of consumers, producers and retailers of luxury goods.
Resumo:
In the hotel industry, undistributed operating expenses represent a significant portion of the operating costs for a hotel. Exactly how most of these expenses arise is not well understood. Using data from more than 40 hotels operated by a major chain, the authors examine the links between the variety of a hotel’s products and customers and its undistributed operating expenses and revenues. Their findings show that undistributed operating expenses are related to the extent of the property’s business and product-services mix. The results suggest that although increasing a property's product-service mix results in higher undistributed operating expenses, the incremental costs are compensated for by higher revenues. However, increasing business mix while increasing undistributed operating expenses does not result in higher revenues.
Resumo:
The paper addresses the transport activities and associated energy consumption involved in the production and supply of two products: jeans and yoghurt. In the case of jeans, the analysis is from the locations in which cotton is grown, to retail outlets in the UK; in the case of yoghurt, the analysis is from the supply of milk on farms, to retail outlets in France. The results show that the transport stages from the point of jeans manufacture to UK port are responsible for the greatest proportion of transport energy use per kilogram of jeans in the UK supply chain. In the case of the French yoghurt supply chains, the results indicate that each of the three transport stages from farm to third-party distribution centre consume approximately the same proportion of total freight transport energy. The energy used on the transport stage for yoghurt from third-party distribution centre to retail outlet varies depending on the type of retail outlet served. Far greater quantities of energy are used in transporting jeans than yoghurts from farm/field to retail outlet. This is explained by the distances involved in the respective supply chains. Both case studies demonstrate that the energy used by consumers transporting goods to their homes by car can be as great as total freight transport energy used in the supply chain from farm/field to retail outlet (per kilogram of product transported).
Resumo:
An increasing number of producers, retailers and third-party logistics providers are interested in carrying out energy assessments of their product supply chain. This is due to sensitivity about climate change and carbon emissions, but also to high energy prices. This paper presents an analytical approach developed to measure energy use in logistics activities in product supply chains. The approach (based on the Life Cycle Approach) quantifies energy use in transport and logistics activities at all stages of a product supply chain. The work has demonstrated that such an assessment approach based on the supply chain is useful in comparing the energy use implications of different strategies. This supply chain approach can be used to consider options such as sourcing and distribution centre locations, transport modes, road freight vehicle types and weights, vehicle load factors, empty running, transport distance and the balance between consumer shopping trips and delivery to the home.