958 resultados para Probabilistic generalization
Resumo:
The aim of this paper is to present an economical design of an X chart for a short-run production. The process mean starts equal to mu(0) (in-control, State I) and in a random time it shifts to mu(1) > mu(0) (out-of-control, State II). The monitoring procedure consists of inspecting a single item at every m produced ones. If the measurement of the quality characteristic does not meet the control limits, the process is stopped, adjusted, and additional (r - 1) items are inspected retrospectively. The probabilistic model was developed considering only shifts in the process mean. A direct search technique is applied to find the optimum parameters which minimizes the expected cost function. Numerical examples illustrate the proposed procedure. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
A four parameter generalization of the Weibull distribution capable of modeling a bathtub-shaped hazard rate function is defined and studied. The beauty and importance of this distribution lies in its ability to model monotone as well as non-monotone failure rates, which are quite common in lifetime problems and reliability. The new distribution has a number of well-known lifetime special sub-models, such as the Weibull, extreme value, exponentiated Weibull, generalized Rayleigh and modified Weibull distributions, among others. We derive two infinite sum representations for its moments. The density of the order statistics is obtained. The method of maximum likelihood is used for estimating the model parameters. Also, the observed information matrix is obtained. Two applications are presented to illustrate the proposed distribution. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
We study in detail the so-called beta-modified Weibull distribution, motivated by the wide use of the Weibull distribution in practice, and also for the fact that the generalization provides a continuous crossover towards cases with different shapes. The new distribution is important since it contains as special sub-models some widely-known distributions, such as the generalized modified Weibull, beta Weibull, exponentiated Weibull, beta exponential, modified Weibull and Weibull distributions, among several others. It also provides more flexibility to analyse complex real data. Various mathematical properties of this distribution are derived, including its moments and moment generating function. We examine the asymptotic distributions of the extreme values. Explicit expressions are also derived for the chf, mean deviations, Bonferroni and Lorenz curves, reliability and entropies. The estimation of parameters is approached by two methods: moments and maximum likelihood. We compare by simulation the performances of the estimates from these methods. We obtain the expected information matrix. Two applications are presented to illustrate the proposed distribution.
Resumo:
Expokit provides a set of routines aimed at computing matrix exponentials. More precisely, it computes either a small matrix exponential in full, the action of a large sparse matrix exponential on an operand vector, or the solution of a system of linear ODEs with constant inhomogeneity. The backbone of the sparse routines consists of matrix-free Krylov subspace projection methods (Arnoldi and Lanczos processes), and that is why the toolkit is capable of coping with sparse matrices of large dimension. The software handles real and complex matrices and provides specific routines for symmetric and Hermitian matrices. The computation of matrix exponentials is a numerical issue of critical importance in the area of Markov chains and furthermore, the computed solution is subject to probabilistic constraints. In addition to addressing general matrix exponentials, a distinct attention is assigned to the computation of transient states of Markov chains.
Resumo:
A generalization of the classical problem of optimal lattice covering of R-n is considered. Solutions to this generalized problem are found in two specific classes of lattices. The global optimal solution of the generalization is found for R-2. (C) 1998 Elsevier Science Inc. All rights reserved.
Resumo:
In this paper, the minimum-order stable recursive filter design problem is proposed and investigated. This problem is playing an important role in pipeline implementation sin signal processing. Here, the existence of a high-order stable recursive filter is proved theoretically, in which the upper bound for the highest order of stable filters is given. Then the minimum-order stable linear predictor is obtained via solving an optimization problem. In this paper, the popular genetic algorithm approach is adopted since it is a heuristic probabilistic optimization technique and has been widely used in engineering designs. Finally, an illustrative example is sued to show the effectiveness of the proposed algorithm.
Resumo:
We consider the electronic properties of layered molecular crystals of the type theta -D(2)A where A is an anion and D is a donor molecule such as bis-(ethylenedithia-tetrathiafulvalene) (BEDT-TTF), which is arranged in the theta -type pattern within the layers. We argue that the simplest strongly correlated electron model that can describe the rich phase diagram of these materials is the extended Hubbard model on the square lattice at one-quarter filling. In the limit where the Coulomb repulsion on a single site is large, the nearest-neighbor Coulomb repulsion V plays a crucial role. When V is much larger than the intermolecular hopping integral t the ground state is an insulator with charge ordering. In this phase antiferromagnetism arises due to a novel fourth-order superexchange process around a plaquette on the square lattice. We argue that the charge ordered phase is destroyed below a critical nonzero value V, of the order of t. Slave-boson theory is used to explicitly demonstrate this for the SU(N) generalization of the model, in the large-N limit. We also discuss the relevance of the model to the all-organic family beta-(BEDT-TTF)(2)SF5YSO3 where Y=CH2CF2, CH2, CHF.
Resumo:
This study examines the relationship between management accounting and planning profiles in Brazilian companies. The main goal is to understand the consequences of not including a fully structured management accounting scheme in the planning process. The authors conducted a field research among medium and large-sized companies, using a probabilistic sample from a population of 2281 companies. Using analytic hierarchy process (AHP) and statistical cluster analysis, the authors grouped the entities` strategic budget planning processes into five profiles, after which the authors applied statistical tests to assess the five clusters. The study concludes that poor or fully implemented strategic and budget-planning processes relate to the management accounting profiles of the Brazilian organizations studied. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
The Brazil consolidated itself as the largest world producer of sugarcane, sugar and ethanol. The creation of the Programa Nacional do Alcool - PROALCOOL and the growing use of cars with flexible motors were some of the factors that helped to motivate still more the production. Evolutions in the agricultural and industrial research did the Brazilian competitiveness in sugar and ethanol globally elevated, what is evidenced when comparing the amount produced at the country and the production costs, which turned a big one differential. Therefore, the administration of costs is of great relevance to the sugar and ethanol companies, for representing a significant rationalization in the production processes, with economy of resources and the reach of better earnings, besides reducing the operational risk pertinent at the fixed costs of production. Thus, the present work has for objective to analyze the costs structure of sugar and ethanol companies of the Center-south area of the country through an empiric-analytical study based in methodologies and concepts extracted of the costs accounting. It is verified that great part of the costs and operational expenses have variable behavior, a positive factor for the sector reducing the operational risk of the activity. The main restraint of this study is the sample of five years and 10% of the number of plants in Brazil that although they represent 30% of the national production, don`t allow the generalization of the model.
Resumo:
Objective: Existing evidence suggests that family interventions can be effective in reducing relapse rates in schizophrenia and related conditions. Despite this, such interventions are not routinely delivered in Australian mental health services. The objective of the current study is to investigate the incremental cost-effectiveness ratios (ICERs) of introducing three types of family interventions, namely: behavioural family management (BFM); behavioural intervention for families (BIF); and multiple family groups (MFG) into current mental health services in Australia. Method: The ICER of each of the family interventions is assessed from a health sector perspective, including the government, persons with schizophrenia and their families/carers using a standardized methodology. A two-stage approach is taken to the assessment of benefit. The first stage involves a quantitative analysis based on disability-adjusted life years (DALYs) averted. The second stage involves application of 'second filter' criteria (including equity, strength of evidence, feasibility and acceptability to stakeholders) to results. The robustness of results is tested using multivariate probabilistic sensitivity analysis. Results: The most cost-effective intervention, in order of magnitude, is BIF (A$8000 per DALY averted), followed by MFG (A$21 000 per DALY averted) and lastly BFM (A$28 000 per DALY averted). The inclusion of time costs makes BFM more cost-effective than MFG. Variation of discount rate has no effect on conclusions. Conclusions: All three interventions are considered 'value-for-money' within an Australian context. This conclusion needs to be tempered against the methodological challenge of converting clinical outcomes into a generic economic outcome measure (DALY). Issues surrounding the feasibility of routinely implementing such interventions need to be addressed.
Resumo:
We discuss the expectation propagation (EP) algorithm for approximate Bayesian inference using a factorizing posterior approximation. For neural network models, we use a central limit theorem argument to make EP tractable when the number of parameters is large. For two types of models, we show that EP can achieve optimal generalization performance when data are drawn from a simple distribution.
Resumo:
A comprehensive probabilistic model for simulating dendrite morphology and investigating dendritic growth kinetics during solidification has been developed, based on a modified Cellular Automaton (mCA) for microscopic modeling of nucleation, growth of crystals and solute diffusion. The mCA model numerically calculated solute redistribution both in the solid and liquid phases, the curvature of dendrite tips and the growth anisotropy. This modeling takes account of thermal, curvature and solute diffusion effects. Therefore, it can simulate microstructure formation both on the scale of the dendrite tip length. This model was then applied for simulating dendritic solidification of an Al-7%Si alloy. Both directional and equiaxed dendritic growth has been performed to investigate the growth anisotropy and cooling rate on dendrite morphology. Furthermore, the competitive growth and selection of dendritic crystals have also investigated.
Resumo:
This study of breast cancer survival is based on analysis of five-year relative survival of 38 362 cases of invasive breast cancer in New South Wales (NSW) women, incident between 1972 and 1991, with follow-up to 1992, using data from the population-based NSW Central Cancer Registry. Survival was ascertained by matching the registry file of breast cancers against NSW death certificates from 1972 to 1992, mainly by automated probabilistic linkage. Absolute survival of cases was compared with expected survival of age- and period-matched NSW women. Proportional hazard regression analysis was used for examination of the effects on excess mortality of age, period of diagnosis and degree of spread at diagnosis. Relative survival at five years increased from 70 per cent in 1972-1976 to 77 per cent in 1987-1991. Survival improved during the 1970s and in the late 1980s. Regression analysis suggested that part of the improved survival in the late 1980s was due to lesser degree of spread at diagnosis, whereas the improved survival during the 1970s may have been due to treatment. Survival was better for those aged 40-49 years (RR = 0.86) and worse for those aged greater than or equal to 70 years (RR = 1.22) compared with the referent group (60-69 years). Excess mortality was much less for those with invasive localised disease than those with regional spread (RR = 3.1) or metastatic cancer (RR = 15.5) at diagnosis. For the most recent period (1987-1991), relative five-year survival was 90, 70 and 18 per cent, respectively, for the three degree-of-spread categories.
Resumo:
In spite of considerable technical advance in MRI techniques, the optical resolution of these methods are still limited. Consequently, the delineation of cytoarchitectonic fields based on probabilistic maps and brain volume changes, as well as small-scale changes seen in MRI scans need to be verified by neuronanatomical/neuropathological diagnostic tools. To attend the current interdisciplinary needs of the scientific community, brain banks have to broaden their scope in order to provide high quality tissue suitable for neuroimaging- neuropathology/anatomy correlation studies. The Brain Bank of the Brazilian Aging Brain Research Group (BBBABSG) of the University of Sao Paulo Medical School (USPMS) collaborates with researchers interested in neuroimaging-neuropathological correlation studies providing brains submitted to postmortem MRI in-situ. In this paper we describe and discuss the parameters established by the BBBABSG to select and to handle brains for fine-scale neuroimaging-neuropathological correlation studies, and to exclude inappropriate/unsuitable autopsy brains. We tried to assess the impact of the postmortem time and storage of the corpse on the quality of the MRI scans and to establish fixation protocols that are the most appropriate to these correlation studies. After investigation of a total of 36 brains, postmortem interval and low body temperature proved to be the main factors determining the quality of routine MRI protocols. Perfusion fixation of the brains after autopsy by mannitol 20% followed by formalin 20% was the best method for preserving the original brain shape and volume, and for allowing further routine and immunohistochemical staining. Taken to together, these parameters offer a methodological progress in screening and processing of human postmortem tissue in order to guarantee high quality material for unbiased correlation studies and to avoid expenditures by post-imaging analyses and histological processing of brain tissue.
Resumo:
In this paper, we present a fuzzy approach to the Reed-Frost model for epidemic spreading taking into account uncertainties in the diagnostic of the infection. The heterogeneities in the infected group is based on the clinical signals of the individuals (symptoms, laboratorial exams, medical findings, etc.), which are incorporated into the dynamic of the epidemic. The infectivity level is time-varying and the classification of the individuals is performed through fuzzy relations. Simulations considering a real problem with data of the viral epidemic in a children daycare are performed and the results are compared with a stochastic Reed-Frost generalization.