956 resultados para Probabilistic robotics
Resumo:
This work deals with the problem of minimizing the waste of space that occurs on a rotational placement of a set of irregular bi-dimensional items inside a bi-dimensional container. This problem is approached with a heuristic based on Simulated Annealing (SA) with adaptive neighborhood. The objective function is evaluated in a constructive approach, where the items are placed sequentially. The placement is governed by three different types of parameters: sequence of placement, the rotation angle and the translation. The rotation applied and the translation of the polygon are cyclic continuous parameters, and the sequence of placement defines a combinatorial problem. This way, it is necessary to control cyclic continuous and discrete parameters. The approaches described in the literature deal with only type of parameter (sequence of placement or translation). In the proposed SA algorithm, the sensibility of each continuous parameter is evaluated at each iteration increasing the number of accepted solutions. The sensibility of each parameter is associated to its probability distribution in the definition of the next candidate.
Diagnostic errors and repetitive sequential classifications in on-line process control by attributes
Resumo:
The procedure of on-line process control by attributes, known as Taguchi`s on-line process control, consists of inspecting the mth item (a single item) at every m produced items and deciding, at each inspection, whether the fraction of conforming items was reduced or not. If the inspected item is nonconforming, the production is stopped for adjustment. As the inspection system can be subject to diagnosis errors, one develops a probabilistic model that classifies repeatedly the examined item until a conforming or b non-conforming classification is observed. The first event that occurs (a conforming classifications or b non-conforming classifications) determines the final classification of the examined item. Proprieties of an ergodic Markov chain were used to get the expression of average cost of the system of control, which can be optimized by three parameters: the sampling interval of the inspections (m); the number of repeated conforming classifications (a); and the number of repeated non-conforming classifications (b). The optimum design is compared with two alternative approaches: the first one consists of a simple preventive policy. The production system is adjusted at every n produced items (no inspection is performed). The second classifies the examined item repeatedly r (fixed) times and considers it conforming if most classification results are conforming. Results indicate that the current proposal performs better than the procedure that fixes the number of repeated classifications and classifies the examined item as conforming if most classifications were conforming. On the other hand, the preventive policy can be averagely the most economical alternative rather than those ones that require inspection depending on the degree of errors and costs. A numerical example illustrates the proposed procedure. (C) 2009 Elsevier B. V. All rights reserved.
Resumo:
Susceptible-infective-removed (SIR) models are commonly used for representing the spread of contagious diseases. A SIR model can be described in terms of a probabilistic cellular automaton (PCA), where each individual (corresponding to a cell of the PCA lattice) is connected to others by a random network favoring local contacts. Here, this framework is employed for investigating the consequences of applying vaccine against the propagation of a contagious infection, by considering vaccination as a game, in the sense of game theory. In this game, the players are the government and the susceptible newborns. In order to maximize their own payoffs, the government attempts to reduce the costs for combating the epidemic, and the newborns may be vaccinated only when infective individuals are found in their neighborhoods and/or the government promotes an immunization program. As a consequence of these strategies supported by cost-benefit analysis and perceived risk, numerical simulations show that the disease is not fully eliminated and the government implements quasi-periodic vaccination campaigns. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
There are several ways of controlling the propagation of a contagious disease. For instance, to reduce the spreading of an airborne infection, individuals can be encouraged to remain in their homes and/or to wear face masks outside their domiciles. However, when a limited amount of masks is available, who should use them: the susceptible subjects, the infective persons or both populations? Here we employ susceptible-infective-recovered (SIR) models described in terms of ordinary differential equations and probabilistic cellular automata in order to investigate how the deletion of links in the random complex network representing the social contacts among individuals affects the dynamics of a contagious disease. The inspiration for this study comes from recent discussions about the impact of measures usually recommended by health public organizations for preventing the propagation of the swine influenza A (H1N1) virus. Our answer to this question can be valid for other eco-epidemiological systems. (C) 2010 Elsevier BM. All rights reserved.
Resumo:
We study the spreading of contagious diseases in a population of constant size using susceptible-infective-recovered (SIR) models described in terms of ordinary differential equations (ODEs) and probabilistic cellular automata (PCA). In the PCA model, each individual (represented by a cell in the lattice) is mainly locally connected to others. We investigate how the topological properties of the random network representing contacts among individuals influence the transient behavior and the permanent regime of the epidemiological system described by ODE and PCA. Our main conclusions are: (1) the basic reproduction number (commonly called R(0)) related to a disease propagation in a population cannot be uniquely determined from some features of transient behavior of the infective group; (2) R(0) cannot be associated to a unique combination of clustering coefficient and average shortest path length characterizing the contact network. We discuss how these results can embarrass the specification of control strategies for combating disease propagations. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The aim of this paper is to present an economical design of an X chart for a short-run production. The process mean starts equal to mu(0) (in-control, State I) and in a random time it shifts to mu(1) > mu(0) (out-of-control, State II). The monitoring procedure consists of inspecting a single item at every m produced ones. If the measurement of the quality characteristic does not meet the control limits, the process is stopped, adjusted, and additional (r - 1) items are inspected retrospectively. The probabilistic model was developed considering only shifts in the process mean. A direct search technique is applied to find the optimum parameters which minimizes the expected cost function. Numerical examples illustrate the proposed procedure. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
In most Of the practical six-actuator in-parallel manipulators, the octahedral form is either taken as it stands or is approximated. Yet considerable theoretical attention is paid in the literature to more general forms. Here we touch on the general form, and describe some aspects of its behavior that vitiate strongly against its adoption as a pattern for a realistic manipulate,: We reach the conclusion that the structure for in-parallel manipulators must be triangulated as fully as possible, so leading to the octahedral form. In describing some of the geometrical properties of the general octahedron, we show how they apply to manipulators. We examine in detail the special configurations at which the 6 x 6 matrix of leg lines is singular presenting results from the point of view of geometry in preference to analysis. In extending and enlarging on some known properties, a few behavioral surprises materialize. In studying special configurations, we start with the most general situation, and every other case derives from this. Our coverage is more comprehensive than any that we have found. We bring to light material that is, we think, of significant use to a designer.
Resumo:
Expokit provides a set of routines aimed at computing matrix exponentials. More precisely, it computes either a small matrix exponential in full, the action of a large sparse matrix exponential on an operand vector, or the solution of a system of linear ODEs with constant inhomogeneity. The backbone of the sparse routines consists of matrix-free Krylov subspace projection methods (Arnoldi and Lanczos processes), and that is why the toolkit is capable of coping with sparse matrices of large dimension. The software handles real and complex matrices and provides specific routines for symmetric and Hermitian matrices. The computation of matrix exponentials is a numerical issue of critical importance in the area of Markov chains and furthermore, the computed solution is subject to probabilistic constraints. In addition to addressing general matrix exponentials, a distinct attention is assigned to the computation of transient states of Markov chains.
Resumo:
Standard tools for the analysis of economic problems involving uncertainty, including risk premiums, certainty equivalents and the notions of absolute and relative risk aversion, are developed without making specific assumptions on functional form beyond the basic requirements of monotonicity, transitivity, continuity, and the presumption that individuals prefer certainty to risk. Individuals are not required to display probabilistic sophistication. The approach relies on the distance and benefit functions to characterize preferences relative to a given state-contingent vector of outcomes. The distance and benefit functions are used to derive absolute and relative risk premiums and to characterize preferences exhibiting constant absolute risk aversion (CARA) and constant relative risk aversion (CRRA). A generalization of the notion of Schur-concavity is presented. If preferences are generalized Schur concave, the absolute and relative risk premiums are generalized Schur convex, and the certainty equivalents are generalized Schur concave.
Resumo:
In this paper, the minimum-order stable recursive filter design problem is proposed and investigated. This problem is playing an important role in pipeline implementation sin signal processing. Here, the existence of a high-order stable recursive filter is proved theoretically, in which the upper bound for the highest order of stable filters is given. Then the minimum-order stable linear predictor is obtained via solving an optimization problem. In this paper, the popular genetic algorithm approach is adopted since it is a heuristic probabilistic optimization technique and has been widely used in engineering designs. Finally, an illustrative example is sued to show the effectiveness of the proposed algorithm.
Resumo:
This study examines the relationship between management accounting and planning profiles in Brazilian companies. The main goal is to understand the consequences of not including a fully structured management accounting scheme in the planning process. The authors conducted a field research among medium and large-sized companies, using a probabilistic sample from a population of 2281 companies. Using analytic hierarchy process (AHP) and statistical cluster analysis, the authors grouped the entities` strategic budget planning processes into five profiles, after which the authors applied statistical tests to assess the five clusters. The study concludes that poor or fully implemented strategic and budget-planning processes relate to the management accounting profiles of the Brazilian organizations studied. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Objective: Existing evidence suggests that family interventions can be effective in reducing relapse rates in schizophrenia and related conditions. Despite this, such interventions are not routinely delivered in Australian mental health services. The objective of the current study is to investigate the incremental cost-effectiveness ratios (ICERs) of introducing three types of family interventions, namely: behavioural family management (BFM); behavioural intervention for families (BIF); and multiple family groups (MFG) into current mental health services in Australia. Method: The ICER of each of the family interventions is assessed from a health sector perspective, including the government, persons with schizophrenia and their families/carers using a standardized methodology. A two-stage approach is taken to the assessment of benefit. The first stage involves a quantitative analysis based on disability-adjusted life years (DALYs) averted. The second stage involves application of 'second filter' criteria (including equity, strength of evidence, feasibility and acceptability to stakeholders) to results. The robustness of results is tested using multivariate probabilistic sensitivity analysis. Results: The most cost-effective intervention, in order of magnitude, is BIF (A$8000 per DALY averted), followed by MFG (A$21 000 per DALY averted) and lastly BFM (A$28 000 per DALY averted). The inclusion of time costs makes BFM more cost-effective than MFG. Variation of discount rate has no effect on conclusions. Conclusions: All three interventions are considered 'value-for-money' within an Australian context. This conclusion needs to be tempered against the methodological challenge of converting clinical outcomes into a generic economic outcome measure (DALY). Issues surrounding the feasibility of routinely implementing such interventions need to be addressed.
Resumo:
A comprehensive probabilistic model for simulating dendrite morphology and investigating dendritic growth kinetics during solidification has been developed, based on a modified Cellular Automaton (mCA) for microscopic modeling of nucleation, growth of crystals and solute diffusion. The mCA model numerically calculated solute redistribution both in the solid and liquid phases, the curvature of dendrite tips and the growth anisotropy. This modeling takes account of thermal, curvature and solute diffusion effects. Therefore, it can simulate microstructure formation both on the scale of the dendrite tip length. This model was then applied for simulating dendritic solidification of an Al-7%Si alloy. Both directional and equiaxed dendritic growth has been performed to investigate the growth anisotropy and cooling rate on dendrite morphology. Furthermore, the competitive growth and selection of dendritic crystals have also investigated.
Resumo:
This study of breast cancer survival is based on analysis of five-year relative survival of 38 362 cases of invasive breast cancer in New South Wales (NSW) women, incident between 1972 and 1991, with follow-up to 1992, using data from the population-based NSW Central Cancer Registry. Survival was ascertained by matching the registry file of breast cancers against NSW death certificates from 1972 to 1992, mainly by automated probabilistic linkage. Absolute survival of cases was compared with expected survival of age- and period-matched NSW women. Proportional hazard regression analysis was used for examination of the effects on excess mortality of age, period of diagnosis and degree of spread at diagnosis. Relative survival at five years increased from 70 per cent in 1972-1976 to 77 per cent in 1987-1991. Survival improved during the 1970s and in the late 1980s. Regression analysis suggested that part of the improved survival in the late 1980s was due to lesser degree of spread at diagnosis, whereas the improved survival during the 1970s may have been due to treatment. Survival was better for those aged 40-49 years (RR = 0.86) and worse for those aged greater than or equal to 70 years (RR = 1.22) compared with the referent group (60-69 years). Excess mortality was much less for those with invasive localised disease than those with regional spread (RR = 3.1) or metastatic cancer (RR = 15.5) at diagnosis. For the most recent period (1987-1991), relative five-year survival was 90, 70 and 18 per cent, respectively, for the three degree-of-spread categories.
Resumo:
In spite of considerable technical advance in MRI techniques, the optical resolution of these methods are still limited. Consequently, the delineation of cytoarchitectonic fields based on probabilistic maps and brain volume changes, as well as small-scale changes seen in MRI scans need to be verified by neuronanatomical/neuropathological diagnostic tools. To attend the current interdisciplinary needs of the scientific community, brain banks have to broaden their scope in order to provide high quality tissue suitable for neuroimaging- neuropathology/anatomy correlation studies. The Brain Bank of the Brazilian Aging Brain Research Group (BBBABSG) of the University of Sao Paulo Medical School (USPMS) collaborates with researchers interested in neuroimaging-neuropathological correlation studies providing brains submitted to postmortem MRI in-situ. In this paper we describe and discuss the parameters established by the BBBABSG to select and to handle brains for fine-scale neuroimaging-neuropathological correlation studies, and to exclude inappropriate/unsuitable autopsy brains. We tried to assess the impact of the postmortem time and storage of the corpse on the quality of the MRI scans and to establish fixation protocols that are the most appropriate to these correlation studies. After investigation of a total of 36 brains, postmortem interval and low body temperature proved to be the main factors determining the quality of routine MRI protocols. Perfusion fixation of the brains after autopsy by mannitol 20% followed by formalin 20% was the best method for preserving the original brain shape and volume, and for allowing further routine and immunohistochemical staining. Taken to together, these parameters offer a methodological progress in screening and processing of human postmortem tissue in order to guarantee high quality material for unbiased correlation studies and to avoid expenditures by post-imaging analyses and histological processing of brain tissue.