929 resultados para Probabilistic Error Correction
Resumo:
This work studies the combination of safe and probabilistic reasoning through the hybridization of Monte Carlo integration techniques with continuous constraint programming. In continuous constraint programming there are variables ranging over continuous domains (represented as intervals) together with constraints over them (relations between variables) and the goal is to find values for those variables that satisfy all the constraints (consistent scenarios). Constraint programming “branch-and-prune” algorithms produce safe enclosures of all consistent scenarios. Special proposed algorithms for probabilistic constraint reasoning compute the probability of sets of consistent scenarios which imply the calculation of an integral over these sets (quadrature). In this work we propose to extend the “branch-and-prune” algorithms with Monte Carlo integration techniques to compute such probabilities. This approach can be useful in robotics for localization problems. Traditional approaches are based on probabilistic techniques that search the most likely scenario, which may not satisfy the model constraints. We show how to apply our approach in order to cope with this problem and provide functionality in real time.
Resumo:
O erro pode ser um sintoma da evolução do aluno, no entanto, a sua correção, quando mal gerida, pode provocar o efeito contrário, levando o erro a fossilizar-se. Nem todos os alunos apreciam a correção: enquanto alguns esperam ser corrigidos, outros evitam o erro ao não participar de forma ativa nas atividades e contexto de sala de aula. Assim sendo, nem todas as estratégias de correção e de remediação são bem recebidas por todos os alunos. Propomos, por essa razão, o recurso às novas tecnologias e aos jogos didáticos, de forma a motivar os alunos para a auto e heterocorreção, encarando o erro como parte do processo de aprendizagem.
Resumo:
This paper is mainly concerned with the tracking accuracy of Exchange Traded Funds (ETFs) listed on the London Stock Exchange (LSE) but also evaluates their performance and pricing efficiency. The findings show that ETFs offer virtually the same return but exhibit higher volatility than their benchmark. It seems that the pricing efficiency, which should come from the creation and redemption process, does not fully hold as equity ETFs show consistent price premiums. The tracking error of the funds is generally small and is decreasing over time. The risk of the ETF, daily price volatility and the total expense ratio explain a large part of the tracking error. Trading volume, fund size, bid-ask spread and average price premium or discount did not have an impact on the tracking error. Finally, it is concluded that market volatility and the tracking error are positively correlated.
Resumo:
The assessment of existing timber structures is often limited to information obtained from non or semi destructive testing, as mechanical testing is in many cases not possible due to its destructive nature. Therefore, the available data provides only an indirect measurement of the reference mechanical properties of timber elements, often obtained through empirical based correlations. Moreover, the data must result from the combination of different tests, as to provide a reliable source of information for a structural analysis. Even if general guidelines are available for each typology of testing, there is still a need for a global methodology allowing to combine information from different sources and infer upon that information in a decision process. In this scope, the present work presents the implementation of a probabilistic based framework for safety assessment of existing timber elements. This methodology combines information gathered in different scales and follows a probabilistic framework allowing for the structural assessment of existing timber elements with possibility of inference and updating of its mechanical properties, through Bayesian methods. The probabilistic based framework is based in four main steps: (i) scale of information; (ii) measurement data; (iii) probability assignment; and (iv) structural analysis. In this work, the proposed methodology is implemented in a case study. Data was obtained through a multi-scale experimental campaign made to old chestnut timber beams accounting correlations of non and semi-destructive tests with mechanical properties. Finally, different inference scenarios are discussed aiming at the characterization of the safety level of the elements.
Resumo:
A novel framework for probabilistic-based structural assessment of existing structures, which combines model identification and reliability assessment procedures, considering in an objective way different sources of uncertainty, is presented in this paper. A short description of structural assessment applications, provided in literature, is initially given. Then, the developed model identification procedure, supported in a robust optimization algorithm, is presented. Special attention is given to both experimental and numerical errors, to be considered in this algorithm convergence criterion. An updated numerical model is obtained from this process. The reliability assessment procedure, which considers a probabilistic model for the structure in analysis, is then introduced, incorporating the results of the model identification procedure. The developed model is then updated, as new data is acquired, through a Bayesian inference algorithm, explicitly addressing statistical uncertainty. Finally, the developed framework is validated with a set of reinforced concrete beams, which were loaded up to failure in laboratory.
Resumo:
"Series title: Springerbriefs in applied sciences and technology, ISSN 2191-530X"
Resumo:
In recent decades, an increased interest has been evidenced in the research on multi-scale hierarchical modelling in the field of mechanics, and also in the field of wood products and timber engineering. One of the main motivations for hierar-chical modelling is to understand how properties, composition and structure at lower scale levels may influence and be used to predict the material properties on a macroscopic and structural engineering scale. This chapter presents the applicability of statistic and probabilistic methods, such as the Maximum Likelihood method and Bayesian methods, in the representation of timber’s mechanical properties and its inference accounting to prior information obtained in different importance scales. These methods allow to analyse distinct timber’s reference properties, such as density, bending stiffness and strength, and hierarchically consider information obtained through different non, semi or destructive tests. The basis and fundaments of the methods are described and also recommendations and limitations are discussed. The methods may be used in several contexts, however require an expert’s knowledge to assess the correct statistic fitting and define the correlation arrangement between properties.
Resumo:
We describe the case of a 40-day-old female patient with a history of breathlessness since birth who was referred to our hospital for surgical correction of common arterial trunk. The invasive investigation disclosed a Fallot¢s tetralogy anatomy associated with an anomalous origin of the left pulmonary artery from the ascending aorta. Immediately after diagnosis, the patient underwent a successful total surgical correction of the defect, including simultaneous anastomosis of the left pulmonary artery to the pulmonary trunk.
Resumo:
OBJECTIVE: To evaluate whether left ventricular end-systolic (ESD) diameters £ 51mm in patients (pt) with severe chronic mitral regurgitation (MR) are predictors of a poor prognosis after mitral valve surgery (MVS). METHODS: Eleven pt (aged 36±13 years) were studied in the preoperative period (pre), median of 36 days; in the early postoperative period (post1), median of 9 days; and in the late postoperative period (post2), mean of 38.5±37.6 months. Clinical and echocardiographic data were gathered from each pt with MR and systolic diameter ³51mm (mean = 57±4mm) to evaluate the result of MVS. Ten patients were in NYHA Class III/IV. RESULTS: All but 2 pt improved in functional class. Two pt died from heart failure and infectious endocarditis 14 and 11 months, respectively, after valve replacement. According to ejection fraction (EF) in post2, we identified 2 groups: group 1 (n=6), whose EF decreased in post1, but increased in post2 (p=0.01) and group 2 (n=5), whose EF decreased progressively from post1 to post2 (p=0.10). All pt with symptoms lasting £ 48 months had improvement in EF in post2 (p=0.01). CONCLUSION: ESD ³51mm are not always associated with a poor prognosis after MVS in patients with MR. Symptoms lasting up to 48 months are associated with improvement in left ventricular function.
Resumo:
We present a case of aneurysmal dilation of the aortic residual segment, involving abdominal vessels in corrective surgeries for thoracoabdominal aortic aneurysm, through the identification of risk groups for recurrent dilation, aiming at using a specific operative technique with a branched graft, to prevent aneurysm relapse.
Resumo:
El objetivo que persigue un proceso de auditoría de estados contables es la comunicación por parte del auditor de una conclusión en relación al grado de razonabilidad con que tales estados reflejan la situación patrimonial, económica y financiera del ente de acuerdo a los criterios plasmados en las normas contables de referencia a ser utilizadas. El hecho que un auditor emita una conclusión errónea como consecuencia de su labor puede implicar la asunción de responsabilidades profesionales, civiles y penales como consecuencia de reclamos de usuarios de los estados contables que pudieran haberse visto perjudicados como consecuencia de la emisión de la conclusión errónea. Las normas contables a nivel nacional e internacional admiten la existencia de errores u omisiones en la información contenida en los estados contables, en la medida que tales desvíos no provoquen en los usuarios interesados en tales estados una decisión distinta a la que tomarían en caso de no existir los errores u omisiones aludidos. De lo expuesto en el párrafo anterior surge la cabal importancia que la determinación del nivel de significación total (nivel de desvíos admitidos por los usuarios de los estados contables en la información por ellos contenida) adquiere en los procesos de auditoría, como así también la asignación de tal nivel entre los distintos componentes de los estados contables (asignación del error tolerable) a los efectos de que los auditores eviten asumir responsabilidades de índole profesional, civil y/o penal. Hasta el momento no se conoce la existencia de modelos matemáticos que respalden de modo objetivo y verificable el cálculo del nivel de significación total y la asignación del error tolerable entre los distintos elementos conformantes de los estados contables. Entendemos que el desarrollo e integración de un modelo de cuantificación del nivel de significación total y de asignación del error tolerable tiene las siguientes repercusiones: 1 – Representaría para el auditor un elemento que respalde el modo de cuantificación del nivel de significación y la asignación del error tolerable entre los componentes de los estados contables. 2 – Permitiría que los auditores reduzcan las posibilidades de asumir responsabilidades de carácter profesional, civil y/o penales como consecuencia de su labor. 3 – Representaría un principio de avance a los efectos de que los organismos emisores de normas de auditoría a nivel nacional e internacional recepten elementos a los efectos de fijar directrices en relación al cálculo del nivel de significación y de asignación del error tolerable. 4 - Eliminaría al cálculo del nivel de significación como una barrera que afecte la comparabilidad de los estados contables.
Resumo:
The classical central limit theorem states the uniform convergence of the distribution functions of the standardized sums of independent and identically distributed square integrable real-valued random variables to the standard normal distribution function. While first versions of the central limit theorem are already due to Moivre (1730) and Laplace (1812), a systematic study of this topic started at the beginning of the last century with the fundamental work of Lyapunov (1900, 1901). Meanwhile, extensions of the central limit theorem are available for a multitude of settings. This includes, e.g., Banach space valued random variables as well as substantial relaxations of the assumptions of independence and identical distributions. Furthermore, explicit error bounds are established and asymptotic expansions are employed to obtain better approximations. Classical error estimates like the famous bound of Berry and Esseen are stated in terms of absolute moments of the random summands and therefore do not reflect a potential closeness of the distributions of the single random summands to a normal distribution. Non-classical approaches take this issue into account by providing error estimates based on, e.g., pseudomoments. The latter field of investigation was initiated by work of Zolotarev in the 1960's and is still in its infancy compared to the development of the classical theory. For example, non-classical error bounds for asymptotic expansions seem not to be available up to now ...
Resumo:
Magdeburg, Univ., Fak. für Elektrotechnik und Informationstechnik, Diss., 2010