858 resultados para Robust Probabilistic Model, Dyslexic Users, Rewriting, Question-Answering
Resumo:
Most approaches to stereo visual odometry reconstruct the motion based on the tracking of point features along a sequence of images. However, in low-textured scenes it is often difficult to encounter a large set of point features, or it may happen that they are not well distributed over the image, so that the behavior of these algorithms deteriorates. This paper proposes a probabilistic approach to stereo visual odometry based on the combination of both point and line segment that works robustly in a wide variety of scenarios. The camera motion is recovered through non-linear minimization of the projection errors of both point and line segment features. In order to effectively combine both types of features, their associated errors are weighted according to their covariance matrices, computed from the propagation of Gaussian distribution errors in the sensor measurements. The method, of course, is computationally more expensive that using only one type of feature, but still can run in real-time on a standard computer and provides interesting advantages, including a straightforward integration into any probabilistic framework commonly employed in mobile robotics.
Resumo:
Investigation of large, destructive earthquakes is challenged by their infrequent occurrence and the remote nature of geophysical observations. This thesis sheds light on the source processes of large earthquakes from two perspectives: robust and quantitative observational constraints through Bayesian inference for earthquake source models, and physical insights on the interconnections of seismic and aseismic fault behavior from elastodynamic modeling of earthquake ruptures and aseismic processes.
To constrain the shallow deformation during megathrust events, we develop semi-analytical and numerical Bayesian approaches to explore the maximum resolution of the tsunami data, with a focus on incorporating the uncertainty in the forward modeling. These methodologies are then applied to invert for the coseismic seafloor displacement field in the 2011 Mw 9.0 Tohoku-Oki earthquake using near-field tsunami waveforms and for the coseismic fault slip models in the 2010 Mw 8.8 Maule earthquake with complementary tsunami and geodetic observations. From posterior estimates of model parameters and their uncertainties, we are able to quantitatively constrain the near-trench profiles of seafloor displacement and fault slip. Similar characteristic patterns emerge during both events, featuring the peak of uplift near the edge of the accretionary wedge with a decay toward the trench axis, with implications for fault failure and tsunamigenic mechanisms of megathrust earthquakes.
To understand the behavior of earthquakes at the base of the seismogenic zone on continental strike-slip faults, we simulate the interactions of dynamic earthquake rupture, aseismic slip, and heterogeneity in rate-and-state fault models coupled with shear heating. Our study explains the long-standing enigma of seismic quiescence on major fault segments known to have hosted large earthquakes by deeper penetration of large earthquakes below the seismogenic zone, where mature faults have well-localized creeping extensions. This conclusion is supported by the simulated relationship between seismicity and large earthquakes as well as by observations from recent large events. We also use the modeling to connect the geodetic observables of fault locking with the behavior of seismicity in numerical models, investigating how a combination of interseismic geodetic and seismological estimates could constrain the locked-creeping transition of faults and potentially their co- and post-seismic behavior.
Resumo:
Sports and recreation management is addressed here using a model that combines the policies and methodologies applied in the Costa Rican context as a result of a concern to identify the real needs in the sports, recreation, and health promotion fields through the different manifestations of human movement. This approach has been developed during eight years of work in the Costa Rican Sports and Recreation Institute (Instituto Costarricense del Deporte y la Recreación-ICODER) together with different Costa Rican communities, both rural and urban, and local organizations, such as Comprehensive Development Community Associations, Sports and Recreation Community Boards (CCDR), Municipal Mayorships, and NGOs, among others. This article particularly takes into consideration the experience of the CCDRs as entities that have been given the responsibility by the Costa Rican Government to promote and manage municipal sports and recreation services with a convenient offering that would meet the needs of users or customers. In this way, this article is aimed at answering the question on how Boards should conduct an efficient management in a way that they also meet the needs of public users or customers in the municipalities of the country, by proposing a management model that serves as an additional instrument to improving the already existing services managed by the aforementioned entities. This study presents a model of Costa Rican management structured with the theoretical elements that currently define the organization and planning of sports and recreation as a service.
Resumo:
Classification schemes are built at a particular point in time; at inception, they reflect a worldview indicative of that time. This is their strength, but results in potential weak- nesses as worldviews change. For example, if a scheme of mathematics is not updated even though the state of the art has changed, then it is not a very useful scheme to users for the purposes of information retrieval. However, change in schemes is a good thing. Changing allows designers of schemes to update their model and serves as a responsible mediator between resources and users. But change does come at a cost. In the print world, we revise universal clas- sification schemes—sometimes in drastic ways—and this means that over time, the power of a classification scheme to collocate is compromised if we do not account for scheme change in the organization of affected physical resources. If we understand this phenomenon in the print world, we can design ameliorations for the digital world.
Resumo:
Super elastic nitinol (NiTi) wires were exploited as highly robust supports for three distinct crosslinked polymeric ionic liquid (PIL)-based coatings in solid-phase microextraction (SPME). The oxidation of NiTi wires in a boiling (30%w/w) H2O2 solution and subsequent derivatization in vinyltrimethoxysilane (VTMS) allowed for vinyl moieties to be appended to the surface of the support. UV-initiated on-fiber copolymerization of the vinyl-substituted NiTi support with monocationic ionic liquid (IL) monomers and dicationic IL crosslinkers produced a crosslinked PIL-based network that was covalently attached to the NiTi wire. This alteration alleviated receding of the coating from the support, which was observed for an analogous crosslinked PIL applied on unmodified NiTi wires. A series of demanding extraction conditions, including extreme pH, pre-exposure to pure organic solvents, and high temperatures, were applied to investigate the versatility and robustness of the fibers. Acceptable precision of the model analytes was obtained for all fibers under these conditions. Method validation by examining the relative recovery of a homologous group of phthalate esters (PAEs) was performed in drip-brewed coffee (maintained at 60 °C) by direct immersion SPME. Acceptable recoveries were obtained for most PAEs in the part-per-billion level, even in this exceedingly harsh and complex matrix.
Resumo:
Stavskaya's model is a one-dimensional probabilistic cellular automaton (PCA) introduced in the end of the 1960s as an example of a model displaying a nonequilibrium phase transition. Although its absorbing state phase transition is well understood nowadays, the model never received a full numerical treatment to investigate its critical behavior. In this Brief Report we characterize the critical behavior of Stavskaya's PCA by means of Monte Carlo simulations and finite-size scaling analysis. The critical exponents of the model are calculated and indicate that its phase transition belongs to the directed percolation universality class of critical behavior, as would be expected on the basis of the directed percolation conjecture. We also explicitly establish the relationship of the model with the Domany-Kinzel PCA on its directed site percolation line, a connection that seems to have gone unnoticed in the literature so far.
Resumo:
This paper aims to formulate and investigate the application of various nonlinear H(infinity) control methods to a fiee-floating space manipulator subject to parametric uncertainties and external disturbances. From a tutorial perspective, a model-based approach and adaptive procedures based on linear parametrization, neural networks and fuzzy systems are covered by this work. A comparative study is conducted based on experimental implementations performed with an actual underactuated fixed-base planar manipulator which is, following the DEM concept, dynamically equivalent to a free-floating space manipulator. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
The purpose of this study is to apply robust inverse dynamics control for a six-degree-of-freedom flight simulator motion system. From an implementation viewpoint, simplification of the inverse dynamics control law is introduced by assuming control law matrices as constants. The robust control strategy is applied in the outer loop of the inverse dynamic control to counteract the effects of imperfect compensation due this simplification. The control strategy is designed using the Lyapunov stability theory. Forward and inverse kinematics and a full dynamic model of a six-degree-of-freedom motion base driven by electromechanical actuators are briefly presented. A describing function, acceleration step response and some maneuvers computed from the washout filter were used to evaluate the performance of the controllers.
Resumo:
Ecological niche modelling combines species occurrence points with environmental raster layers in order to obtain models for describing the probabilistic distribution of species. The process to generate an ecological niche model is complex. It requires dealing with a large amount of data, use of different software packages for data conversion, for model generation and for different types of processing and analyses, among other functionalities. A software platform that integrates all requirements under a single and seamless interface would be very helpful for users. Furthermore, since biodiversity modelling is constantly evolving, new requirements are constantly being added in terms of functions, algorithms and data formats. This evolution must be accompanied by any software intended to be used in this area. In this scenario, a Service-Oriented Architecture (SOA) is an appropriate choice for designing such systems. According to SOA best practices and methodologies, the design of a reference business process must be performed prior to the architecture definition. The purpose is to understand the complexities of the process (business process in this context refers to the ecological niche modelling problem) and to design an architecture able to offer a comprehensive solution, called a reference architecture, that can be further detailed when implementing specific systems. This paper presents a reference business process for ecological niche modelling, as part of a major work focused on the definition of a reference architecture based on SOA concepts that will be used to evolve the openModeller software package for species modelling. The basic steps that are performed while developing a model are described, highlighting important aspects, based on the knowledge of modelling experts. In order to illustrate the steps defined for the process, an experiment was developed, modelling the distribution of Ouratea spectabilis (Mart.) Engl. (Ochnaceae) using openModeller. As a consequence of the knowledge gained with this work, many desirable improvements on the modelling software packages have been identified and are presented. Also, a discussion on the potential for large-scale experimentation in ecological niche modelling is provided, highlighting opportunities for research. The results obtained are very important for those involved in the development of modelling tools and systems, for requirement analysis and to provide insight on new features and trends for this category of systems. They can also be very helpful for beginners in modelling research, who can use the process and the experiment example as a guide to this complex activity. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
There are several ways of controlling the propagation of a contagious disease. For instance, to reduce the spreading of an airborne infection, individuals can be encouraged to remain in their homes and/or to wear face masks outside their domiciles. However, when a limited amount of masks is available, who should use them: the susceptible subjects, the infective persons or both populations? Here we employ susceptible-infective-recovered (SIR) models described in terms of ordinary differential equations and probabilistic cellular automata in order to investigate how the deletion of links in the random complex network representing the social contacts among individuals affects the dynamics of a contagious disease. The inspiration for this study comes from recent discussions about the impact of measures usually recommended by health public organizations for preventing the propagation of the swine influenza A (H1N1) virus. Our answer to this question can be valid for other eco-epidemiological systems. (C) 2010 Elsevier BM. All rights reserved.
Resumo:
The University of Queensland, Australia has developed Fez, a world-leading user-interface and management system for Fedora-based institutional repositories, which bridges the gap between a repository and users. Christiaan Kortekaas, Andrew Bennett and Keith Webster will review this open source software that gives institutions the power to create a comprehensive repository solution without the hassle..
Resumo:
Modeling volcanic phenomena is complicated by free-surfaces often supporting large rheological gradients. Analytical solutions and analogue models provide explanations for fundamental characteristics of lava flows. But more sophisticated models are needed, incorporating improved physics and rheology to capture realistic events. To advance our understanding of the flow dynamics of highly viscous lava in Peléean lava dome formation, axi-symmetrical Finite Element Method (FEM) models of generic endogenous dome growth have been developed. We use a novel technique, the level-set method, which tracks a moving interface, leaving the mesh unaltered. The model equations are formulated in an Eulerian framework. In this paper we test the quality of this technique in our numerical scheme by considering existing analytical and experimental models of lava dome growth which assume a constant Newtonian viscosity. We then compare our model against analytical solutions for real lava domes extruded on Soufrière, St. Vincent, W.I. in 1979 and Mount St. Helens, USA in October 1980 using an effective viscosity. The level-set method is found to be computationally light and robust enough to model the free-surface of a growing lava dome. Also, by modeling the extruded lava with a constant pressure head this naturally results in a drop in extrusion rate with increasing dome height, which can explain lava dome growth observables more appropriately than when using a fixed extrusion rate. From the modeling point of view, the level-set method will ultimately provide an opportunity to capture more of the physics while benefiting from the numerical robustness of regular grids.
Resumo:
RWMODEL II simulates the Rescorla-Wagner model of Pavlovian conditioning. It is written in Delphi and runs under Windows 3.1 and Windows 95. The program was designed for novice and expert users and can be employed in teaching, as well as in research. It is user friendly and requires a minimal level of computer literacy but is sufficiently flexible to permit a wide range of simulations. It allows the display of empirical data, against which predictions from the model can be validated.
Resumo:
Normal mixture models are being increasingly used to model the distributions of a wide variety of random phenomena and to cluster sets of continuous multivariate data. However, for a set of data containing a group or groups of observations with longer than normal tails or atypical observations, the use of normal components may unduly affect the fit of the mixture model. In this paper, we consider a more robust approach by modelling the data by a mixture of t distributions. The use of the ECM algorithm to fit this t mixture model is described and examples of its use are given in the context of clustering multivariate data in the presence of atypical observations in the form of background noise.
Resumo:
Almost all leprosy cases reported in industrialized countries occur amongst immigrants or refugees from developing countries where leprosy continues to be an important health issue. Screening for leprosy is an important question for governments in countries with immigration and refugee programmes. A decision analysis framework is used to evaluate leprosy screening. The analysis uses a set of criteria and parameters regarding leprosy screening, and available data to estimate the number of cases which would be detected by a leprosy screening programme of immigrants from countries with different leprosy prevalences, compared with a policy of waiting for immigrants who develop symptomatic clinical diseases to present for health care. In a cohort of 100,000 immigrants from high leprosy prevalence regions (3.6/10,000), screening would detect 32 of the 42 cases which would arise in the destination country over the 14 years after migration; from medium prevalence areas (0.7/10,000) 6.3 of the total 8.1 cases would be detected, and from low prevalence regions (0.2/10,600) 1.8 of 2.3 cases. Using Australian data, the migrant mix would produce 74 leprosy cases from 10 years intake; screening would detect 54, and 19 would be diagnosed subsequently after migration. Screening would only produce significant case-yield amongst immigrants from regions or social groups with high leprosy prevalence. Since the number of immigrants to Australia from countries of higher endemnicity is not large routine leprosy screening would have a small impact on case incidence.