966 resultados para Algebra of Errors
Localização automática de pontos de controle em imagens aéreas baseada em cenas terrestres verticais
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Brazil has a strong trading relationship with several countries, including France, which has intensified these links in recent years and intends to do so yet further. Legal documents regulate this operation, resulting in a set of terms which designate concepts specific to this area. Communication between Brazilian and French buyers and sellers is intense and does not permit the occurrence of errors in understanding orders for merchandise nor in terms of purchase and sale. It is therefore very important that agents of International Trade between Brazil and France should have access to a specialised terminographic tool in the area, containing the relevant terms used in French and Portuguese. This type of work does not currently exist; we therefore decided to make a contribution and draw up a proposal for a bilingual French-Portuguese dictionary in this specialised area. During our research, we registered a significant presence of English terms in International Trade texts originally written in Portuguese and in French, which may be explained by the fact that English currently has the role of global lingua franca. However, it is well known that France operates a policy of linguistic protectionism, making the use of French obligatory in all sectors of activity in France. This generated an area of doubt: how should one deal with English terms in a bilingual French-Portugese dictionary? In order to begin the search for an answer to this question, we decided to see what treatment was given to English terms in the area of International Trade in some French dictionaries. In this paper we shall present the principal results obtained during our research.
Resumo:
Pós-graduação em Agronomia (Produção Vegetal) - FCAV
Resumo:
The aim of this study was to evaluate the evolution of calibration and maintenance practices for crop sprayers on soybean production areas in Brazil, in the 2006 and 2007 seasons, based on the Project IPP data. Therefore, the evaluation covered issues related to calibration, maintenance condition and the main components of 103 sprayers distributed in the following states: Rio Grande do Sul (35), Paraná (60), and, Mato Grosso do Sul (8). The evaluations were done at the rate of one sprayer per farm. The most frequent problems were related to the pressure gauge, spray leaks and calibration errors greater than 50% of the desired volume rate. The analysis of the application rate showed a tendency for the farmers to apply volume rates below the desired value. In 2006 the errors of the application rate were significant, with 70.4% for Rio Grande do Sul State, 74.5% for Paraná State and 37.5% for Mato Grosso do Sul State. In 2007 there was a reduction of errors, with averages of 50.0% for Rio Grande do Sul and 66.7% for Paraná. In general terms, the results showed improvements on the use, maintenance and calibration processes for crop sprayers on the areas covered by the Project IPP, with reductions on average indexes for calibration errors, leaks and bad tips, among other issues.
Resumo:
End-user programmers are increasingly relying on web authoring environments to create web applications. Although often consisting primarily of web pages, such applications are increasingly going further, harnessing the content available on the web through “programs” that query other web applications for information to drive other tasks. Unfortunately, errors can be pervasive in web applications, impacting their dependability. This paper reports the results of an exploratory study of end-user web application developers, performed with the aim of exposing prevalent classes of errors. The results suggest that end-users struggle the most with the identification and manipulation of variables when structuring requests to obtain data from other web sites. To address this problem, we present a family of techniques that help end user programmers perform this task, reducing possible sources of error. The techniques focus on simplification and characterization of the data that end-users must analyze while developing their web applications. We report the results of an empirical study in which these techniques are applied to several popular web sites. Our results reveal several potential benefits for end-users who wish to “engineer” dependable web applications.
Resumo:
In this paper is presented a multilayer perceptron neural network combined with the Nelder-Mead Simplex method to detect damage in multiple support beams. The input parameters are based on natural frequencies and modal flexibility. It was considered that only a number of modes were available and that only vertical degrees of freedom were measured. The reliability of the proposed methodology is assessed from the generation of random damages scenarios and the definition of three types of errors, which can be found during the damage identification process. Results show that the methodology can reliably determine the damage scenarios. However, its application to large beams may be limited by the high computational cost of training the neural network.
Resumo:
Bol algebras appear as the tangent algebra of Bol loops. A (left) Bol algebra is a vector space equipped with a binary operation [a, b] and a ternary operation {a, b, c} that satisfy five defining identities. If A is a left or right alternative algebra then A(b) is a Bol algebra, where [a, b] := ab - ba is the commutator and {a, b, c} := < b, c, a > is the Jordan associator. A special identity is an identity satisfied by Ab for all right alternative algebras A, but not satisfied by the free Bol algebra. We show that there are no special identities of degree <= 7, but there are special identities of degree 8. We obtain all the special identities of degree 8 in partition six-two. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
The purpose of this study was to evaluate the visual outcome of chronic occupational exposure to a mixture of organic solvents by measuring color discrimination, achromatic contrast sensitivity and visual fields in a group of gas station workers. We tested 25 workers (20 males) and 25 controls with no history of chronic exposure to solvents (10 males). All participants had normal ophthalmologic exams. Subjects had worked in gas stations on an average of 9.6 +/- 6.2 years. Color vision was evaluated with the Lanthony D15d and Cambridge Colour Test (CCT). Visual field assessment consisted of white-on-white 24-2 automatic perimetry (Humphrey II-750i). Contrast sensitivity was measured for sinusoidal gratings of 0.2, 0.5, 1.0, 2.0, 5.0, 10.0 and 20.0 cycles per degree (cpd). Results from both groups were compared using the Mann-Whitney U test. The number of errors in the D15d was higher for workers relative to controls (p<0.01). Their CCT color discrimination thresholds were elevated compared to the control group along the protan, deutan and tritan confusion axes (p<0.01), and their ellipse area and ellipticity were higher (p<0.01). Genetic analysis of subjects with very elevated color discrimination thresholds excluded congenital causes for the visual losses. Automated perimetry thresholds showed elevation in the 9 degrees, 15 degrees and 21 degrees of eccentricity (p<0.01) and in MD and PSD indexes (p<0.01). Contrast sensitivity losses were found for all spatial frequencies measured (p<0.01) except for 0.5 cpd. Significant correlation was found between previous working years and deutan axis thresholds (rho = 0.59; p<0.05), indexes of the Lanthony D15d (rho = 0.52; p<0.05), perimetry results in the fovea (rho = -0.51; p<0.05) and at 3, 9 and 15 degrees of eccentricity (rho = -0.46; p<0.05). Extensive and diffuse visual changes were found, suggesting that specific occupational limits should be created.
Resumo:
Background: Ontologies have increasingly been used in the biomedical domain, which has prompted the emergence of different initiatives to facilitate their development and integration. The Open Biological and Biomedical Ontologies (OBO) Foundry consortium provides a repository of life-science ontologies, which are developed according to a set of shared principles. This consortium has developed an ontology called OBO Relation Ontology aiming at standardizing the different types of biological entity classes and associated relationships. Since ontologies are primarily intended to be used by humans, the use of graphical notations for ontology development facilitates the capture, comprehension and communication of knowledge between its users. However, OBO Foundry ontologies are captured and represented basically using text-based notations. The Unified Modeling Language (UML) provides a standard and widely-used graphical notation for modeling computer systems. UML provides a well-defined set of modeling elements, which can be extended using a built-in extension mechanism named Profile. Thus, this work aims at developing a UML profile for the OBO Relation Ontology to provide a domain-specific set of modeling elements that can be used to create standard UML-based ontologies in the biomedical domain. Results: We have studied the OBO Relation Ontology, the UML metamodel and the UML profiling mechanism. Based on these studies, we have proposed an extension to the UML metamodel in conformance with the OBO Relation Ontology and we have defined a profile that implements the extended metamodel. Finally, we have applied the proposed UML profile in the development of a number of fragments from different ontologies. Particularly, we have considered the Gene Ontology (GO), the PRotein Ontology (PRO) and the Xenopus Anatomy and Development Ontology (XAO). Conclusions: The use of an established and well-known graphical language in the development of biomedical ontologies provides a more intuitive form of capturing and representing knowledge than using only text-based notations. The use of the profile requires the domain expert to reason about the underlying semantics of the concepts and relationships being modeled, which helps preventing the introduction of inconsistencies in an ontology under development and facilitates the identification and correction of errors in an already defined ontology.
Resumo:
Obstructive sleep apnoea/hypopnoea syndrome (OSAHS) is the periodic reduction or cessation of airflow during sleep. The syndrome is associated whit loud snoring, disrupted sleep and observed apnoeas. Surgery aims to alleviate symptoms of daytime sleepiness, improve quality of life and reduce the signs of sleep apnoea recordered by polysomnography. Surgical intervention for snoring and OSAHS includes several procedures, each designed to increase the patency of the upper airway. Procedures addressing nasal obstruction include septoplasty, turbinectomy, and radiofrequency ablation (RF) of the turbinates. Surgical procedures to reduce soft palate redundancy include uvulopalatopharyngoplasty with or without tonsillectomy, uvulopalatal flap, laser-assisted uvulopalatoplasty, and RF of the soft palate. More significant, however, particularly in cases of severe OSA, is hypopharyngeal or retrolingual obstruction related to an enlarged tongue, or more commonly due to maxillomandibular deficiency. Surgeries in these cases are aimed at reducing the bulk of the tongue base or providing more space for the tongue in the oropharynx so as to limit posterior collapse during sleep. These procedures include tongue-base suspension, genioglossal advancement, hyoid suspension, lingualplasty, and maxillomandibular advancement. We reviewed 269 patients undergoing to osas surgery at the ENT Department of Forlì Hospital in the last decade. Surgery was considered a success if the postoperative apnea/hypopnea index (AHI) was less than 20/h. According to the results, we have developed surgical decisional algorithms with the aims to optimize the success of these procedures by identifying proper candidates for surgery and the most appropriate surgical techniques. Although not without risks and not as predictable as positive airway pressure therapy, surgery remains an important treatment option for patients with obstructive sleep apnea (OSA), particularly for those who have failed or cannot tolerate positive airway pressure therapy. Successful surgery depends on proper patient selection, proper procedure selection, and experience of the surgeon. The intended purpose of medical algorithms is to improve and standardize decisions made in the delivery of medical care, assist in standardizing selection and application of treatment regimens, to reduce potential introduction of errors. Nasal Continuous Positive Airway Pressure (nCPAP) is the recommended therapy for patients with moderate to severe OSAS. Unfortunately this treatment is not accepted by some patient, appears to be poorly tolerated in a not neglible number of subjects, and the compliance may be critical, especially in the long term if correctly evaluated with interview as well with CPAP smart cards analysis. Among the alternative options in Literature, surgery is a long time honoured solution. However until now no clear scientific evidence exists that surgery can be considered a really effective option in OSAHS management. We have design a randomized prospective study comparing MMA and a ventilatory device (Autotitrating Positive Airways Pressure – APAP) in order to understand the real effectiveness of surgery in the management of moderate to severe OSAS. Fifty consecutive previously full informed patients suffering from severe OSAHS were enrolled and randomised into a conservative (APAP) or surgical (MMA) arm. Demographic, biometric, PSG and ESS profiles of the two group were statistically not significantly different. One year after surgery or continuous APAP treatment both groups showed a remarkable improvement of mean AHI and ESS; the degree of improvement was not statistically different. Provided the relatively small sample of studied subjects and the relatively short time of follow up, MMA proved to be in our adult and severe OSAHS patients group a valuable alternative therapeutical tool with a success rate not inferior to APAP.
Resumo:
In the context of “testing laboratory” one of the most important aspect to deal with is the measurement result. Whenever decisions are based on measurement results, it is important to have some indication of the quality of the results. In every area concerning with noise measurement many standards are available but without an expression of uncertainty, it is impossible to judge whether two results are in compliance or not. ISO/IEC 17025 is an international standard related with the competence of calibration and testing laboratories. It contains the requirements that testing and calibration laboratories have to meet if they wish to demonstrate that they operate to a quality system, are technically competent and are able to generate technically valid results. ISO/IEC 17025 deals specifically with the requirements for the competence of laboratories performing testing and calibration and for the reporting of the results, which may or may not contain opinions and interpretations of the results. The standard requires appropriate methods of analysis to be used for estimating uncertainty of measurement. In this point of view, for a testing laboratory performing sound power measurement according to specific ISO standards and European Directives, the measurement of uncertainties is the most important factor to deal with. Sound power level measurement, according to ISO 3744:1994 , performed with a limited number of microphones distributed over a surface enveloping a source is affected by a certain systematic error and a related standard deviation. Making a comparison of measurement carried out with different microphone arrays is difficult because results are affected by systematic errors and standard deviation that are peculiarities of the number of microphones disposed on the surface, their spatial position and the complexity of the sound field. A statistical approach could give an overview of the difference between sound power level evaluated with different microphone arrays and an evaluation of errors that afflict this kind of measurement. Despite the classical approach that tend to follow the ISO GUM this thesis present a different point of view of the problem related to the comparison of result obtained from different microphone arrays.
Resumo:
Sandwich-Singularitäten sind die Singularitäten auf derNormalisierung von Aufblasungen eines regulärenFlächenkeimes. In der Arbeit wird ein enger Zusammenhangzwischen Topologie und Deformationstheorie vonSandwich-Singularitäten einerseits und ebenenKurvensingularitäten andererseits dargestellt. NeueErgebnisse betreffen u.a. Deformationen vonnulldimensionalen komplexen Räumen in der Ebene, die durchvollständige Ideale beschrieben werden, z.B. wann'simultanes Aufblasen' der Fasern einer solchen Deformationmöglich ist. Zudem werden Glättungskomponenten und dieKollar-Vermutung für Sandwich-Singularitäten untersucht undim Zusammenhang damit numerische Kriterien für die Frage, obdie symbolische Algebra einer Raumkurve endlich erzeugt ist.
Resumo:
The present work concerns with the study of debris flows and, in particular, with the related hazard in the Alpine Environment. During the last years several methodologies have been developed to evaluate hazard associated to such a complex phenomenon, whose velocity, impacting force and inappropriate temporal prediction are responsible of the related high hazard level. This research focuses its attention on the depositional phase of debris flows through the application of a numerical model (DFlowz), and on hazard evaluation related to watersheds morphometric, morphological and geological characterization. The main aims are to test the validity of DFlowz simulations and assess sources of errors in order to understand how the empirical uncertainties influence the predictions; on the other side the research concerns with the possibility of performing hazard analysis starting from the identification of susceptible debris flow catchments and definition of their activity level. 25 well documented debris flow events have been back analyzed with the model DFlowz (Berti and Simoni, 2007): derived form the implementation of the empirical relations between event volume and planimetric and cross section inundated areas, the code allows to delineate areas affected by an event by taking into account information about volume, preferential flow path and digital elevation model (DEM) of fan area. The analysis uses an objective methodology for evaluating the accuracy of the prediction and involve the calibration of the model based on factors describing the uncertainty associated to the semi empirical relationships. The general assumptions on which the model is based have been verified although the predictive capabilities are influenced by the uncertainties of the empirical scaling relationships, which have to be necessarily taken into account and depend mostly on errors concerning deposited volume estimation. In addition, in order to test prediction capabilities of physical-based models, some events have been simulated through the use of RAMMS (RApid Mass MovementS). The model, which has been developed by the Swiss Federal Institute for Forest, Snow and Landscape Research (WSL) in Birmensdorf and the Swiss Federal Institute for Snow and Avalanche Research (SLF) takes into account a one-phase approach based on Voellmy rheology (Voellmy, 1955; Salm et al., 1990). The input file combines the total volume of the debris flow located in a release area with a mean depth. The model predicts the affected area, the maximum depth and the flow velocity in each cell of the input DTM. Relatively to hazard analysis related to watersheds characterization, the database collected by the Alto Adige Province represents an opportunity to examine debris-flow sediment dynamics at the regional scale and analyze lithologic controls. With the aim of advancing current understandings about debris flow, this study focuses on 82 events in order to characterize the topographic conditions associated with their initiation , transportation and deposition, seasonal patterns of occurrence and examine the role played by bedrock geology on sediment transfer.
Resumo:
Die chronisch obstruktive Lungenerkrankung (engl. chronic obstructive pulmonary disease, COPD) ist ein Überbegriff für Erkrankungen, die zu Husten, Auswurf und Dyspnoe (Atemnot) in Ruhe oder Belastung führen - zu diesen werden die chronische Bronchitis und das Lungenemphysem gezählt. Das Fortschreiten der COPD ist eng verknüpft mit der Zunahme des Volumens der Wände kleiner Luftwege (Bronchien). Die hochauflösende Computertomographie (CT) gilt bei der Untersuchung der Morphologie der Lunge als Goldstandard (beste und zuverlässigste Methode in der Diagnostik). Möchte man Bronchien, eine in Annäherung tubuläre Struktur, in CT-Bildern vermessen, so stellt die geringe Größe der Bronchien im Vergleich zum Auflösungsvermögen eines klinischen Computertomographen ein großes Problem dar. In dieser Arbeit wird gezeigt wie aus konventionellen Röntgenaufnahmen CT-Bilder berechnet werden, wo die mathematischen und physikalischen Fehlerquellen im Bildentstehungsprozess liegen und wie man ein CT-System mittels Interpretation als lineares verschiebungsinvariantes System (engl. linear shift invariant systems, LSI System) mathematisch greifbar macht. Basierend auf der linearen Systemtheorie werden Möglichkeiten zur Beschreibung des Auflösungsvermögens bildgebender Verfahren hergeleitet. Es wird gezeigt wie man den Tracheobronchialbaum aus einem CT-Datensatz stabil segmentiert und mittels eines topologieerhaltenden 3-dimensionalen Skelettierungsalgorithmus in eine Skelettdarstellung und anschließend in einen kreisfreien Graphen überführt. Basierend auf der linearen System Theorie wird eine neue, vielversprechende, integral-basierte Methodik (IBM) zum Vermessen kleiner Strukturen in CT-Bildern vorgestellt. Zum Validieren der IBM-Resultate wurden verschiedene Messungen an einem Phantom, bestehend aus 10 unterschiedlichen Silikon Schläuchen, durchgeführt. Mit Hilfe der Skelett- und Graphendarstellung ist ein Vermessen des kompletten segmentierten Tracheobronchialbaums im 3-dimensionalen Raum möglich. Für 8 zweifach gescannte Schweine konnte eine gute Reproduzierbarkeit der IBM-Resultate nachgewiesen werden. In einer weiteren, mit IBM durchgeführten Studie konnte gezeigt werden, dass die durchschnittliche prozentuale Bronchialwandstärke in CT-Datensätzen von 16 Rauchern signifikant höher ist, als in Datensätzen von 15 Nichtrauchern. IBM läßt sich möglicherweise auch für Wanddickenbestimmungen bei Problemstellungen aus anderen Arbeitsgebieten benutzen - kann zumindest als Ideengeber dienen. Ein Artikel mit der Beschreibung der entwickelten Methodik und der damit erzielten Studienergebnisse wurde zur Publikation im Journal IEEE Transactions on Medical Imaging angenommen.
Resumo:
The present thesis is a contribution to the theory of algebras of pseudodifferential operators on singular settings. In particular, we focus on the $b$-calculus and the calculus on conformally compact spaces in the sense of Mazzeo and Melrose in connection with the notion of spectral invariant transmission operator algebras. We summarize results given by Gramsch et. al. on the construction of $Psi_0$-and $Psi*$-algebras and the corresponding scales of generalized Sobolev spaces using commutators of certain closed operators and derivations. In the case of a manifold with corners $Z$ we construct a $Psi*$-completion $A_b(Z,{}^bOmega^{1/2})$ of the algebra of zero order $b$-pseudodifferential operators $Psi_{b,cl}(Z, {}^bOmega^{1/2})$ in the corresponding $C*$-closure $B(Z,{}^bOmega^{12})hookrightarrow L(L^2(Z,{}^bOmega^{1/2}))$. The construction will also provide that localised to the (smooth) interior of Z the operators in the $A_b(Z, {}^bOmega^{1/2})$ can be represented as ordinary pseudodifferential operators. In connection with the notion of solvable $C*$-algebras - introduced by Dynin - we calculate the length of the $C*$-closure of $Psi_{b,cl}^0(F,{}^bOmega^{1/2},R^{E(F)})$ in $B(F,{}^bOmega^{1/2}),R^{E(F)})$ by localizing $B(Z, {}^bOmega^{1/2})$ along the boundary face $F$ using the (extended) indical familiy $I^B_{FZ}$. Moreover, we discuss how one can localise a certain solving ideal chain of $B(Z, {}^bOmega^{1/2})$ in neighbourhoods $U_p$ of arbitrary points $pin Z$. This localisation process will recover the singular structure of $U_p$; further, the induced length function $l_p$ is shown to be upper semi-continuous. We give construction methods for $Psi*$- and $C*$-algebras admitting only infinite long solving ideal chains. These algebras will first be realized as unconnected direct sums of (solvable) $C*$-algebras and then refined such that the resulting algebras have arcwise connected spaces of one dimensional representations. In addition, we recall the notion of transmission algebras on manifolds with corners $(Z_i)_{iin N}$ following an idea of Ali Mehmeti, Gramsch et. al. Thereby, we connect the underlying $C^infty$-function spaces using point evaluations in the smooth parts of the $Z_i$ and use generalized Laplacians to generate an appropriate scale of Sobolev spaces. Moreover, it is possible to associate generalized (solving) ideal chains to these algebras, such that to every $ninN$ there exists an ideal chain of length $n$ within the algebra. Finally, we discuss the $K$-theory for algebras of pseudodifferential operators on conformally compact manifolds $X$ and give an index theorem for these operators. In addition, we prove that the Dirac-operator associated to the metric of a conformally compact manifold $X$ is not a Fredholm operator.