844 resultados para Orthogonal Representation
Resumo:
Different representations for a control surface freeplay nonlinearity in a three degree of freedom aeroelastic system are assessed. These are the discontinuous, polynomial and hyperbolic tangent representations. The Duhamel formulation is used to model the aerodynamic loads. Assessment of the validity of these representations is performed through comparison with previous experimental observations. The results show that the instability and nonlinear response characteristics are accurately predicted when using the discontinuous and hyperbolic tangent representations. On the other hand, the polynomial representation fails to predict chaotic motions observed in the experiments. (c) 2012 Elsevier Ltd. All rights reserved.
Resumo:
A subspace representation of a poset S = {s(1), ..., S-t} is given by a system (V; V-1, ..., V-t) consisting of a vector space V and its sub-spaces V-i such that V-i subset of V-j if s(i) (sic) S-j. For each real-valued vector chi = (chi(1), ..., chi(t)) with positive components, we define a unitary chi-representation of S as a system (U: U-1, ..., U-t) that consists of a unitary space U and its subspaces U-i such that U-i subset of U-j if S-i (sic) S-j and satisfies chi 1 P-1 + ... + chi P-t(t) = 1, in which P-i is the orthogonal projection onto U-i. We prove that S has a finite number of unitarily nonequivalent indecomposable chi-representations for each weight chi if and only if S has a finite number of nonequivalent indecomposable subspace representations; that is, if and only if S contains any of Kleiner's critical posets. (c) 2012 Elsevier Inc. All rights reserved.
Resumo:
Objective: To identify and compare perceptions of pain and how it is faced between men and women with central post-stroke pain. Methods: The participants were 25 men and 25 women of minimum age 30 years-old and minimum schooling level of four years, presenting central post-stroke pain for at least three months. The instruments used were: Mini-Mental State Examination; structured interview for the Brief Psychiatric Scale; Survey of Sociodemographic and Clinical Data; Visual Analogue Scale (VAS); Ways of Coping with Problems Scale (WCPS) in Scale; Revised Illness Perception Questionnaire (IPQ-R); and Beck Depression Inventory (BD). Results: A significantly greater number of women used the coping strategy "Turn to spiritual and religious activities" in WCPS. They associated their emotional state with the cause of pain in IPQ-R. "Distraction of attention" was the strategy most used by the subjects. Conclusion: Women used spiritual and religious activities more as a coping strategy and perceived their emotional state as the cause of pain.
Resumo:
An important feature in computer systems developed for the agricultural sector is to satisfy the heterogeneity of data generated in different processes. Most problems related with this heterogeneity arise from the lack of standard for different computing solutions proposed. An efficient solution for that is to create a single standard for data exchange. The study on the actual process involved in cotton production was based on a research developed by the Brazilian Agricultural Research Corporation (EMBRAPA) that reports all phases as a result of the compilation of several theoretical and practical researches related to cotton crop. The proposition of a standard starts with the identification of the most important classes of data involved in the process, and includes an ontology that is the systematization of concepts related to the production of cotton fiber and results in a set of classes, relations, functions and instances. The results are used as a reference for the development of computational tools, transforming implicit knowledge into applications that support the knowledge described. This research is based on data from the Midwest of Brazil. The choice of the cotton process as a study case comes from the fact that Brazil is one of the major players and there are several improvements required for system integration in this segment.
Resumo:
Dynamic texture is a recent field of investigation that has received growing attention from computer vision community in the last years. These patterns are moving texture in which the concept of selfsimilarity for static textures is extended to the spatiotemporal domain. In this paper, we propose a novel approach for dynamic texture representation, that can be used for both texture analysis and segmentation. In this method, deterministic partially self-avoiding walks are performed in three orthogonal planes of the video in order to combine appearance and motion features. We validate our method on three applications of dynamic texture that present interesting challenges: recognition, clustering and segmentation. Experimental results on these applications indicate that the proposed method improves the dynamic texture representation compared to the state of the art.
Resumo:
[EN] The information provided by the International Commission for the Conservation of Atlantic Tunas (ICCAT) on captures of skipjack tuna (Katsuwonus pelamis) in the central-east Atlantic has a number of limitations, such as gaps in the statistics for certain fleets and the level of spatiotemporal detail at which catches are reported. As a result, the quality of these data and their effectiveness for providing management advice is limited. In order to reconstruct missing spatiotemporal data of catches, the present study uses Data INterpolating Empirical Orthogonal Functions (DINEOF), a technique for missing data reconstruction, applied here for the first time to fisheries data. DINEOF is based on an Empirical Orthogonal Functions decomposition performed with a Lanczos method. DINEOF was tested with different amounts of missing data, intentionally removing values from 3.4% to 95.2% of data loss, and then compared with the same data set with no missing data. These validation analyses show that DINEOF is a reliable methodological approach of data reconstruction for the purposes of fishery management advice, even when the amount of missing data is very high.
Resumo:
[EN]This paper presents the experimental measurements of isobaric vapor−liquid equilibria (iso-p VLE) and excess volumes (vE) at several temperatures in the interval (288.15 to 328.15) K for six binary systems composed of two alkyl (methyl, ethyl) propanoates and three odd carbon alkanes (C5 to C9). The mixing processes were expansive, vE > 0, with (δvE/δT)p > 0, and endothermic. The installation used to measure the iso-p VLE was improved by controlling three of the variables involved in the experimentation with a PC.
Resumo:
[EN]This paper deals with the orthogonal projection (in the Frobenius sense) AN of the identity matrix I onto the matrix subspace AS (A ? Rn×n, S being an arbitrary subspace of Rn×n). Lower and upper bounds on the normalized Frobenius condition number of matrix AN are given. Furthermore, for every matrix subspace S ? Rn×n, a new index bF (A, S), which generalizes the normalized Frobenius condition number of matrix A, is defined and analyzed...
Resumo:
[EN]We analyze the best approximation
Resumo:
Auf der Suche nach potenten pharmakologischen Wirkstoffen hat die Kombinatorische Chemie in der letzten Dekade eine große Bedeutung erlangt, um innerhalb kurzer Zeit ein breites Spektrum von Verbindungen für biologische Tests zu erzeugen. Kohlenhydrate stellen als Scaffolds interessante Kandidaten für die kombinatorische Synthese dar, da sie mehrere Derivatisierungspositionen in stereochemisch definierter Art und Weise zur Verfügung stellen. So ist die räumlich eindeutige Präsentation angebundener pharmakophorer Gruppen möglich, wie es für den Einsatz als Peptidmimetika wünschenswert ist. Zur gezielten Derivatisierung einzelner Hydroxylfunktionen ist der Einsatz eines orthogonalen Schutz-gruppenmusters erforderlich, das gegenüber den im Lauf der kombinatorischen Synthese herrschenden Reaktionsbedingungen hinreichend stabil ist. Weiterhin ist ein geeignetes Ankersystem zu finden, um eine Festphasensynthese und damit eine Automatisierung zu ermöglichen.Zur Minimierung der im Fall von Hexosen wie Galactose benötigten fünf zueinander orthogonal stabilen Schutzgruppen wurde bei der vorliegenden Arbeit von Galactal ausgegangen, bei dem nur noch drei Hydroxylfunktionen zu differenzieren sind. Das Galactose-Gerüst kann anschließend wiederhergestellt werden. Die Differenzierung wurde über den Einsatz von Hydrolasen durch regioselektive Acylierungs- und Deacylierungs-reaktionen erreicht, wobei auch immobilisierte Enzyme Verwendung fanden. Dabei konnte ein orthogonales Schutzgruppenmuster sequentiell aufgebaut werden, das auch die nötigen Stabilitäten gegenüber sonstigen, teilweise geeignet modifizierten Reaktionsbedingungen aufweist. Für die Anbindung an eine Festphase wurde ein metathetisch spaltbarer Anker entwickelt, der über die anomere Position unter Wiederherstellung des Galactose-Gerüsts angebunden wurde. Auch ein oxidativ spaltbares und ein photolabiles Ankersystem wurden erprobt.
Resumo:
“Cartographic heritage” is different from “cartographic history”. The second term refers to the study of the development of surveying and drawing techniques related to maps, through time, i.e. through different types of cultural environment which were background for the creation of maps. The first term concerns the whole amount of ancient maps, together with these different types of cultural environment, which the history has brought us and which we perceive as cultural values to be preserved and made available to many users (public, institutions, experts). Unfortunately, ancient maps often suffer preservation problems of their analog support, mostly due to aging. Today, metric recovery in digital form and digital processing of historical cartography allow preserving map heritage. Moreover, modern geomatic techniques give us new chances of using historical information, which would be unachievable on analog supports. In this PhD thesis, the whole digital processing of recovery and elaboration of ancient cartography is reported, with a special emphasis on the use of digital tools in preservation and elaboration of cartographic heritage. It is possible to divide the workflow into three main steps, that reflect the chapter structure of the thesis itself: • map acquisition: conversion of the ancient map support from analog to digital, by means of high resolution scanning or 3D surveying (digital photogrammetry or laser scanning techniques); this process must be performed carefully, with special instruments, in order to reduce deformation as much as possible; • map georeferencing: reproducing in the digital image the native metric content of the map, or even improving it by selecting a large number of still existing ground control points; this way it is possible to understand the projection features of the historical map, as well as to evaluate and represent the degree of deformation induced by the old type of cartographic transformation (that can be unknown to us), by surveying errors or by support deformation, usually all errors of too high value with respect to our standards; • data elaboration and management in a digital environment, by means of modern software tools: vectorization, giving the map a new and more attractive graphic view (for instance, by creating a 3D model), superimposing it on current base maps, comparing it to other maps, and finally inserting it in GIS or WebGIS environment as a specific layer. The study is supported by some case histories, each of them interesting from the point of view of one digital cartographic elaboration step at least. The ancient maps taken into account are the following ones: • three maps of the Po river delta, made at the end of the XVI century by a famous land-surveyor, Ottavio Fabri (he is single author in the first map, co-author with Gerolamo Pontara in the second map, co-author with Bonajuto Lorini and others in the third map), who wrote a methodological textbook where he explains a new topographical instrument, the squadra mobile (mobile square) invented and used by himself; today all maps are preserved in the State Archive of Venice; • the Ichnoscenografia of Bologna by Filippo de’ Gnudi, made in the 1702 and today preserved in the Archiginnasio Library of Bologna; it is a scenographic view of the city, captured in a bird’s eye flight, but also with an icnographic value, as the author himself declares; • the map of Bologna by the periti Gregorio Monari and Antonio Laghi, the first map of the city derived from a systematic survey, even though it was made only ten years later (1711–1712) than the map by de’ Gnudi; in this map the scenographic view was abandoned, in favor of a more correct representation by means of orthogonal projection; today the map is preserved in the State Archive of Bologna; • the Gregorian Cadastre of Bologna, made in 1831 and updated until 1927, now preserved in the State Archive of Bologna; it is composed by 140 maps and 12 brogliardi (register volumes). In particular, the three maps of the Po river delta and the Cadastre were studied with respect to their acquisition procedure. Moreover, the first maps were analyzed from the georeferencing point of view, and the Cadastre was analyzed with respect to a possible GIS insertion. Finally, the Ichnoscenografia was used to illustrate a possible application of digital elaboration, such as 3D modeling. Last but not least, we must not forget that the study of an ancient map should start, whenever possible, from the consultation of the precious original analogical document; analysis by means of current digital techniques allow us new research opportunities in a rich and modern multidisciplinary context.
Resumo:
Two of the main features of today complex software systems like pervasive computing systems and Internet-based applications are distribution and openness. Distribution revolves around three orthogonal dimensions: (i) distribution of control|systems are characterised by several independent computational entities and devices, each representing an autonomous and proactive locus of control; (ii) spatial distribution|entities and devices are physically distributed and connected in a global (such as the Internet) or local network; and (iii) temporal distribution|interacting system components come and go over time, and are not required to be available for interaction at the same time. Openness deals with the heterogeneity and dynamism of system components: complex computational systems are open to the integration of diverse components, heterogeneous in terms of architecture and technology, and are dynamic since they allow components to be updated, added, or removed while the system is running. The engineering of open and distributed computational systems mandates for the adoption of a software infrastructure whose underlying model and technology could provide the required level of uncoupling among system components. This is the main motivation behind current research trends in the area of coordination middleware to exploit tuple-based coordination models in the engineering of complex software systems, since they intrinsically provide coordinated components with communication uncoupling and further details in the references therein. An additional daunting challenge for tuple-based models comes from knowledge-intensive application scenarios, namely, scenarios where most of the activities are based on knowledge in some form|and where knowledge becomes the prominent means by which systems get coordinated. Handling knowledge in tuple-based systems induces problems in terms of syntax - e.g., two tuples containing the same data may not match due to differences in the tuple structure - and (mostly) of semantics|e.g., two tuples representing the same information may not match based on a dierent syntax adopted. Till now, the problem has been faced by exploiting tuple-based coordination within a middleware for knowledge intensive environments: e.g., experiments with tuple-based coordination within a Semantic Web middleware (surveys analogous approaches). However, they appear to be designed to tackle the design of coordination for specic application contexts like Semantic Web and Semantic Web Services, and they result in a rather involved extension of the tuple space model. The main goal of this thesis was to conceive a more general approach to semantic coordination. In particular, it was developed the model and technology of semantic tuple centres. It is adopted the tuple centre model as main coordination abstraction to manage system interactions. A tuple centre can be seen as a programmable tuple space, i.e. an extension of a Linda tuple space, where the behaviour of the tuple space can be programmed so as to react to interaction events. By encapsulating coordination laws within coordination media, tuple centres promote coordination uncoupling among coordinated components. Then, the tuple centre model was semantically enriched: a main design choice in this work was to try not to completely redesign the existing syntactic tuple space model, but rather provide a smooth extension that { although supporting semantic reasoning { keep the simplicity of tuple and tuple matching as easier as possible. By encapsulating the semantic representation of the domain of discourse within coordination media, semantic tuple centres promote semantic uncoupling among coordinated components. The main contributions of the thesis are: (i) the design of the semantic tuple centre model; (ii) the implementation and evaluation of the model based on an existent coordination infrastructure; (iii) a view of the application scenarios in which semantic tuple centres seem to be suitable as coordination media.