830 resultados para Metric Representation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: In the analysis of effects by cell treatment such as drug dosing, identifying changes on gene network structures between normal and treated cells is a key task. A possible way for identifying the changes is to compare structures of networks estimated from data on normal and treated cells separately. However, this approach usually fails to estimate accurate gene networks due to the limited length of time series data and measurement noise. Thus, approaches that identify changes on regulations by using time series data on both conditions in an efficient manner are demanded. Methods: We propose a new statistical approach that is based on the state space representation of the vector autoregressive model and estimates gene networks on two different conditions in order to identify changes on regulations between the conditions. In the mathematical model of our approach, hidden binary variables are newly introduced to indicate the presence of regulations on each condition. The use of the hidden binary variables enables an efficient data usage; data on both conditions are used for commonly existing regulations, while for condition specific regulations corresponding data are only applied. Also, the similarity of networks on two conditions is automatically considered from the design of the potential function for the hidden binary variables. For the estimation of the hidden binary variables, we derive a new variational annealing method that searches the configuration of the binary variables maximizing the marginal likelihood. Results: For the performance evaluation, we use time series data from two topologically similar synthetic networks, and confirm that our proposed approach estimates commonly existing regulations as well as changes on regulations with higher coverage and precision than other existing approaches in almost all the experimental settings. For a real data application, our proposed approach is applied to time series data from normal Human lung cells and Human lung cells treated by stimulating EGF-receptors and dosing an anticancer drug termed Gefitinib. In the treated lung cells, a cancer cell condition is simulated by the stimulation of EGF-receptors, but the effect would be counteracted due to the selective inhibition of EGF-receptors by Gefitinib. However, gene expression profiles are actually different between the conditions, and the genes related to the identified changes are considered as possible off-targets of Gefitinib. Conclusions: From the synthetically generated time series data, our proposed approach can identify changes on regulations more accurately than existing methods. By applying the proposed approach to the time series data on normal and treated Human lung cells, candidates of off-target genes of Gefitinib are found. According to the published clinical information, one of the genes can be related to a factor of interstitial pneumonia, which is known as a side effect of Gefitinib.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is a wide range of video services over complex transmission networks, and in some cases end users fail to receive an acceptable quality level. In this paper, the different factors that degrade users' quality of experience (QoE) in video streaming service that use TCP as transmission protocol are studied. In this specific service, impairment factors are: number of pauses, their duration and temporal location. In order to measure the effect that each temporal segment has in the overall video quality, subjective tests. Because current subjective test methodologies are not adequate to assess video streaming over TCP, some recommendations are provided here. At the application layer, a customized player is used to evaluate the behavior of player buffer, and consequently, the end user QoE. Video subjective test results demonstrate that there is a close correlation between application parameters and subjective scores. Based on this fact, a new metrics named VsQM is defined, which considers the importance of temporal location of pauses to assess the user QoE of video streaming service. A useful application scenario is also presented, in which the metrics proposed herein is used to improve video services(1).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: There is no consensus on the optimal method to measure delivered dialysis dose in patients with acute kidney injury (AKI). The use of direct dialysate-side quantification of dose in preference to the use of formal blood-based urea kinetic modeling and simplified blood urea nitrogen (BUN) methods has been recommended for dose assessment in critically-ill patients with AKI. We evaluate six different blood-side and dialysate-side methods for dose quantification. Methods: We examined data from 52 critically-ill patients with AKI requiring dialysis. All patients were treated with pre-dilution CWHDF and regional citrate anticoagulation. Delivered dose was calculated using blood-side and dialysis-side kinetics. Filter function was assessed during the entire course of therapy by calculating BUN to dialysis fluid urea nitrogen (FUN) ratios q/12 hours. Results: Median daily treatment time was 1,413 min (1,260-1,440). The median observed effluent volume per treatment was 2,355 mL/h (2,060-2,863) (p<0.001). Urea mass removal rate was 13.0 +/- 7.6 mg/min. Both EKR (r(2)=0.250; p<0.001) and K-D (r(2)=0.409; p<0.001) showed a good correlation with actual solute removal. EKR and K-D presented a decline in their values that was related to the decrease in filter function assessed by the FUN/BUN ratio. Conclusions: Effluent rate (ml/kg/h) can only empirically provide an estimated of dose in CRRT. For clinical practice, we recommend that the delivered dose should be measured and expressed as K-D. EKR also constitutes a good method for dose comparisons over time and across modalities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Different representations for a control surface freeplay nonlinearity in a three degree of freedom aeroelastic system are assessed. These are the discontinuous, polynomial and hyperbolic tangent representations. The Duhamel formulation is used to model the aerodynamic loads. Assessment of the validity of these representations is performed through comparison with previous experimental observations. The results show that the instability and nonlinear response characteristics are accurately predicted when using the discontinuous and hyperbolic tangent representations. On the other hand, the polynomial representation fails to predict chaotic motions observed in the experiments. (c) 2012 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To identify and compare perceptions of pain and how it is faced between men and women with central post-stroke pain. Methods: The participants were 25 men and 25 women of minimum age 30 years-old and minimum schooling level of four years, presenting central post-stroke pain for at least three months. The instruments used were: Mini-Mental State Examination; structured interview for the Brief Psychiatric Scale; Survey of Sociodemographic and Clinical Data; Visual Analogue Scale (VAS); Ways of Coping with Problems Scale (WCPS) in Scale; Revised Illness Perception Questionnaire (IPQ-R); and Beck Depression Inventory (BD). Results: A significantly greater number of women used the coping strategy "Turn to spiritual and religious activities" in WCPS. They associated their emotional state with the cause of pain in IPQ-R. "Distraction of attention" was the strategy most used by the subjects. Conclusion: Women used spiritual and religious activities more as a coping strategy and perceived their emotional state as the cause of pain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An important feature in computer systems developed for the agricultural sector is to satisfy the heterogeneity of data generated in different processes. Most problems related with this heterogeneity arise from the lack of standard for different computing solutions proposed. An efficient solution for that is to create a single standard for data exchange. The study on the actual process involved in cotton production was based on a research developed by the Brazilian Agricultural Research Corporation (EMBRAPA) that reports all phases as a result of the compilation of several theoretical and practical researches related to cotton crop. The proposition of a standard starts with the identification of the most important classes of data involved in the process, and includes an ontology that is the systematization of concepts related to the production of cotton fiber and results in a set of classes, relations, functions and instances. The results are used as a reference for the development of computational tools, transforming implicit knowledge into applications that support the knowledge described. This research is based on data from the Midwest of Brazil. The choice of the cotton process as a study case comes from the fact that Brazil is one of the major players and there are several improvements required for system integration in this segment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[EN] The purpose of this paper is to present some fixed point theorems for Meir-Keeler contractions in a complete metric space endowed with a partial order. MSC: 47H10.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[EN] The purpose of this paper is to present a fixed point theorem for generalized contractions in partially ordered complete metric spaces. We also present an application to first-order ordinary differential equations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[EN]This paper presents the experimental measurements of isobaric vapor−liquid equilibria (iso-p VLE) and excess volumes (vE) at several temperatures in the interval (288.15 to 328.15) K for six binary systems composed of two alkyl (methyl, ethyl) propanoates and three odd carbon alkanes (C5 to C9). The mixing processes were expansive, vE > 0, with (δvE/δT)p > 0, and endothermic. The installation used to measure the iso-p VLE was improved by controlling three of the variables involved in the experimentation with a PC.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Until recently the debate on the ontology of spacetime had only a philosophical significance, since, from a physical point of view, General Relativity has been made "immune" to the consequences of the "Hole Argument" simply by reducing the subject to the assertion that solutions of Einstein equations which are mathematically different and related by an active diffeomorfism are physically equivalent. From a technical point of view, the natural reading of the consequences of the "Hole Argument” has always been to go further and say that the mathematical representation of spacetime in General Relativity inevitably contains a “superfluous structure” brought to light by the gauge freedom of the theory. This position of apparent split between the philosophical outcome and the physical one has been corrected thanks to a meticulous and complicated formal analysis of the theory in a fundamental and recent (2006) work by Luca Lusanna and Massimo Pauri entitled “Explaining Leibniz equivalence as difference of non-inertial appearances: dis-solution of the Hole Argument and physical individuation of point-events”. The main result of this article is that of having shown how, from a physical point of view, point-events of Einstein empty spacetime, in a particular class of models considered by them, are literally identifiable with the autonomous degrees of freedom of the gravitational field (the Dirac observables, DO). In the light of philosophical considerations based on realism assumptions of the theories and entities, the two authors then conclude by saying that spacetime point-events have a degree of "weak objectivity", since they, depending on a NIF (non-inertial frame), unlike the points of the homogeneous newtonian space, are plunged in a rich and complex non-local holistic structure provided by the “ontic part” of the metric field. Therefore according to the complex structure of spacetime that General Relativity highlights and within the declared limits of a methodology based on a Galilean scientific representation, we can certainly assert that spacetime has got "elements of reality", but the inevitably relational elements that are in the physical detection of point-events in the vacuum of matter (highlighted by the “ontic part” of the metric field, the DO) are closely dependent on the choice of the global spatiotemporal laboratory where the dynamics is expressed (NIF). According to the two authors, a peculiar kind of structuralism takes shape: the point structuralism, with common features both of the absolutist and substantival tradition and of the relationalist one. The intention of this thesis is that of proposing a method of approaching the problem that is, at least at the beginning, independent from the previous ones, that is to propose an approach based on the possibility of describing the gravitational field at three distinct levels. In other words, keeping the results achieved by the work of Lusanna and Pauri in mind and following their underlying philosophical assumptions, we intend to partially converge to their structuralist approach, but starting from what we believe is the "foundational peculiarity" of General Relativity, which is that characteristic inherent in the elements that constitute its formal structure: its essentially geometric nature as a theory considered regardless of the empirical necessity of the measure theory. Observing the theory of General Relativity from this perspective, we can find a "triple modality" for describing the gravitational field that is essentially based on a geometric interpretation of the spacetime structure. The gravitational field is now "visible" no longer in terms of its autonomous degrees of freedom (the DO), which, in fact, do not have a tensorial and, therefore, nor geometric nature, but it is analyzable through three levels: a first one, called the potential level (which the theory identifies with the components of the metric tensor), a second one, known as the connections level (which in the theory determine the forces acting on the mass and, as such, offer a level of description related to the one that the newtonian gravitation provides in terms of components of the gravitational field) and, finally, a third level, that of the Riemann tensor, which is peculiar to General Relativity only. Focusing from the beginning on what is called the "third level" seems to present immediately a first advantage: to lead directly to a description of spacetime properties in terms of gauge-invariant quantites, which allows to "short circuit" the long path that, in the treatises analyzed, leads to identify the "ontic part” of the metric field. It is then shown how to this last level it is possible to establish a “primitive level of objectivity” of spacetime in terms of the effects that matter exercises in extended domains of spacetime geometrical structure; these effects are described by invariants of the Riemann tensor, in particular of its irreducible part: the Weyl tensor. The convergence towards the affirmation by Lusanna and Pauri that the existence of a holistic, non-local and relational structure from which the properties quantitatively identified of point-events depend (in addition to their own intrinsic detection), even if it is obtained from different considerations, is realized, in our opinion, in the assignment of a crucial role to the degree of curvature of spacetime that is defined by the Weyl tensor even in the case of empty spacetimes (as in the analysis conducted by Lusanna and Pauri). In the end, matter, regarded as the physical counterpart of spacetime curvature, whose expression is the Weyl tensor, changes the value of this tensor even in spacetimes without matter. In this way, going back to the approach of Lusanna and Pauri, it affects the DOs evolution and, consequently, the physical identification of point-events (as our authors claim). In conclusion, we think that it is possible to see the holistic, relational, and non-local structure of spacetime also through the "behavior" of the Weyl tensor in terms of the Riemann tensor. This "behavior" that leads to geometrical effects of curvature is characterized from the beginning by the fact that it concerns extensive domains of the manifold (although it should be pointed out that the values of the Weyl tensor change from point to point) by virtue of the fact that the action of matter elsewhere indefinitely acts. Finally, we think that the characteristic relationality of spacetime structure should be identified in this "primitive level of organization" of spacetime.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

“Cartographic heritage” is different from “cartographic history”. The second term refers to the study of the development of surveying and drawing techniques related to maps, through time, i.e. through different types of cultural environment which were background for the creation of maps. The first term concerns the whole amount of ancient maps, together with these different types of cultural environment, which the history has brought us and which we perceive as cultural values to be preserved and made available to many users (public, institutions, experts). Unfortunately, ancient maps often suffer preservation problems of their analog support, mostly due to aging. Today, metric recovery in digital form and digital processing of historical cartography allow preserving map heritage. Moreover, modern geomatic techniques give us new chances of using historical information, which would be unachievable on analog supports. In this PhD thesis, the whole digital processing of recovery and elaboration of ancient cartography is reported, with a special emphasis on the use of digital tools in preservation and elaboration of cartographic heritage. It is possible to divide the workflow into three main steps, that reflect the chapter structure of the thesis itself: • map acquisition: conversion of the ancient map support from analog to digital, by means of high resolution scanning or 3D surveying (digital photogrammetry or laser scanning techniques); this process must be performed carefully, with special instruments, in order to reduce deformation as much as possible; • map georeferencing: reproducing in the digital image the native metric content of the map, or even improving it by selecting a large number of still existing ground control points; this way it is possible to understand the projection features of the historical map, as well as to evaluate and represent the degree of deformation induced by the old type of cartographic transformation (that can be unknown to us), by surveying errors or by support deformation, usually all errors of too high value with respect to our standards; • data elaboration and management in a digital environment, by means of modern software tools: vectorization, giving the map a new and more attractive graphic view (for instance, by creating a 3D model), superimposing it on current base maps, comparing it to other maps, and finally inserting it in GIS or WebGIS environment as a specific layer. The study is supported by some case histories, each of them interesting from the point of view of one digital cartographic elaboration step at least. The ancient maps taken into account are the following ones: • three maps of the Po river delta, made at the end of the XVI century by a famous land-surveyor, Ottavio Fabri (he is single author in the first map, co-author with Gerolamo Pontara in the second map, co-author with Bonajuto Lorini and others in the third map), who wrote a methodological textbook where he explains a new topographical instrument, the squadra mobile (mobile square) invented and used by himself; today all maps are preserved in the State Archive of Venice; • the Ichnoscenografia of Bologna by Filippo de’ Gnudi, made in the 1702 and today preserved in the Archiginnasio Library of Bologna; it is a scenographic view of the city, captured in a bird’s eye flight, but also with an icnographic value, as the author himself declares; • the map of Bologna by the periti Gregorio Monari and Antonio Laghi, the first map of the city derived from a systematic survey, even though it was made only ten years later (1711–1712) than the map by de’ Gnudi; in this map the scenographic view was abandoned, in favor of a more correct representation by means of orthogonal projection; today the map is preserved in the State Archive of Bologna; • the Gregorian Cadastre of Bologna, made in 1831 and updated until 1927, now preserved in the State Archive of Bologna; it is composed by 140 maps and 12 brogliardi (register volumes). In particular, the three maps of the Po river delta and the Cadastre were studied with respect to their acquisition procedure. Moreover, the first maps were analyzed from the georeferencing point of view, and the Cadastre was analyzed with respect to a possible GIS insertion. Finally, the Ichnoscenografia was used to illustrate a possible application of digital elaboration, such as 3D modeling. Last but not least, we must not forget that the study of an ancient map should start, whenever possible, from the consultation of the precious original analogical document; analysis by means of current digital techniques allow us new research opportunities in a rich and modern multidisciplinary context.