993 resultados para software component
Resumo:
Three-dimensional spectroscopy techniques are becoming more and more popular, producing an increasing number of large data cubes. The challenge of extracting information from these cubes requires the development of new techniques for data processing and analysis. We apply the recently developed technique of principal component analysis (PCA) tomography to a data cube from the center of the elliptical galaxy NGC 7097 and show that this technique is effective in decomposing the data into physically interpretable information. We find that the first five principal components of our data are associated with distinct physical characteristics. In particular, we detect a low-ionization nuclear-emitting region (LINER) with a weak broad component in the Balmer lines. Two images of the LINER are present in our data, one seen through a disk of gas and dust, and the other after scattering by free electrons and/or dust particles in the ionization cone. Furthermore, we extract the spectrum of the LINER, decontaminated from stellar and extended nebular emission, using only the technique of PCA tomography. We anticipate that the scattered image has polarized light due to its scattered nature.
Resumo:
Aims. A model-independent reconstruction of the cosmic expansion rate is essential to a robust analysis of cosmological observations. Our goal is to demonstrate that current data are able to provide reasonable constraints on the behavior of the Hubble parameter with redshift, independently of any cosmological model or underlying gravity theory. Methods. Using type Ia supernova data, we show that it is possible to analytically calculate the Fisher matrix components in a Hubble parameter analysis without assumptions about the energy content of the Universe. We used a principal component analysis to reconstruct the Hubble parameter as a linear combination of the Fisher matrix eigenvectors (principal components). To suppress the bias introduced by the high redshift behavior of the components, we considered the value of the Hubble parameter at high redshift as a free parameter. We first tested our procedure using a mock sample of type Ia supernova observations, we then applied it to the real data compiled by the Sloan Digital Sky Survey (SDSS) group. Results. In the mock sample analysis, we demonstrate that it is possible to drastically suppress the bias introduced by the high redshift behavior of the principal components. Applying our procedure to the real data, we show that it allows us to determine the behavior of the Hubble parameter with reasonable uncertainty, without introducing any ad-hoc parameterizations. Beyond that, our reconstruction agrees with completely independent measurements of the Hubble parameter obtained from red-envelope galaxies.
Resumo:
This study was conducted in the Private Reserve Mata do Jambreiro (912 ha), localized in the Iron Quadrangle, Minas Gerais, southeastern portion of the Espinhaco Range, which is predominantly covered by semideciduous seasonal montane forest. Three topographically and physiognomic similar areas located within a continuum forest fragment, distant by 1.3 to 1.5 km were sampled by the point-quadrat method. In each area, 30 points were marked. Individuals with a minimum perimeter at the breast height (PBH) of 15 cm were sampled, totaling 111 species belonging to 40 families. The most representative family was Fabaceae, with 14.29% of the total number of species. Low floristic similarity (5.3% to 34.4%) was observed between the areas, pointing out the importance of distribution of sample units in continuous fragments. Shannon diversity index (H') found was 4.22 and Pielou equability (J) 0.894. Soil analysis showed some differences in chemical composition between the three studied areas and was an important component for the interpretation of the floristic variation found. The low floristic similarity observed here for close areas justify the requirement of more detailed inventories by Brazilian Environmental Agencies for the legal authorization procedures prior to the establishment of new enterprising projects. Also, the professionals that conduct rapid inventories, mainly the Environmental Consultants, should give more attention to this kind of floristic variation and to the methods used to inventory complex forests.
Resumo:
This paper presents SMarty, a variability management approach for UML-based software product lines (PL). SMarty is supported by a UML profile, the SMartyProfile, and a process for managing variabilities, the SMartyProcess. SMartyProfile aims at representing variabilities, variation points, and variants in UML models by applying a set of stereotypes. SMartyProcess consists of a set of activities that is systematically executed to trace, identify, and control variabilities in a PL based on SMarty. It also identifies variability implementation mechanisms and analyzes specific product configurations. In addition, a more comprehensive application of SMarty is presented using SEI's Arcade Game Maker PL. An evaluation of SMarty and related work are discussed.
Resumo:
In one-component Abelian sandpile models, the toppling probabilities are independent quantities. This is not the case in multicomponent models. The condition of associativity of the underlying Abelian algebras imposes nonlinear relations among the toppling probabilities. These relations are derived for the case of two-component quadratic Abelian algebras. We show that Abelian sandpile models with two conservation laws have only trivial avalanches.
Resumo:
Thousands of Free and Open Source Software Projects (FSP) were, and continually are, created on the Internet. This scenario increases the number of opportunities to collaborate to the same extent that it promotes competition for users and contributors, who can guide projects to superior levels, unachievable by founders alone. Thus, given that the main goal of FSP founders is to improve their projects by means of collaboration, the importance to understand and manage the capacity of attracting users and contributors to the project is established. To support researchers and founders in this challenge, the concept of attractiveness is introduced in this paper, which develops a theoretical-managerial toolkit about the causes, indicators and consequences of attractiveness, enabling its strategic management.
Resumo:
Type IV secretion systems (T4SS) are used by Gram-negative bacteria to translocate protein and DNA substrates across the cell envelope and into target cells. Translocation across the outer membrane is achieved via a ringed tetradecameric outer membrane complex made up of a small VirB7 lipoprotein (normally 30 to 45 residues in the mature form) and the C-terminal domains of the VirB9 and VirB10 subunits. Several species from the genera of Xanthomonas phytopathogens possess an uncharacterized type IV secretion system with some distinguishing features, one of which is an unusually large VirB7 subunit (118 residues in the mature form). Here, we report the NMR and 1.0 angstrom X-ray structures of the VirB7 subunit from Xanthomonas citri subsp. citri (VirB7(XAC2622)) and its interaction with VirB9. NMR solution studies show that residues 27-41 of the disordered flexible N-terminal region of VirB7(XAC2622) interact specifically with the VirB9 C-terminal domain, resulting in a significant reduction in the conformational freedom of both regions. VirB7(XAC2622) has a unique C-terminal domain whose topology is strikingly similar to that of N0 domains found in proteins from different systems involved in transport across the bacterial outer membrane. We show that VirB7(XAC2622) oligomerizes through interactions involving conserved residues in the N0 domain and residues 42-49 within the flexible N-terminal region and that these homotropic interactions can persist in the presence of heterotropic interactions with VirB9. Finally, we propose that VirB(7XAC2622) oligomerization is compatible with the core complex structure in a manner such that the N0 domains form an extra layer on the perimeter of the tetradecameric ring.
Resumo:
For environmental quality assessment, INAA has been applied for determining chemical elements in small (200 mg) and large (200 g) samples of leaves from 200 trees. By applying the Ingamells` constant, the expected percent standard deviation was estimated in 0.9-2.2% for 200 mg samples. Otherwise, for composite samples (200 g), expected standard deviation varied from 0.5 to 10% in spite of analytical uncertainties ranging from 2 to 30%. Results thereby suggested the expression of the degree of representativeness as a source of uncertainty, contributing for increasing of the reliability of environmental studies mainly in the case of composite samples.
Resumo:
The purpose of this article is to initiate a philosophical discussion about the ethical component of professional competence in nursing from the perspective of Brazilian nurses. Specifically, this article discusses professional competence in nursing practice in the Brazilian health context, based on two different conceptual frameworks. The first framework is derived from the idealistic and traditional approach while the second views professional competence through the lens of historical and dialectical materialism theory. The philosophical analyses show that the idealistic view of professional competence differs greatly from practice. Combining nursing professional competence with philosophical perspectives becomes a challenge when ideals are opposed by the reality and implications of everyday nursing practice.
Resumo:
Objective To evaluate drug interaction software programs and determine their accuracy in identifying drug-drug interactions that may occur in intensive care units. Setting The study was developed in Brazil. Method Drug interaction software programs were identified through a bibliographic search in PUBMED and in LILACS (database related to the health sciences published in Latin American and Caribbean countries). The programs` sensitivity, specificity, and positive and negative predictive values were determined to assess their accuracy in detecting drug-drug interactions. The accuracy of the software programs identified was determined using 100 clinically important interactions and 100 clinically unimportant ones. Stockley`s Drug Interactions 8th edition was employed as the gold standard in the identification of drug-drug interaction. Main outcome Sensitivity, specificity, positive and negative predictive values. Results The programs studied were: Drug Interaction Checker (DIC), Drug-Reax (DR), and Lexi-Interact (LI). DR displayed the highest sensitivity (0.88) and DIC showed the lowest (0.69). A close similarity was observed among the programs regarding specificity (0.88-0.92) and positive predictive values (0.88-0.89). The DIC had the lowest negative predictive value (0.75) and DR the highest (0.91). Conclusion The DR and LI programs displayed appropriate sensitivity and specificity for identifying drug-drug interactions of interest in intensive care units. Drug interaction software programs help pharmacists and health care teams in the prevention and recognition of drug-drug interactions and optimize safety and quality of care delivered in intensive care units.
Resumo:
Support for interoperability and interchangeability of software components which are part of a fieldbus automation system relies on the definition of open architectures, most of them involving proprietary technologies. Concurrently, standard, open and non-proprietary technologies, such as XML, SOAP, Web Services and the like, have greatly evolved and been diffused in the computing area. This article presents a FOUNDATION fieldbus (TM) device description technology named Open-EDD, based on XML and other related technologies (XLST, DOM using Xerces implementation, OO, XMIL Schema), proposing an open and nonproprietary alternative to the EDD (Electronic Device Description). This initial proposal includes defining Open-EDDML as the programming language of the technology in the FOUNDATION fieldbus (TM) protocol, implementing a compiler and a parser, and finally, integrating and testing the new technology using field devices and a commercial fieldbus configurator. This study attests that this new technology is feasible and can be applied to other configurators or HMI applications used in fieldbus automation systems. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
This paper proposes a novel computer vision approach that processes video sequences of people walking and then recognises those people by their gait. Human motion carries different information that can be analysed in various ways. The skeleton carries motion information about human joints, and the silhouette carries information about boundary motion of the human body. Moreover, binary and gray-level images contain different information about human movements. This work proposes to recover these different kinds of information to interpret the global motion of the human body based on four different segmented image models, using a fusion model to improve classification. Our proposed method considers the set of the segmented frames of each individual as a distinct class and each frame as an object of this class. The methodology applies background extraction using the Gaussian Mixture Model (GMM), a scale reduction based on the Wavelet Transform (WT) and feature extraction by Principal Component Analysis (PCA). We propose four new schemas for motion information capture: the Silhouette-Gray-Wavelet model (SGW) captures motion based on grey level variations; the Silhouette-Binary-Wavelet model (SBW) captures motion based on binary information; the Silhouette-Edge-Binary model (SEW) captures motion based on edge information and the Silhouette Skeleton Wavelet model (SSW) captures motion based on skeleton movement. The classification rates obtained separately from these four different models are then merged using a new proposed fusion technique. The results suggest excellent performance in terms of recognising people by their gait.
Resumo:
This paper presents the proposal for a reference model for developing software aimed at small companies. Despite the importance of that represent the small software companies in Latin America, the fact of not having its own standards, and able to meet their specific, has created serious difficulties in improving their process and also in quality certification. In this sense and as a contribution to better understanding of the subject they propose a reference model and as a means to validate the proposal, presents a report of its application in a small Brazilian company, committed to certification of the quality model MPS.BR.
Resumo:
In this paper a bond graph methodology is used to model incompressible fluid flows with viscous and thermal effects. The distinctive characteristic of these flows is the role of pressure, which does not behave as a state variable but as a function that must act in such a way that the resulting velocity field has divergence zero. Velocity and entropy per unit volume are used as independent variables for a single-phase, single-component flow. Time-dependent nodal values and interpolation functions are introduced to represent the flow field, from which nodal vectors of velocity and entropy are defined as state variables. The system for momentum and continuity equations is coincident with the one obtained by using the Galerkin method for the weak formulation of the problem in finite elements. The integral incompressibility constraint is derived based on the integral conservation of mechanical energy. The weak formulation for thermal energy equation is modeled with true bond graph elements in terms of nodal vectors of temperature and entropy rates, resulting a Petrov-Galerkin method. The resulting bond graph shows the coupling between mechanical and thermal energy domains through the viscous dissipation term. All kind of boundary conditions are handled consistently and can be represented as generalized effort or flow sources. A procedure for causality assignment is derived for the resulting graph, satisfying the Second principle of Thermodynamics. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
Nanocomposite membranes containing polysulfone (PSI) and sodium montmorillonite from Wyoming (MMT) were prepared by a combination of solution dispersion and the immersion step of the wet-phase inversion method. The purpose was to study the MMT addition with contents of 0.5 and 3.0 mass% MMT in the preparation of nanocomposite membranes by means of morphology, thermal, mechanical and hydrophilic properties of nanocomposite membranes and to compare these properties to the pure PSf membrane ones. Small-angle X-ray diffraction patterns revealed the formation of intercalated clay mineral layers in the PSf matrix and TEM images also presented an exfoliated structure. A good dispersion of the clay mineral particles was detected by SEM images. Tensile tests showed that both elongation at break and tensile strength of the nanocomposites were improved in comparison to the pristine PSf. The thermal stability of the nanocomposite membranes, evaluated by onset and final temperatures of degradation, was also enhanced. The hydrophilicity of the nanocomposite membranes, determined by water contact angle measurements, was higher; therefore, the MMT addition was useful to produce more hydrophilic membranes. (C) 2009 Elsevier B.V. All rights reserved.