878 resultados para sets of words
Resumo:
2D electrophoresis is a well-known method for protein separation which is extremely useful in the field of proteomics. Each spot in the image represents a protein accumulation and the goal is to perform a differential analysis between pairs of images to study changes in protein content. It is thus necessary to register two images by finding spot correspondences. Although it may seem a simple task, generally, the manual processing of this kind of images is very cumbersome, especially when strong variations between corresponding sets of spots are expected (e.g. strong non-linear deformations and outliers). In order to solve this problem, this paper proposes a new quadratic assignment formulation together with a correspondence estimation algorithm based on graph matching which takes into account the structural information between the detected spots. Each image is represented by a graph and the task is to find a maximum common subgraph. Successful experimental results using real data are presented, including an extensive comparative performance evaluation with ground-truth data. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
We propose a likelihood ratio test ( LRT) with Bartlett correction in order to identify Granger causality between sets of time series gene expression data. The performance of the proposed test is compared to a previously published bootstrapbased approach. LRT is shown to be significantly faster and statistically powerful even within non- Normal distributions. An R package named gGranger containing an implementation for both Granger causality identification tests is also provided.
Resumo:
Trypanosoma cruzi is highly diverse genetically and has been partitioned into six discrete typing units (DTUs), recently re-named T. cruzi I-VI. Although T. cruzi reproduces predominantly by binary division, accumulating evidence indicates that particular DTUs are the result of hybridization events. Two major scenarios for the origin of the hybrid lineages have been proposed. It is accepted widely that the most heterozygous TcV and TcVI DTUs are the result of genetic exchange between TcII and TcIII strains. On the other hand, the participation of a TcI parental in the current genome structure of these hybrid strains is a matter of debate. Here, sequences of the T. cruzi-specific 195-bp satellite DNA of TcI, TcII, Tat, TcV, and TcVI strains have been used for inferring network genealogies. The resulting genealogy showed a high degree of reticulation, which is consistent with more than one event of hybridization between the Tc DTUs. The data also strongly suggest that Tat is a hybrid with two distinct sets of satellite sequences, and that genetic exchange between TcI and TcII parentals occurred within the pedigree of the TcV and TcVI DTUs. Although satellite DNAs belong to the fast-evolving portion of eukaryotic genomes, in >100 satellite units of nine T. cruzi strains we found regions that display 100% identity. No DTU-specific consensus motifs were identified, inferring species-wide conservation. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
New basis sets of the atomic natural orbital (ANO) type have been developed for the lanthanide atoms La-Lu. The ANOs have been obtained from the average density matrix of the ground and lowest excited states of the atom, the positive ions, and the atom in an electric field. Scalar relativistic effects are included through the use of a Douglas-Kroll-Hess Hamiltonian. Multiconfigurational wave functions have been used with dynamic correlation included using second-order perturbation theory (CASSCF/CASPT2). The basis sets are applied in calculations of ionization energies and some excitation energies. Computed ionization energies have an accuracy better than 0.1 eV in most cases. Two molecular applications are inluded as illustration: the cerium diatom and the LuF3 molecule. In both cases it is shown that 4f orbitals are not involved in the chemical bond in contrast to an earlier claim for the latter molecule.
Resumo:
The solvation of six solvatochromic probes in a large number of solvents (33-68) was examined at 25 degrees C. The probes employed were the following: 2,6-diphenyl-4-(2,4,6-triphenylpyridinium-1-yl) phenolate (RB); 4-[(E)2-(1-methylpyridinium-4-yl)ethenyl] phenolate, MePM; 1-methylquinolinium-8-olate, QB; 2-bromo-4-[(E)-2-(1-methylpyridinium-4-yl)ethenyl] phenolate, MePMBr, 2,6-dichloro-4-(2,4,6-triphenyl pyridinium-1-yl) phenolate (WB); and 2,6-dibromo-4-[(E)-2-(1-methylpyridinium-4-yl)ethenyl] phenolate, MePMBr(2), respectively. Of these, MePMBr is a novel compound. They can be grouped in three pairs, each with similar pK(a) in water but with different molecular properties, for example, lipophilicity and dipole moment. These pairs are formed by RB and MePM; QB and MePMBr; WB and MePMBr(2), respectively. Theoretical calculations were carried out in order to calculate their physicochemical properties including bond lengths, dihedral angles, dipole moments, and wavelength of absorption of the intramolecular charge-transfer band in four solvents, water, methanol, acetone, and DMSO, respectively. The data calculated were in excellent agreement with available experimental data, for example, bond length and dihedral angles. This gives credence to the use of the calculated properties in explaining the solvatochromic behaviors observed. The dependence of an empirical solvent polarity scale E(T)(probe) in kcal/mol on the physicochemical properties of the solvent (acidity, basicity, and dipolarity/polarizability) and those of the probes (pK(a), and dipole moment) was analyzed by using known multiparameter solvation equations. For each pair of probes, values of E(T)(probe) (for example, E(T)(MePM) versus E(T)(RB)) were found to be linearly correlated with correlation coefficients, r, between 0.9548 and 0.9860. For the mercyanine series, the values of E(T)(probe) also correlated linearly, with (r) of 0.9772 (MePMBr versus MePM) and 0.9919 (MePMBr(2) versus MePM). The response of each pair of probes (of similar pK(a)) to solvent acidity is the same, provided that solute-solvent hydrogen-bonding is not seriously affected by steric crowding (as in case of RB). We show, for the first time, that the response to solvent dipolarity/polarizability is linearly correlated to the dipole moment of the probes. The successive introduction of bromine atoms in MePM (to give MePMBr, then MePMBr(2)) leads to the following linear decrease: pK(a) in water, length of the phenolate oxygen-carbon bond, length of the central ethylenic bond, susceptibility to solvent acidity, and susceptibility to solvent dipolarity/polarizability. Thus studying the solvation of probes whose molecular structures are varied systematically produces a wealth of information on the effect of solute structure on its solvation. The results of solvation of the present probes were employed in order to test the goodness of fit of two independent sets of solvent solvatochromic parameters.
Resumo:
Various significant anti-HCV and cytotoxic sesquiterpene lactones (SLs) have been characterized. In this work, the chemometric tool Principal Component Analysis (PCA) was applied to two sets of SLs and the variance of the biological activity was explored. The first principal component accounts for as much of the variability in the data as possible, and each succeeding component accounts for as much of the remaining variability as possible. The calculations were performed using VolSurf program. For anti-HCV activity, PC1 (First Principal Component) explained 30.3% and PC2 (Second Principal Component) explained 26.5% of matrix total variance, while for cytotoxic activity, PC1 explained 30.9% and PC2 explained 15.6% of the total variance. The formalism employed generated good exploratory and predictive results and we identified some structural features, for both sets, important to the suitable biological activity and pharmacokinetic profile.
Resumo:
The batch-operated bromate/phosphate/acetone/dual catalyst system was studied at four temperatures between 5 and 35 degrees C. The dynamics was simultaneously followed by potential measurements with platinum and bromide selective electrodes, and spectroscopically at two different wavelengths. By simultaneously recording these four time series it was possible to characterize the dynamics of the sequential oscillations that evolve in time. The existence of three sequential oscillatory patterns at each temperature allowed estimating the activation energies in each case. Along with the activation energy of the induction period, it was possible to trace the time evolution of the overall activation energy at four different stages as the reaction proceeds. The study was carried out for two different sets of initial concentrations and it was observed that the overall activation energy increases as reactants turn into products. This finding was propounded as a result of the decrease in the driving force, or the system`s affinity, of the catalytic oxidative bromination of acetone with acidic bromate, as the closed system evolves toward the thermodynamic equilibrium.
Resumo:
Background: The sensitivity to microenvironmental changes varies among animals and may be under genetic control. It is essential to take this element into account when aiming at breeding robust farm animals. Here, linear mixed models with genetic effects in the residual variance part of the model can be used. Such models have previously been fitted using EM and MCMC algorithms. Results: We propose the use of double hierarchical generalized linear models (DHGLM), where the squared residuals are assumed to be gamma distributed and the residual variance is fitted using a generalized linear model. The algorithm iterates between two sets of mixed model equations, one on the level of observations and one on the level of variances. The method was validated using simulations and also by re-analyzing a data set on pig litter size that was previously analyzed using a Bayesian approach. The pig litter size data contained 10,060 records from 4,149 sows. The DHGLM was implemented using the ASReml software and the algorithm converged within three minutes on a Linux server. The estimates were similar to those previously obtained using Bayesian methodology, especially the variance components in the residual variance part of the model. Conclusions: We have shown that variance components in the residual variance part of a linear mixed model can be estimated using a DHGLM approach. The method enables analyses of animal models with large numbers of observations. An important future development of the DHGLM methodology is to include the genetic correlation between the random effects in the mean and residual variance parts of the model as a parameter of the DHGLM.
Resumo:
The recent advances in CMOS technology have allowed for the fabrication of transistors with submicronic dimensions, making possible the integration of tens of millions devices in a single chip that can be used to build very complex electronic systems. Such increase in complexity of designs has originated a need for more efficient verification tools that could incorporate more appropriate physical and computational models. Timing verification targets at determining whether the timing constraints imposed to the design may be satisfied or not. It can be performed by using circuit simulation or by timing analysis. Although simulation tends to furnish the most accurate estimates, it presents the drawback of being stimuli dependent. Hence, in order to ensure that the critical situation is taken into account, one must exercise all possible input patterns. Obviously, this is not possible to accomplish due to the high complexity of current designs. To circumvent this problem, designers must rely on timing analysis. Timing analysis is an input-independent verification approach that models each combinational block of a circuit as a direct acyclic graph, which is used to estimate the critical delay. First timing analysis tools used only the circuit topology information to estimate circuit delay, thus being referred to as topological timing analyzers. However, such method may result in too pessimistic delay estimates, since the longest paths in the graph may not be able to propagate a transition, that is, may be false. Functional timing analysis, in turn, considers not only circuit topology, but also the temporal and functional relations between circuit elements. Functional timing analysis tools may differ by three aspects: the set of sensitization conditions necessary to declare a path as sensitizable (i.e., the so-called path sensitization criterion), the number of paths simultaneously handled and the method used to determine whether sensitization conditions are satisfiable or not. Currently, the two most efficient approaches test the sensitizability of entire sets of paths at a time: one is based on automatic test pattern generation (ATPG) techniques and the other translates the timing analysis problem into a satisfiability (SAT) problem. Although timing analysis has been exhaustively studied in the last fifteen years, some specific topics have not received the required attention yet. One such topic is the applicability of functional timing analysis to circuits containing complex gates. This is the basic concern of this thesis. In addition, and as a necessary step to settle the scenario, a detailed and systematic study on functional timing analysis is also presented.
Resumo:
This paper is about economies with a representative consumer. In general a representative consumer need not exist, although there are several well known sets of sufficient conditions under which Qne will. It is common practice, however, to use the representative consumer hypothesis without specifically assuming any of these. We show, firstly, that it is possible for the utility of the representative consumer to increase when every actual consumer is made worse off. This shows a serious shortcoming of welfare judgements based on the representatíve consumer. Secondly, in economies where this does not occur, there exists a social welfare function, which we construct, which is consistent with welfare judgements based on the utility of the representative consumer. Finally we provide a converse to Samuelson' s 1956 representative consumer result, which relates it to Scitovsky's community indifference curves.
Resumo:
The relationship between Islamic Law and other legal systems (basically western type domestic legal orders and international law) is often thought of in terms of compatibility or incompatibility. Concerning certain subject matters of choice, the compatibility of Islamic (legal) principles with the values embedded in legal systems that are regarded as characteristic of the Modern Age is tested by sets of questions: is democracy possible in Islam? Does Islam recognize human rights and are those rights equivalent to a more universal conception? Does Islam recognize or condone more extreme acts of violence and does it justify violence differently? Etc. Such questions and many more presuppose the existence of an ensemble of rules or principles which, as any other set of rules and principles, purport to regulate social behavior. This ensemble is generically referred to as Islamic Law. However, one set of questions is usually left unanswered: is Islamic Law a legal system? If it is a legal system, what are its specific characteristics? How does it work? Where does it apply? It is this paper`s argument that the relationship between Islamic Law and domestic and international law can only be understood if looked upon as a relationship between distinct legal systems or legal orders.
Resumo:
The work described in this thesis aims to support the distributed design of integrated systems and considers specifically the need for collaborative interaction among designers. Particular emphasis was given to issues which were only marginally considered in previous approaches, such as the abstraction of the distribution of design automation resources over the network, the possibility of both synchronous and asynchronous interaction among designers and the support for extensible design data models. Such issues demand a rather complex software infrastructure, as possible solutions must encompass a wide range of software modules: from user interfaces to middleware to databases. To build such structure, several engineering techniques were employed and some original solutions were devised. The core of the proposed solution is based in the joint application of two homonymic technologies: CAD Frameworks and object-oriented frameworks. The former concept was coined in the late 80's within the electronic design automation community and comprehends a layered software environment which aims to support CAD tool developers, CAD administrators/integrators and designers. The latter, developed during the last decade by the software engineering community, is a software architecture model to build extensible and reusable object-oriented software subsystems. In this work, we proposed to create an object-oriented framework which includes extensible sets of design data primitives and design tool building blocks. Such object-oriented framework is included within a CAD Framework, where it plays important roles on typical CAD Framework services such as design data representation and management, versioning, user interfaces, design management and tool integration. The implemented CAD Framework - named Cave2 - followed the classical layered architecture presented by Barnes, Harrison, Newton and Spickelmier, but the possibilities granted by the use of the object-oriented framework foundations allowed a series of improvements which were not available in previous approaches: - object-oriented frameworks are extensible by design, thus this should be also true regarding the implemented sets of design data primitives and design tool building blocks. This means that both the design representation model and the software modules dealing with it can be upgraded or adapted to a particular design methodology, and that such extensions and adaptations will still inherit the architectural and functional aspects implemented in the object-oriented framework foundation; - the design semantics and the design visualization are both part of the object-oriented framework, but in clearly separated models. This allows for different visualization strategies for a given design data set, which gives collaborating parties the flexibility to choose individual visualization settings; - the control of the consistency between semantics and visualization - a particularly important issue in a design environment with multiple views of a single design - is also included in the foundations of the object-oriented framework. Such mechanism is generic enough to be also used by further extensions of the design data model, as it is based on the inversion of control between view and semantics. The view receives the user input and propagates such event to the semantic model, which evaluates if a state change is possible. If positive, it triggers the change of state of both semantics and view. Our approach took advantage of such inversion of control and included an layer between semantics and view to take into account the possibility of multi-view consistency; - to optimize the consistency control mechanism between views and semantics, we propose an event-based approach that captures each discrete interaction of a designer with his/her respective design views. The information about each interaction is encapsulated inside an event object, which may be propagated to the design semantics - and thus to other possible views - according to the consistency policy which is being used. Furthermore, the use of event pools allows for a late synchronization between view and semantics in case of unavailability of a network connection between them; - the use of proxy objects raised significantly the abstraction of the integration of design automation resources, as either remote or local tools and services are accessed through method calls in a local object. The connection to remote tools and services using a look-up protocol also abstracted completely the network location of such resources, allowing for resource addition and removal during runtime; - the implemented CAD Framework is completely based on Java technology, so it relies on the Java Virtual Machine as the layer which grants the independence between the CAD Framework and the operating system. All such improvements contributed to a higher abstraction on the distribution of design automation resources and also introduced a new paradigm for the remote interaction between designers. The resulting CAD Framework is able to support fine-grained collaboration based on events, so every single design update performed by a designer can be propagated to the rest of the design team regardless of their location in the distributed environment. This can increase the group awareness and allow a richer transfer of experiences among them, improving significantly the collaboration potential when compared to previously proposed file-based or record-based approaches. Three different case studies were conducted to validate the proposed approach, each one focusing one a subset of the contributions of this thesis. The first one uses the proxy-based resource distribution architecture to implement a prototyping platform using reconfigurable hardware modules. The second one extends the foundations of the implemented object-oriented framework to support interface-based design. Such extensions - design representation primitives and tool blocks - are used to implement a design entry tool named IBlaDe, which allows the collaborative creation of functional and structural models of integrated systems. The third case study regards the possibility of integration of multimedia metadata to the design data model. Such possibility is explored in the frame of an online educational and training platform.
Resumo:
In this dissertation, we investigate the effect of foreign capital participations in Brazilians companies’ performance. To carry out this analysis, we constructed two sets of model based on EBITDA margin and return on equity. Panel data analysis is used to examine the relationship between foreign capital ownership and Brazilian firms’ performance. We construct a cross-section time-series sample of companies listed on the BOVESPA index from 2006 to 2010. Empirical results led us to validate two hypotheses. First, foreign capital participations improve companies’ performance up to a certain level of participation. Then, joint controlled or strategic partnership between a Brazilian company and a foreign investor provide high operating performance.
Resumo:
Uma importante tendência do mercado de luxo é a extensão de marca em um novo segmento de mercado por meio da chamada extensão vertical, que pode ser para cima ou para baixo. Em outras palavras, significa que a organização passa a atuar em um novo segmento dentro de uma mesma categoria de produtos, mas com diferente público-alvo que sua marca original. Nesse processo, a empresa inicia atividade em um novo segmento com diferente nível de luxo. A distribuição é um aspecto fundamental do composto de Marketing e a importância da internet como canal de distribuição dessa indústria tem aumentado expressivamente nos últimos anos. Dessa forma, faz-se necessário compreender como as marcas de luxo gerenciam suas estratégias de distribuição online quando desenvolvem processos de extensão de marca e penetração em novos segmentos. Com o objetivo de analisar a estratégia de distribuição da indústria de luxo, um estudo exploratório foi desenvolvido focando bens de luxo pessoal (em categorias como costura, relógios & jóias, couro e sapatos). Uma amostra significativa constituída de marcas originais e suas extensões foi analisada para constituir um modelo comparativo entre duas variáveis: o nível de diferenciação entre os canais de distribuição da marca original e suas extensões; e a distância entre as próprias marcas no que concerne ao seus posicionamentos. Esse estudo contribui para o entendimento da dinâmica de distribuição do mercado e colabora com a compreensão do comportamento das empresas que atuam nele, dependendo do tipo de extensões que elas desenvolvem e da forma como elas são conduzidas.
Resumo:
This paper analyzes the placement in the private sector of a subset of Brazilian public-sector employees. This group left public employment in the mid-1990’s through a voluntary severance program. This paper contrasts their earnings before and after quitting the public sector, and compares both sets of wages to public and private sector earnings for similar workers. We find that participants in this voluntary severance program suffered a significant reduction in average earnings wage and an increase in earnings dispersion. We test whether the reduction in average earnings and the increase in earnings dispersion is the expected outcome once one controls for observed characteristics, by means of counterfactual simulations. Several methods of controlling for observed characteristics (parametric and non-parametrically) are used for robustness. The results indicate that this group of workers was paid at levels below what would be expected given their embodied observable characteristics.