948 resultados para Functional Discourse Grammar Theory
Resumo:
We have investigated optical and transport properties of the molecular structure 2,3,4,5-tetraphenyl-1-phenylethynyl-cyclopenta-2,4-dienol experimentally and theoretically. The optical spectrum was calculated using Hartree-Fock-intermediate neglect of differential overlap-configuration interaction model. The experimental photoluminescence spectrum showed a peak around 470nm which was very well described by the modeling. Electronic transport measurements showed a diode-like effect with a strong current rectification. A phenomenological microscopic model based on non-equilibrium Green's function technique was proposed and a very good description electronic transport was obtained. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4767457]
Resumo:
The physical properties of small rhodium clusters, Rh-n, have been in debate due to the shortcomings of density functional theory (DFT). To help in the solution of those problems, we obtained a set of putative lowest energy structures for small Rh-n (n = 2-15) clusters employing hybrid-DFT and the generalized gradient approximation (GGA). For n = 2-6, both hybrid and GGA functionals yield similar ground-state structures (compact), however, hybrid favors compact structures for n = 7-15, while GGA favors open structures based on simple cubic motifs. Thus, experimental results are crucial to indicate the correct ground-state structures, however, we found that a unique set of structures (compact or open) is unable to explain all available experimental data. For example, the GGA structures (open) yield total magnetic moments in excellent agreement with experimental data, while hybrid structures (compact) have larger magnetic moments compared with experiments due to the increased localization of the 4d states. Thus, we would conclude that GGA provides a better description of the Rh-n clusters, however, a recent experimental-theoretical study [ Harding et al., J. Chem. Phys. 133, 214304 (2010)] found that only compact structures are able to explain experimental vibrational data, while open structures cannot. Therefore, it indicates that the study of Rh-n clusters is a challenging problem and further experimental studies are required to help in the solution of this conundrum, as well as a better description of the exchange and correlation effects on the Rh n clusters using theoretical methods such as the quantum Monte Carlo method.
Resumo:
The aim of the present study was to evaluate the influence of seasonality on the behavior of phytoplankton associations in eutrophic reservoirs with different depths in northeastern Brazil. Five collections were carried out at each of the reservoirs at two depths (0.1 m and near the sediment) at three-month intervals in each season (dry and rainy). The phytoplankton samples were preserved in Lugol's solution and quantified under an inverted microscope for the determination of density values, which were subsequently converted to biomass values based on cellular biovolume and classified in phytoplankton associations. The following abiotic variables were analyzed: water temperature, dissolved oxygen, pH, turbidity, water transparency, total phosphorus, total dissolved phosphorus, orthophosphate and total nitrogen. The data were investigated using canonical correspondence analysis. The influence of seasonality on the dynamics of the phytoplankton community was lesser in the deeper reservoirs. Depth affected the behavior of the algal associations. Variation in light availability was a determinant of changes in the phytoplankton structure. Urosolenia and Anabaena associations were more abundant in shallow ecosystems with a larger eutrophic zone, whereas the Microcystis association was more related to deep ecosystems with adequate availability of nutrients. The distribution of Cyclotella, Geitlerinema, Planktothrix, Pseudanabaena and Cylindrospermopsis associations was different from that seen in subtropical regions and the substitution of these associations was related to a reduction in the eutrophic zone rather than the mixture zone. Published by Elsevier GmbH.
Resumo:
Abstract Background Recently, it was realized that the functional connectivity networks estimated from actual brain-imaging technologies (MEG, fMRI and EEG) can be analyzed by means of the graph theory, that is a mathematical representation of a network, which is essentially reduced to nodes and connections between them. Methods We used high-resolution EEG technology to enhance the poor spatial information of the EEG activity on the scalp and it gives a measure of the electrical activity on the cortical surface. Afterwards, we used the Directed Transfer Function (DTF) that is a multivariate spectral measure for the estimation of the directional influences between any given pair of channels in a multivariate dataset. Finally, a graph theoretical approach was used to model the brain networks as graphs. These methods were used to analyze the structure of cortical connectivity during the attempt to move a paralyzed limb in a group (N=5) of spinal cord injured patients and during the movement execution in a group (N=5) of healthy subjects. Results Analysis performed on the cortical networks estimated from the group of normal and SCI patients revealed that both groups present few nodes with a high out-degree value (i.e. outgoing links). This property is valid in the networks estimated for all the frequency bands investigated. In particular, cingulate motor areas (CMAs) ROIs act as ‘‘hubs’’ for the outflow of information in both groups, SCI and healthy. Results also suggest that spinal cord injuries affect the functional architecture of the cortical network sub-serving the volition of motor acts mainly in its local feature property. In particular, a higher local efficiency El can be observed in the SCI patients for three frequency bands, theta (3-6 Hz), alpha (7-12 Hz) and beta (13-29 Hz). By taking into account all the possible pathways between different ROI couples, we were able to separate clearly the network properties of the SCI group from the CTRL group. In particular, we report a sort of compensatory mechanism in the SCI patients for the Theta (3-6 Hz) frequency band, indicating a higher level of “activation” Ω within the cortical network during the motor task. The activation index is directly related to diffusion, a type of dynamics that underlies several biological systems including possible spreading of neuronal activation across several cortical regions. Conclusions The present study aims at demonstrating the possible applications of graph theoretical approaches in the analyses of brain functional connectivity from EEG signals. In particular, the methodological aspects of the i) cortical activity from scalp EEG signals, ii) functional connectivity estimations iii) graph theoretical indexes are emphasized in the present paper to show their impact in a real application.
Resumo:
With their accession to the European Union, twelve new countries - Romania among them - (re)entered the international community of international donors. In the history of development aid this can be seen as a unique event: it is for the first time in history that such a large number of countries become international donors, with such short notice and in such a particular context that sees some scholars announcing the ‘death’ of development. But in spite of what might be claimed regarding the ‘end’ of the development era, development discourse seems to be rather vigorous and in good health: it is able to extert an undeniable force of attraction over the twelve countries that, in a matter of years, have already convinced themselves of its validity and adhered to its main tenets. This thesis collects evidence for improving our understanding of this process that sees the co-optation of twelve new countries to the dominant theory and practice of development cooperation. The evidence collected seems to show that one of the tools employed by the promoters of this co-optation process is that of constructing the ‘new’ Member States as ‘new’, inexpert donors that need to learn from the ‘old’ ones. By taking a case-study approach, this thesis gathers data that suggests that conceiving of the ‘twelve’ as ‘new’ donors is both historically inaccurate and value-ladden. On one hand, Romania’s case-study illustrates how in the (socialist) past at least one in the group of the twelve was particularly conversant in the discourse of international development. On the other hand, the process of co-optation, while being presented as a knowledgeproducing process, can also be seen as an ignorance-producing procedure: Romania, along with its fellow new Member States, takes the opportunity of ‘building its capacity’ and ‘raising its awareness’ of development cooperation along the line drawn by the European Union, but at the same time it seems to un-learn and ‘lower’ its awareness of development experience in the (socialist) past. This is one possible reading of this thesis. At a different level, this thesis can also be seen as an attempt to account of almost five decades of international development discourse in one specific country – Romania – in three different socio-political contexts: the socialist years (up to the year 1989), the ‘transition years’ (from 1989 to the pre-accession years) and the membership to the European Union. In this second reading, the thesis seeks to illustrate how – contrary to widespread beliefs – before 1989 Romania’s international development discourse was particularly vivid: in the most varied national and international settings President Ceausescu unfolded an extensive discursive activity on issues pertaining to international development; generous media coverage of affairs concerning the developing countries and their fight for development was the rule rather than the exception; the political leadership wanted the Romanians not only to be familiarized with (or ‘aware of’ to use current terminology) matters of underdevelopment, but also to prove a sense of solidarity with these countries, as well as a sense of pride for the relations of ‘mutual help’ that were being built with them; finally, international development was object of academic attention and the Romanian scholars were able not only to reflect on major developments, but could also formulate critical positions towards the practices of development aid. Very little remains of all this during the transition years, while in the present those who are engaged in matters pertaining to international development do so with a view of building Romania as an EU-compliant donor.
Resumo:
Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.
Resumo:
In this thesis we discuss a representation of quantum mechanics and quantum and statistical field theory based on a functional renormalization flow equation for the one-particle-irreducible average effective action, and we employ it to get information on some specific systems.
Resumo:
In this thesis we develop further the functional renormalization group (RG) approach to quantum field theory (QFT) based on the effective average action (EAA) and on the exact flow equation that it satisfies. The EAA is a generalization of the standard effective action that interpolates smoothly between the bare action for krightarrowinfty and the standard effective action rnfor krightarrow0. In this way, the problem of performing the functional integral is converted into the problem of integrating the exact flow of the EAA from the UV to the IR. The EAA formalism deals naturally with several different aspects of a QFT. One aspect is related to the discovery of non-Gaussian fixed points of the RG flow that can be used to construct continuum limits. In particular, the EAA framework is a useful setting to search for Asymptotically Safe theories, i.e. theories valid up to arbitrarily high energies. A second aspect in which the EAA reveals its usefulness are non-perturbative calculations. In fact, the exact flow that it satisfies is a valuable starting point for devising new approximation schemes. In the first part of this thesis we review and extend the formalism, in particular we derive the exact RG flow equation for the EAA and the related hierarchy of coupled flow equations for the proper-vertices. We show how standard perturbation theory emerges as a particular way to iteratively solve the flow equation, if the starting point is the bare action. Next, we explore both technical and conceptual issues by means of three different applications of the formalism, to QED, to general non-linear sigma models (NLsigmaM) and to matter fields on curved spacetimes. In the main part of this thesis we construct the EAA for non-abelian gauge theories and for quantum Einstein gravity (QEG), using the background field method to implement the coarse-graining procedure in a gauge invariant way. We propose a new truncation scheme where the EAA is expanded in powers of the curvature or field strength. Crucial to the practical use of this expansion is the development of new techniques to manage functional traces such as the algorithm proposed in this thesis. This allows to project the flow of all terms in the EAA which are analytic in the fields. As an application we show how the low energy effective action for quantum gravity emerges as the result of integrating the RG flow. In any treatment of theories with local symmetries that introduces a reference scale, the question of preserving gauge invariance along the flow emerges as predominant. In the EAA framework this problem is dealt with the use of the background field formalism. This comes at the cost of enlarging the theory space where the EAA lives to the space of functionals of both fluctuation and background fields. In this thesis, we study how the identities dictated by the symmetries are modified by the introduction of the cutoff and we study so called bimetric truncations of the EAA that contain both fluctuation and background couplings. In particular, we confirm the existence of a non-Gaussian fixed point for QEG, that is at the heart of the Asymptotic Safety scenario in quantum gravity; in the enlarged bimetric theory space where the running of the cosmological constant and of Newton's constant is influenced by fluctuation couplings.
Resumo:
We give a brief review of the Functional Renormalization method in quantum field theory, which is intrinsically non perturbative, in terms of both the Polchinski equation for the Wilsonian action and the Wetterich equation for the generator of the proper verteces. For the latter case we show a simple application for a theory with one real scalar field within the LPA and LPA' approximations. For the first case, instead, we give a covariant "Hamiltonian" version of the Polchinski equation which consists in doing a Legendre transform of the flow for the corresponding effective Lagrangian replacing arbitrary high order derivative of fields with momenta fields. This approach is suitable for studying new truncations in the derivative expansion. We apply this formulation for a theory with one real scalar field and, as a novel result, derive the flow equations for a theory with N real scalar fields with the O(N) internal symmetry. Within this new approach we analyze numerically the scaling solutions for N=1 in d=3 (critical Ising model), at the leading order in the derivative expansion with an infinite number of couplings, encoded in two functions V(phi) and Z(phi), obtaining an estimate for the quantum anomalous dimension with a 10% accuracy (confronting with Monte Carlo results).
Resumo:
The European Society of Cardiology heart failure guidelines firmly recommend regular physical activity and structured exercise training (ET), but this recommendation is still poorly implemented in daily clinical practice outside specialized centres and in the real world of heart failure clinics. In reality, exercise intolerance can be successfully tackled by applying ET. We need to encourage the mindset that breathlessness may be evidence of signalling between the periphery and central haemodynamic performance and regular physical activity may ultimately bring about favourable changes in myocardial function, symptoms, functional capacity, and increased hospitalization-free life span and probably survival. In this position paper, we provide practical advice for the application of exercise in heart failure and how to overcome traditional barriers, based on the current scientific and clinical knowledge supporting the beneficial effect of this intervention.
Resumo:
Biological systems have acquired effective adaptive strategies to cope with physiological challenges and to maximize biochemical processes under imposed constraints. Striated muscle tissue demonstrates a remarkable malleability and can adjust its metabolic and contractile makeup in response to alterations in functional demands. Activity-dependent muscle plasticity therefore represents a unique model to investigate the regulatory machinery underlying phenotypic adaptations in a fully differentiated tissue. Adjustments in form and function of mammalian muscle have so far been characterized at a descriptive level, and several major themes have evolved. These imply that mechanical, metabolic and neuronal perturbations in recruited muscle groups relay to the specific processes being activated by the complex physiological stimulus of exercise. The important relationship between the phenotypic stimuli and consequent muscular modifications is reflected by coordinated differences at the transcript level that match structural and functional adjustments in the new training steady state. Permanent alterations of gene expression thus represent a major strategy for the integration of phenotypic stimuli into remodeling of muscle makeup. A unifying theory on the molecular mechanism that connects the single exercise stimulus to the multi-faceted adjustments made after the repeated impact of the muscular stress remains elusive. Recently, master switches have been recognized that sense and transduce the individual physical and chemical perturbations induced by physiological challenges via signaling cascades to downstream gene expression events. Molecular observations on signaling systems also extend the long-known evidence for desensitization of the muscle response to endurance exercise after the repeated impact of the stimulus that occurs with training. Integrative approaches involving the manipulation of single factors and the systematic monitoring of downstream effects at multiple levels would appear to be the ultimate method for pinpointing the mechanism of muscle remodeling. The identification of the basic relationships underlying the malleability of muscle tissue is likely to be of relevance for our understanding of compensatory processes in other tissues, species and organisms.
Resumo:
The last few years have seen the advent of high-throughput technologies to analyze various properties of the transcriptome and proteome of several organisms. The congruency of these different data sources, or lack thereof, can shed light on the mechanisms that govern cellular function. A central challenge for bioinformatics research is to develop a unified framework for combining the multiple sources of functional genomics information and testing associations between them, thus obtaining a robust and integrated view of the underlying biology. We present a graph theoretic approach to test the significance of the association between multiple disparate sources of functional genomics data by proposing two statistical tests, namely edge permutation and node label permutation tests. We demonstrate the use of the proposed tests by finding significant association between a Gene Ontology-derived "predictome" and data obtained from mRNA expression and phenotypic experiments for Saccharomyces cerevisiae. Moreover, we employ the graph theoretic framework to recast a surprising discrepancy presented in Giaever et al. (2002) between gene expression and knockout phenotype, using expression data from a different set of experiments.
Resumo:
We establish a fundamental equivalence between singular value decomposition (SVD) and functional principal components analysis (FPCA) models. The constructive relationship allows to deploy the numerical efficiency of SVD to fully estimate the components of FPCA, even for extremely high-dimensional functional objects, such as brain images. As an example, a functional mixed effect model is fitted to high-resolution morphometric (RAVENS) images. The main directions of morphometric variation in brain volumes are identified and discussed.