836 resultados para GIBBS FORMALISM
Resumo:
Program slicing is a well known family of techniques used to identify code fragments which depend on or are depended upon specific program entities. They are particularly useful in the areas of reverse engineering, program understanding, testing and software maintenance. Most slicing methods, usually oriented towards the imperative or object paradigms, are based on some sort of graph structure representing program dependencies. Slicing techniques amount, therefore, to (sophisticated) graph transversal algorithms. This paper proposes a completely different approach to the slicing problem for functional programs. Instead of extracting program information to build an underlying dependencies’ structure, we resort to standard program calculation strategies, based on the so-called Bird-Meertens formalism. The slicing criterion is specified either as a projection or a hiding function which, once composed with the original program, leads to the identification of the intended slice. Going through a number of examples, the paper suggests this approach may be an interesting, even if not completely general, alternative to slicing functional programs
Resumo:
Clone detection is well established for imperative programs. It works mostly on the statement level and therefore is ill-suited for func- tional programs, whose main constituents are expressions and types. In this paper we introduce clone detection for functional programs using a new intermediate program representation, dubbed Functional Control Tree. We extend clone detection to the identi cation of non-trivial func- tional program clones based on the recursion patterns from the so-called Bird-Meertens formalism
Resumo:
Program slicing is a well known family of techniques used to identify code fragments which depend on or are depended upon specific program entities. They are particularly useful in the areas of reverse engineering, program understanding, testing and software maintenance. Most slicing methods, usually targeting either the imperative or the object oriented paradigms, are based on some sort of graph structure representing program dependencies. Slicing techniques amount, therefore, to (sophisticated) graph transversal algorithms. This paper proposes a completely different approach to the slicing problem for functional programs. Instead of extracting program information to build an underlying dependencies’ structure, we resort to standard program calculation strategies, based on the so-called Bird- Meertens formalism. The slicing criterion is specified either as a projection or a hiding function which, once composed with the original program, leads to the identification of the intended slice. Going through a number of examples, the paper suggests this approach may be an interesting, even if not completely general alternative to slicing functional programs
Resumo:
In the past thirty years, a series of plans have been developed by successive Brazilian governments in a continuing effort to maximize the nation's resources for economic and social growth. This planning history has been quantitatively rich but qualitatively poor. The disjunction has stimulated Professor Mello e Souza to address himself to the problem of national planning and to offer some criticisms of Brazilian planning experience. Though political instability has obviously been a factor promoting discontinuity, his criticisms are aimed at the attitudes and strategic concepts which have sought to link planning to national goals and administration. He criticizes the fascination with techniques and plans to the exclusion of proper diagnosis of the socio-political reality, developing instruments to coordinate and carry out objectives, and creating an administrative structure centralized enough to make national decisions and decentralized enough to perform on the basis of those decisions. Thus, fixed, quantified objectives abound while the problem of functioning mechanisms for the coordinated, rational use of resources has been left unattended. Although his interest and criticism are focused on the process and experience of national planning, he recognized variation in the level and results of Brazilian planning. National plans have failed due to faulty conception of the function of planning. Sectorial plans, save in the sector of the petroleum industry under government responsibility, ha e not succeeded in overcoming the problems of formulation and execution thereby repeating old technical errors. Planning for the private sector has a somewhat brighter history due to the use of Grupos Executivos which has enabled the planning process to transcend the formalism and tradition-bound attitudes of the regular bureaucracy. Regional planning offers two relatively successful experiences, Sudene and the strategy of the regionally oriented autarchy. Thus, planning history in Brazil is not entirely black but a certain shade of grey. The major part of the article, however, is devoted to a descriptive analysis of the national planning experience. The plans included in this analysis are: The Works and Equipment Plan (POE); The Health, Food, Transportation and Energy Plan (Salte); The Program of Goals; The Trienal Plan of Economic and Social Development; and the Plan of Governmental Economic Action (Paeg). Using these five plans for his historical experience the author sets out a series of errors of formulation and execution by which he analyzes that experience. With respect to formulation, he speaks of a lack of elaboration of programs and projects, of coordination among diverse goals, and of provision of qualified staff and techniques. He mentions the absence of the definition of resources necessary to the financing of the plan and the inadequate quantification of sectorial and national goals due to the lack of reliable statistical information. Finally, he notes the failure to coordinate the annual budget with the multi-year plans. He sees the problems of execution as beginning in the absence of coordination between the various sectors of the public administration, the failure to develop an operative system of decentralization, the absence of any system of financial and fiscal control over execution, the difficulties imposed by the system of public accounting, and the absence of an adequate program of allocation for the liberation of resources. He ends by pointing to the failure to develop and use an integrated system of political economic tools in a mode compatible with the objective of the plans. The body of the article analyzes national planning experience in Brazil using these lists of errors as rough model of criticism. Several conclusions emerge from this analysis with regard to planning in Brazil and in developing countries, in general. Plans have generally been of little avail in Brazil because of the lack of a continuous, bureaucratized (in the Weberian sense) planning organization set in an instrumentally suitable administrative structure and based on thorough diagnoses of socio-economic conditions and problems. Plans have become the justification for planning. Planning has come to be conceived as a rational method of orienting the process of decisions through the establishment of a precise and quantified relation between means and ends. But this conception has led to a planning history rimmed with frustration, and failure, because of its rigidity in the face of flexible and changing reality. Rather, he suggests a conception of planning which understands it "as a rational process of formulating decisions about the policy, economy, and society whose only demand is that of managing the instrumentarium in a harmonious and integrated form in order to reach explicit, but not quantified ends". He calls this "planning without plans": the establishment of broad-scale tendencies through diagnosis whose implementation is carried out through an adjustable, coherent instrumentarium of political-economic tools. Administration according to a plan of multiple, integrated goals is a sound procedure if the nation's administrative machinery contains the technical development needed to control the multiple variables linked to any situation of socio-economic change. Brazil does not possess this level of refinement and any strategy of planning relevant to its problems must recognize this. The reforms which have been attempted fail to make this recognition as is true of the conception of planning informing the Brazilian experience. Therefore, unworkable plans, ill-diagnosed with little or no supportive instrumentarium or flexibility have been Brazil's legacy. This legacy seems likely to continue until the conception of planning comes to live in the reality of Brazil.
Resumo:
Fluorescent protein microscopy imaging is nowadays one of the most important tools in biomedical research. However, the resulting images present a low signal to noise ratio and a time intensity decay due to the photobleaching effect. This phenomenon is a consequence of the decreasing on the radiation emission efficiency of the tagging protein. This occurs because the fluorophore permanently loses its ability to fluoresce, due to photochemical reactions induced by the incident light. The Poisson multiplicative noise that corrupts these images, in addition with its quality degradation due to photobleaching, make long time biological observation processes very difficult. In this paper a denoising algorithm for Poisson data, where the photobleaching effect is explicitly taken into account, is described. The algorithm is designed in a Bayesian framework where the data fidelity term models the Poisson noise generation process as well as the exponential intensity decay caused by the photobleaching. The prior term is conceived with Gibbs priors and log-Euclidean potential functions, suitable to cope with the positivity constrained nature of the parameters to be estimated. Monte Carlo tests with synthetic data are presented to characterize the performance of the algorithm. One example with real data is included to illustrate its application.
Resumo:
The two-Higgs-doublet model can be constrained by imposing Higgs-family symmetries and/or generalized CP symmetries. It is known that there are only six independent classes of such symmetry-constrained models. We study the CP properties of all cases in the bilinear formalism. An exact symmetry implies CP conservation. We show that soft breaking of the symmetry can lead to spontaneous CP violation (CPV) in three of the classes.
Resumo:
Deoxyribonucleic acid, or DNA, is the most fundamental aspect of life but present day scientific knowledge has merely scratched the surface of the problem posed by its decoding. While experimental methods provide insightful clues, the adoption of analysis tools supported by the formalism of mathematics will lead to a systematic and solid build-up of knowledge. This paper studies human DNA from the perspective of system dynamics. By associating entropy and the Fourier transform, several global properties of the code are revealed. The fractional order characteristics emerge as a natural consequence of the information content. These properties constitute a small piece of scientific knowledge that will support further efforts towards the final aim of establishing a comprehensive theory of the phenomena involved in life.
Resumo:
This paper presents an algorithm to efficiently generate the state-space of systems specified using the IOPT Petri-net modeling formalism. IOPT nets are a non-autonomous Petri-net class, based on Place-Transition nets with an extended set of features designed to allow the rapid prototyping and synthesis of system controllers through an existing hardware-software co-design framework. To obtain coherent and deterministic operation, IOPT nets use a maximal-step execution semantics where, in a single execution step, all enabled transitions will fire simultaneously. This fact increases the resulting state-space complexity and can cause an arc "explosion" effect. Real-world applications, with several million states, will reach a higher order of magnitude number of arcs, leading to the need for high performance state-space generator algorithms. The proposed algorithm applies a compilation approach to read a PNML file containing one IOPT model and automatically generate an optimized C program to calculate the corresponding state-space.
Resumo:
The solubility of ethene in water and in the fermentation medium of Xanthobacter Py(2) was determined with a Ben-Naim-Baer type apparatus. The solubility measurements were carried out in the temperature range of (293.15 to 323.15) K and at atmospheric pressure with a precision of about +/- 0.3 %. The Ostwald coefficients, the mole fractions of the dissolved ethene, at the gas partial pressure of 101.325 kPa, and the Henry coefficients, at the water vapor pressure, were calculated using accurate thermodynamic relations. A comparison between the solubility of ethene in water and in the cultivation medium has shown that this gas is about 2.4 % more soluble in pure water. On the other hand, from the solubility temperature dependence, the Gibbs energy, enthalpy, and entropy changes for the process of transferring the solute from the gaseous phase to the liquid solutions were also determined. Moreover, the perturbed-chain statistical associating fluid theory equation of state (PC-SAFT EOS) model was used for the prediction of the solubility of ethene in water. New parameters, k(ij), are proposed for this system, and it was found that using a ky temperature-dependent PC-SAFT EOS describes more accurately the behavior solubilities of ethene in water at 101.325 kPa, improving the deviations to 1 %.
Resumo:
In the two-Higgs-doublet model (THDM), generalized-CP transformations (phi(i) -> X-ij phi(*)(j) where X is unitary) and unitary Higgs-family transformations (phi(i) -> U-ij phi(j)) have recently been examined in a series of papers. In terms of gauge-invariant bilinear functions of the Higgs fields phi(i), the Higgs-family transformations and the generalized-CP transformations possess a simple geometric description. Namely, these transformations correspond in the space of scalar-field bilinears to proper and improper rotations, respectively. In this formalism, recent results relating generalized CP transformations with Higgs-family transformations have a clear geometric interpretation. We will review what is known regarding THDM symmetries, as well as derive new results concerning those symmetries, namely how they can be interpreted geometrically as applications of several CP transformations.
Resumo:
Trends between the Hammett's sigma(p) and related normal sigma(n)(p), inductive sigma(I), resonance sigma(R), negative sigma(-)(p) and positive sigma(+)(p) polar conjugation and Taft's sigma(o)(p) substituent constants and the N-H center dot center dot center dot O distance, delta(N-H) NMR chemical shift, oxidation potential (E-p/2(ox), measured in this study by cyclic voltammetry (CV)) and thermodynamic parameters (pK, Delta G(0), Delta H-0 and Delta S-0) of the dissociation process of unsubstituted 3-(phenylhydrazo)pentane-2,4-dione (HL1) and its para-substituted chloro (HL2), carboxy (HL3), fluoro (HL4) and nitro (HL5) derivatives were recognized. The best fits were found for sigma(p) and/or sigma(-)(p) in the cases of d(N center dot center dot center dot O), delta(N-H) and E-p/2(ox), showing the importance of resonance and conjugation effects in such properties, whereas for the above thermodynamic properties the inductive effects (sigma(I)) are dominant. HL2 exists in the hydrazo form in DMSO solution and in the solid state and contains an intramolecular H-bond with the N center dot center dot center dot O distance of 2.588(3)angstrom. It was also established that the dissociation process of HL1-5 is non-spontaneous, endothermic and entropically unfavourable, and that the increase in the inductive effect (sigma(I)) of para-substitutents (-H < -Cl < -COOH < -F < -NO2) leads to the corresponding growth of the N center dot center dot center dot O distance and decrease of the pK and of the changes of Gibbs free energy, of enthalpy and of entropy for the HL1-5 acid dissociation process. The electrochemical behaviour of HL1-5 was interpreted using theoretical calculations at the DFT/HF hybrid level, namely in terms of HOMO and LUMO compositions, and of reactivities induced by anodic and cathodic electron-transfers. Copyright (C) 2010 John Wiley & Sons, Ltd.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
The purpose of this paper was to introduce the symbolic formalism based on kneading theory, which allows us to study the renormalization of non-autonomous periodic dynamical systems.
Resumo:
OBJETIVO: Desenvolver um modelo estatístico baseado em métodos Bayesianos para estimar o risco de infecção tuberculosa em estudos com perdas de seguimento, comparando-o com um modelo clássico determinístico. MÉTODOS: O modelo estocástico proposto é baseado em um algoritmo de amostradores de Gibbs, utilizando as informações de perdas de seguimento ao final de um estudo longitudinal. Para simular o número desconhecido de indivíduos reatores ao final do estudo e perdas de seguimento, mas não reatores no tempo inicial, uma variável latente foi introduzida no novo modelo. Apresenta-se um exercício de aplicação de ambos os modelos para comparação das estimativas geradas. RESULTADOS: As estimativas pontuais fornecidas por ambos os modelos são próximas, mas o modelo Bayesiano apresentou a vantagem de trazer os intervalos de credibilidade como medidas da variabilidade amostral dos parâmetros estimados. CONCLUSÕES: O modelo Bayesiano pode ser útil em estudos longitudinais com baixa adesão ao seguimento.
Resumo:
The interaction between two disks immersed in a 2D nernatic is investigated i) analytically using the tenser order parameter formalism for the nematic configuration around isolated disks and ii) numerically using finite-element methods with adaptive meshing to minimize the corresponding Landau-de Gennes free energy. For strong homeotropic anchoring, each disk generates a pair of defects with one-half topological charge responsible for the 2D quadrupolar interaction between the disks at large distances. At short distance, the position of the defects may change, leading to unexpected complex interactions with the quadrupolar repulsive interactions becoming attractive. This short-range attraction in all directions is still anisotropic. As the distance between the disks decreases, their preferred relative orientation with respect to the far-field nernatic director changes from oblique to perpendicular.