921 resultados para visualisation formalism


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the past thirty years, a series of plans have been developed by successive Brazilian governments in a continuing effort to maximize the nation's resources for economic and social growth. This planning history has been quantitatively rich but qualitatively poor. The disjunction has stimulated Professor Mello e Souza to address himself to the problem of national planning and to offer some criticisms of Brazilian planning experience. Though political instability has obviously been a factor promoting discontinuity, his criticisms are aimed at the attitudes and strategic concepts which have sought to link planning to national goals and administration. He criticizes the fascination with techniques and plans to the exclusion of proper diagnosis of the socio-political reality, developing instruments to coordinate and carry out objectives, and creating an administrative structure centralized enough to make national decisions and decentralized enough to perform on the basis of those decisions. Thus, fixed, quantified objectives abound while the problem of functioning mechanisms for the coordinated, rational use of resources has been left unattended. Although his interest and criticism are focused on the process and experience of national planning, he recognized variation in the level and results of Brazilian planning. National plans have failed due to faulty conception of the function of planning. Sectorial plans, save in the sector of the petroleum industry under government responsibility, ha e not succeeded in overcoming the problems of formulation and execution thereby repeating old technical errors. Planning for the private sector has a somewhat brighter history due to the use of Grupos Executivos which has enabled the planning process to transcend the formalism and tradition-bound attitudes of the regular bureaucracy. Regional planning offers two relatively successful experiences, Sudene and the strategy of the regionally oriented autarchy. Thus, planning history in Brazil is not entirely black but a certain shade of grey. The major part of the article, however, is devoted to a descriptive analysis of the national planning experience. The plans included in this analysis are: The Works and Equipment Plan (POE); The Health, Food, Transportation and Energy Plan (Salte); The Program of Goals; The Trienal Plan of Economic and Social Development; and the Plan of Governmental Economic Action (Paeg). Using these five plans for his historical experience the author sets out a series of errors of formulation and execution by which he analyzes that experience. With respect to formulation, he speaks of a lack of elaboration of programs and projects, of coordination among diverse goals, and of provision of qualified staff and techniques. He mentions the absence of the definition of resources necessary to the financing of the plan and the inadequate quantification of sectorial and national goals due to the lack of reliable statistical information. Finally, he notes the failure to coordinate the annual budget with the multi-year plans. He sees the problems of execution as beginning in the absence of coordination between the various sectors of the public administration, the failure to develop an operative system of decentralization, the absence of any system of financial and fiscal control over execution, the difficulties imposed by the system of public accounting, and the absence of an adequate program of allocation for the liberation of resources. He ends by pointing to the failure to develop and use an integrated system of political economic tools in a mode compatible with the objective of the plans. The body of the article analyzes national planning experience in Brazil using these lists of errors as rough model of criticism. Several conclusions emerge from this analysis with regard to planning in Brazil and in developing countries, in general. Plans have generally been of little avail in Brazil because of the lack of a continuous, bureaucratized (in the Weberian sense) planning organization set in an instrumentally suitable administrative structure and based on thorough diagnoses of socio-economic conditions and problems. Plans have become the justification for planning. Planning has come to be conceived as a rational method of orienting the process of decisions through the establishment of a precise and quantified relation between means and ends. But this conception has led to a planning history rimmed with frustration, and failure, because of its rigidity in the face of flexible and changing reality. Rather, he suggests a conception of planning which understands it "as a rational process of formulating decisions about the policy, economy, and society whose only demand is that of managing the instrumentarium in a harmonious and integrated form in order to reach explicit, but not quantified ends". He calls this "planning without plans": the establishment of broad-scale tendencies through diagnosis whose implementation is carried out through an adjustable, coherent instrumentarium of political-economic tools. Administration according to a plan of multiple, integrated goals is a sound procedure if the nation's administrative machinery contains the technical development needed to control the multiple variables linked to any situation of socio-economic change. Brazil does not possess this level of refinement and any strategy of planning relevant to its problems must recognize this. The reforms which have been attempted fail to make this recognition as is true of the conception of planning informing the Brazilian experience. Therefore, unworkable plans, ill-diagnosed with little or no supportive instrumentarium or flexibility have been Brazil's legacy. This legacy seems likely to continue until the conception of planning comes to live in the reality of Brazil.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The two-Higgs-doublet model can be constrained by imposing Higgs-family symmetries and/or generalized CP symmetries. It is known that there are only six independent classes of such symmetry-constrained models. We study the CP properties of all cases in the bilinear formalism. An exact symmetry implies CP conservation. We show that soft breaking of the symmetry can lead to spontaneous CP violation (CPV) in three of the classes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Deoxyribonucleic acid, or DNA, is the most fundamental aspect of life but present day scientific knowledge has merely scratched the surface of the problem posed by its decoding. While experimental methods provide insightful clues, the adoption of analysis tools supported by the formalism of mathematics will lead to a systematic and solid build-up of knowledge. This paper studies human DNA from the perspective of system dynamics. By associating entropy and the Fourier transform, several global properties of the code are revealed. The fractional order characteristics emerge as a natural consequence of the information content. These properties constitute a small piece of scientific knowledge that will support further efforts towards the final aim of establishing a comprehensive theory of the phenomena involved in life.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents an algorithm to efficiently generate the state-space of systems specified using the IOPT Petri-net modeling formalism. IOPT nets are a non-autonomous Petri-net class, based on Place-Transition nets with an extended set of features designed to allow the rapid prototyping and synthesis of system controllers through an existing hardware-software co-design framework. To obtain coherent and deterministic operation, IOPT nets use a maximal-step execution semantics where, in a single execution step, all enabled transitions will fire simultaneously. This fact increases the resulting state-space complexity and can cause an arc "explosion" effect. Real-world applications, with several million states, will reach a higher order of magnitude number of arcs, leading to the need for high performance state-space generator algorithms. The proposed algorithm applies a compilation approach to read a PNML file containing one IOPT model and automatically generate an optimized C program to calculate the corresponding state-space.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the two-Higgs-doublet model (THDM), generalized-CP transformations (phi(i) -> X-ij phi(*)(j) where X is unitary) and unitary Higgs-family transformations (phi(i) -> U-ij phi(j)) have recently been examined in a series of papers. In terms of gauge-invariant bilinear functions of the Higgs fields phi(i), the Higgs-family transformations and the generalized-CP transformations possess a simple geometric description. Namely, these transformations correspond in the space of scalar-field bilinears to proper and improper rotations, respectively. In this formalism, recent results relating generalized CP transformations with Higgs-family transformations have a clear geometric interpretation. We will review what is known regarding THDM symmetries, as well as derive new results concerning those symmetries, namely how they can be interpreted geometrically as applications of several CP transformations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this paper was to introduce the symbolic formalism based on kneading theory, which allows us to study the renormalization of non-autonomous periodic dynamical systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The interaction between two disks immersed in a 2D nernatic is investigated i) analytically using the tenser order parameter formalism for the nematic configuration around isolated disks and ii) numerically using finite-element methods with adaptive meshing to minimize the corresponding Landau-de Gennes free energy. For strong homeotropic anchoring, each disk generates a pair of defects with one-half topological charge responsible for the 2D quadrupolar interaction between the disks at large distances. At short distance, the position of the defects may change, leading to unexpected complex interactions with the quadrupolar repulsive interactions becoming attractive. This short-range attraction in all directions is still anisotropic. As the distance between the disks decreases, their preferred relative orientation with respect to the far-field nernatic director changes from oblique to perpendicular.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the early 21st Century, with the phenomenon of digital convergence, the consecration of Web 2.0, the decrease of the cost of cameras and video recorders, the proliferation of mobile phones, laptops and wireless technologies, we witness the arising of a new wave of media, of an informal, personal and at times “minority” nature, facilitating social networks, a culture of fans, of sharing and remix. As digital networks become fully and deeply intricate in our experience, the idea of “participation” arises as one of the most complex and controversial themes of the contemporary critical discourse, namely in what concerns contemporary art and new media art. However, the idea of “participation” as a practice or postulate traverses the 20th century art playing an essential role in its auto-critic, in questioning the concept of author, and in the dilution of the frontiers between art, “life” and society, emphasizing the process, the everyday and a community sense. As such, questioning the new media art in light of a “participatory art” (Frieling, 2008) invokes a double gaze simultaneously attentive to the emerging figures of a “participatory aesthetics” in digital arts and of the genealogy in which it is included. In fact, relating the new media art with the complex and paradoxical phenomenon of “participation” allows us to, on the one hand, avoid “digital formalism” (Lovink, 2008) and analyse the relations between digital art and contemporary social movements; on the other hand, this angle of analysis contributes to reinforce the dialogue and the links between digital art and contemporary art, questioning the alleged frontiers that separate them.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coronary artery disease (CAD) is currently one of the most prevalent diseases in the world population and calcium deposits in coronary arteries are one direct risk factor. These can be assessed by the calcium score (CS) application, available via a computed tomography (CT) scan, which gives an accurate indication of the development of the disease. However, the ionising radiation applied to patients is high. This study aimed to optimise the protocol acquisition in order to reduce the radiation dose and explain the flow of procedures to quantify CAD. The main differences in the clinical results, when automated or semiautomated post-processing is used, will be shown, and the epidemiology, imaging, risk factors and prognosis of the disease described. The software steps and the values that allow the risk of developingCADto be predicted will be presented. A64-row multidetector CT scan with dual source and two phantoms (pig hearts) were used to demonstrate the advantages and disadvantages of the Agatston method. The tube energy was balanced. Two measurements were obtained in each of the three experimental protocols (64, 128, 256 mAs). Considerable changes appeared between the values of CS relating to the protocol variation. The predefined standard protocol provided the lowest dose of radiation (0.43 mGy). This study found that the variation in the radiation dose between protocols, taking into consideration the dose control systems attached to the CT equipment and image quality, was not sufficient to justify changing the default protocol provided by the manufacturer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim - To use Monte Carlo (MC) together with voxel phantoms to analyze the tissue heterogeneity effect in the dose distributions and equivalent uniform dose (EUD) for (125)I prostate implants. Background - Dose distribution calculations in low dose-rate brachytherapy are based on the dose deposition around a single source in a water phantom. This formalism does not take into account tissue heterogeneities, interseed attenuation, or finite patient dimensions effects. Tissue composition is especially important due to the photoelectric effect. Materials and Methods - The computed tomographies (CT) of two patients with prostate cancer were used to create voxel phantoms for the MC simulations. An elemental composition and density were assigned to each structure. Densities of the prostate, vesicles, rectum and bladder were determined through the CT electronic densities of 100 patients. The same simulations were performed considering the same phantom as pure water. Results were compared via dose-volume histograms and EUD for the prostate and rectum. Results - The mean absorbed doses presented deviations of 3.3-4.0% for the prostate and of 2.3-4.9% for the rectum, when comparing calculations in water with calculations in the heterogeneous phantom. In the calculations in water, the prostate D 90 was overestimated by 2.8-3.9% and the rectum D 0.1cc resulted in dose differences of 6-8%. The EUD resulted in an overestimation of 3.5-3.7% for the prostate and of 7.7-8.3% for the rectum. Conclusions - The deposited dose was consistently overestimated for the simulation in water. In order to increase the accuracy in the determination of dose distributions, especially around the rectum, the introduction of the model-based algorithms is recommended.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The MCNPX code was used to calculate the TG-43U1 recommended parameters in water and prostate tissue in order to quantify the dosimetric impact in 30 patients treated with (125)I prostate implants when replacing the TG-43U1 formalism parameters calculated in water by a prostate-like medium in the planning system (PS) and to evaluate the uncertainties associated with Monte Carlo (MC) calculations. The prostate density was obtained from the CT of 100 patients with prostate cancer. The deviations between our results for water and the TG-43U1 consensus dataset values were -2.6% for prostate V100, -13.0% for V150, and -5.8% for D90; -2.0% for rectum V100, and -5.1% for D0.1; -5.0% for urethra D10, and -5.1% for D30. The same differences between our water and prostate results were all under 0.3%. Uncertainties estimations were up to 2.9% for the gL(r) function, 13.4% for the F(r,θ) function and 7.0% for Λ, mainly due to seed geometry uncertainties. Uncertainties in extracting the TG-43U1 parameters in the MC simulations as well as in the literature comparison are of the same order of magnitude as the differences between dose distributions computed for water and prostate-like medium. The selection of the parameters for the PS should be done carefully, as it may considerably affect the dose distributions. The seeds internal geometry uncertainties are a major limiting factor in the MC parameters deduction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The basic motivation of this work was the integration of biophysical models within the interval constraints framework for decision support. Comparing the major features of biophysical models with the expressive power of the existing interval constraints framework, it was clear that the most important inadequacy was related with the representation of differential equations. System dynamics is often modelled through differential equations but there was no way of expressing a differential equation as a constraint and integrate it within the constraints framework. Consequently, the goal of this work is focussed on the integration of ordinary differential equations within the interval constraints framework, which for this purpose is extended with the new formalism of Constraint Satisfaction Differential Problems. Such framework allows the specification of ordinary differential equations, together with related information, by means of constraints, and provides efficient propagation techniques for pruning the domains of their variables. This enabled the integration of all such information in a single constraint whose variables may subsequently be used in other constraints of the model. The specific method used for pruning its variable domains can then be combined with the pruning methods associated with the other constraints in an overall propagation algorithm for reducing the bounds of all model variables. The application of the constraint propagation algorithm for pruning the variable domains, that is, the enforcement of local-consistency, turned out to be insufficient to support decision in practical problems that include differential equations. The domain pruning achieved is not, in general, sufficient to allow safe decisions and the main reason derives from the non-linearity of the differential equations. Consequently, a complementary goal of this work proposes a new strong consistency criterion, Global Hull-consistency, particularly suited to decision support with differential models, by presenting an adequate trade-of between domain pruning and computational effort. Several alternative algorithms are proposed for enforcing Global Hull-consistency and, due to their complexity, an effort was made to provide implementations able to supply any-time pruning results. Since the consistency criterion is dependent on the existence of canonical solutions, it is proposed a local search approach that can be integrated with constraint propagation in continuous domains and, in particular, with the enforcing algorithms for anticipating the finding of canonical solutions. The last goal of this work is the validation of the approach as an important contribution for the integration of biophysical models within decision support. Consequently, a prototype application that integrated all the proposed extensions to the interval constraints framework is developed and used for solving problems in different biophysical domains.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the past years, Software Architecture has attracted increased attention by academia and industry as the unifying concept to structure the design of complex systems. One particular research area deals with the possibility of reconfiguring architectures to adapt the systems they describe to new requirements. Reconfiguration amounts to adding and removing components and connections, and may have to occur without stopping the execution of the system being reconfigured. This work contributes to the formal description of such a process. Taking as a premise that a single formalism hardly ever satisfies all requirements in every situation, we present three approaches, each one with its own assumptions about the systems it can be applied to and with different advantages and disadvantages. Each approach is based on work of other researchers and has the aesthetic concern of changing as little as possible the original formalism, keeping its spirit. The first approach shows how a given reconfiguration can be specified in the same manner as the system it is applied to and in a way to be efficiently executed. The second approach explores the Chemical Abstract Machine, a formalism for rewriting multisets of terms, to describe architectures, computations, and reconfigurations in a uniform way. The last approach uses a UNITY-like parallel programming design language to describe computations, represents architectures by diagrams in the sense of Category Theory, and specifies reconfigurations by graph transformation rules.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The IEEE 802.15.4 protocol proposes a flexible communication solution for Low-Rate Wireless Personal Area Networks including sensor networks. It presents the advantage to fit different requirements of potential applications by adequately setting its parameters. When enabling its beacon mode, the protocol makes possible real-time guarantees by using its Guaranteed Time Slot (GTS) mechanism. This paper analyzes the performance of the GTS allocation mechanism in IEEE 802.15.4. The analysis gives a full understanding of the behavior of the GTS mechanism with regards to delay and throughput metrics. First, we propose two accurate models of service curves for a GTS allocation as a function of the IEEE 802.15.4 parameters. We then evaluate the delay bounds guaranteed by an allocation of a GTS using Network Calculus formalism. Finally, based on the analytic results, we analyze the impact of the IEEE 802.15.4 parameters on the throughput and delay bound guaranteed by a GTS allocation. The results of this work pave the way for an efficient dimensioning of an IEEE 802.15.4 cluster.