884 resultados para Problem analysis
Resumo:
Event-specific scales commonly have greater power than generalized scales in prediction of specific disorders and in testing mediator models for predicting such disorders. Therefore, in a preliminary study, a 6-item Alcohol Helplessness Scale was constructed and found to be reliable for a sample of 98 problem drinkers. Hierarchical multiple regression and its derivative path analysis were used to test whether helplessness and self-efficacy moderate or mediate the link between alcohol dependence and depression, A test of a moderation model was not supported, whereas a test of a mediation model was supported. Helplessness and self-efficacy both significantly and independently mediated between alcohol dependence and depression. Nevertheless, a significant direct effect of alcohol dependence on depression also remained, (C) 2001 John Wiley & Sons, Inc.
Resumo:
In this paper we refer to the gene-to-phenotype modeling challenge as the GP problem. Integrating information across levels of organization within a genotype-environment system is a major challenge in computational biology. However, resolving the GP problem is a fundamental requirement if we are to understand and predict phenotypes given knowledge of the genome and model dynamic properties of biological systems. Organisms are consequences of this integration, and it is a major property of biological systems that underlies the responses we observe. We discuss the E(NK) model as a framework for investigation of the GP problem and the prediction of system properties at different levels of organization. We apply this quantitative framework to an investigation of the processes involved in genetic improvement of plants for agriculture. In our analysis, N genes determine the genetic variation for a set of traits that are responsible for plant adaptation to E environment-types within a target population of environments. The N genes can interact in epistatic NK gene-networks through the way that they influence plant growth and development processes within a dynamic crop growth model. We use a sorghum crop growth model, available within the APSIM agricultural production systems simulation model, to integrate the gene-environment interactions that occur during growth and development and to predict genotype-to-phenotype relationships for a given E(NK) model. Directional selection is then applied to the population of genotypes, based on their predicted phenotypes, to simulate the dynamic aspects of genetic improvement by a plant-breeding program. The outcomes of the simulated breeding are evaluated across cycles of selection in terms of the changes in allele frequencies for the N genes and the genotypic and phenotypic values of the populations of genotypes.
Resumo:
Recent observations from type Ia Supernovae and from cosmic microwave background (CMB) anisotropies have revealed that most of the matter of the Universe interacts in a repulsive manner, composing the so-called dark energy constituent of the Universe. Determining the properties of dark energy is one of the most important tasks of modern cosmology and this is the main motivation for this work. The analysis of cosmic gravitational waves (GW) represents, besides the CMB temperature and polarization anisotropies, an additional approach in the determination of parameters that may constrain the dark energy models and their consistence. In recent work, a generalized Chaplygin gas model was considered in a flat universe and the corresponding spectrum of gravitational waves was obtained. In the present work we have added a massless gas component to that model and the new spectrum has been compared to the previous one. The Chaplygin gas is also used to simulate a L-CDM model by means of a particular combination of parameters so that the Chaplygin gas and the L-CDM models can be easily distinguished in the theoretical scenarios here established. We find that the models are strongly degenerated in the range of frequencies studied. This degeneracy is in part expected since the models must converge to each other when some particular combinations of parameters are considered.
Resumo:
The central goal of this paper is thinking about the Brazilian military power and its linking to the international ambitions of the country in the 21st century. After a comparative analysis to other BRICs and with a historical one about Brazil's strategic irrelevance, we aim to establish what the minimum military capacity Brazil would need in order to meet the country's latest international interests. Similarly, it will be discussed if the National Strategy of Defense, approved in 2008, and the recent strategic agreements signed with France represent one more step toward this minimum military capacity.
Resumo:
Within the development of motor vehicles, crash safety (e.g. occupant protection, pedestrian protection, low speed damageability), is one of the most important attributes. In order to be able to fulfill the increased requirements in the framework of shorter cycle times and rising pressure to reduce costs, car manufacturers keep intensifying the use of virtual development tools such as those in the domain of Computer Aided Engineering (CAE). For crash simulations, the explicit finite element method (FEM) is applied. The accuracy of the simulation process is highly dependent on the accuracy of the simulation model, including the midplane mesh. One of the roughest approximations typically made is the actual part thickness which, in reality, can vary locally. However, almost always a constant thickness value is defined throughout the entire part due to complexity reasons. On the other hand, for precise fracture analysis within FEM, the correct thickness consideration is one key enabler. Thus, availability of per element thickness information, which does not exist explicitly in the FEM model, can significantly contribute to an improved crash simulation quality, especially regarding fracture prediction. Even though the thickness is not explicitly available from the FEM model, it can be inferred from the original CAD geometric model through geometric calculations. This paper proposes and compares two thickness estimation algorithms based on ray tracing and nearest neighbour 3D range searches. A systematic quantitative analysis of the accuracy of both algorithms is presented, as well as a thorough identification of particular geometric arrangements under which their accuracy can be compared. These results enable the identification of each technique’s weaknesses and hint towards a new, integrated, approach to the problem that linearly combines the estimates produced by each algorithm.
Resumo:
Current software development often relies on non-trivial coordination logic for combining autonomous services, eventually running on different platforms. As a rule, however, such a coordination layer is strongly woven within the application at source code level. Therefore, its precise identification becomes a major methodological (and technical) problem and a challenge to any program understanding or refactoring process. The approach introduced in this paper resorts to slicing techniques to extract coordination data from source code. Such data are captured in a specific dependency graph structure from which a coordination model can be recovered either in the form of an Orc specification or as a collection of code fragments corresponding to the identification of typical coordination patterns in the system. Tool support is also discussed
Resumo:
The Wyner-Ziv video coding (WZVC) rate distortion performance is highly dependent on the quality of the side information, an estimation of the original frame, created at the decoder. This paper, characterizes the WZVC efficiency when motion compensated frame interpolation (MCFI) techniques are used to generate the side information, a difficult problem in WZVC especially because the decoder only has available some reference decoded frames. The proposed WZVC compression efficiency rate model relates the power spectral of the estimation error to the accuracy of the MCFI motion field. Then, some interesting conclusions may be derived related to the impact of the motion field smoothness and the correlation to the true motion trajectories on the compression performance.
Resumo:
As wind power generation undergoes rapid growth, lightning and overvoltage incidents involving wind power plants have come to be regarded as a serious problem. Firstly, lightning location systems are discussed, as well as important parameters regarding lightning protection. Also, this paper presents a case study, based on a wind turbine with an interconnecting transformer, for the study of adequate lightning and overvoltage protection measures. The electromagnetic transients circuit under study is described, and computational results are presented.
Resumo:
A previously developed model is used to numerically simulate real clinical cases of the surgical correction of scoliosis. This model consists of one-dimensional finite elements with spatial deformation in which (i) the column is represented by its axis; (ii) the vertebrae are assumed to be rigid; and (iii) the deformability of the column is concentrated in springs that connect the successive rigid elements. The metallic rods used for the surgical correction are modeled by beam elements with linear elastic behavior. To obtain the forces at the connections between the metallic rods and the vertebrae geometrically, non-linear finite element analyses are performed. The tightening sequence determines the magnitude of the forces applied to the patient column, and it is desirable to keep those forces as small as possible. In this study, a Genetic Algorithm optimization is applied to this model in order to determine the sequence that minimizes the corrective forces applied during the surgery. This amounts to find the optimal permutation of integers 1, ... , n, n being the number of vertebrae involved. As such, we are faced with a combinatorial optimization problem isomorph to the Traveling Salesman Problem. The fitness evaluation requires one computing intensive Finite Element Analysis per candidate solution and, thus, a parallel implementation of the Genetic Algorithm is developed.
Resumo:
Deoxyribonucleic acid, or DNA, is the most fundamental aspect of life but present day scientific knowledge has merely scratched the surface of the problem posed by its decoding. While experimental methods provide insightful clues, the adoption of analysis tools supported by the formalism of mathematics will lead to a systematic and solid build-up of knowledge. This paper studies human DNA from the perspective of system dynamics. By associating entropy and the Fourier transform, several global properties of the code are revealed. The fractional order characteristics emerge as a natural consequence of the information content. These properties constitute a small piece of scientific knowledge that will support further efforts towards the final aim of establishing a comprehensive theory of the phenomena involved in life.
Resumo:
Mathematical Program with Complementarity Constraints (MPCC) finds many applications in fields such as engineering design, economic equilibrium and mathematical programming theory itself. A queueing system model resulting from a single signalized intersection regulated by pre-timed control in traffic network is considered. The model is formulated as an MPCC problem. A MATLAB implementation based on an hyperbolic penalty function is used to solve this practical problem, computing the total average waiting time of the vehicles in all queues and the green split allocation. The problem was codified in AMPL.
Resumo:
The Cultural Property Risk Analysis Model was applied in 2006 to a Portuguese archive located in Lisbon. Its results highlighted the need for the institution to take care of risks related to fire, physical forces and relative humidity problems. Five years after this first analysis the results are revisited and a few changes are introduced due to recent events: fire and high humidity remain an important hazard but are now accompanied by a pressing contaminants problem. Improvements in storage systems were responsible for a large decrease in terms of calculated risk magnitude and proved to be very cost-effective.
Resumo:
Different problems are daily discuss on environmental aspects such acid rain, eutrophication, global warming and an others problems. Rarely do we find some discussions about phosphorus problematic. Through the years the phosphorus as been a real problem and must be more discussed. On this thesis was done a global material flow analysis of phosphorus, based on data from the year 2004, the production of phosphate rock in that year was 18.9 million tones, almost this amount it was used as fertilizer on the soil and the plants only can uptake, on average, 20% of the input of fertilizer to grow up, the remainder is lost for the phosphorus soil. In the phosphorus soil there is equilibrium between the phosphorus available to uptake from the plants and the phosphorus associate with other compounds, this equilibrium depends of the kind of soil and is related with the soil pH. A reserve inventory was done and we have 15,000 million tones as reserve, the amount that is economical available. The reserve base is estimated in 47,000 million tones. The major reserves can be found in Morocco and Western Sahara, United Sates, China and South Africa. The reserve estimated in 2009 was 15,000 million tone of phosphate rock or 1,963 million tone of P. If every year the mined phosphate rock is around 22 Mt/yr (phosphorus production on 2008 USGS 2009), and each year the consumption of phosphorus increases because of the food demand, the reserves of phosphate rock will be finished in about 90 years, or maybe even less. About the value/impact assessment was done a qualitative analysis, if on the future we don’t have more phosphate rock to produce fertilizers, it is expected a drop on the crops yields, each depends of the kind of the soil and the impact on the humans feed and animal production will not be a relevant problem. We can recovery phosphorus from different waste streams such as ploughing crop residues back into the soil, Food processing plants and food retailers, Human and animal excreta, Meat and bone meal, Manure fibre, Sewage sludge and wastewater. Some of these examples are developed in the paper.
Resumo:
The mechanisms of speech production are complex and have been raising attention from researchers of both medical and computer vision fields. In the speech production mechanism, the articulator’s study is a complex issue, since they have a high level of freedom along this process, namely the tongue, which instigates a problem in its control and observation. In this work it is automatically characterized the tongues shape during the articulation of the oral vowels of Portuguese European by using statistical modeling on MR-images. A point distribution model is built from a set of images collected during artificially sustained articulations of Portuguese European sounds, which can extract the main characteristics of the motion of the tongue. The model built in this work allows under standing more clearly the dynamic speech events involved during sustained articulations. The tongue shape model built can also be useful for speech rehabilitation purposes, specifically to recognize the compensatory movements of the articulators during speech production.