987 resultados para General circulation models
Resumo:
Numerical optimisation methods are being more commonly applied to agricultural systems models, to identify the most profitable management strategies. The available optimisation algorithms are reviewed and compared, with literature and our studies identifying evolutionary algorithms (including genetic algorithms) as superior in this regard to simulated annealing, tabu search, hill-climbing, and direct-search methods. Results of a complex beef property optimisation, using a real-value genetic algorithm, are presented. The relative contributions of the range of operational options and parameters of this method are discussed, and general recommendations listed to assist practitioners applying evolutionary algorithms to the solution of agricultural systems. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
Comparative phylogeography has proved useful for investigating biological responses to past climate change and is strongest when combined with extrinsic hypotheses derived from the fossil record or geology. However, the rarity of species with sufficient, spatially explicit fossil evidence restricts the application of this method. Here, we develop an alternative approach in which spatial models of predicted species distributions under serial paleoclimates are compared with a molecular phylogeography, in this case for a snail endemic to the rainforests of North Queensland, Australia. We also compare the phylogeography of the snail to those from several endemic vertebrates and use consilience across all of these approaches to enhance biogeographical inference for this rainforest fauna. The snail mtDNA phylogeography is consistent with predictions from paleoclimate modeling in relation to the location and size of climatic refugia through the late Pleistocene-Holocene and broad patterns of extinction and recolonization. There is general agreement between quantitative estimates of population expansion from sequence data (using likelihood and coalescent methods) vs. distributional modeling. The snail phylogeography represents a composite of both common and idiosyncratic patterns seen among vertebrates, reflecting the geographically finer scale of persistence and subdivision in the snail. In general, this multifaceted approach, combining spatially explicit paleoclimatological models and comparative phylogeography, provides a powerful approach to locating historical refugia and understanding species' responses to them.
Resumo:
New Zealand has a good Neogene plant fossil record. During the Miocene it was without high topography and it was highly maritime, meaning that its climate, and the resulting vegetation, would be controlled dominantly by zonal climate conditions. Its vegetation record during this time suggests the climate passed from an ever-wet and cool but frostless phase in the Early Miocene in which Nothofagus subgenus Brassospora was prominent. Then it became seasonally dry, with vegetation in which palms and Eucalyptus were prominent and fires were frequent, and in the mid-Miocene, it developed a dry-climate vegetation dominated by Casuarinaceae. These changes are reflected in a sedimentological change from acidic to alkaline chemistry and the appearance of regular charcoal in the record. The vegetation then changed again to include a prominent herb component including Chenopodiaceae and Asteraceae. Sphagnum became prominent, and Nothofagus returned, but mainly as the subgenus Fuscospora (presently restricted to temperate climates). This is interpreted as a return to a generally wet, but now cold climate, in which outbreaks of cold polar air and frost were frequent. The transient drying out of a small maritime island and the accompanying vegetation/climate sequence could be explained by a higher frequency of the Sub-Tropical High Pressure (STHP) cells (the descending limbs of the Hadley cells) over New Zealand during the Miocene. This may have resulted from an increased frequency of 'blocking', a synoptic situation which occurs in the region today. An alternative hypothesis, that the global STHP belt lay at a significantly higher latitude in the early Neogene (perhaps 55degreesS) than today (about 30degreesS), is considered less likely because of physical constraints on STHP belt latitude. In either case, the difference between the early Neogene and present situation may have been a response to an increased polar-equatorial temperature gradient. This contrasts with current climate models for the geological past in which the latitude of the High Pressure belt impact is held invariant though geological time. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
We present two integrable spin ladder models which possess a general free parameter besides the rung coupling J. The models are exactly solvable by means of the Bethe ansatz method and we present the Bethe ansatz equations. We analyze the elementary excitations of the models which reveal the existence of a gap for both models that depends on the free parameter. (C) 2003 American Institute of Physics.
Resumo:
Em geral, produtos agrícolas são produzidos em larga escala e essa produtividade cresce proporcionalmente ao seu consumo. Entretanto, outro fator também cresce de forma proporcional, as perdas pós-colheita, o que sugere a utilização de tecnologias para aumentar a utilização desses produtos mitigando o desperdício e aumentando sua a vida de prateleira. Além disso, oferecer o produto durante o período de entressafra. No presente trabalho, foi utilizado à tecnologia de secagem em leito de espuma aplicada a cenoura, beterraba, tomate e morango, produtos amplamente produzidos e consumidos no Brasil. Neste trabalho, os quatros produtos foram submetidos à secagem em leito de espuma em secador com ar circulado em temperaturas controladas de 40, 50, 60, 70 e 80 °C. A descrição da cinética de secagem foi realizada pelo ajuste de modelos matemáticos para cada temperatura do ar de secagem. Além disso, foi proposto um modelo matemático generalizado ajustado por regressão não linear. O modelo de Page obteve o melhor ajuste sobre os dados de secagem em todos os produtos testados, com um coeficiente de determinação (R²) superior a 98% em todas as temperaturas avaliadas. Além disso, foi possível modelar a influência da temperatura do ar sobre o parâmetro k do modelo de Page através da utilização de um modelo exponencial. O coeficiente de difusão efetiva aumentou com a elevação da temperatura, apresentando valores entre 10-8e 10-7 m².s-¹ para as temperaturas de processo. A relação entre o coeficiente de difusão efetiva e a temperatura de secagem pôde ser descrita pela equação de Arrhenius.
Resumo:
Current software development relies increasingly on non-trivial coordination logic for com- bining autonomous services often running on di erent platforms. As a rule, however, in typical non-trivial software systems, such a coordination layer is strongly weaved within the application at source code level. Therefore, its precise identi cation becomes a major methodological (and technical) problem which cannot be overestimated along any program understanding or refactoring process. Open access to source code, as granted in OSS certi cation, provides an opportunity for the devel- opment of methods and technologies to extract, from source code, the relevant coordination information. This paper is a step in this direction, combining a number of program analysis techniques to automatically recover coordination information from legacy code. Such information is then expressed as a model in Orc, a general purpose orchestration language
Resumo:
We write down the renormalization-group equations for the Yukawa-coupling matrices in a general multi-Higgs-doublet model. We then assume that the matrices of the Yukawa couplings of the various Higgs doublets to right-handed fermions of fixed quantum numbers are all proportional to each other. We demonstrate that, in the case of the two-Higgs-doublet model, this proportionality is preserved by the renormalization-group running only in the cases of the standard type-I, II, X, and Y models. We furthermore show that a similar result holds even when there are more than two Higgs doublets: the Yukawa-coupling matrices to fermions of a given electric charge remain proportional under the renormalization-group running if and only if there is a basis for the Higgs doublets in which all the fermions of a given electric charge couple to only one Higgs doublet.
Resumo:
Recent literature has proved that many classical pricing models (Black and Scholes, Heston, etc.) and risk measures (V aR, CV aR, etc.) may lead to “pathological meaningless situations”, since traders can build sequences of portfolios whose risk leveltends to −infinity and whose expected return tends to +infinity, i.e., (risk = −infinity, return = +infinity). Such a sequence of strategies may be called “good deal”. This paper focuses on the risk measures V aR and CV aR and analyzes this caveat in a discrete time complete pricing model. Under quite general conditions the explicit expression of a good deal is given, and its sensitivity with respect to some possible measurement errors is provided too. We point out that a critical property is the absence of short sales. In such a case we first construct a “shadow riskless asset” (SRA) without short sales and then the good deal is given by borrowing more and more money so as to invest in the SRA. It is also shown that the SRA is interested by itself, even if there are short selling restrictions.
Resumo:
Background: Paranoid ideation has been regarded as a cognitive and a social process used as a defence against perceived threats. According to this perspective, paranoid ideation can be understood as a process extending across the normal-pathological continuum. Methods: In order to refine the construct of paranoid ideation and to validate a measure of paranoia, 906 Portuguese participants from the general population and 91 patients were administered the General Paranoia Scale (GPS), and two conceptual models (one - and tridimensional) were compared through confirmatory factor analysis (CFA). Results: Results from the CFA of the GPS confirmed a different model than the one-dimensional model proposed by Fenigstein and Vanable, which com-prised three dimensions (mistrust thoughts, persecutory ideas, and self-deprecation). This alternative model presented a better fit and increased sensitivity when compared with the one-dimensional model. Further data analysis of the scale revealed that the GPS is an adequate assessment tool for adults, with good psychometric characteristics and high internal consistency. Conclusion: The model proposed in the current work leads to further refinements and enrichment of the construct of paranoia in different populations, allowing the assessment of three dimensions of paranoia and the risk of clinical paranoia in a single measure for the general population.
Resumo:
The basic motivation of this work was the integration of biophysical models within the interval constraints framework for decision support. Comparing the major features of biophysical models with the expressive power of the existing interval constraints framework, it was clear that the most important inadequacy was related with the representation of differential equations. System dynamics is often modelled through differential equations but there was no way of expressing a differential equation as a constraint and integrate it within the constraints framework. Consequently, the goal of this work is focussed on the integration of ordinary differential equations within the interval constraints framework, which for this purpose is extended with the new formalism of Constraint Satisfaction Differential Problems. Such framework allows the specification of ordinary differential equations, together with related information, by means of constraints, and provides efficient propagation techniques for pruning the domains of their variables. This enabled the integration of all such information in a single constraint whose variables may subsequently be used in other constraints of the model. The specific method used for pruning its variable domains can then be combined with the pruning methods associated with the other constraints in an overall propagation algorithm for reducing the bounds of all model variables. The application of the constraint propagation algorithm for pruning the variable domains, that is, the enforcement of local-consistency, turned out to be insufficient to support decision in practical problems that include differential equations. The domain pruning achieved is not, in general, sufficient to allow safe decisions and the main reason derives from the non-linearity of the differential equations. Consequently, a complementary goal of this work proposes a new strong consistency criterion, Global Hull-consistency, particularly suited to decision support with differential models, by presenting an adequate trade-of between domain pruning and computational effort. Several alternative algorithms are proposed for enforcing Global Hull-consistency and, due to their complexity, an effort was made to provide implementations able to supply any-time pruning results. Since the consistency criterion is dependent on the existence of canonical solutions, it is proposed a local search approach that can be integrated with constraint propagation in continuous domains and, in particular, with the enforcing algorithms for anticipating the finding of canonical solutions. The last goal of this work is the validation of the approach as an important contribution for the integration of biophysical models within decision support. Consequently, a prototype application that integrated all the proposed extensions to the interval constraints framework is developed and used for solving problems in different biophysical domains.
Resumo:
We discuss theoretical and phenomenological aspects of two-Higgs-doublet extensions of the Standard Model. In general, these extensions have scalar mediated flavour changing neutral currents which are strongly constrained by experiment. Various strategies are discussed to control these flavour changing scalar currents and their phenomenological consequences are analysed. In particular, scenarios with natural flavour conservation are investigated, including the so-called type I and type II models as well as lepton-specific and inert models. Type III models are then discussed, where scalar flavour changing neutral currents are present at tree level, but are suppressed by either a specific ansatz for the Yukawa couplings or by the introduction of family symmetries leading to a natural suppression mechanism. We also consider the phenomenology of charged scalars in these models. Next we turn to the role of symmetries in the scalar sector. We discuss the six symmetry-constrained scalar potentials and their extension into the fermion sector. The vacuum structure of the scalar potential is analysed, including a study of the vacuum stability conditions on the potential and the renormalization-group improvement of these conditions is also presented. The stability of the tree level minimum of the scalar potential in connection with electric charge conservation and its behaviour under CP is analysed. The question of CP violation is addressed in detail, including the cases of explicit CP violation and spontaneous CP violation. We present a detailed study of weak basis invariants which are odd under CP. These invariants allow for the possibility of studying the CP properties of any two-Higgs-doublet model in an arbitrary Higgs basis. A careful study of spontaneous CP violation is presented, including an analysis of the conditions which have to be satisfied in order for a vacuum to violate CP. We present minimal models of CP violation where the vacuum phase is sufficient to generate a complex CKM matrix, which is at present a requirement for any realistic model of spontaneous CP violation.
Resumo:
Dissertação apresentada como requisito parcial para obtenção do grau de Doutor em Gestão de Informação
Resumo:
OBJECTIVE: The objective of the study was to develop a model for estimating patient 28-day in-hospital mortality using 2 different statistical approaches. DESIGN: The study was designed to develop an outcome prediction model for 28-day in-hospital mortality using (a) logistic regression with random effects and (b) a multilevel Cox proportional hazards model. SETTING: The study involved 305 intensive care units (ICUs) from the basic Simplified Acute Physiology Score (SAPS) 3 cohort. PATIENTS AND PARTICIPANTS: Patients (n = 17138) were from the SAPS 3 database with follow-up data pertaining to the first 28 days in hospital after ICU admission. INTERVENTIONS: None. MEASUREMENTS AND RESULTS: The database was divided randomly into 5 roughly equal-sized parts (at the ICU level). It was thus possible to run the model-building procedure 5 times, each time taking four fifths of the sample as a development set and the remaining fifth as the validation set. At 28 days after ICU admission, 19.98% of the patients were still in the hospital. Because of the different sampling space and outcome variables, both models presented a better fit in this sample than did the SAPS 3 admission score calibrated to vital status at hospital discharge, both on the general population and in major subgroups. CONCLUSIONS: Both statistical methods can be used to model the 28-day in-hospital mortality better than the SAPS 3 admission model. However, because the logistic regression approach is specifically designed to forecast 28-day mortality, and given the high uncertainty associated with the assumption of the proportionality of risks in the Cox model, the logistic regression approach proved to be superior.
Resumo:
Dissertação para obtenção do Grau de Doutor em Engenharia Informática
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics