855 resultados para Emergence Prediction
Resumo:
Phenology shifts are the most widely cited examples of the biological impact of climate change, yet there are few assessments of potential effects on the fitness of individual organisms or the persistence of populations. Despite extensive evidence of climate-driven advances in phenological events over recent decades, comparable patterns across species' geographic ranges have seldom been described. Even fewer studies have quantified concurrent spatial gradients and temporal trends between phenology and climate. Here we analyse a large data set (~129 000 phenology measures) over 37 years across the UK to provide the first phylogenetic comparative analysis of the relative roles of plasticity and local adaptation in generating spatial and temporal patterns in butterfly mean flight dates. Although populations of all species exhibit a plastic response to temperature, with adult emergence dates earlier in warmer years by an average of 6.4 days per °C, among-population differences are significantly lower on average, at 4.3 days per °C. Emergence dates of most species are more synchronised over their geographic range than is predicted by their relationship between mean flight date and temperature over time, suggesting local adaptation. Biological traits of species only weakly explained the variation in differences between space-temperature and time-temperature phenological responses, suggesting that multiple mechanisms may operate to maintain local adaptation. As niche models assume constant relationships between occurrence and environmental conditions across a species' entire range, an important implication of the temperature-mediated local adaptation detected here is that populations of insects are much more sensitive to future climate changes than current projections suggest.
Resumo:
The Madden-Julian Oscillation (MJO) is the dominant mode of intraseasonal variability in the Trop- ics. It can be characterised as a planetary-scale coupling between the atmospheric circulation and organised deep convection that propagates east through the equatorial Indo-Pacific region. The MJO interacts with weather and climate systems on a near-global scale and is a crucial source of predictability for weather forecasts on medium to seasonal timescales. Despite its global signifi- cance, accurately representing the MJO in numerical weather prediction (NWP) and climate models remains a challenge. This thesis focuses on the representation of the MJO in the Integrated Forecasting System (IFS) at the European Centre for Medium-Range Weather Forecasting (ECMWF), a state-of-the-art NWP model. Recent modifications to the model physics in Cycle 32r3 (Cy32r3) of the IFS led to ad- vances in the simulation of the MJO; for the first time the observed amplitude of the MJO was maintained throughout the integration period. A set of hindcast experiments, which differ only in their formulation of convection, have been performed between May 2008 and April 2009 to asses the sensitivity of MJO simulation in the IFS to the Cy32r3 convective parameterization. Unique to this thesis is the attribution of the advances in MJO simulation in Cy32r3 to the mod- ified convective parameterization, specifically, the relative-humidity-dependent formulation for or- ganised deep entrainment. Increasing the sensitivity of the deep convection scheme to environmen- tal moisture is shown to modify the relationship between precipitation and moisture in the model. Through dry-air entrainment, convective plumes ascending in low-humidity environments terminate lower in the atmosphere. As a result, there is an increase in the occurrence of cumulus congestus, which acts to moisten the mid-troposphere. Due to the modified precipitation-moisture relationship more moisture is able to build up which effectively preconditions the tropical atmosphere for the transition to deep convection. Results from this thesis suggest that a tropospheric moisture control on convection is key to simulating the interaction between the physics and large-scale circulation associated with the MJO.
Resumo:
This paper describes the methodology of providing multiprobability predictions for proteomic mass spectrometry data. The methodology is based on a newly developed machine learning framework called Venn machines. Is allows to output a valid probability interval. The methodology is designed for mass spectrometry data. For demonstrative purposes, we applied this methodology to MALDI-TOF data sets in order to predict the diagnosis of heart disease and early diagnoses of ovarian cancer and breast cancer. The experiments showed that probability intervals are narrow, that is, the output of the multiprobability predictor is similar to a single probability distribution. In addition, probability intervals produced for heart disease and ovarian cancer data were more accurate than the output of corresponding probability predictor. When Venn machines were forced to make point predictions, the accuracy of such predictions is for the most data better than the accuracy of the underlying algorithm that outputs single probability distribution of a label. Application of this methodology to MALDI-TOF data sets empirically demonstrates the validity. The accuracy of the proposed method on ovarian cancer data rises from 66.7 % 11 months in advance of the moment of diagnosis to up to 90.2 % at the moment of diagnosis. The same approach has been applied to heart disease data without time dependency, although the achieved accuracy was not as high (up to 69.9 %). The methodology allowed us to confirm mass spectrometry peaks previously identified as carrying statistically significant information for discrimination between controls and cases.
Resumo:
A cylinder experiment was conducted in northern Greece during 2005 and 2006 to assess emergence dynamics of barnyardgrass (Echinochloa crus-galli (L.) Beauv.) and jimsonweed (Datura stramonium L.) in the case of a switch from conventional to conservation tillage systems (CT). Emergence was surveyed from two burial depths (5 and 10 cm) and with simulation of reduced tillage (i.e. by soil disturbance) and no-till conditions. Barnyardgrass emergence was significantly affected by burial depth, having greater emergence from 5 cm depth (96%) although even 78% of seedlings emerged from 10 cm depth after the two years of study. Emergence of barnyardgrass was stable across years from the different depths and tillage regimes. Jimsonweed seeds showed lower germination than barnyardgrass during the study period, whereas its emergence was significantly affected by soil disturbance having 41% compared to 28% without disturbance. A burial depth x soil disturbance interaction was also determined, which showed higher emergence from 10 cm depth with soil disturbance. Jimsonweed was found to have significantly higher emergence from 10 cm depth with soil disturbance in Year 2. Seasonal emergence timing of barnyardgrass did not vary between the different burial depth and soil disturbance regimes, as it started in April and lasted until end of May in both years. Jimsonweed showed a bimodal pattern, with first emergence starting end of April until mid-May and the second ranging from mid-June to mid-August from 10 cm burial depth and from mid-July to mid-August from 5 cm depth, irrespective of soil disturbance in both cases.
Resumo:
It is argued that existing polar prediction systems do not yet meet users’ needs; and possible ways forward in advancing prediction capacity in polar regions and beyond are outlined. The polar regions have been attracting more and more attention in recent years, fuelled by the perceptible impacts of anthropogenic climate change. Polar climate change provides new opportunities, such as shorter shipping routes between Europe and East Asia, but also new risks such as the potential for industrial accidents or emergencies in ice-covered seas. Here, it is argued that environmental prediction systems for the polar regions are less developed than elsewhere. There are many reasons for this situation, including the polar regions being (historically) lower priority, with less in situ observations, and with numerous local physical processes that are less well-represented by models. By contrasting the relative importance of different physical processes in polar and lower latitudes, the need for a dedicated polar prediction effort is illustrated. Research priorities are identified that will help to advance environmental polar prediction capabilities. Examples include an improvement of the polar observing system; the use of coupled atmosphere-sea ice-ocean models, even for short-term prediction; and insight into polar-lower latitude linkages and their role for forecasting. Given the enormity of some of the challenges ahead, in a harsh and remote environment such as the polar regions, it is argued that rapid progress will only be possible with a coordinated international effort. More specifically, it is proposed to hold a Year of Polar Prediction (YOPP) from mid-2017 to mid-2019 in which the international research and operational forecasting community will work together with stakeholders in a period of intensive observing, modelling, prediction, verification, user-engagement and educational activities.
Resumo:
Understanding how the emergence of the anthropogenic warming signal from the noise of internal variability translates to changes in extreme event occurrence is of crucial societal importance. By utilising simulations of cumulative carbon dioxide (CO2) emissions and temperature changes from eleven earth system models, we demonstrate that the inherently lower internal variability found at tropical latitudes results in large increases in the frequency of extreme daily temperatures (exceedances of the 99.9th percentile derived from pre-industrial climate simulations) occurring much earlier than for mid-to-high latitude regions. Most of the world's poorest people live at low latitudes, when considering 2010 GDP-PPP per capita; conversely the wealthiest population quintile disproportionately inhabit more variable mid-latitude climates. Consequently, the fraction of the global population in the lowest socio-economic quintile is exposed to substantially more frequent daily temperature extremes after much lower increases in both mean global warming and cumulative CO2 emissions.
Resumo:
A clinical Klebsiella pneumoniae isolate carrying the extended-spectrum beta-lactamase gene variants bla(SHV-40), bla(TEM-116) and bla(GES-7) was recovered. Cefoxitin and ceftazidime activity was most affected by the presence of these genes and an additional resistance to trimethoprim-sulphamethoxazole was observed. The bla(GES-7) gene was found to be inserted into a class 1 integron. These results show the emergence of novel bla(TEM) and bla(SHV) genes in Brazil. Moreover, the presence of class 1 integrons suggests a great potential for dissemination of bla(GES) genes into diverse nosocomial pathogens. Indeed, the bla(GES-7) gene was originally discovered in Enterobacter cloacae in Greece and, to our knowledge, has not been reported elsewhere.
Resumo:
In the present work, a group contribution method is proposed for the estimation of viscosity of fatty compounds and biodiesel esters as a function of the temperature. The databank used for regression of the group contribution parameters (1070 values for 65 types of substances) included fatty compounds, such as fatty acids, methyl and ethyl esters and alcohols, tri- and diacylglycerols, and glycerol. The inclusion of new experimental data for fatty esters, a partial acylglycerol, and glycerol allowed for a further refinement in the performance of this methodology in comparison to a prior group contribution equation (Ceriani, R.; Goncalves, C. B.; Rabelo, J.; Caruso, M.; Cunha, A. C. C.; Cavaleri, F. W.; Batista, E. A. C.; Meirelles, A. J. A. Group contribution model for predicting viscosity of fatty compounds. J. Chem. Eng. Data 2007, 52, 965-972) for all classes of fatty compounds. Besides, the influence of small concentrations of partial acylglycerols, intermediate compounds in the transesterification reaction, in the viscosity of biodiesels was also investigated.
Resumo:
The evolution of commodity computing lead to the possibility of efficient usage of interconnected machines to solve computationally-intensive tasks, which were previously solvable only by using expensive supercomputers. This, however, required new methods for process scheduling and distribution, considering the network latency, communication cost, heterogeneous environments and distributed computing constraints. An efficient distribution of processes over such environments requires an adequate scheduling strategy, as the cost of inefficient process allocation is unacceptably high. Therefore, a knowledge and prediction of application behavior is essential to perform effective scheduling. In this paper, we overview the evolution of scheduling approaches, focusing on distributed environments. We also evaluate the current approaches for process behavior extraction and prediction, aiming at selecting an adequate technique for online prediction of application execution. Based on this evaluation, we propose a novel model for application behavior prediction, considering chaotic properties of such behavior and the automatic detection of critical execution points. The proposed model is applied and evaluated for process scheduling in cluster and grid computing environments. The obtained results demonstrate that prediction of the process behavior is essential for efficient scheduling in large-scale and heterogeneous distributed environments, outperforming conventional scheduling policies by a factor of 10, and even more in some cases. Furthermore, the proposed approach proves to be efficient for online predictions due to its low computational cost and good precision. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Process scheduling techniques consider the current load situation to allocate computing resources. Those techniques make approximations such as the average of communication, processing, and memory access to improve the process scheduling, although processes may present different behaviors during their whole execution. They may start with high communication requirements and later just processing. By discovering how processes behave over time, we believe it is possible to improve the resource allocation. This has motivated this paper which adopts chaos theory concepts and nonlinear prediction techniques in order to model and predict process behavior. Results confirm the radial basis function technique which presents good predictions and also low processing demands show what is essential in a real distributed environment.
Resumo:
This study investigates the numerical simulation of three-dimensional time-dependent viscoelastic free surface flows using the Upper-Convected Maxwell (UCM) constitutive equation and an algebraic explicit model. This investigation was carried out to develop a simplified approach that can be applied to the extrudate swell problem. The relevant physics of this flow phenomenon is discussed in the paper and an algebraic model to predict the extrudate swell problem is presented. It is based on an explicit algebraic representation of the non-Newtonian extra-stress through a kinematic tensor formed with the scaled dyadic product of the velocity field. The elasticity of the fluid is governed by a single transport equation for a scalar quantity which has dimension of strain rate. Mass and momentum conservations, and the constitutive equation (UCM and algebraic model) were solved by a three-dimensional time-dependent finite difference method. The free surface of the fluid was modeled using a marker-and-cell approach. The algebraic model was validated by comparing the numerical predictions with analytic solutions for pipe flow. In comparison with the classical UCM model, one advantage of this approach is that computational workload is substantially reduced: the UCM model employs six differential equations while the algebraic model uses only one. The results showed stable flows with very large extrudate growths beyond those usually obtained with standard differential viscoelastic models. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
We have studied an agent model which presents the emergence of sexual barriers through the onset of assortative mating, a condition that might lead to sympatric speciation. In the model, individuals are characterized by two traits, each determined by a single locus A or B. Heterozygotes on A are penalized by introducing an adaptive difference from homozygotes. Two niches are available. Each A homozygote is adapted to one of the niches. The second trait, called the marker trait has no bearing on the fitness. The model includes mating preferences, which are inherited from the mother and subject to random variations. A parameter controlling recombination probabilities of the two loci is also introduced. We study the phase diagram by means of simulations, in the space of parameters (adaptive difference, carrying capacity, recombination probability). Three phases are found, characterized by (i) assortative mating, (ii) extinction of one of the A alleles and (iii) Hardy-Weinberg like equilibrium. We also make perturbations of these phases to see how robust they are. Assortative mating can be gained or lost with changes that present hysteresis loops, showing the resulting equilibrium to have partial memory of the initial state and that the process of going from a polymorphic panmictic phase to a phase where assortative mating acts as sexual barrier can be described as a first-order transition. (C) 2009 Published by Elsevier Ltd.
Resumo:
Scenarios for the emergence or bootstrap of a lexicon involve the repeated interaction between at least two agents who must reach a consensus on how to name N objects using H words. Here we consider minimal models of two types of learning algorithms: cross-situational learning, in which the individuals determine the meaning of a word by looking for something in common across all observed uses of that word, and supervised operant conditioning learning, in which there is strong feedback between individuals about the intended meaning of the words. Despite the stark differences between these learning schemes, we show that they yield the same communication accuracy in the limits of large N and H, which coincides with the result of the classical occupancy problem of randomly assigning N objects to H words.
Resumo:
Managing software maintenance is rarely a precise task due to uncertainties concerned with resources and services descriptions. Even when a well-established maintenance process is followed, the risk of delaying tasks remains if the new services are not precisely described or when resources change during process execution. Also, the delay of a task at an early process stage may represent a different delay at the end of the process, depending on complexity or services reliability requirements. This paper presents a knowledge-based representation (Bayesian Networks) for maintenance project delays based on specialists experience and a corresponding tool to help in managing software maintenance projects. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
The purpose of this article is to present a new method to predict the response variable of an observation in a new cluster for a multilevel logistic regression. The central idea is based on the empirical best estimator for the random effect. Two estimation methods for multilevel model are compared: penalized quasi-likelihood and Gauss-Hermite quadrature. The performance measures for the prediction of the probability for a new cluster observation of the multilevel logistic model in comparison with the usual logistic model are examined through simulations and an application.