876 resultados para 2447: modelling and forecasting
Resumo:
In this work, compliant actuators are developed by coupling braided structures and polymer gels, able to produce work by controlled gel swelling in the presence of water. A number of aspects related to the engineering of gel actuators were studied, including gel selection, modelling and experimentation of constant force and constant displacement behaviour, and response time. The actuator was intended for use as vibration neutralizer: with this aim, generation of a force of 10 N in a time not exceeding a second was needed. Results were promising in terms of force generation, although response time was still longer than required. In addition, the easiest way to obtain the reversibility of the effect is still under discussion: possible routes for improvement are suggested and will be the object of future work.
Resumo:
The problem of modeling solar energetic particle (SEP) events is important to both space weather research and forecasting, and yet it has seen relatively little progress. Most important SEP events are associated with coronal mass ejections (CMEs) that drive coronal and interplanetary shocks. These shocks can continuously produce accelerated particles from the ambient medium to well beyond 1 AU. This paper describes an effort to model real SEP events using a Center for Integrated Space weather Modeling (CISM) MHD solar wind simulation including a cone model of CMEs to initiate the related shocks. In addition to providing observation-inspired shock geometry and characteristics, this MHD simulation describes the time-dependent observer field line connections to the shock source. As a first approximation, we assume a shock jump-parameterized source strength and spectrum, and that scatter-free transport occurs outside of the shock source, thus emphasizing the role the shock evolution plays in determining the modeled SEP event profile. Three halo CME events on May 12, 1997, November 4, 1997 and December 13, 2006 are used to test the modeling approach. While challenges arise in the identification and characterization of the shocks in the MHD model results, this approach illustrates the importance to SEP event modeling of globally simulating the underlying heliospheric event. The results also suggest the potential utility of such a model for forcasting and for interpretation of separated multipoint measurements such as those expected from the STEREO mission.
Resumo:
The Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble (TIGGE) is a World Weather Research Programme project. One of its main objectives is to enhance collaboration on the development of ensemble prediction between operational centers and universities by increasing the availability of ensemble prediction system (EPS) data for research. This study analyzes the prediction of Northern Hemisphere extratropical cyclones by nine different EPSs archived as part of the TIGGE project for the 6-month time period of 1 February 2008–31 July 2008, which included a sample of 774 cyclones. An objective feature tracking method has been used to identify and track the cyclones along the forecast trajectories. Forecast verification statistics have then been produced [using the European Centre for Medium-Range Weather Forecasts (ECMWF) operational analysis as the truth] for cyclone position, intensity, and propagation speed, showing large differences between the different EPSs. The results show that the ECMWF ensemble mean and control have the highest level of skill for all cyclone properties. The Japanese Meteorological Administration (JMA), the National Centers for Environmental Prediction (NCEP), the Met Office (UKMO), and the Canadian Meteorological Centre (CMC) have 1 day less skill for the position of cyclones throughout the forecast range. The relative performance of the different EPSs remains the same for cyclone intensity except for NCEP, which has larger errors than for position. NCEP, the Centro de Previsão de Tempo e Estudos Climáticos (CPTEC), and the Australian Bureau of Meteorology (BoM) all have faster intensity error growth in the earlier part of the forecast. They are also very underdispersive and significantly underpredict intensities, perhaps due to the comparatively low spatial resolutions of these EPSs not being able to accurately model the tilted structure essential to cyclone growth and decay. There is very little difference between the levels of skill of the ensemble mean and control for cyclone position, but the ensemble mean provides an advantage over the control for all EPSs except CPTEC in cyclone intensity and there is an advantage for propagation speed for all EPSs. ECMWF and JMA have an excellent spread–skill relationship for cyclone position. The EPSs are all much more underdispersive for cyclone intensity and propagation speed than for position, with ECMWF and CMC performing best for intensity and CMC performing best for propagation speed. ECMWF is the only EPS to consistently overpredict cyclone intensity, although the bias is small. BoM, NCEP, UKMO, and CPTEC significantly underpredict intensity and, interestingly, all the EPSs underpredict the propagation speed, that is, the cyclones move too slowly on average in all EPSs.
Resumo:
1. Estimates of seed bank depletion rates are essential for modelling and management of plant populations. The seed bag burial method is often used to measure seed mortality in the soil. However, the density of seeds within seed bags is higher than densities in natural seed banks, which may elevate levels of pathogens and influence seed mortality. The aim of this study was to quantify the effects of fungi and seed density within buried mesh bags on the mortality of seeds. Striga hermonthica was chosen as the study species because it has been widely studied but different methods for measuring seed mortality in the soil have yielded contradictory estimates. 2. Seed bags were buried in soil and exhumed at regular time intervals to monitor mortality of the seeds in three field experiments during two rainy seasons. The effect of fungal activity on seed mortality was evaluated in a fungi exclusion experiment. Differences in seed-to-seed interaction were obtained by using two and four densities within the seed bags in consecutive years. Densities were created by mixing 1000 seeds with 0, 10, 100 or 1000 g of coarse sand. 3. The mortality rate was significantly lower when fungi were excluded, indicating the possible role of pathogenic fungi. 4. Decreasing the density of seeds in bags significantly reduced seed mortality, most probably because of decreased seed-to-seed contamination by pathogenic fungi. 5. Synthesis and applications. Models of plant populations in general and annual weeds in particular often use values from the literature for seed bank depletion rates. These depletion rates have often been estimated by the seed bag burial method, yet seed density within seed bags may be unrealistically high. Consequently, estimates of seed mortality rates may be too high because of an overestimation of the effects of soil or seed-borne pathogens. Species that have been classified from such studies as having short-lived seed banks may need to be re-assessed using realistic densities either within seed bags or otherwise. Similarly, models of seed bank dynamics based on such overestimated depletion rates may lead to incorrect conclusions regarding the seed banks and, perhaps, the management of weeds and rare species.
Resumo:
Statistical approaches have been applied to examine amino acid pairing preferences within parallel beta-sheets. The main chain hydrogen bonding pattern in parallel beta-sheets means that, for each residue pair, only one of the residues is involved in main chain hydrogen bonding with the strand containing the partner residue. We call this the hydrogen bonded (HB) residue and the partner residue the non-hydrogen bonded (nHB) residue, and differentiate between the favorability of a pair and that of its reverse pair, e.g. Asn(HB)-Thr(nHB)versus Thr(HB)-Asn(nHB). Significantly (p < or = 0.000001) favoured pairings were rationalised using stereochemical arguments. For instance, Asn(HB)-Thr(nHB) and Arg(HB)-Thr(nHB) were favoured pairs, where the residues adopted favoured chi1 rotamer positions that allowed side-chain interactions to occur. In contrast, Thr(HB)-Asn(nHB) and Thr(HB)-Arg(nHB) were not significantly favoured, and could only form side-chain interactions if the residues involved adopted less favourable chi1 conformations. The favourability of hydrophobic pairs e.g. Ile(HB)-Ile(nHB), Val(HB)-Val(nHB) and Leu(HB)-Ile(nHB) was explained by the residues adopting their most preferred chi1 and chi2 conformations, which enabled them to form nested arrangements. Cysteine-cysteine pairs are significantly favoured, although these do not form intrasheet disulphide bridges. Interactions between positively and negatively charged residues were asymmetrically preferred: those with the negatively charged residue at the HB position were more favoured. This trend was accounted for by the presence of general electrostatic interactions, which, based on analysis of distances between charged atoms, were likely to be stronger when the negatively charged residue is the HB partner. The Arg(HB)-Asp(nHB) interaction was an exception to this trend and its favorability was rationalised by the formation of specific side-chain interactions. This research provides rules that could be applied to protein structure prediction, comparative modelling and protein engineering and design. The methods used to analyse the pairing preferences are automated and detailed results are available (http://www.rubic.rdg.ac.uk/betapairprefsparallel/).
Resumo:
Statistical approaches have been applied to examine amino acid pairing preferences within parallel beta-sheets. The main chain hydrogen bonding pattern in parallel beta-sheets means that, for each residue pair, only one of the residues is involved in main chain hydrogen bonding with the strand containing the partner residue. We call this the hydrogen bonded (HB) residue and the partner residue the non-hydrogen bonded (nHB) residue, and differentiate between the favourability of a pair and that of its reverse pair, e.g. Asn(HB)-Thr(nHB) versus Thr(HB)-Asn(nHB). Significantly (p <= 0.000001) favoured pairings were rationalised using stereochemical arguments. For instance, Asn(HB)-Thr(nHB) and Arg(HB)-Thr(nHB) were favoured pairs, where the residues adopted favoured chi(1) rotamer positions that allowed side-chain interactions to occur. In contrast, Thr(HB)-Asn(nHB) and Thr(HB)-Arg(nHB) were not significantly favoured, and could only form side-chain interactions if the residues involved adopted less favourable chi(1) conformations. The favourability of hydrophobic pairs e.g. Ile(HB)-Ile(nHB), Val(HB)-Val(nHB) and Leu(HB)-Ile(nHB) was explained by the residues adopting their most preferred chi(1) and chi(2) conformations, which enabled them to form nested arrangements. Cysteine-cysteine pairs are significantly favoured, although these do not form intrasheet disulphide bridges. Interactions between positively and negatively charged residues were asymmetrically preferred: those with the negatively charged residue at the HB position were more favoured. This trend was accounted for by the presence of general electrostatic interactions, which, based on analysis of distances between charged atoms, were likely to be stronger when the negatively charged residue is the HB partner. The Arg(HB)-Asp(nHB) interaction was an exception to this trend and its favourability was rationalised by the formation of specific side-chain interactions. This research provides rules that could be applied to protein structure prediction, comparative modelling and protein engineering and design. The methods used to analyse the pairing preferences are automated and detailed results are available (http:// www.rubic.rdg.ac.uk/betapairprefsparallel/). (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
The importance of dispersal for the maintenance of biodiversity, while long-recognized, has remained unresolved. We used molecular markers to measure effective dispersal in a natural population of the vertebrate-dispersed Neotropical tree, Simarouba amara (Simaroubaceae) by comparing the distances between maternal parents and their offspring and comparing gene movement via seed and pollen in the 50 ha plot of the Barro Colorado Island forest, Central Panama. In all cases (parent-pair, mother-offspring, father-offspring, sib-sib) distances between related pairs were significantly greater than distances to nearest possible neighbours within each category. Long-distance seedling establishment was frequent: 74% of assigned seedlings established > 100 m from the maternal parent [mean = 392 +/- 234.6 m (SD), range = 9.3-1000.5 m] and pollen-mediated gene flow was comparable to that of seed [mean = 345.0 +/- 157.7 m (SD), range 57.6-739.7 m]. For S. amara we found approximately a 10-fold difference between distances estimated by inverse modelling and mean seedling recruitment distances (39 m vs. 392 m). Our findings have important implications for future studies in forest demography and regeneration, with most seedlings establishing at distances far exceeding those demonstrated by negative density-dependent effects.
Resumo:
Four unsaturated aminopyranosides have been prepared as possible transition-state mimics targeted towards carbohydrate processing enzymes. The conformations of the protonated aminosugars have been investigated by molecular modelling and their ability to inhibit alpha- and beta-glucosidases and an a-mannosidase have been probed. Two targets proved moderate inhibitors of alpha-glucosidases from Brewer's yeast and Bacillus stearothennophilus.
Resumo:
Xyloglucan-acting enzymes are believed to have effects on type I primary plant cell wall mechanical properties. In order to get a better understanding of these effects, a range of enzymes with different in vitro modes of action were tested against cell wall analogues (bio-composite materials based on Acetobacter xylinus cellulose and xyloglucan). Tomato pericarp xyloglucan endo transglycosylase (tXET) and nasturtium seed xyloglucanase (nXGase) were produced heterologously in Pichia pastoris. Their action against the cell wall analogues was compared with that of a commercial preparation of Trichoderma endo-glucanase (EndoGase). Both 'hydrolytic' enzymes (nXGase and EndoGase) were able to depolymerise not only the cross-link xyloglucan fraction but also the surface-bound fraction. Consequent major changes in cellulose fibril architecture were observed. In mechanical terms, removal of xyloglucan cross-links from composites resulted in increased stiffness (at high strain) and decreased visco-elasticity with similar extensibility. On the other hand, true transglycosylase activity (tXET) did not affect the cellulose/xyloglucan ratio. No change in composite stiffness or extensibility resulted, but a significant increase in creep behaviour was observed in the presence of active tXET. These results provide direct in vitro evidence for the involvement of cell wall xyloglucan-specific enzymes in mechanical changes underlying plant cell wall re-modelling and growth processes. Mechanical consequences of tXET action are shown to be complimentary to those of cucumber expansin.
Resumo:
Hidden Markov Models (HMMs) have been successfully applied to different modelling and classification problems from different areas over the recent years. An important step in using HMMs is the initialisation of the parameters of the model as the subsequent learning of HMM’s parameters will be dependent on these values. This initialisation should take into account the knowledge about the addressed problem and also optimisation techniques to estimate the best initial parameters given a cost function, and consequently, to estimate the best log-likelihood. This paper proposes the initialisation of Hidden Markov Models parameters using the optimisation algorithm Differential Evolution with the aim to obtain the best log-likelihood.
Resumo:
In the forecasting of binary events, verification measures that are “equitable” were defined by Gandin and Murphy to satisfy two requirements: 1) they award all random forecasting systems, including those that always issue the same forecast, the same expected score (typically zero), and 2) they are expressible as the linear weighted sum of the elements of the contingency table, where the weights are independent of the entries in the table, apart from the base rate. The authors demonstrate that the widely used “equitable threat score” (ETS), as well as numerous others, satisfies neither of these requirements and only satisfies the first requirement in the limit of an infinite sample size. Such measures are referred to as “asymptotically equitable.” In the case of ETS, the expected score of a random forecasting system is always positive and only falls below 0.01 when the number of samples is greater than around 30. Two other asymptotically equitable measures are the odds ratio skill score and the symmetric extreme dependency score, which are more strongly inequitable than ETS, particularly for rare events; for example, when the base rate is 2% and the sample size is 1000, random but unbiased forecasting systems yield an expected score of around −0.5, reducing in magnitude to −0.01 or smaller only for sample sizes exceeding 25 000. This presents a problem since these nonlinear measures have other desirable properties, in particular being reliable indicators of skill for rare events (provided that the sample size is large enough). A potential way to reconcile these properties with equitability is to recognize that Gandin and Murphy’s two requirements are independent, and the second can be safely discarded without losing the key advantages of equitability that are embodied in the first. This enables inequitable and asymptotically equitable measures to be scaled to make them equitable, while retaining their nonlinearity and other properties such as being reliable indicators of skill for rare events. It also opens up the possibility of designing new equitable verification measures.
Resumo:
The Group on Earth Observations System of Systems, GEOSS, is a co-ordinated initiative by many nations to address the needs for earth-system information expressed by the 2002 World Summit on Sustainable Development. We discuss the role of earth-system modelling and data assimilation in transforming earth-system observations into the predictive and status-assessment products required by GEOSS, across many areas of socio-economic interest. First we review recent gains in the predictive skill of operational global earth-system models, on time-scales of days to several seasons. We then discuss recent work to develop from the global predictions a diverse set of end-user applications which can meet GEOSS requirements for information of socio-economic benefit; examples include forecasts of coastal storm surges, floods in large river basins, seasonal crop yield forecasts and seasonal lead-time alerts for malaria epidemics. We note ongoing efforts to extend operational earth-system modelling and assimilation capabilities to atmospheric composition, in support of improved services for air-quality forecasts and for treaty assessment. We next sketch likely GEOSS observational requirements in the coming decades. In concluding, we reflect on the cost of earth observations relative to the modest cost of transforming the observations into information of socio-economic value.
Resumo:
We explore the potential for making statistical decadal predictions of sea surface temperatures (SSTs) in a perfect model analysis, with a focus on the Atlantic basin. Various statistical methods (Lagged correlations, Linear Inverse Modelling and Constructed Analogue) are found to have significant skill in predicting the internal variability of Atlantic SSTs for up to a decade ahead in control integrations of two different global climate models (GCMs), namely HadCM3 and HadGEM1. Statistical methods which consider non-local information tend to perform best, but which is the most successful statistical method depends on the region considered, GCM data used and prediction lead time. However, the Constructed Analogue method tends to have the highest skill at longer lead times. Importantly, the regions of greatest prediction skill can be very different to regions identified as potentially predictable from variance explained arguments. This finding suggests that significant local decadal variability is not necessarily a prerequisite for skillful decadal predictions, and that the statistical methods are capturing some of the dynamics of low-frequency SST evolution. In particular, using data from HadGEM1, significant skill at lead times of 6–10 years is found in the tropical North Atlantic, a region with relatively little decadal variability compared to interannual variability. This skill appears to come from reconstructing the SSTs in the far north Atlantic, suggesting that the more northern latitudes are optimal for SST observations to improve predictions. We additionally explore whether adding sub-surface temperature data improves these decadal statistical predictions, and find that, again, it depends on the region, prediction lead time and GCM data used. Overall, we argue that the estimated prediction skill motivates the further development of statistical decadal predictions of SSTs as a benchmark for current and future GCM-based decadal climate predictions.