965 resultados para PREDICTIONS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Understanding and predicting the consequences of warming for complex ecosystems and indeed individual species remains a major ecological challenge. Here, we investigated the effect of increased seawater temperatures on the metabolic and consumption rates of five distinct marine species. The experimental species reflected different trophic positions within a typical benthic East Atlantic food web, and included a herbivorous gastropod, a scavenging decapod, a predatory echinoderm, a decapod and a benthic-feeding fish. We examined the metabolism-body mass and consumption-body mass scaling for each species, and assessed changes in their consumption efficiencies. Our results indicate that body mass and temperature effects on metabolism were inconsistent across species and that some species were unable to meet metabolic demand at higher temperatures, thus highlighting the vulnerability of individual species to warming. While body size explains a large proportion of the variation in species' physiological responses to warming, it is clear that idiosyncratic species responses, irrespective of body size, complicate predictions of population and ecosystem level response to future scenarios of climate change. © 2012 The Royal Society.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The speeds of sound in dibromomethane, bromochloromethane, and dichloromethane have been measured in the temperature range from 293.15 to 313.15 K and at pressures up to 100 MPa. Densities and isobaric heat capacities at atmospheric pressure have been also determined. Experimental results were used to calculate the densities and isobaric heat capacities as the function of temperature and pressure by means of a numerical integration technique. Moreover, experimental data at atmospheric pressure were then used to determine the SAFT-VR Mie molecular parameters for these liquids. The accuracy of the model has been then evaluated using a comparison of derived experimental high-pressure data with those predicted using SAFT. It was found that the model provide the possibility to predict also the isobaric heat capacity of all selected haloalkanes within an error up to 6%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Evolution equipped Bdellovibrio bacteriovorus predatory bacteria to invade other bacteria, digesting and replicating, sealed within them thus preventing nutrient-sharing with organisms in the surrounding environment. Bdellovibrio were previously described as "obligate predators" because only by mutations, often in gene bd0108, are 1 in ~1x10(7) of predatory lab strains of Bdellovibrio converted to prey-independent growth. A previous genomic analysis of B. bacteriovorus strain HD100 suggested that predatory consumption of prey DNA by lytic enzymes made Bdellovibrio less likely than other bacteria to acquire DNA by lateral gene transfer (LGT). However the Doolittle and Pan groups predicted, in silico, both ancient and recent lateral gene transfer into the B. bacteriovorus HD100 genome.

RESULTS: To test these predictions, we isolated a predatory bacterium from the River Tiber- a good potential source of LGT as it is rich in diverse bacteria and organic pollutants- by enrichment culturing with E. coli prey cells. The isolate was identified as B. bacteriovorus and named as strain Tiberius. Unusually, this Tiberius strain showed simultaneous prey-independent growth on organic nutrients and predatory growth on live prey. Despite the prey-independent growth, the homolog of bd0108 did not have typical prey-independent-type mutations. The dual growth mode may reflect the high carbon content of the river, and gives B. bacteriovorus Tiberius extended non-predatory contact with the other bacteria present. The HD100 and Tiberius genomes were extensively syntenic despite their different cultured-terrestrial/freshly-isolated aquatic histories; but there were significant differences in gene content indicative of genomic flux and LGT. Gene content comparisons support previously published in silico predictions for LGT in strain HD100 with substantial conservation of genes predicted to have ancient LGT origins but little conservation of AT-rich genes predicted to be recently acquired.

CONCLUSIONS: The natural niche and dual predatory, and prey-independent growth of the B. bacteriovorus Tiberius strain afforded it extensive non-predatory contact with other marine and freshwater bacteria from which LGT is evident in its genome. Thus despite their arsenal of DNA-lytic enzymes; Bdellovibrio are not always predatory in natural niches and their genomes are shaped by acquiring whole genes from other bacteria.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context. Although the question of progenitor systems and detailed explosion mechanisms still remains a matter of discussion, it is commonly believed that Type Ia supernovae (SNe Ia) are production sites of large amounts of radioactive nuclei. Even though the gamma-ray emission due to radioactive decays is responsible for powering the light curves of SNe Ia, gamma rays themselves are of particular interest as a diagnostic tool because they directly lead to deeper insight into the nucleosynthesis and the kinematics of these explosion events. Aims: We study the evolution of gamma-ray line and continuum emission of SNe Ia with the objective of analyzing the relevance of observations in this energy range. We seek to investigate the chances for the success of future MeV missions regarding their capabilities for constraining the intrinsic properties and the physical processes of SNe Ia. Methods: Focusing on two of the most broadly discussed SN Ia progenitor scenarios - a delayed detonation in a Chandrasekhar-mass white dwarf (WD) and a violent merger of two WDs - we used three-dimensional explosion models and performed radiative transfer simulations to obtain synthetic gamma-ray spectra. Both chosen models produce the same mass of 56Ni and have similar optical properties that are in reasonable agreement with the recently observed supernova SN 2011fe. We examine the gamma-ray spectra with respect to their distinct features and draw connections to certain characteristics of the explosion models. Applying diagnostics, such as line and hardness ratios, the detection prospects for future gamma-ray missions with higher sensitivities in the MeV energy range are discussed. Results: In contrast to the optical regime, the gamma-ray emission of our two chosen models proves to be quite different. The almost direct connection of the emission of gamma rays to fundamental physical processes occurring in SNe Ia permits additional constraints concerning several explosion model properties that are not easily accessible within other wavelength ranges. Proposed future MeV missions such as GRIPS will resolve all spectral details only for nearby SNe Ia, but hardness ratio and light curve measurements still allow for a distinction of the two different models at 10 Mpc and 16 Mpc for an exposure time of 106 s. The possibility of detecting the strongest line features up to the Virgo distance will offer the opportunity to build up a first sample of SN Ia detections in the gamma-ray energy range and underlines the importance of future space observatories for MeV gamma rays.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Les temps de réponse dans une tache de reconnaissance d’objets visuels diminuent de façon significative lorsque les cibles peuvent être distinguées à partir de deux attributs redondants. Le gain de redondance pour deux attributs est un résultat commun dans la littérature, mais un gain causé par trois attributs redondants n’a été observé que lorsque ces trois attributs venaient de trois modalités différentes (tactile, auditive et visuelle). La présente étude démontre que le gain de redondance pour trois attributs de la même modalité est effectivement possible. Elle inclut aussi une investigation plus détaillée des caractéristiques du gain de redondance. Celles-ci incluent, outre la diminution des temps de réponse, une diminution des temps de réponses minimaux particulièrement et une augmentation de la symétrie de la distribution des temps de réponse. Cette étude présente des indices que ni les modèles de course, ni les modèles de coactivation ne sont en mesure d’expliquer l’ensemble des caractéristiques du gain de redondance. Dans ce contexte, nous introduisons une nouvelle méthode pour évaluer le triple gain de redondance basée sur la performance des cibles doublement redondantes. Le modèle de cascade est présenté afin d’expliquer les résultats de cette étude. Ce modèle comporte plusieurs voies de traitement qui sont déclenchées par une cascade d’activations avant de satisfaire un seul critère de décision. Il offre une approche homogène aux recherches antérieures sur le gain de redondance. L’analyse des caractéristiques des distributions de temps de réponse, soit leur moyenne, leur symétrie, leur décalage ou leur étendue, est un outil essentiel pour cette étude. Il était important de trouver un test statistique capable de refléter les différences au niveau de toutes ces caractéristiques. Nous abordons la problématique d’analyser les temps de réponse sans perte d’information, ainsi que l’insuffisance des méthodes d’analyse communes dans ce contexte, comme grouper les temps de réponses de plusieurs participants (e. g. Vincentizing). Les tests de distributions, le plus connu étant le test de Kolmogorov- Smirnoff, constituent une meilleure alternative pour comparer des distributions, celles des temps de réponse en particulier. Un test encore inconnu en psychologie est introduit : le test d’Anderson-Darling à deux échantillons. Les deux tests sont comparés, et puis nous présentons des indices concluants démontrant la puissance du test d’Anderson-Darling : en comparant des distributions qui varient seulement au niveau de (1) leur décalage, (2) leur étendue, (3) leur symétrie, ou (4) leurs extrémités, nous pouvons affirmer que le test d’Anderson-Darling reconnait mieux les différences. De plus, le test d’Anderson-Darling a un taux d’erreur de type I qui correspond exactement à l’alpha tandis que le test de Kolmogorov-Smirnoff est trop conservateur. En conséquence, le test d’Anderson-Darling nécessite moins de données pour atteindre une puissance statistique suffisante.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Data mining is one of the hottest research areas nowadays as it has got wide variety of applications in common man’s life to make the world a better place to live. It is all about finding interesting hidden patterns in a huge history data base. As an example, from a sales data base, one can find an interesting pattern like “people who buy magazines tend to buy news papers also” using data mining. Now in the sales point of view the advantage is that one can place these things together in the shop to increase sales. In this research work, data mining is effectively applied to a domain called placement chance prediction, since taking wise career decision is so crucial for anybody for sure. In India technical manpower analysis is carried out by an organization named National Technical Manpower Information System (NTMIS), established in 1983-84 by India's Ministry of Education & Culture. The NTMIS comprises of a lead centre in the IAMR, New Delhi, and 21 nodal centres located at different parts of the country. The Kerala State Nodal Centre is located at Cochin University of Science and Technology. In Nodal Centre, they collect placement information by sending postal questionnaire to passed out students on a regular basis. From this raw data available in the nodal centre, a history data base was prepared. Each record in this data base includes entrance rank ranges, reservation, Sector, Sex, and a particular engineering. From each such combination of attributes from the history data base of student records, corresponding placement chances is computed and stored in the history data base. From this data, various popular data mining models are built and tested. These models can be used to predict the most suitable branch for a particular new student with one of the above combination of criteria. Also a detailed performance comparison of the various data mining models is done.This research work proposes to use a combination of data mining models namely a hybrid stacking ensemble for better predictions. A strategy to predict the overall absorption rate for various branches as well as the time it takes for all the students of a particular branch to get placed etc are also proposed. Finally, this research work puts forward a new data mining algorithm namely C 4.5 * stat for numeric data sets which has been proved to have competent accuracy over standard benchmarking data sets called UCI data sets. It also proposes an optimization strategy called parameter tuning to improve the standard C 4.5 algorithm. As a summary this research work passes through all four dimensions for a typical data mining research work, namely application to a domain, development of classifier models, optimization and ensemble methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hat Stiffened Plates are used in composite ships and are gaining popularity in metallic ship construction due to its high strength-to-weight ratio. Light weight structures will result in greater payload, higher speeds, reduced fuel consumption and environmental emissions. Numerical Investigations have been carried out using the commercial Finite Element software ANSYS 12 to substantiate the high strength-to-weight ratio of Hat Stiffened Plates over other open section stiffeners which are commonly used in ship building. Analysis of stiffened plate has always been a matter of concern for the structural engineers since it has been rather difficult to quantify the actual load sharing between stiffeners and plating. Finite Element Method has been accepted as an efficient tool for the analysis of stiffened plated structure. Best results using the Finite Element Method for the analysis of thin plated structures are obtained when both the stiffeners and the plate are modeled using thin plate elements having six degrees of freedom per node. However, one serious problem encountered with this design and analysis process is that the generation of the finite element models for a complex configuration is time consuming and laborious. In order to overcome these difficulties two different methods viz., Orthotropic Plate Model and Superelement for Hat Stiffened Plate have been suggested in the present work. In the Orthotropic Plate Model geometric orthotropy is converted to material orthotropy i.e., the stiffeners are smeared and they vanish from the field of analysis and the structure can be analysed using any commercial Finite Element software which has orthotropic elements in its element library. The Orthotropic Plate Model developed has predicted deflection, stress and linear buckling load with sufficiently good accuracy in the case of all four edges simply supported boundary condition. Whereas, in the case of two edges fixed and other two edges simply supported boundary condition even though the stress has been predicted with good accuracy there has been large variation in the deflection predicted. This variation in the deflection predicted is because, for the Orthotropic Plate Model the rigidity is uniform throughout the plate whereas in the actual Hat Stiffened Plate the rigidity along the line of attachment of the stiffeners to the plate is large as compared to the unsupported portion of the plate. The Superelement technique is a method of treating a portion of the structure as if it were a single element even though it is made up of many individual elements. The Superelement has predicted the deflection and in-plane stress of Hat Stiffened Plate with sufficiently good accuracy for different boundary conditions. Formulation of Superelement for composite Hat Stiffened Plate has also been presented in the thesis. The capability of Orthotropic Plate Model and Superelement to handle typical boundary conditions and characteristic loads in a ship structure has been demonstrated through numerical investigations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ability of climate models to reproduce and predict land surface anomalies is an important but little-studied topic. In this study, an atmosphere and ocean assimilation scheme is used to determine whether HadCM3 can reproduce and predict snow water equivalent and soil moisture during the 1997–1998 El Nino Southern Oscillation event. Soil moisture is reproduced more successfully, though both snow and soil moisture show some predictability at 1- and 4-month lead times. This result suggests that land surface anomalies may be reasonably well initialized for climate model predictions and hydrological applications using atmospheric assimilation methods over a period of time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Faced by the realities of a changing climate, decision makers in a wide variety of organisations are increasingly seeking quantitative predictions of regional and local climate. An important issue for these decision makers, and for organisations that fund climate research, is what is the potential for climate science to deliver improvements - especially reductions in uncertainty - in such predictions? Uncertainty in climate predictions arises from three distinct sources: internal variability, model uncertainty and scenario uncertainty. Using data from a suite of climate models we separate and quantify these sources. For predictions of changes in surface air temperature on decadal timescales and regional spatial scales, we show that uncertainty for the next few decades is dominated by sources (model uncertainty and internal variability) that are potentially reducible through progress in climate science. Furthermore, we find that model uncertainty is of greater importance than internal variability. Our findings have implications for managing adaptation to a changing climate. Because the costs of adaptation are very large, and greater uncertainty about future climate is likely to be associated with more expensive adaptation, reducing uncertainty in climate predictions is potentially of enormous economic value. We highlight the need for much more work to compare: a) the cost of various degrees of adaptation, given current levels of uncertainty; and b) the cost of new investments in climate science to reduce current levels of uncertainty. Our study also highlights the importance of targeting climate science investments on the most promising opportunities to reduce prediction uncertainty.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Remote sensing can potentially provide information useful in improving pollution transport modelling in agricultural catchments. Realisation of this potential will depend on the availability of the raw data, development of information extraction techniques, and the impact of the assimilation of the derived information into models. High spatial resolution hyperspectral imagery of a farm near Hereford, UK is analysed. A technique is described to automatically identify the soil and vegetation endmembers within a field, enabling vegetation fractional cover estimation. The aerially-acquired laser altimetry is used to produce digital elevation models of the site. At the subfield scale the hypothesis that higher resolution topography will make a substantial difference to contaminant transport is tested using the AGricultural Non-Point Source (AGNPS) model. Slope aspect and direction information are extracted from the topography at different resolutions to study the effects on soil erosion, deposition, runoff and nutrient losses. Field-scale models are often used to model drainage water, nitrate and runoff/sediment loss, but the demanding input data requirements make scaling up to catchment level difficult. By determining the input range of spatial variables gathered from EO data, and comparing the response of models to the range of variation measured, the critical model inputs can be identified. Response surfaces to variation in these inputs constrain uncertainty in model predictions and are presented. Although optical earth observation analysis can provide fractional vegetation cover, cloud cover and semi-random weather patterns can hinder data acquisition in Northern Europe. A Spring and Autumn cloud cover analysis is carried out over seven UK sites close to agricultural districts, using historic satellite image metadata, climate modelling and historic ground weather observations. Results are assessed in terms of probability of acquisition probability and implications for future earth observation missions. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the impact of aerosol forcing uncertainty on the robustness of estimates of the twentieth-century warming attributable to anthropogenic greenhouse gas emissions. Attribution analyses on three coupled climate models with very different sensitivities and aerosol forcing are carried out. The Third Hadley Centre Coupled Ocean - Atmosphere GCM (HadCM3), Parallel Climate Model (PCM), and GFDL R30 models all provide good simulations of twentieth-century global mean temperature changes when they include both anthropogenic and natural forcings. Such good agreement could result from a fortuitous cancellation of errors, for example, by balancing too much ( or too little) greenhouse warming by too much ( or too little) aerosol cooling. Despite a very large uncertainty for estimates of the possible range of sulfate aerosol forcing obtained from measurement campaigns, results show that the spatial and temporal nature of observed twentieth-century temperature change constrains the component of past warming attributable to anthropogenic greenhouse gases to be significantly greater ( at the 5% level) than the observed warming over the twentieth century. The cooling effects of aerosols are detected in all three models. Both spatial and temporal aspects of observed temperature change are responsible for constraining the relative roles of greenhouse warming and sulfate cooling over the twentieth century. This is because there are distinctive temporal structures in differential warming rates between the hemispheres, between land and ocean, and between mid- and low latitudes. As a result, consistent estimates of warming attributable to greenhouse gas emissions are obtained from all three models, and predictions are relatively robust to the use of more or less sensitive models. The transient climate response following a 1% yr(-1) increase in CO2 is estimated to lie between 2.2 and 4 K century(-1) (5-95 percentiles).