95 resultados para Scaling Of Chf


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The fast increase in the size and number of databases demands data mining approaches that are scalable to large amounts of data. This has led to the exploration of parallel computing technologies in order to perform data mining tasks concurrently using several processors. Parallelization seems to be a natural and cost-effective way to scale up data mining technologies. One of the most important of these data mining technologies is the classification of newly recorded data. This paper surveys advances in parallelization in the field of classification rule induction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

How fast can a mammal evolve from the size of a mouse to the size of an elephant? Achieving such a large transformation calls for major biological reorganization. Thus, the speed at which this occurs has important implications for extensive faunal changes, including adaptive radiations and recovery from mass extinctions. To quantify the pace of large-scale evolution we developed a metric, clade maximum rate, which represents the maximum evolutionary rate of a trait within a clade. We applied this metric to body mass evolution in mammals over the last 70 million years, during which multiple large evolutionary transitions occurred in oceans and on continents and islands. Our computations suggest that it took a minimum of 1.6, 5.1, and 10 million generations for terrestrial mammal mass to increase 100-, and 1,000-, and 5,000- fold, respectively. Values for whales were down to half the length (i.e., 1.1, 3, and 5 million generations), perhaps due to the reduced mechanical constraints of living in an aquatic environment. When differences in generation time are considered, we find an exponential increase in maximum mammal body mass during the 35 million years following the Cretaceous–Paleogene (K–Pg) extinction event. Our results also indicate a basic asymmetry in macroevolution: very large decreases (such as extreme insular dwarfism) can happen at more than 10 times the rate of increases. Our findings allow more rigorous comparisons of microevolutionary and macroevolutionary patterns and processes. Keywords: haldanes, biological time, scaling, pedomorphosis

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A favoured method of assimilating information from state-of-the-art climate models into integrated assessment models of climate impacts is to use the transient climate response (TCR) of the climate models as an input, sometimes accompanied by a pattern matching approach to provide spatial information. More recent approaches to the problem use TCR with another independent piece of climate model output: the land-sea surface warming ratio (φ). In this paper we show why the use of φ in addition to TCR has such utility. Multiple linear regressions of surface temperature change onto TCR and φ in 22 climate models from the CMIP3 multi-model database show that the inclusion of φ explains a much greater fraction of the inter-model variance than using TCR alone. The improvement is particularly pronounced in North America and Eurasia in the boreal summer season, and in the Amazon all year round. The use of φ as the second metric is beneficial for three reasons: firstly it is uncorrelated with TCR in state-of-the-art climate models and can therefore be considered as an independent metric; secondly, because of its projected time-invariance, the magnitude of φ is better constrained than TCR in the immediate future; thirdly, the use of two variables is much simpler than approaches such as pattern scaling from climate models. Finally we show how using the latest estimates of φ from climate models with a mean value of 1.6—as opposed to previously reported values of 1.4—can significantly increase the mean time-integrated discounted damage projections in a state-of-the-art integrated assessment model by about 15 %. When compared to damages calculated without the inclusion of the land-sea warming ratio, this figure rises to 65 %, equivalent to almost 200 trillion dollars over 200 years.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Results from aircraft and surface observations provided evidence for the existence of mesoscale circulations over the Boreal Ecosystem-Atmosphere Study (BOREAS) domain. Using an integrated approach that included the use of analytical modeling, numerical modeling, and data analysis, we have found that there are substantial contributions to the total budgets of heat over the BOREAS domain generated by mesoscale circulations. This effect is largest when the synoptic flow is relatively weak, yet it is present under less favorable conditions, as shown by the case study presented here. While further analysis is warranted to document this effect, the existence of mesoscale flow is not surprising, since it is related to the presence of landscape patches, including lakes, which are of a size on the order of the local Rossby radius and which have spatial differences in maximum sensible heat flux of about 300 W m−2. We have also analyzed the vertical temperature profile simulated in our case study as well as high-resolution soundings and we have found vertical profiles of temperature change above the boundary layer height, which we attribute in part to mesoscale contributions. Our conclusion is that in regions with organized landscapes, such as BOREAS, even with relatively strong synoptic winds, dynamical scaling criteria should be used to assess whether mesoscale effects should be parameterized or explicitly resolved in numerical models of the atmosphere.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Advances in hardware and software technology enable us to collect, store and distribute large quantities of data on a very large scale. Automatically discovering and extracting hidden knowledge in the form of patterns from these large data volumes is known as data mining. Data mining technology is not only a part of business intelligence, but is also used in many other application areas such as research, marketing and financial analytics. For example medical scientists can use patterns extracted from historic patient data in order to determine if a new patient is likely to respond positively to a particular treatment or not; marketing analysts can use extracted patterns from customer data for future advertisement campaigns; finance experts have an interest in patterns that forecast the development of certain stock market shares for investment recommendations. However, extracting knowledge in the form of patterns from massive data volumes imposes a number of computational challenges in terms of processing time, memory, bandwidth and power consumption. These challenges have led to the development of parallel and distributed data analysis approaches and the utilisation of Grid and Cloud computing. This chapter gives an overview of parallel and distributed computing approaches and how they can be used to scale up data mining to large datasets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an assessment of the impacts of climate change on a series of indicators of hydrological regimes across the global domain, using a global hydrological model run with climate scenarios constructed using pattern-scaling from 21 CMIP3 (Coupled Model Intercomparison Project Phase 3) climate models. Changes are compared with natural variability, with a significant change being defined as greater than the standard deviation of the hydrological indicator in the absence of climate change. Under an SRES (Special Report on Emissions Scenarios) A1b emissions scenario, substantial proportions of the land surface (excluding Greenland and Antarctica) would experience significant changes in hydrological behaviour by 2050; under one climate model scenario (Hadley Centre HadCM3), average annual runoff increases significantly over 47% of the land surface and decreases over 36%; only 17% therefore sees no significant change. There is considerable variability between regions, depending largely on projected changes in precipitation. Uncertainty in projected river flow regimes is dominated by variation in the spatial patterns of climate change between climate models (hydrological model uncertainty is not included). There is, however, a strong degree of consistency in the overall magnitude and direction of change. More than two-thirds of climate models project a significant increase in average annual runoff across almost a quarter of the land surface, and a significant decrease over 14%, with considerably higher degrees of consistency in some regions. Most climate models project increases in runoff in Canada and high-latitude eastern Europe and Siberia, and decreases in runoff in central Europe, around the Mediterranean, the Mashriq, central America and Brasil. There is some evidence that projecte change in runoff at the regional scale is not linear with change in global average temperature change. The effects of uncertainty in the rate of future emissions is relatively small

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dynamical downscaling of Global Climate Models (GCMs) through regional climate models (RCMs) potentially improves the usability of the output for hydrological impact studies. However, a further downscaling or interpolation of precipitation from RCMs is often needed to match the precipitation characteristics at the local scale. This study analysed three Model Output Statistics (MOS) techniques to adjust RCM precipitation; (1) a simple direct method (DM), (2) quantile-quantile mapping (QM) and (3) a distribution-based scaling (DBS) approach. The modelled precipitation was daily means from 16 RCMs driven by ERA40 reanalysis data over the 1961–2000 provided by the ENSEMBLES (ENSEMBLE-based Predictions of Climate Changes and their Impacts) project over a small catchment located in the Midlands, UK. All methods were conditioned on the entire time series, separate months and using an objective classification of Lamb's weather types. The performance of the MOS techniques were assessed regarding temporal and spatial characteristics of the precipitation fields, as well as modelled runoff using the HBV rainfall-runoff model. The results indicate that the DBS conditioned on classification patterns performed better than the other methods, however an ensemble approach in terms of both climate models and downscaling methods is recommended to account for uncertainties in the MOS methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The complete details of our calculation of the NLO QCD corrections to heavy flavor photo- and hadroproduction with longitudinally polarized initial states are presented. The main motivation for investigating these processes is the determination of the polarized gluon density at the COMPASS and RHIC experiments, respectively, in the near future. All methods used in the computation are extensively documented, providing a self-contained introduction to this type of calculations. Some employed tools also may be of general interest, e.g., the series expansion of hypergeometric functions. The relevant parton level results are collected and plotted in the form of scaling functions. However, the simplification of the obtained gluon-gluon virtual contributions has not been completed yet. Thus NLO phenomenological predictions are only given in the case of photoproduction. The theoretical uncertainties of these predictions, in particular with respect to the heavy quark mass, are carefully considered. Also it is shown that transverse momentum cuts can considerably enhance the measured production asymmetries. Finally unpolarized heavy quark production is reviewed in order to derive conditions for a successful interpretation of future spin-dependent experimental data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent aircraft measurements, primarily in the extratropics, of the horizontal variance of nitrous oxide (N2O) and ozone (O3) in the middle stratosphere indicate that horizontal spectra of the tracer variance scale nearly as k−2, where k is the spatial wavenumber along the aircraft flight track [Strahan and Mahlman, 1994; Bacmeister et al., 1996]. This spectral scaling has been regarded as inconsistent with the accepted picture of stratospheric tracer motion; large-scale quasi-two-dimensional tracer advection typically yields a k−1 scaling (i.e., the classical Batchelor spectrum). In this paper it is argued that the nearly k−2 scaling seen in the measurements is a natural outcome of quasi-two-dimensional filamentation of the polar vortex edge. The accepted picture of stratospheric tracer motion can thus be retained: no additional physical processes are needed to account for deviations from the Batchelor spectrum. Our argument is based on the finite lifetime of tracer filaments and on the “singularity spectrum” associated with a one-dimensional field composed of randomly spaced jumps in concentration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An eddy-resolving numerical model of a zonal flow, meant to resemble the Antarctic Circumpolar Current, is described and analyzed using the framework of J. Marshall and T. Radko. In addition to wind and buoyancy forcing at the surface, the model contains a sponge layer at the northern boundary that permits a residual meridional overturning circulation (MOC) to exist at depth. The strength of the residual MOC is diagnosed for different strengths of surface wind stress. It is found that the eddy circulation largely compensates for the changes in Ekman circulation. The extent of the compensation and thus the sensitivity of the MOC to the winds depend on the surface boundary condition. A fixed-heat-flux surface boundary severely limits the ability of the MOC to change. An interactive heat flux leads to greater sensitivity. To explain the MOC sensitivity to the wind strength under the interactive heat flux, transformed Eulerian-mean theory is applied, in which the eddy diffusivity plays a central role in determining the eddy response. A scaling theory for the eddy diffusivity, based on the mechanical energy balance, is developed and tested; the average magnitude of the diffusivity is found to be proportional to the square root of the wind stress. The MOC sensitivity to the winds based on this scaling is compared with the true sensitivity diagnosed from the experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Expression microarrays are increasingly used to obtain large scale transcriptomic information on a wide range of biological samples. Nevertheless, there is still much debate on the best ways to process data, to design experiments and analyse the output. Furthermore, many of the more sophisticated mathematical approaches to data analysis in the literature remain inaccessible to much of the biological research community. In this study we examine ways of extracting and analysing a large data set obtained using the Agilent long oligonucleotide transcriptomics platform, applied to a set of human macrophage and dendritic cell samples. Results: We describe and validate a series of data extraction, transformation and normalisation steps which are implemented via a new R function. Analysis of replicate normalised reference data demonstrate that intrarray variability is small (only around 2 of the mean log signal), while interarray variability from replicate array measurements has a standard deviation (SD) of around 0.5 log(2) units (6 of mean). The common practise of working with ratios of Cy5/Cy3 signal offers little further improvement in terms of reducing error. Comparison to expression data obtained using Arabidopsis samples demonstrates that the large number of genes in each sample showing a low level of transcription reflect the real complexity of the cellular transcriptome. Multidimensional scaling is used to show that the processed data identifies an underlying structure which reflect some of the key biological variables which define the data set. This structure is robust, allowing reliable comparison of samples collected over a number of years and collected by a variety of operators. Conclusions: This study outlines a robust and easily implemented pipeline for extracting, transforming normalising and visualising transcriptomic array data from Agilent expression platform. The analysis is used to obtain quantitative estimates of the SD arising from experimental (non biological) intra- and interarray variability, and for a lower threshold for determining whether an individual gene is expressed. The study provides a reliable basis for further more extensive studies of the systems biology of eukaryotic cells.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To optimise the placement of small wind turbines in urban areas a detailed understanding of the spatial variability of the wind resource is required. At present, due to a lack of observations, the NOABL wind speed database is frequently used to estimate the wind resource at a potential site. However, recent work has shown that this tends to overestimate the wind speed in urban areas. This paper suggests a method for adjusting the predictions of the NOABL in urban areas by considering the impact of the underlying surface on a neighbourhood scale. In which, the nature of the surface is characterised on a 1 km2 resolution using an urban morphology database. The model was then used to estimate the variability of the annual mean wind speed across Greater London at a height typical of current small wind turbine installations. Initial validation of the results suggests that the predicted wind speeds are considerably more accurate than the NOABL values. The derived wind map therefore currently provides the best opportunity to identify the neighbourhoods in Greater London at which small wind turbines yield their highest energy production. The model does not consider street scale processes, however previously derived scaling factors can be applied to relate the neighbourhood wind speed to a value at a specific rooftop site. The results showed that the wind speed predicted across London is relatively low, exceeding 4 ms-1 at only 27% of the neighbourhoods in the city. Of these sites less than 10% are within 10 km of the city centre, with the majority over 20 km from the city centre. Consequently, it is predicted that small wind turbines tend to perform better towards the outskirts of the city, therefore for cities which fit the Burgess concentric ring model, such as Greater London, ‘distance from city centre’ is a useful parameter for siting small wind turbines. However, there are a number of neighbourhoods close to the city centre at which the wind speed is relatively high and these sites can only been identified with a detailed representation of the urban surface, such as that developed in this study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Black carbon aerosol plays a unique and important role in Earth’s climate system. Black carbon is a type of carbonaceous material with a unique combination of physical properties. This assessment provides an evaluation of black-carbon climate forcing that is comprehensive in its inclusion of all known and relevant processes and that is quantitative in providing best estimates and uncertainties of the main forcing terms: direct solar absorption; influence on liquid, mixed phase, and ice clouds; and deposition on snow and ice. These effects are calculated with climate models, but when possible, they are evaluated with both microphysical measurements and field observations. Predominant sources are combustion related, namely, fossil fuels for transportation, solid fuels for industrial and residential uses, and open burning of biomass. Total global emissions of black carbon using bottom-up inventory methods are 7500 Gg yr�-1 in the year 2000 with an uncertainty range of 2000 to 29000. However, global atmospheric absorption attributable to black carbon is too low in many models and should be increased by a factor of almost 3. After this scaling, the best estimate for the industrial-era (1750 to 2005) direct radiative forcing of atmospheric black carbon is +0.71 W m�-2 with 90% uncertainty bounds of (+0.08, +1.27)Wm�-2. Total direct forcing by all black carbon sources, without subtracting the preindustrial background, is estimated as +0.88 (+0.17, +1.48) W m�-2. Direct radiative forcing alone does not capture important rapid adjustment mechanisms. A framework is described and used for quantifying climate forcings, including rapid adjustments. The best estimate of industrial-era climate forcing of black carbon through all forcing mechanisms, including clouds and cryosphere forcing, is +1.1 W m�-2 with 90% uncertainty bounds of +0.17 to +2.1 W m�-2. Thus, there is a very high probability that black carbon emissions, independent of co-emitted species, have a positive forcing and warm the climate. We estimate that black carbon, with a total climate forcing of +1.1 W m�-2, is the second most important human emission in terms of its climate forcing in the present-day atmosphere; only carbon dioxide is estimated to have a greater forcing. Sources that emit black carbon also emit other short-lived species that may either cool or warm climate. Climate forcings from co-emitted species are estimated and used in the framework described herein. When the principal effects of short-lived co-emissions, including cooling agents such as sulfur dioxide, are included in net forcing, energy-related sources (fossil fuel and biofuel) have an industrial-era climate forcing of +0.22 (�-0.50 to +1.08) W m-�2 during the first year after emission. For a few of these sources, such as diesel engines and possibly residential biofuels, warming is strong enough that eliminating all short-lived emissions from these sources would reduce net climate forcing (i.e., produce cooling). When open burning emissions, which emit high levels of organic matter, are included in the total, the best estimate of net industrial-era climate forcing by all short-lived species from black-carbon-rich sources becomes slightly negative (�-0.06 W m�-2 with 90% uncertainty bounds of �-1.45 to +1.29 W m�-2). The uncertainties in net climate forcing from black-carbon-rich sources are substantial, largely due to lack of knowledge about cloud interactions with both black carbon and co-emitted organic carbon. In prioritizing potential black-carbon mitigation actions, non-science factors, such as technical feasibility, costs, policy design, and implementation feasibility play important roles. The major sources of black carbon are presently in different stages with regard to the feasibility for near-term mitigation. This assessment, by evaluating the large number and complexity of the associated physical and radiative processes in black-carbon climate forcing, sets a baseline from which to improve future climate forcing estimates.