96 resultados para Reproducing kernel


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper critically explores the politics that mediate the use of environmental science assessments as the basis of resource management policy. Drawing on recent literature in the political ecology tradition that has emphasised the politicised nature of the production and use of scientific knowledge in environmental management, the paper analyses a hydrological assessment in a small river basin in Chile, undertaken in response to concerns over the possible overexploitation of groundwater resources. The case study illustrates the limitations of an approach based predominantly on hydrogeological modelling to ascertain the effects of increased groundwater abstraction. In particular, it identifies the subjective ways in which the assessment was interpreted and used by the state water resources agency to underpin water allocation decisions in accordance with its own interests, and the role that a desocialised assessment played in reproducing unequal patterns of resource use and configuring uneven waterscapes. Nevertheless, as Chile’s ‘neoliberal’ political-economic framework privileges the role of science and technocracy, producing other forms of environmental knowledge to complement environmental science is likely to be contentious. In conclusion, the paper considers the potential of mobilising the concept of the hydrosocial cycle to further critically engage with environmental science.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Microarray based comparative genomic hybridisation (CGH) experiments have been used to study numerous biological problems including understanding genome plasticity in pathogenic bacteria. Typically such experiments produce large data sets that are difficult for biologists to handle. Although there are some programmes available for interpretation of bacterial transcriptomics data and CGH microarray data for looking at genetic stability in oncogenes, there are none specifically to understand the mosaic nature of bacterial genomes. Consequently a bottle neck still persists in accurate processing and mathematical analysis of these data. To address this shortfall we have produced a simple and robust CGH microarray data analysis process that may be automated in the future to understand bacterial genomic diversity. Results: The process involves five steps: cleaning, normalisation, estimating gene presence and absence or divergence, validation, and analysis of data from test against three reference strains simultaneously. Each stage of the process is described and we have compared a number of methods available for characterising bacterial genomic diversity, for calculating the cut-off between gene presence and absence or divergence, and shown that a simple dynamic approach using a kernel density estimator performed better than both established, as well as a more sophisticated mixture modelling technique. We have also shown that current methods commonly used for CGH microarray analysis in tumour and cancer cell lines are not appropriate for analysing our data. Conclusion: After carrying out the analysis and validation for three sequenced Escherichia coli strains, CGH microarray data from 19 E. coli O157 pathogenic test strains were used to demonstrate the benefits of applying this simple and robust process to CGH microarray studies using bacterial genomes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Liquid clouds play a profound role in the global radiation budget but it is difficult to remotely retrieve their vertical profile. Ordinary narrow field-of-view (FOV) lidars receive a strong return from such clouds but the information is limited to the first few optical depths. Wideangle multiple-FOV lidars can isolate radiation scattered multiple times before returning to the instrument, often penetrating much deeper into the cloud than the singly-scattered signal. These returns potentially contain information on the vertical profile of extinction coefficient, but are challenging to interpret due to the lack of a fast radiative transfer model for simulating them. This paper describes a variational algorithm that incorporates a fast forward model based on the time-dependent two-stream approximation, and its adjoint. Application of the algorithm to simulated data from a hypothetical airborne three-FOV lidar with a maximum footprint width of 600m suggests that this approach should be able to retrieve the extinction structure down to an optical depth of around 6, and total opticaldepth up to at least 35, depending on the maximum lidar FOV. The convergence behavior of Gauss-Newton and quasi-Newton optimization schemes are compared. We then present results from an application of the algorithm to observations of stratocumulus by the 8-FOV airborne “THOR” lidar. It is demonstrated how the averaging kernel can be used to diagnose the effective vertical resolution of the retrieved profile, and therefore the depth to which information on the vertical structure can be recovered. This work enables exploitation of returns from spaceborne lidar and radar subject to multiple scattering more rigorously than previously possible.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The estimation of the long-term wind resource at a prospective site based on a relatively short on-site measurement campaign is an indispensable task in the development of a commercial wind farm. The typical industry approach is based on the measure-correlate-predict �MCP� method where a relational model between the site wind velocity data and the data obtained from a suitable reference site is built from concurrent records. In a subsequent step, a long-term prediction for the prospective site is obtained from a combination of the relational model and the historic reference data. In the present paper, a systematic study is presented where three new MCP models, together with two published reference models �a simple linear regression and the variance ratio method�, have been evaluated based on concurrent synthetic wind speed time series for two sites, simulating the prospective and the reference site. The synthetic method has the advantage of generating time series with the desired statistical properties, including Weibull scale and shape factors, required to evaluate the five methods under all plausible conditions. In this work, first a systematic discussion of the statistical fundamentals behind MCP methods is provided and three new models, one based on a nonlinear regression and two �termed kernel methods� derived from the use of conditional probability density functions, are proposed. All models are evaluated by using five metrics under a wide range of values of the correlation coefficient, the Weibull scale, and the Weibull shape factor. Only one of all models, a kernel method based on bivariate Weibull probability functions, is capable of accurately predicting all performance metrics studied.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The mechanisms involved in Atlantic meridional overturning circulation (AMOC) decadal variability and predictability over the last 50 years are analysed in the IPSL–CM5A–LR model using historical and initialised simulations. The initialisation procedure only uses nudging towards sea surface temperature anomalies with a physically based restoring coefficient. When compared to two independent AMOC reconstructions, both the historical and nudged ensemble simulations exhibit skill at reproducing AMOC variations from 1977 onwards, and in particular two maxima occurring respectively around 1978 and 1997. We argue that one source of skill is related to the large Mount Agung volcanic eruption starting in 1963, which reset an internal 20-year variability cycle in the North Atlantic in the model. This cycle involves the East Greenland Current intensity, and advection of active tracers along the subpolar gyre, which leads to an AMOC maximum around 15 years after the Mount Agung eruption. The 1997 maximum occurs approximately 20 years after the former one. The nudged simulations better reproduce this second maximum than the historical simulations. This is due to the initialisation of a cooling of the convection sites in the 1980s under the effect of a persistent North Atlantic oscillation (NAO) positive phase, a feature not captured in the historical simulations. Hence we argue that the 20-year cycle excited by the 1963 Mount Agung eruption together with the NAO forcing both contributed to the 1990s AMOC maximum. These results support the existence of a 20-year cycle in the North Atlantic in the observations. Hindcasts following the CMIP5 protocol are launched from a nudged simulation every 5 years for the 1960–2005 period. They exhibit significant correlation skill score as compared to an independent reconstruction of the AMOC from 4-year lead-time average. This encouraging result is accompanied by increased correlation skills in reproducing the observed 2-m air temperature in the bordering regions of the North Atlantic as compared to non-initialized simulations. To a lesser extent, predicted precipitation tends to correlate with the nudged simulation in the tropical Atlantic. We argue that this skill is due to the initialisation and predictability of the AMOC in the present prediction system. The mechanisms evidenced here support the idea of volcanic eruptions as a pacemaker for internal variability of the AMOC. Together with the existence of a 20-year cycle in the North Atlantic they propose a novel and complementary explanation for the AMOC variations over the last 50 years.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Simulating spiking neural networks is of great interest to scientists wanting to model the functioning of the brain. However, large-scale models are expensive to simulate due to the number and interconnectedness of neurons in the brain. Furthermore, where such simulations are used in an embodied setting, the simulation must be real-time in order to be useful. In this paper we present NeMo, a platform for such simulations which achieves high performance through the use of highly parallel commodity hardware in the form of graphics processing units (GPUs). NeMo makes use of the Izhikevich neuron model which provides a range of realistic spiking dynamics while being computationally efficient. Our GPU kernel can deliver up to 400 million spikes per second. This corresponds to a real-time simulation of around 40 000 neurons under biologically plausible conditions with 1000 synapses per neuron and a mean firing rate of 10 Hz.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The translation of an ensemble of model runs into a probability distribution is a common task in model-based prediction. Common methods for such ensemble interpretations proceed as if verification and ensemble were draws from the same underlying distribution, an assumption not viable for most, if any, real world ensembles. An alternative is to consider an ensemble as merely a source of information rather than the possible scenarios of reality. This approach, which looks for maps between ensembles and probabilistic distributions, is investigated and extended. Common methods are revisited, and an improvement to standard kernel dressing, called ‘affine kernel dressing’ (AKD), is introduced. AKD assumes an affine mapping between ensemble and verification, typically not acting on individual ensemble members but on the entire ensemble as a whole, the parameters of this mapping are determined in parallel with the other dressing parameters, including a weight assigned to the unconditioned (climatological) distribution. These amendments to standard kernel dressing, albeit simple, can improve performance significantly and are shown to be appropriate for both overdispersive and underdispersive ensembles, unlike standard kernel dressing which exacerbates over dispersion. Studies are presented using operational numerical weather predictions for two locations and data from the Lorenz63 system, demonstrating both effectiveness given operational constraints and statistical significance given a large sample.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ecosystem fluxes of energy, water, and CO2 result in spatial and temporal variations in atmospheric properties. In principle, these variations can be used to quantify the fluxes through inverse modelling of atmospheric transport, and can improve the understanding of processes and falsifiability of models. We investigated the influence of ecosystem fluxes on atmospheric CO2 in the vicinity of the WLEF-TV tower in Wisconsin using an ecophysiological model (Simple Biosphere, SiB2) coupled to an atmospheric model (Regional Atmospheric Modelling System). Model parameters were specified from satellite imagery and soil texture data. In a companion paper, simulated fluxes in the immediate tower vicinity have been compared to eddy covariance fluxes measured at the tower, with meteorology specified from tower sensors. Results were encouraging with respect to the ability of the model to capture observed diurnal cycles of fluxes. Here, the effects of fluxes in the tower footprint were also investigated by coupling SiB2 to a high-resolution atmospheric simulation, so that the model physiology could affect the meteorological environment. These experiments were successful in reproducing observed fluxes and concentration gradients during the day and at night, but revealed problems during transitions at sunrise and sunset that appear to be related to the canopy radiation parameterization in SiB2.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The performance of 18 coupled Chemistry Climate Models (CCMs) in the Tropical Tropopause Layer (TTL) is evaluated using qualitative and quantitative diagnostics. Trends in tropopause quantities in the tropics and the extratropical Upper Troposphere and Lower Stratosphere (UTLS) are analyzed. A quantitative grading methodology for evaluating CCMs is extended to include variability and used to develop four different grades for tropical tropopause temperature and pressure, water vapor and ozone. Four of the 18 models and the multi-model mean meet quantitative and qualitative standards for reproducing key processes in the TTL. Several diagnostics are performed on a subset of the models analyzing the Tropopause Inversion Layer (TIL), Lagrangian cold point and TTL transit time. Historical decreases in tropical tropopause pressure and decreases in water vapor are simulated, lending confidence to future projections. The models simulate continued decreases in tropopause pressure in the 21st century, along with ∼1K increases per century in cold point tropopause temperature and 0.5–1 ppmv per century increases in water vapor above the tropical tropopause. TTL water vapor increases below the cold point. In two models, these trends are associated with 35% increases in TTL cloud fraction. These changes indicate significant perturbations to TTL processes, specifically to deep convective heating and humidity transport. Ozone in the extratropical lowermost stratosphere has significant and hemispheric asymmetric trends. O3 is projected to increase by nearly 30% due to ozone recovery in the Southern Hemisphere (SH) and due to enhancements in the stratospheric circulation. These UTLS ozone trends may have significant effects in the TTL and the troposphere.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hamburg atmospheric general circulation model ECHAM3 at T106 resolution (1.125' lat.Aon.) has considerable skill in reproducing the observed seasonal reversal of mean sea level pressure, the location of the summer heat low as well as the position of the monsoon trough over the Indian subcontinent. The present-day climate and its seasonal cycle are realistically simulated by the model over this region. The model simulates the structure, intensity, frequency, movement and lifetime of monsoon depressions remarkably well. The number of monsoon depressions/storms simulated by the model in a year ranged from 5 to 12 with an average frequency of 8.4 yr-', not significantly different from the observed climatology. The model also simulates the interannual variability in the formation of depressions over the north Bay of Bengal during the summer monsoon season. In the warmer atmosphere under doubled CO2 conditions, the number of monsoon depressions/cyclonic storms forming in Indian seas in a year ranged from 5 to 11 with an average frequency of 7.6 yr-', not significantly different from those inferred in the control run of the model. However, under doubled CO2 conditions, fewer depressions formed in the month of June. Neither the lowest central pressure nor the maximum wind speed changes appreciably in monsoon depressions identified under simulated enhanced greenhouse conditions. The analysis suggests there will be no significant changes in the number and intensity of monsoon depressions in a warmer atmosphere.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we report on a study conducted using the Middle Atmospheric Nitrogen TRend Assessment (MANTRA) balloon measurements of stratospheric constituents and temperature and the Canadian Middle Atmosphere Model (CMAM). Three different kinds of data are used to assess the inter-consistency of the combined dataset: single profiles of long-lived species from MANTRA 1998, sparse climatologies from the ozonesonde measurements during the four MANTRA campaigns and from HALOE satellite measurements, and the CMAM climatology. In doing so, we evaluate the ability of the model to reproduce the measured fields and to thereby test our ability to describe mid-latitude summertime stratospheric processes. The MANTRA campaigns were conducted at Vanscoy, Saskatchewan, Canada (52◦ N, 107◦ W)in late August and early September of 1998, 2000, 2002 and 2004. During late summer at mid-latitudes, the stratosphere is close to photochemical control, providing an ideal scenario for the study reported here. From this analysis we find that: (1) reducing the value for the vertical diffusion coefficient in CMAM to a more physically reasonable value results in the model better reproducing the measured profiles of long-lived species; (2) the existence of compact correlations among the constituents, as expected from independent measurements in the literature and from models, confirms the self-consistency of the MANTRA measurements; and (3) the 1998 measurements show structures in the chemical species profiles that can be associated with transport, adding to the growing evidence that the summertime stratosphere can be much more disturbed than anticipated. The mechanisms responsible for such disturbances need to be understood in order to assess the representativeness of the measurements and to isolate longterm trends.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We develop a new sparse kernel density estimator using a forward constrained regression framework, within which the nonnegative and summing-to-unity constraints of the mixing weights can easily be satisfied. Our main contribution is to derive a recursive algorithm to select significant kernels one at time based on the minimum integrated square error (MISE) criterion for both the selection of kernels and the estimation of mixing weights. The proposed approach is simple to implement and the associated computational cost is very low. Specifically, the complexity of our algorithm is in the order of the number of training data N, which is much lower than the order of N2 offered by the best existing sparse kernel density estimators. Numerical examples are employed to demonstrate that the proposed approach is effective in constructing sparse kernel density estimators with comparable accuracy to those of the classical Parzen window estimate and other existing sparse kernel density estimators.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This chapter takes the example of local African beekeeping to explore how the forest can act as an important locus for men's work in Western Tanzania. Here we scrutinise how beekeeping enables its practitioners to situate themselves in the forest locality and observe how the social relationships, interactions and everyday practices entailed in living and working together are a means through which beekeepers generate a sense of belonging and identity. As part and parcel of this process, men transmit their skills to a new generation, thus reproducing themselves and their social environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider in this paper the solvability of linear integral equations on the real line, in operator form (λ−K)φ=ψ, where and K is an integral operator. We impose conditions on the kernel, k, of K which ensure that K is bounded as an operator on . Let Xa denote the weighted space as |s|→∞}. Our first result is that if, additionally, |k(s,t)|⩽κ(s−t), with and κ(s)=O(|s|−b) as |s|→∞, for some b>1, then the spectrum of K is the same on Xa as on X, for 0kernel takes the form k(s,t)=κ(s−t)z(t), with , , and κ(s)=O(|s|−b) as |s|→∞, for some b>1. As an example where kernels of this latter form occur we discuss a boundary integral equation formulation of an impedance boundary value problem for the Helmholtz equation in a half-plane.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose a Nystr¨om/product integration method for a class of second kind integral equations on the real line which arise in problems of two-dimensional scalar and elastic wave scattering by unbounded surfaces. Stability and convergence of the method is established with convergence rates dependent on the smoothness of components of the kernel. The method is applied to the problem of acoustic scattering by a sound soft one-dimensional surface which is the graph of a function f, and superalgebraic convergence is established in the case when f is infinitely smooth. Numerical results are presented illustrating this behavior for the case when f is periodic (the diffraction grating case). The Nystr¨om method for this problem is stable and convergent uniformly with respect to the period of the grating, in contrast to standard integral equation methods for diffraction gratings which fail at a countable set of grating periods.