996 resultados para Density functions


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Habitat selection has been one of the main research topics in ecology for decades. Nevertheless, many aspects of habitat selection still need to be explored. In particular, previous studies have overlooked the importance of temporal variation in habitat selection and the value of including data on reproductive success in order to describe the best quality habitat for a species. We used data collected from radiocollared wolves in Yellowstone National Park (USA), between 1996 and 2008, to describe wolf habitat selection. In particular, we aimed to identify i) seasonal differences in wolf habitat selection, ii) factors influencing interannual variation in habitat selection, and iii) the effect of habitat selection on wolf reproductive success. We used probability density functions to describe wolf habitat use and habitat coverages to represent the habitat available to wolves. We used regression analysis to connect habitat use with habitat characteristics and habitat selection with reproductive success. Our most relevant result was discovering strong interannual variability in wolf habitat selection. This variability was in part explained by pack identity and differences in litter size and leadership of a pack between two years (summer) and in pack size and precipitation (winter). We also detected some seasonal differences. Wolves selected open habitats, intermediate elevations, intermediate distances from roads, and avoided steep slopes in late winter. They selected areas close to roads and avoided steep slopes in summer. In early winter, wolves selected wetlands, herbaceous and shrub vegetation types, and areas at intermediate elevation and distance from roads. Surprisingly, the habitat characteristics selected by wolves were not useful in predicting reproductive success. We hypothesize that interannual variability in wolf habitat selection may be too strong to detect effects on reproductive success. Moreover, prey availability and competitor pressure may also have an influence on wolf reproductive success, which we did not assess. This project demonstrated how important temporal variation is in shaping patterns of habitat selection. We still believe in the value of running long-term studies, but the effect of temporal variation should always be taken into account.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

PURPOSE Positron emission tomography (PET)∕computed tomography (CT) measurements on small lesions are impaired by the partial volume effect, which is intrinsically tied to the point spread function of the actual imaging system, including the reconstruction algorithms. The variability resulting from different point spread functions hinders the assessment of quantitative measurements in clinical routine and especially degrades comparability within multicenter trials. To improve quantitative comparability there is a need for methods to match different PET∕CT systems through elimination of this systemic variability. Consequently, a new method was developed and tested that transforms the image of an object as produced by one tomograph to another image of the same object as it would have been seen by a different tomograph. The proposed new method, termed Transconvolution, compensates for differing imaging properties of different tomographs and particularly aims at quantitative comparability of PET∕CT in the context of multicenter trials. METHODS To solve the problem of image normalization, the theory of Transconvolution was mathematically established together with new methods to handle point spread functions of different PET∕CT systems. Knowing the point spread functions of two different imaging systems allows determining a Transconvolution function to convert one image into the other. This function is calculated by convolving one point spread function with the inverse of the other point spread function which, when adhering to certain boundary conditions such as the use of linear acquisition and image reconstruction methods, is a numerically accessible operation. For reliable measurement of such point spread functions characterizing different PET∕CT systems, a dedicated solid-state phantom incorporating (68)Ge∕(68)Ga filled spheres was developed. To iteratively determine and represent such point spread functions, exponential density functions in combination with a Gaussian distribution were introduced. Furthermore, simulation of a virtual PET system provided a standard imaging system with clearly defined properties to which the real PET systems were to be matched. A Hann window served as the modulation transfer function for the virtual PET. The Hann's apodization properties suppressed high spatial frequencies above a certain critical frequency, thereby fulfilling the above-mentioned boundary conditions. The determined point spread functions were subsequently used by the novel Transconvolution algorithm to match different PET∕CT systems onto the virtual PET system. Finally, the theoretically elaborated Transconvolution method was validated transforming phantom images acquired on two different PET systems to nearly identical data sets, as they would be imaged by the virtual PET system. RESULTS The proposed Transconvolution method matched different PET∕CT-systems for an improved and reproducible determination of a normalized activity concentration. The highest difference in measured activity concentration between the two different PET systems of 18.2% was found in spheres of 2 ml volume. Transconvolution reduced this difference down to 1.6%. In addition to reestablishing comparability the new method with its parameterization of point spread functions allowed a full characterization of imaging properties of the examined tomographs. CONCLUSIONS By matching different tomographs to a virtual standardized imaging system, Transconvolution opens a new comprehensive method for cross calibration in quantitative PET imaging. The use of a virtual PET system restores comparability between data sets from different PET systems by exerting a common, reproducible, and defined partial volume effect.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Botanical data are widely used as terrestrial proxy data for climate reconstructions. Using a newly established method based on probability density functions (pdf-method), the temperature development throughout the last interglacial, the Eemian, is reconstructed for the two German sites Bispingen and Grobern and the French site La Grande Pile. The results are compared with previous reconstructions using other methods. After a steep increase in January as well as July temperatures in the early phase of the interglacial, the reconstructed most probable climate appears to be slightly warmer than today. While the temperature is reconstructed as relatively stable throughout the Eemian, a certain tendency towards cooler January temperatures is evident. January temperatures decreased from approx. 2-3° C in the early part to approx. -3° C in the later part at Bispingen, and from approx. 2° C to approx. -1° C at Grobern and La Grande Pile. A major drop to about -8° C marks the very end of the interglacial at all three sites. While these results agree well with other proxy data and former reconstructions based on the indicator species method, the results differ significantly from reconstructions based on the modern pollen analogue technique ("pollen transfer functions"). The lack of modern analogues is assumed to be the main reason for the discrepancies. It is concluded that any reconstruction method needs to be evaluated carefully in this respect if used for periods lacking modern analogous plant communities.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper we present a global overview of the recent study carried out in Spain for the new hazard map, which final goal is the revision of the Building Code in our country (NCSE-02). The study was carried our for a working group joining experts from The Instituto Geografico Nacional (IGN) and the Technical University of Madrid (UPM) , being the different phases of the work supervised by an expert Committee integrated by national experts from public institutions involved in subject of seismic hazard. The PSHA method (Probabilistic Seismic Hazard Assessment) has been followed, quantifying the epistemic uncertainties through a logic tree and the aleatory ones linked to variability of parameters by means of probability density functions and Monte Carlo simulations. In a first phase, the inputs have been prepared, which essentially are: 1) a project catalogue update and homogenization at Mw 2) proposal of zoning models and source characterization 3) calibration of Ground Motion Prediction Equations (GMPE’s) with actual data and development of a local model with data collected in Spain for Mw < 5.5. In a second phase, a sensitivity analysis of the different input options on hazard results has been carried out in order to have criteria for defining the branches of the logic tree and their weights. Finally, the hazard estimation was done with the logic tree shown in figure 1, including nodes for quantifying uncertainties corresponding to: 1) method for estimation of hazard (zoning and zoneless); 2) zoning models, 3) GMPE combinations used and 4) regression method for estimation of source parameters. In addition, the aleatory uncertainties corresponding to the magnitude of the events, recurrence parameters and maximum magnitude for each zone have been also considered including probability density functions and Monte Carlo simulations The main conclusions of the study are presented here, together with the obtained results in terms of PGA and other spectral accelerations SA (T) for return periods of 475, 975 and 2475 years. The map of the coefficient of variation (COV) are also represented to give an idea of the zones where the dispersion among results are the highest and the zones where the results are robust.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The application of a recently developed model of sonic anemometers measuring process has revealed that these sensors cannot be considered as absolute ones when measuring spectral characteristics of turbulent wind speed since it is demonstrated that the ratios of measured to real spectral density functions depend on the composition and temperature of the considered planetary atmosphere. The new model of the measuring process of sonic anemometers is applied to describe the measuring characteristics of these sensors as fluid/flow dependent (against the traditional hypothesis of fluid/flow independence) and hence dependent on the considered planetary atmosphere. The influence of fluid and flow characteristics (quantified via the Mach number of the flow) and the influence of the design parameters of sonic anemometers (mainly represented by time delay between pulses shots and geometry) on turbulence measurement are quantified for the atmospheres of Mars, Jupiter, and Earth. Important differences between the behavior of these sensors for the same averaged wind speed in the three considered atmospheres are detected in terms of characteristics of turbulence measurement as well as in terms of optimum values of anemometer design parameters for application on the different considered planetary atmospheres. These differences cannot be detected by traditional models of sonic anemometer measuring process based on line averaging along the sonic acoustic paths.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tese de mestrado em Física, apresentada à Universidade de Lisboa, através da Faculdade de Ciências, 2016

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The ERS-1 Satellite was launched in July 1991 by the European Space Agency into a polar orbit at about km800, carrying a C-band scatterometer. A scatterometer measures the amount of radar back scatter generated by small ripples on the ocean surface induced by instantaneous local winds. Operational methods that extract wind vectors from satellite scatterometer data are based on the local inversion of a forward model, mapping scatterometer observations to wind vectors, by the minimisation of a cost function in the scatterometer measurement space.par This report uses mixture density networks, a principled method for modelling conditional probability density functions, to model the joint probability distribution of the wind vectors given the satellite scatterometer measurements in a single cell (the `inverse' problem). The complexity of the mapping and the structure of the conditional probability density function are investigated by varying the number of units in the hidden layer of the multi-layer perceptron and the number of kernels in the Gaussian mixture model of the mixture density network respectively. The optimal model for networks trained per trace has twenty hidden units and four kernels. Further investigation shows that models trained with incidence angle as an input have results comparable to those models trained by trace. A hybrid mixture density network that incorporates geophysical knowledge of the problem confirms other results that the conditional probability distribution is dominantly bimodal.par The wind retrieval results improve on previous work at Aston, but do not match other neural network techniques that use spatial information in the inputs, which is to be expected given the ambiguity of the inverse problem. Current work uses the local inverse model for autonomous ambiguity removal in a principled Bayesian framework. Future directions in which these models may be improved are given.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We investigate the feasibility of simultaneous suppressing of the amplification noise and nonlinearity, representing the most fundamental limiting factors in modern optical communication. To accomplish this task we developed a general design optimisation technique, based on concepts of noise and nonlinearity management. We demonstrate the immense efficiency of the novel approach by applying it to a design optimisation of transmission lines with periodic dispersion compensation using Raman and hybrid Raman-EDFA amplification. Moreover, we showed, using nonlinearity management considerations, that the optimal performance in high bit-rate dispersion managed fibre systems with hybrid amplification is achieved for a certain amplifier spacing – which is different from commonly known optimal noise performance corresponding to fully distributed amplification. Required for an accurate estimation of the bit error rate, the complete knowledge of signal statistics is crucial for modern transmission links with strong inherent nonlinearity. Therefore, we implemented the advanced multicanonical Monte Carlo (MMC) method, acknowledged for its efficiency in estimating distribution tails. We have accurately computed acknowledged for its efficiency in estimating distribution tails. We have accurately computed marginal probability density functions for soliton parameters, by numerical modelling of Fokker-Plank equation applying the MMC simulation technique. Moreover, applying a powerful MMC method we have studied the BER penalty caused by deviations from the optimal decision level in systems employing in-line 2R optical regeneration. We have demonstrated that in such systems the analytical linear approximation that makes a better fit in the central part of the regenerator nonlinear transfer function produces more accurate approximation of the BER and BER penalty. We present a statistical analysis of RZ-DPSK optical signal at direct detection receiver with Mach-Zehnder interferometer demodulation

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The ERS-1 Satellite was launched in July 1991 by the European Space Agency into a polar orbit at about 800 km, carrying a C-band scatterometer. A scatterometer measures the amount of backscatter microwave radiation reflected by small ripples on the ocean surface induced by sea-surface winds, and so provides instantaneous snap-shots of wind flow over large areas of the ocean surface, known as wind fields. Inherent in the physics of the observation process is an ambiguity in wind direction; the scatterometer cannot distinguish if the wind is blowing toward or away from the sensor device. This ambiguity implies that there is a one-to-many mapping between scatterometer data and wind direction. Current operational methods for wind field retrieval are based on the retrieval of wind vectors from satellite scatterometer data, followed by a disambiguation and filtering process that is reliant on numerical weather prediction models. The wind vectors are retrieved by the local inversion of a forward model, mapping scatterometer observations to wind vectors, and minimising a cost function in scatterometer measurement space. This thesis applies a pragmatic Bayesian solution to the problem. The likelihood is a combination of conditional probability distributions for the local wind vectors given the scatterometer data. The prior distribution is a vector Gaussian process that provides the geophysical consistency for the wind field. The wind vectors are retrieved directly from the scatterometer data by using mixture density networks, a principled method to model multi-modal conditional probability density functions. The complexity of the mapping and the structure of the conditional probability density function are investigated. A hybrid mixture density network, that incorporates the knowledge that the conditional probability distribution of the observation process is predominantly bi-modal, is developed. The optimal model, which generalises across a swathe of scatterometer readings, is better on key performance measures than the current operational model. Wind field retrieval is approached from three perspectives. The first is a non-autonomous method that confirms the validity of the model by retrieving the correct wind field 99% of the time from a test set of 575 wind fields. The second technique takes the maximum a posteriori probability wind field retrieved from the posterior distribution as the prediction. For the third technique, Markov Chain Monte Carlo (MCMC) techniques were employed to estimate the mass associated with significant modes of the posterior distribution, and make predictions based on the mode with the greatest mass associated with it. General methods for sampling from multi-modal distributions were benchmarked against a specific MCMC transition kernel designed for this problem. It was shown that the general methods were unsuitable for this application due to computational expense. On a test set of 100 wind fields the MAP estimate correctly retrieved 72 wind fields, whilst the sampling method correctly retrieved 73 wind fields.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis describes the procedure and results from four years research undertaken through the IHD (Interdisciplinary Higher Degrees) Scheme at Aston University in Birmingham, sponsored by the SERC (Science and Engineering Research Council) and Monk Dunstone Associates, Chartered Quantity Surveyors. A stochastic networking technique VERT (Venture Evaluation and Review Technique) was used to model the pre-tender costs of public health, heating ventilating, air-conditioning, fire protection, lifts and electrical installations within office developments. The model enabled the quantity surveyor to analyse, manipulate and explore complex scenarios which previously had defied ready mathematical analysis. The process involved the examination of historical material costs, labour factors and design performance data. Components and installation types were defined and formatted. Data was updated and adjusted using mechanical and electrical pre-tender cost indices and location, selection of contractor, contract sum, height and site condition factors. Ranges of cost, time and performance data were represented by probability density functions and defined by constant, uniform, normal and beta distributions. These variables and a network of the interrelationships between services components provided the framework for analysis. The VERT program, in this particular study, relied upon Monte Carlo simulation to model the uncertainties associated with pre-tender estimates of all possible installations. The computer generated output in the form of relative and cumulative frequency distributions of current element and total services costs, critical path analyses and details of statistical parameters. From this data alternative design solutions were compared, the degree of risk associated with estimates was determined, heuristics were tested and redeveloped, and cost significant items were isolated for closer examination. The resultant models successfully combined cost, time and performance factors and provided the quantity surveyor with an appreciation of the cost ranges associated with the various engineering services design options.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Theprocess of manufacturing system design frequently includes modeling, and usually, this means applying a technique such as discrete event simulation (DES). However, the computer tools currently available to apply this technique enable only a superficial representation of the people that operate within the systems. This is a serious limitation because the performance of people remains central to the competitiveness of many manufacturing enterprises. Therefore, this paper explores the use of probability density functions to represent the variation of worker activity times within DES models.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We present a data based statistical study on the effects of seasonal variations in the growth rates of the gastro-intestinal (GI) parasitic infection in livestock. The alluded growth rate is estimated through the variation in the number of eggs per gram (EPG) of faeces in animals. In accordance with earlier studies, our analysis too shows that rainfall is the dominant variable in determining EPG infection rates compared to other macro-parameters like temperature and humidity. Our statistical analysis clearly indicates an oscillatory dependence of EPG levels on rainfall fluctuations. Monsoon recorded the highest infection with a comparative increase of at least 2.5 times compared to the next most infected period (summer). A least square fit of the EPG versus rainfall data indicates an approach towards a super diffusive (i. e. root mean square displacement growing faster than the square root of the elapsed time as obtained for simple diffusion) infection growth pattern regime for low rainfall regimes (technically defined as zeroth level dependence) that gets remarkably augmented for large rainfall zones. Our analysis further indicates that for low fluctuations in temperature (true on the bulk data), EPG level saturates beyond a critical value of the rainfall, a threshold that is expected to indicate the onset of the nonlinear regime. The probability density functions (PDFs) of the EPG data show oscillatory behavior in the large rainfall regime (greater than 500 mm), the frequency of oscillation, once again, being determined by the ambient wetness (rainfall, and humidity). Data recorded over three pilot projects spanning three measures of rainfall and humidity bear testimony to the universality of this statistical argument. © 2013 Chattopadhyay and Bandyopadhyay.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We derive rigorously the Fokker-Planck equation that governs the statistics of soliton parameters in optical transmission lines in the presence of additive amplifier spontaneous emission. We demonstrate that these statistics are generally non-Gaussian. We present exact marginal probability-density functions for soliton parameters for some cases. A WKB approach is applied to describe the tails of the probability-density functions. © 2005 Optical Society of America.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We present exact analytical results for the statistics of nonlinear coupled oscillators under the influence of additive white noise. We suggest a perturbative approach for analysing the statistics of such systems under the action of a deterministic perturbation, based on the exact expressions for probability density functions for noise-driven oscillators. Using our perturbation technique we show that our results can be applied to studying the optical signal propagation in noisy fibres at (nearly) zero dispersion as well as to weakly nonlinear lattice models with additive noise. The approach proposed can account for a wide spectrum of physically meaningful perturbations and is applicable to the case of large noise strength. © 2005 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We study the statistics of optical data transmission in a noisy nonlinear fiber channel with a weak dispersion management and zero average dispersion. Applying analytical expressions for the output probability density functions both for a nonlinear channel and for a linear channel with additive and multiplicative noise we calculate in a closed form a lower bound estimate on the Shannon capacity for an arbitrary signal-to-noise ratio.