912 resultados para Dwarf Galaxy Fornax Distribution Function Action Based
Resumo:
[cat] En aquest article estudiem estratègies “comprar i mantenir” per a problemes d’optimitzar la riquesa final en un context multi-període. Com que la riquesa final és una suma de variables aleatòries dependents, on cadascuna d’aquestes correspon a una quantitat de capital que s’ha invertit en un actiu particular en una data determinada, en primer lloc considerem aproximacions que redueixen l’aleatorietat multivariant al cas univariant. A continuació, aquestes aproximacions es fan servir per determinar les estratègies “comprar i mantenir” que optimitzen, per a un nivell de probabilitat donat, el VaR i el CLTE de la funció de distribució de la riquesa final. Aquest article complementa el treball de Dhaene et al. (2005), on es van considerar estratègies de reequilibri constant.
Resumo:
In this paper we consider diffusion of a passive substance C in a temporarily and spatially inhomogeneous two-dimensional medium. As a realization for the latter we choose a phase-separating medium consisting of two substances A and B, whose dynamics is determined by the Cahn-Hilliard equation. Assuming different diffusion coefficients of C in A and B, we find that the variance of the distribution function of the said substance grows less than linearly in time. We derive a simple identity for the variance using a probabilistic ansatz and are then able to identify the interface between A and B as the main cause for this nonlinear dependence. We argue that, finally, for very large times the here temporarily dependent diffusion "constant" goes like t-1/3 to a constant asymptotic value D¿. The latter is calculated approximately by employing the effective-medium approximation and by fitting the simulation data to the said time dependence.
Resumo:
When dealing with multi-angular image sequences, problems of reflectance changes due either to illumination and acquisition geometry, or to interactions with the atmosphere, naturally arise. These phenomena interplay with the scene and lead to a modification of the measured radiance: for example, according to the angle of acquisition, tall objects may be seen from top or from the side and different light scatterings may affect the surfaces. This results in shifts in the acquired radiance, that make the problem of multi-angular classification harder and might lead to catastrophic results, since surfaces with the same reflectance return significantly different signals. In this paper, rather than performing atmospheric or bi-directional reflection distribution function (BRDF) correction, a non-linear manifold learning approach is used to align data structures. This method maximizes the similarity between the different acquisitions by deforming their manifold, thus enhancing the transferability of classification models among the images of the sequence.
Resumo:
Lettuce greenhouse experiments were carried out from March to June 2011 in order to analyze how pesticides behave from the time of application until their intake via human consumption taking into account the primary distribution of pesticides, field dissipation, and post-harvest processing. In addition, experimental conditions were used to evaluate a new dynamic plant uptake model comparing its results with the experimentally derived residues. One application of imidacloprid and two of azoxystrobin were conducted. For evaluating primary pesticide distribution, two approaches based on leaf area index and vegetation cover were used and results were compared with those obtained from a tracer test. High influence of lettuce density, growth stage and type of sprayer was observed in primary distribution showing that low densities or early growth stages implied high losses of pesticides on soil. Washed and unwashed samples of lettuce were taken and analyzed from application to harvest to evaluate removal of pesticides by food processing. Results show that residues found on the Spanish preharvest interval days were in all cases below officially set maximum residue limits, although it was observed that time between application and harvest is as important for residues as application amounts. An overall reduction of 40–60% of pesticides residues was obtained from washing lettuce. Experimentally derived residues were compared with modeled residues and deviate from 1.2 to 1.4 for imidacloprid and azoxystrobin, respectively, presenting good model predictions. Resulting human intake fractions range from... for imidacloprid to ... for azoxystrobin.
Resumo:
The distribution of distances from atoms of a particular element E to a probe atom X (oxygen in most cases), both bonded and intermolecular non-bonded contacts, has been analyzed. In general, the distribution is characterized by a maximum at short EX distances corresponding to chemical bonds, followed by a range of unpopulated distances the van der Waals gap and a second maximum at longer distances the van der Waals peak superimposed on a random distribution function that roughly follows a d3 dependence. The analysis of more than five million interatomic"non-bonded" distances has led to the proposal of a consistent set of van der Waals radii for most naturally occurring elements, and its applicability to other element pairs has been tested for a set of more than three million data, all of them compared to over one million bond distances.
Resumo:
Aim Species distribution models (SDMs) based on current species ranges underestimate the potential distribution when projected in time and/or space. A multi-temporal model calibration approach has been suggested as an alternative, and we evaluate this using 13,000 years of data. Location Europe. Methods We used fossil-based records of presence for Picea abies, Abies alba and Fagus sylvatica and six climatic variables for the period 13,000 to 1000yr bp. To measure the contribution of each 1000-year time step to the total niche of each species (the niche measured by pooling all the data), we employed a principal components analysis (PCA) calibrated with data over the entire range of possible climates. Then we projected both the total niche and the partial niches from single time frames into the PCA space, and tested if the partial niches were more similar to the total niche than random. Using an ensemble forecasting approach, we calibrated SDMs for each time frame and for the pooled database. We projected each model to current climate and evaluated the results against current pollen data. We also projected all models into the future. Results Niche similarity between the partial and the total-SDMs was almost always statistically significant and increased through time. SDMs calibrated from single time frames gave different results when projected to current climate, providing evidence of a change in the species realized niches through time. Moreover, they predicted limited climate suitability when compared with the total-SDMs. The same results were obtained when projected to future climates. Main conclusions The realized climatic niche of species differed for current and future climates when SDMs were calibrated considering different past climates. Building the niche as an ensemble through time represents a way forward to a better understanding of a species' range and its ecology in a changing climate.
Resumo:
We provide an incremental quantile estimator for Non-stationary Streaming Data. We propose a method for simultaneous estimation of multiple quantiles corresponding to the given probability levels from streaming data. Due to the limitations of the memory, it is not feasible to compute the quantiles by storing the data. So estimating the quantiles as the data pass by is the only possibility. This can be effective in network measurement. To provide the minimum of the mean-squared error of the estimation, we use parabolic approximation and for comparison we simulate the results for different number of runs and using both linear and parabolic approximations.
Resumo:
Fatigue life assessment of weldedstructures is commonly based on the nominal stress method, but more flexible and accurate methods have been introduced. In general, the assessment accuracy is improved as more localized information about the weld is incorporated. The structural hot spot stress method includes the influence of macro geometric effects and structural discontinuities on the design stress but excludes the local features of the weld. In this thesis, the limitations of the structural hot spot stress method are discussed and a modified structural stress method with improved accuracy is developed and verified for selected welded details. The fatigue life of structures in the as-welded state consists mainly of crack growth from pre-existing cracks or defects. Crack growth rate depends on crack geometry and the stress state on the crack face plane. This means that the stress level and shape of the stress distribution in the assumed crack path governs thetotal fatigue life. In many structural details the stress distribution is similar and adequate fatigue life estimates can be obtained just by adjusting the stress level based on a single stress value, i.e., the structural hot spot stress. There are, however, cases for which the structural stress approach is less appropriate because the stress distribution differs significantly from the more common cases. Plate edge attachments and plates on elastic foundations are some examples of structures with this type of stress distribution. The importance of fillet weld size and weld load variation on the stress distribution is another central topic in this thesis. Structural hot spot stress determination is generally based on a procedure that involves extrapolation of plate surface stresses. Other possibilities for determining the structural hot spot stress is to extrapolate stresses through the thickness at the weld toe or to use Dong's method which includes through-thickness extrapolation at some distance from the weld toe. Both of these latter methods are less sensitive to the FE mesh used. Structural stress based on surface extrapolation is sensitive to the extrapolation points selected and to the FE mesh used near these points. Rules for proper meshing, however, are well defined and not difficult to apply. To improve the accuracy of the traditional structural hot spot stress, a multi-linear stress distribution is introduced. The magnitude of the weld toe stress after linearization is dependent on the weld size, weld load and plate thickness. Simple equations have been derived by comparing assessment results based on the local linear stress distribution and LEFM based calculations. The proposed method is called the modified structural stress method (MSHS) since the structural hot spot stress (SHS) value is corrected using information on weld size andweld load. The correction procedure is verified using fatigue test results found in the literature. Also, a test case was conducted comparing the proposed method with other local fatigue assessment methods.
Resumo:
The effect of the heat flux on the rate of chemical reaction in dilute gases is shown to be important for reactions characterized by high activation energies and in the presence of very large temperature gradients. This effect, obtained from the second-order terms in the distribution function (similar to those obtained in the Burnett approximation to the solution of the Boltzmann equation), is derived on the basis of information theory. It is shown that the analytical results describing the effect are simpler if the kinetic definition for the nonequilibrium temperature is introduced than if the thermodynamic definition is introduced. The numerical results are nearly the same for both definitions
Resumo:
Erilaisten IP-pohjaisten palvelujen käyttö lisääntyy jatkuvasti samalla, kun käyttäjistä tulee yhä liikkuvaisempia. Tästä syystä IP- protokolla tulee väistämättä myös mobiiliverkkoihin. Tässä diplomityössä tutkitaan mobiliteetin IP multcastingiin tuomia ongelmia ja simuloidaan niitä Network Simulatoria käyttäen. Pääpaino on ongelmalla, joka aiheutuu multicast- ryhmänmuodostusviiveestä. Tätä ongelmaa simuloidaan, jotta viiveen, mobiilikäyttäjien palveluunsaapumistaajuuden ja Scalable Reliable Multicast (SRM) protokollan ajastinarvojen asetusten vaikutus repair request- pakettien määrään ja sitä kautta suoritettavien uudelleenlähetysten määrään selviäisi. Eri parametrien vaikutuksen tutkimiseksi esitetään simulaatiotuloksia varioiduilla parametreillä käyttäen CDF- käyriä. Tulosten perusteella merkittävin tekijä uudelleenlähetyspyyntöjen kannalta on protokollan ajastimien arvot ja haluttu palvelun taso, viiveen merkityksen jäädessä vähäiseksi. Työn lopuksi tutkitaan SRM- protokollan soveltuvuutta mobiiliverkkoihin ja pohditaan vaihtoehtoja toiminnan parantamiseksi.
Resumo:
We propose a novel formulation to solve the problem of intra-voxel reconstruction of the fibre orientation distribution function (FOD) in each voxel of the white matter of the brain from diffusion MRI data. The majority of the state-of-the-art methods in the field perform the reconstruction on a voxel-by-voxel level, promoting sparsity of the orientation distribution. Recent methods have proposed a global denoising of the diffusion data using spatial information prior to reconstruction, while others promote spatial regularisation through an additional empirical prior on the diffusion image at each q-space point. Our approach reconciles voxelwise sparsity and spatial regularisation and defines a spatially structured FOD sparsity prior, where the structure originates from the spatial coherence of the fibre orientation between neighbour voxels. The method is shown, through both simulated and real data, to enable accurate FOD reconstruction from a much lower number of q-space samples than the state of the art, typically 15 samples, even for quite adverse noise conditions.
Resumo:
In the Arabidopsis thaliana genome, over 1000 putative genes encoding small, presumably secreted, signalling peptides can be recognized. However, a major obstacle in identifying the function of genes encoding small signalling peptides is the limited number of available loss-of-function mutants. To overcome this, a promising new tool, antagonistic peptide technology, was recently developed. Here, this antagonistic peptide technology was tested on selected CLE peptides and the related IDA peptide and its usefulness in the context of studies of peptide function discussed. Based on the analyses, it was concluded that the antagonistic peptide approach is not the ultimate means to overcome redundancy or lack of loss-of-function lines. However, information collected using antagonistic peptide approaches (in the broad sense) can be very useful, but these approaches do not work in all cases and require a deep insight on the interaction between the ligand and its receptor to be successful. This, as well as peptide ligand structure considerations, should be taken into account before ordering a wide range of synthetic peptide variants and/or generating transgenic plants.
Resumo:
A statistical indentation method has been employed to study the hardness value of fire-refined high conductivity copper, using nanoindentation technique. The Joslin and Oliver approach was used with the aim to separate the hardness (H) influence of copper matrix, from that of inclusions and grain boundaries. This approach relies on a large array of imprints (around 400 indentations), performed at 150 nm of indentation depth. A statistical study using a cumulative distribution function fit and Gaussian simulated distributions, exhibits that H for each phase can be extracted when the indentation depth is much lower than the size of the secondary phases. It is found that the thermal treatment produces a hardness increase, due to the partly re-dissolution of the inclusions (mainly Pb and Sn) in the matrix.
Resumo:
Reinsurance is one of the tools that an insurer can use to mitigate the underwriting risk and then to control its solvency. In this paper, we focus on the proportional reinsurance arrangements and we examine several optimization and decision problems of the insurer with respect to the reinsurance strategy. To this end, we use as decision tools not only the probability of ruin but also the random variable deficit at ruin if ruin occurs. The discounted penalty function (Gerber & Shiu, 1998) is employed to calculate as particular cases the probability of ruin and the moments and the distribution function of the deficit at ruin if ruin occurs.
Resumo:
In this study we used market settlement prices of European call options on stock index futures to extract implied probability distribution function (PDF). The method used produces a PDF of returns of an underlying asset at expiration date from implied volatility smile. With this method, the assumption of lognormal distribution (Black-Scholes model) is tested. The market view of the asset price dynamics can then be used for various purposes (hedging, speculation). We used the so called smoothing approach for implied PDF extraction presented by Shimko (1993). In our analysis we obtained implied volatility smiles from index futures markets (S&P 500 and DAX indices) and standardized them. The method introduced by Breeden and Litzenberger (1978) was then used on PDF extraction. The results show significant deviations from the assumption of lognormal returns for S&P500 options while DAX options mostly fit the lognormal distribution. A deviant subjective view of PDF can be used to form a strategy as discussed in the last section.