941 resultados para Chebyshev And Binomial Distributions


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Numerical experiments with different idealized land and mountain distributions are carried out to study the formation of the Asian monsoon and related coupling processes. Results demonstrate that when there is only extratropical continent located between 0 and 120°E and between 20/30°N and the North Pole, a rather weak monsoon rainband appears along the southern border of the continent, coexisting with an intense intertropical convergence zone (ITCZ). The continuous ITCZ surrounds the whole globe, prohibits the development of near-surface cross-equatorial flow, and collects water vapor from tropical oceans, resulting in very weak monsoon rainfall. When tropical lands are integrated, the ITCZ over the longitude domain where the extratropical continent exists disappears as a consequence of the development of a strong surface cross-equatorial flow from the winter hemisphere to the summer hemisphere. In addition, an intense interaction between the two hemispheres develops, tropical water vapor is transported to the subtropics by the enhanced poleward flow, and a prototype of the Asian monsoon appears. The Tibetan Plateau acts to enhance the coupling between the lower and upper tropospheric circulations and between the subtropical and tropical monsoon circulations, resulting in an intensification of the East Asian summer monsoon and a weakening of the South Asian summer monsoon. Linking the Iranian Plateau to the Tibetan Plateau substantially reduces the precipitation over Africa and increases the precipitation over the Arabian Sea and the northern Indian subcontinent, effectively contributing to the development of the South Asian summer monsoon.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Changes in the frequency and intensity of cyclones and associated windstorms affecting the Medi-terranean region simulated under enhanced Greenhouse Gas forcing conditions are investigated. The analysis is based on 7 climate model integrations performed with two coupled global models (ECHAM5 MPIOM and INGV CMCC), comparing the end of the twentieth century and at least the first half of the twenty-first century. As one of the models has a considerably enhanced resolution of the atmosphere and the ocean, it is also investigated whether the climate change signals are influenced by the model resolution. While the higher resolved simulation is closer to reanalysis climatology, both in terms of cyclones and windstorm distributions, there is no evidence for an influence of the resolution on the sign of the climate change signal. All model simulations show a reduction in the total number of cyclones crossing the Mediterranean region under climate change conditions. Exceptions are Morocco and the Levant region, where the models predict an increase in the number of cyclones. The reduction is especially strong for intense cyclones in terms of their Laplacian of pressure. The influence of the simulated positive shift in the NAO Index on the cyclone decrease is restricted to the Western Mediterranean region, where it explains 10–50 % of the simulated trend, depending on the individual simulation. With respect to windstorms, decreases are simulated over most of the Mediterranean basin. This overall reduction is due to a decrease in the number of events associated with local cyclones, while the number of events associated with cyclones outside of the Mediterranean region slightly increases. These systems are, however, less intense in terms of their integrated severity over the Mediterranean area, as they mostly affect the fringes of the region. In spite of the general reduction in total numbers, several cyclones and windstorms of intensity unknown under current climate conditions are identified for the scenario simulations. For these events, no common trend exists in the individual simulations. Thus, they may rather be attributed to long-term (e.g. decadal) variability than to the Greenhouse Gas forcing. Nevertheless, the result indicates that high-impact weather systems will remain an important risk in the Mediterranean Basin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Boreal winter wind storm situations over Central Europe are investigated by means of an objective cluster analysis. Surface data from the NCEP-Reanalysis and ECHAM4/OPYC3-climate change GHG simulation (IS92a) are considered. To achieve an optimum separation of clusters of extreme storm conditions, 55 clusters of weather patterns are differentiated. To reduce the computational effort, a PCA is initially performed, leading to a data reduction of about 98 %. The clustering itself was computed on 3-day periods constructed with the first six PCs using "k-means" clustering algorithm. The applied method enables an evaluation of the time evolution of the synoptic developments. The climate change signal is constructed by a projection of the GCM simulation on the EOFs attained from the NCEP-Reanalysis. Consequently, the same clusters are obtained and frequency distributions can be compared. For Central Europe, four primary storm clusters are identified. These clusters feature almost 72 % of the historical extreme storms events and add only to 5 % of the total relative frequency. Moreover, they show a statistically significant signature in the associated wind fields over Europe. An increased frequency of Central European storm clusters is detected with enhanced GHG conditions, associated with an enhancement of the pressure gradient over Central Europe. Consequently, more intense wind events over Central Europe are expected. The presented algorithm will be highly valuable for the analysis of huge data amounts as is required for e.g. multi-model ensemble analysis, particularly because of the enormous data reduction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the last decades, several windstorm series hit Europe leading to large aggregated losses. Such storm series are examples of serial clustering of extreme cyclones, presenting a considerable risk for the insurance industry. Clustering of events and return periods of storm series for Germany are quantified based on potential losses using empirical models. Two reanalysis data sets and observations from German weather stations are considered for 30 winters. Histograms of events exceeding selected return levels (1-, 2- and 5-year) are derived. Return periods of historical storm series are estimated based on the Poisson and the negative binomial distributions. Over 4000 years of general circulation model (GCM) simulations forced with current climate conditions are analysed to provide a better assessment of historical return periods. Estimations differ between distributions, for example 40 to 65 years for the 1990 series. For such less frequent series, estimates obtained with the Poisson distribution clearly deviate from empirical data. The negative binomial distribution provides better estimates, even though a sensitivity to return level and data set is identified. The consideration of GCM data permits a strong reduction of uncertainties. The present results support the importance of considering explicitly clustering of losses for an adequate risk assessment for economical applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Spatially dense observations of gust speeds are necessary for various applications, but their availability is limited in space and time. This work presents an approach to help to overcome this problem. The main objective is the generation of synthetic wind gust velocities. With this aim, theoretical wind and gust distributions are estimated from 10 yr of hourly observations collected at 123 synoptic weather stations provided by the German Weather Service. As pre-processing, an exposure correction is applied on measurements of the mean wind velocity to reduce the influence of local urban and topographic effects. The wind gust model is built as a transfer function between distribution parameters of wind and gust velocities. The aim of this procedure is to estimate the parameters of gusts at stations where only wind speed data is available. These parameters can be used to generate synthetic gusts, which can improve the accuracy of return periods at test sites with a lack of observations. The second objective is to determine return periods much longer than the nominal length of the original time series by considering extreme value statistics. Estimates for both local maximum return periods and average return periods for single historical events are provided. The comparison of maximum and average return periods shows that even storms with short average return periods may lead to local wind gusts with return periods of several decades. Despite uncertainties caused by the short length of the observational records, the method leads to consistent results, enabling a wide range of possible applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Iso-score curves graph (iSCG) and mathematical relationships between Scoring Parameters (SP) and Forecasting Parameters (FP) can be used in Economic Scoring Formulas (ESF) used in tendering to distribute the score among bidders in the economic part of a proposal. Each contracting authority must set an ESF when publishing tender specifications and the strategy of each bidder will differ depending on the ESF selected and the weight of the overall proposal scoring. The various mathematical relationships and density distributions that describe the main SPs and FPs, and the representation of tendering data by means of iSCGs, enable the generation of two new types of graphs that can be very useful for bidders who want to be more competitive: the scoring and position probability graphs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aerosol physical and chemical properties were measured in a forest site in central Amazonia (Cuieiras reservation, 2.61S; 60.21W) during the dry season of 2004 (Aug-Oct). Aerosol light scattering and absorption, mass concentration, elemental composition and size distributions were measured at three tower levels (Ground: 2 m; Canopy: 28 m, and Top: 40 m). For the first time, simultaneous eddy covariance fluxes of fine mode particles and volatile organic compounds (VOC) were measured above the Amazonian forest canopy. Aerosol fluxes were measured by eddy covariance using a Condensation Particle Counter (CPC) and a sonic anemometer. VOC fluxes were measured by disjunct eddy covariance using a Proton Transfer Reaction Mass Spectrometer (PTR-MS). At nighttime, a strong vertical gradient of phosphorus and potassium in the aerosol coarse mode was observed, with higher concentrations at Ground level. This suggests a source of primary biogenic particles below the canopy. Equivalent black carbon measurements indicate the presence of light-absorbing aerosols from biogenic origin. Aerosol number size distributions typically consisted of superimposed Aitken (76 nm) and accumulation modes (144 nm), without clear events of new particle formation. Isoprene and monoterpene fluxes reached respectively 7.4 and 0.82 mg m(-2) s(-1) around noon. An average fine particle flux of 0.05 +/- 0.10 10(6) m(-2) s(-1) was calculated, denoting an equilibrium between emission and deposition fluxes of fine mode particles at daytime. No significant correlations were found between VOC and fine mode aerosol concentrations or fluxes. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the multiplicity and pseudorapidity distributions of photons produced in Au + Au and Cu + Cu collisions at root(s)NN = 62.4 and 200 GeV. The photons are measured in the region -3.7 < eta < -2.3 using the photon Multiplicity detector in the STAR experiment at RHIC. The number of photons produced per average number of participating nucleon pairs increases with the beam energy and is independent of (lie collision centrality. For collisions with similar average numbers of participating nucleons the photon multiplicities are observed to be similar for An + Au and Cu + Cu collisions at a given beam energy. The ratios of the number of charged particles to photons in the measured pseudorapidity range are found to be 1.4 +/- 0.1 and 1.2 +/- 0.1 for root(s)NN = 62.4 and 200 GeV, respectively. The energy dependence of this ratio could reflect varying contributions from baryons to charged particles, while mesons are the dominant contributors to photon production in the given kinematic region. The photon pseudorapidity distributions normalized by average number of participating nucleon pairs, when plotted as a function of eta-Y(beam), are found to follow a longitudinal scaling independent of centrality and colliding ion species at both beam energies. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, a simple relation between the Leimkuhler curve and the mean residual life is established. The result is illustrated with several models commonly used in informetrics, such as exponential, Pareto and lognormal. Finally, relationships with some other reliability concepts are also presented. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We apply the concept of exchangeable random variables to the case of non-additive robability distributions exhibiting ncertainty aversion, and in the lass generated bya convex core convex non-additive probabilities, ith a convex core). We are able to rove two versions of the law of arge numbers (de Finetti's heorems). By making use of two efinitions. of independence we rove two versions of the strong law f large numbers. It turns out that e cannot assure the convergence of he sample averages to a constant. e then modal the case there is a true" probability distribution ehind the successive realizations of the uncertain random variable. In this case convergence occurs. This result is important because it renders true the intuition that it is possible "to learn" the "true" additive distribution behind an uncertain event if one repeatedly observes it (a sufficiently large number of times). We also provide a conjecture regarding the "Iearning" (or updating) process above, and prove a partia I result for the case of Dempster-Shafer updating rule and binomial trials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose - This paper proposes an interpolating approach of the element-free Galerkin method (EFGM) coupled with a modified truncation scheme for solving Poisson's boundary value problems in domains involving material non-homogeneities. The suitability and efficiency of the proposed implementation are evaluated for a given set of test cases of electrostatic field in domains involving different material interfaces.Design/methodology/approach - the authors combined an interpolating approximation with a modified domain truncation scheme, which avoids additional techniques for enforcing the Dirichlet boundary conditions and for dealing with material interfaces usually employed in meshfree formulations.Findings - the local electric potential and field distributions were correctly described as well as the global quantities like the total potency and resistance. Since, the treatment of the material interfaces becomes practically the same for both the finite element method (FEM) and the proposed EFGM, FEM-oriented programs can, thus, be easily extended to provide EFGM approximations.Research limitations/implications - the robustness of the proposed formulation became evident from the error analyses of the local and global variables, including in the case of high-material discontinuity.Practical implications - the proposed approach has shown to be as robust as linear FEM. Thus, it becomes an attractive alternative, also because it avoids the use of additional techniques to deal with boundary/interface conditions commonly employed in meshfree formulations.Originality/value - This paper reintroduces the domain truncation in the EFGM context, but by using a set of interpolating shape functions the authors avoided the use of Lagrange multipliers as well Mathematics in Engineering high-material discontinuity.