971 resultados para Generalized extreme value distribution
Resumo:
With extreme variability of the Arctic polar vortex being a key link for stratosphere–troposphere influences, its evolution into the twenty-first century is important for projections of changing surface climate in response to greenhouse gases. Variability of the stratospheric vortex is examined using a state-of-the-art climate model and a suite of specifically developed vortex diagnostics. The model has a fully coupled ocean and a fully resolved stratosphere. Analysis of the standard stratospheric zonal mean wind diagnostic shows no significant increase over the twenty-first century in the number of major sudden stratospheric warmings (SSWs) from its historical value of 0.7 events per decade, although the monthly distribution of SSWs does vary, with events becoming more evenly dispersed throughout the winter. However, further analyses using geometric-based vortex diagnostics show that the vortex mean state becomes weaker, and the vortex centroid is climatologically more equatorward by up to 2.5°, especially during early winter. The results using these diagnostics not only characterize the vortex structure and evolution but also emphasize the need for vortex-centric diagnostics over zonally averaged measures. Finally, vortex variability is subdivided into wave-1 (displaced) and -2 (split) components, and it is implied that vortex displacement events increase in frequency under climate change, whereas little change is observed in splitting events.
Resumo:
It is thought that speciation in phytophagous insects is often due to colonization of novel host plants, because radiations of plant and insect lineages are typically asynchronous. Recent phylogenetic comparisons have supported this model of diversification for both insect herbivores and specialized pollinators. An exceptional case where contemporaneous plant insect diversification might be expected is the obligate mutualism between fig trees (Ficus species, Moraceae) and their pollinating wasps (Agaonidae, Hymenoptera). The ubiquity and ecological significance of this mutualism in tropical and subtropical ecosystems has long intrigued biologists, but the systematic challenge posed by >750 interacting species pairs has hindered progress toward understanding its evolutionary history. In particular, taxon sampling and analytical tools have been insufficient for large-scale co-phylogenetic analyses. Here, we sampled nearly 200 interacting pairs of fig and wasp species from across the globe. Two supermatrices were assembled: on average, wasps had sequences from 77% of six genes (5.6kb), figs had sequences from 60% of five genes (5.5 kb), and overall 850 new DNA sequences were generated for this study. We also developed a new analytical tool, Jane 2, for event-based phylogenetic reconciliation analysis of very large data sets. Separate Bayesian phylogenetic analyses for figs and fig wasps under relaxed molecular clock assumptions indicate Cretaceous diversification of crown groups and contemporaneous divergence for nearly half of all fig and pollinator lineages. Event-based co-phylogenetic analyses further support the co-diversification hypothesis. Biogeographic analyses indicate that the presentday distribution of fig and pollinator lineages is consistent with an Eurasian origin and subsequent dispersal, rather than with Gondwanan vicariance. Overall, our findings indicate that the fig-pollinator mutualism represents an extreme case among plant-insect interactions of coordinated dispersal and long-term co-diversification.
Resumo:
Global communicationrequirements andloadimbalanceof someparalleldataminingalgorithms arethe major obstacles to exploitthe computational power of large-scale systems. This work investigates how non-uniform data distributions can be exploited to remove the global communication requirement and to reduce the communication costin parallel data mining algorithms and, in particular, in the k-means algorithm for cluster analysis. In the straightforward parallel formulation of the k-means algorithm, data and computation loads are uniformly distributed over the processing nodes. This approach has excellent load balancing characteristics that may suggest it could scale up to large and extreme-scale parallel computing systems. However, at each iteration step the algorithm requires a global reduction operationwhichhinders thescalabilityoftheapproach.Thisworkstudiesadifferentparallelformulation of the algorithm where the requirement of global communication is removed, while maintaining the same deterministic nature ofthe centralised algorithm. The proposed approach exploits a non-uniform data distribution which can be either found in real-world distributed applications or can be induced by means ofmulti-dimensional binary searchtrees. The approachcanalso be extended to accommodate an approximation error which allows a further reduction ofthe communication costs. The effectiveness of the exact and approximate methods has been tested in a parallel computing system with 64 processors and in simulations with 1024 processing element
Resumo:
This study aims to characterise the rainfall exceptionality and the meteorological context of the 20 February 2010 flash-floods in Madeira (Portugal). Daily and hourly precipitation records from the available rain-gauge station networks are evaluated in order to reconstitute the temporal evolution of the rainstorm, as its geographic incidence, contributing to understand the flash-flood dynamics and the type and spatial distribution of the associated impacts. The exceptionality of the rainstorm is further confirmed by the return period associated with the daily precipitation registered at the two long-term record stations, with 146.9 mm observed in the city of Funchal and 333.8 mm on the mountain top, corresponding to an estimated return period of approximately 290 yr and 90 yr, respectively. Furthermore, the synoptic associated situation responsible for the flash-floods is analysed using different sources of information, e.g., weather charts, reanalysis data, Meteosat images and radiosounding data, with the focus on two main issues: (1) the dynamical conditions that promoted such anomalous humidity availability over the Madeira region on 20 February 2010 and (2) the uplift mechanism that induced deep convection activity.
Resumo:
The summer monsoon season is an important hydrometeorological feature of the Indian subcontinent and it has significant socioeconomic impacts. This study is aimed at understanding the processes associated with the occurrence of catastrophic flood events. The study has two novel features that add to the existing body of knowledge about the South Asian Monsoon: 1) combine traditional hydrometeorological observations (rain gauge measurements) with unconventional data (media and state historical records of reported flooding) to produce value-added century-long time-series of potential flood events, and 2) identify the larger regional synoptic conditions leading to days with flood potential in the time-series. The promise of mining unconventional data to extend hydrometeorological records is demonstrated in this study. The synoptic evolution of flooding events in the western-central coast of India and the densely populated Mumbai area are shown to correspond to active monsoon periods with embedded low-pressure centers and have far upstream influence from the western edge of the Indian Ocean basin. The coastal processes along the Arabian Peninsula where the currents interact with the continental shelf are found to be key features of extremes during the South Asian Monsoon
Resumo:
Dynamical downscaling is frequently used to investigate the dynamical variables of extra-tropical cyclones, for example, precipitation, using very high-resolution models nested within coarser resolution models to understand the processes that lead to intense precipitation. It is also used in climate change studies, using long timeseries to investigate trends in precipitation, or to look at the small-scale dynamical processes for specific case studies. This study investigates some of the problems associated with dynamical downscaling and looks at the optimum configuration to obtain the distribution and intensity of a precipitation field to match observations. This study uses the Met Office Unified Model run in limited area mode with grid spacings of 12, 4 and 1.5 km, driven by boundary conditions provided by the ECMWF Operational Analysis to produce high-resolution simulations for the Summer of 2007 UK flooding events. The numerical weather prediction model is initiated at varying times before the peak precipitation is observed to test the importance of the initialisation and boundary conditions, and how long the simulation can be run for. The results are compared to raingauge data as verification and show that the model intensities are most similar to observations when the model is initialised 12 hours before the peak precipitation is observed. It was also shown that using non-gridded datasets makes verification more difficult, with the density of observations also affecting the intensities observed. It is concluded that the simulations are able to produce realistic precipitation intensities when driven by the coarser resolution data.
Resumo:
The ability of the HiGEM climate model to represent high-impact, regional, precipitation events is investigated in two ways. The first focusses on a case study of extreme regional accumulation of precipitation during the passage of a summer extra-tropical cyclone across southern England on 20 July 2007 that resulted in a national flooding emergency. The climate model is compared with a global Numerical Weather Prediction (NWP) model and higher resolution, nested limited area models. While the climate model does not simulate the timing and location of the cyclone and associated precipitation as accurately as the NWP simulations, the total accumulated precipitation in all models is similar to the rain gauge estimate across England and Wales. The regional accumulation over the event is insensitive to horizontal resolution for grid spacings ranging from 90km to 4km. Secondly, the free-running climate model reproduces the statistical distribution of daily precipitation accumulations observed in the England-Wales precipitation record. The model distribution diverges increasingly from the record for longer accumulation periods with a consistent under-representation of more intense multi-day accumulations. This may indicate a lack of low-frequency variability associated with weather regime persistence. Despite this, the overall seasonal and annual precipitation totals from the model are still comparable to those from ERA-Interim.
Resumo:
The MATLAB model is contained within the compressed folders (versions are available as .zip and .tgz). This model uses MERRA reanalysis data (>34 years available) to estimate the hourly aggregated wind power generation for a predefined (fixed) distribution of wind farms. A ready made example is included for the wind farm distribution of Great Britain, April 2014 ("CF.dat"). This consists of an hourly time series of GB-total capacity factor spanning the period 1980-2013 inclusive. Given the global nature of reanalysis data, the model can be applied to any specified distribution of wind farms in any region of the world. Users are, however, strongly advised to bear in mind the limitations of reanalysis data when using this model/data. This is discussed in our paper: Cannon, Brayshaw, Methven, Coker, Lenaghan. "Using reanalysis data to quantify extreme wind power generation statistics: a 33 year case study in Great Britain". Submitted to Renewable Energy in March, 2014. Additional information about the model is contained in the model code itself, in the accompanying ReadMe file, and on our website: http://www.met.reading.ac.uk/~energymet/data/Cannon2014/
Resumo:
The influence of the size distribution of particles on the viscous property of an electrorheological fluid has been investigated by the molecular dynamic simulation method. The shear stress of the fluid is found to decrease with the increase of the variance sigma(2) of the Gaussian distribution of the particle size, and then reach a steady value when sigma is larger than 0.5. This phenomenon is attributed to the influence of the particle size distribution on the dynamic structural evolution in the fluid as well as the strength of the different chain-like structures formed by the particles.
Resumo:
The XWS (eXtreme WindStorms) catalogue consists of storm tracks and model-generated maximum 3 s wind-gust footprints for 50 of the most extreme winter windstorms to hit Europe in the period 1979–2012. The catalogue is intended to be a valuable resource for both academia and industries such as (re)insurance, for example allowing users to characterise extreme European storms, and validate climate and catastrophe models. Several storm severity indices were investigated to find which could best represent a list of known high-loss (severe) storms. The best-performing index was Sft, which is a combination of storm area calculated from the storm footprint and maximum 925 hPa wind speed from the storm track. All the listed severe storms are included in the catalogue, and the remaining ones were selected using Sft. A comparison of the model footprint to station observations revealed that storms were generally well represented, although for some storms the highest gusts were underestimated. Possible reasons for this underestimation include the model failing to simulate strong enough pressure gradients and not representing convective gusts. A new recalibration method was developed to estimate the true distribution of gusts at each grid point and correct for this underestimation. The recalibration model allows for storm-to-storm variation which is essential given that different storms have different degrees of model bias. The catalogue is available at www.europeanwindstorms.org.
Resumo:
With a rapidly increasing fraction of electricity generation being sourced from wind, extreme wind power generation events such as prolonged periods of low (or high) generation and ramps in generation, are a growing concern for the efficient and secure operation of national power systems. As extreme events occur infrequently, long and reliable meteorological records are required to accurately estimate their characteristics. Recent publications have begun to investigate the use of global meteorological “reanalysis” data sets for power system applications, many of which focus on long-term average statistics such as monthly-mean generation. Here we demonstrate that reanalysis data can also be used to estimate the frequency of relatively short-lived extreme events (including ramping on sub-daily time scales). Verification against 328 surface observation stations across the United Kingdom suggests that near-surface wind variability over spatiotemporal scales greater than around 300 km and 6 h can be faithfully reproduced using reanalysis, with no need for costly dynamical downscaling. A case study is presented in which a state-of-the-art, 33 year reanalysis data set (MERRA, from NASA-GMAO), is used to construct an hourly time series of nationally-aggregated wind power generation in Great Britain (GB), assuming a fixed, modern distribution of wind farms. The resultant generation estimates are highly correlated with recorded data from National Grid in the recent period, both for instantaneous hourly values and for variability over time intervals greater than around 6 h. This 33 year time series is then used to quantify the frequency with which different extreme GB-wide wind power generation events occur, as well as their seasonal and inter-annual variability. Several novel insights into the nature of extreme wind power generation events are described, including (i) that the number of prolonged low or high generation events is well approximated by a Poission-like random process, and (ii) whilst in general there is large seasonal variability, the magnitude of the most extreme ramps is similar in both summer and winter. An up-to-date version of the GB case study data as well as the underlying model are freely available for download from our website: http://www.met.reading.ac.uk/~energymet/data/Cannon2014/.
Resumo:
The replacement of fat and sugar in cakes is a challenge as they have an important effect on the structural and sensory properties. Moreover, there is the possibility to incorporate an additional value using novel replacers. In this work, inulin and oligofructose were used as fat and sugar replacers, respectively. Different combinations of replacement levels were investigated: fat replacement (0 and 50 %) and sugar replacement (0, 20, 30, 40 and 50 %). Simulated microbaking was carried out to study bubble size distribution during baking. Batter viscosity and weight loss during baking were also analysed. Cake characteristics were studied in terms of cell crumb structure, height, texture and sensory properties. Fat and sugar replacement gave place to batters with low apparent viscosity values. During heating, bubbles underwent a marked expansion in replaced cakes if compared to the control cake. The low batter stability in fat-replaced samples increased bubble movement, giving place to cakes with bigger cells and less height than the control. Sugar-replaced samples had smaller and fewer cells and lower height than the control. Moreover, sugar replacement decreased hardness and cohesiveness and in- creased springiness, which could be related with a denser crumb and an easily crumbled product. Regarding the sensory analysis, a replacement up to 50 % of fat and 30 % of sugar, separately and simultaneously, did not change remarkably the overall acceptability of the cakes. However, the sponginess and the sweetness could be improved in all the replaced cakes, according to the Just About Right scales.
Resumo:
Wind generation's contribution to supporting peak electricity demand is one of the key questions in wind integration studies. Differently from conventional units, the available outputs of different wind farms cannot be approximated as being statistically independent, and hence near-zero wind output is possible across an entire power system. This paper will review the risk model structures currently used to assess wind's capacity value, along with discussion of the resulting data requirements. A central theme is the benefits from performing statistical estimation of the joint distribution for demand and available wind capacity, focusing attention on uncertainties due to limited histories of wind and demand data; examination of Great Britain data from the last 25 years shows that the data requirements are greater than generally thought. A discussion is therefore presented into how analysis of the types of weather system which have historically driven extreme electricity demands can help to deliver robust insights into wind's contribution to supporting demand, even in the face of such data limitations. The role of the form of the probability distribution for available conventional capacity in driving wind capacity credit results is also discussed.
Resumo:
The number of bidders, N, involved in a construction procurement auction is known to have an important effect on the value of the lowest bid and the mark-up applied by bidders. In practice, for example, it is important for a bidder to have a good estimate of N when bidding for a current contract. One approach, instigated by Friedman in 1956, is to make such an estimate by statistical analysis and modelling. Since then, however, finding a suitable model for N has been an enduring problem for researchers and, despite intensive research activity in the subsequent 30 years, little progress has been made, due principally to the absence of new ideas and perspectives. The debate is resumed by checking old assumptions, providing new evidence relating to concomitant variables and proposing a new model. In doing this and in order to ensure universality, a novel approach is developed and tested by using a unique set of 12 construction tender databases from four continents. This shows the new model provides a significant advancement on previous versions. Several new research questions are also posed and other approaches identified for future study.
Resumo:
Based on previous observational studies on cold extreme events over southern South America, some recent studies suggest a possible relationship between Rossby wave propagation remotely triggered and the occurrence of frost. Using the concept of linear theory of Rossby wave propagation, this paper analyzes the propagation of such waves in two different basic states that correspond to austral winters with maximum and minimum generalized frost frequency of occurrence in the Wet Pampa (central-northwest Argentina). In order to determine the wave trajectories, the ray tracing technique is used in this study. Some theoretical discussion about this technique is also presented. The analysis of the basic state, from a theoretical point of view and based on the calculation of ray tracings, corroborates that remotely excited Rossby waves is the mechanism that favors the maximum occurrence of generalized frosts. The basic state in which the waves propagate is what conditions the places where they are excited. The Rossby waves are excited in determined places of the atmosphere, propagating towards South America along the jet streams that act as wave guides, favoring the generation of generalized frosts. In summary, this paper presents an overview of the ray tracing technique and how it can be used to investigate an important synoptic event, such as frost in a specific region, and its relationship with the propagation of large scale planetary waves.