191 resultados para Extreme environments


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Deposit modelling based on archived borehole logs supplemented by a small number of dedicated boreholes is used to reconstruct the main boundary surfaces and the thickness of the main sediment units within the succession of Holocene alluvial deposits underlying the floodplain in the Barking Reach of the Lower Thames Valley. The basis of the modelling exercise is discussed and the models are used to assess the significance of floodplain relief in determining patterns of sedimentation. This evidence is combined with the results of biostratigraphical and geochronological investigations to reconstruct the environmental conditions associated with each successive stage of floodplain aggradation. The two main factors affecting the history and spatial pattern of Holocene sedimentation are shown to be the regional behaviour of relative sea level and the pattern of relief on the surface of the sub-alluvial, Late Devensian Shepperton Gravel. As is generally the case in the Lower Thames Valley, three main stratigraphic units are recognised, the Lower Alluvium, a peat bed broadly equivalent to the Tilbury III peat of Devoy (1979) and an Upper Alluvium. There is no evidence to suggest that the floodplain was substantially re-shaped by erosion during the Holocene. Instead, the relief inherited from the Shepperton Gravel surface was gradually buried either by the accumulation of peat or by deposition of fine-grained sediment from suspension in standing or slow-moving water. The palaeoenvironmental record from Barking confirms important details of the Holocene record observed elsewhere in the Lower Thames Valley, including the presence of Taxus in the valley-floor fen carr woodland between about 5000 and 4000 cal BP, and the subsequent growth of Ulmus on the peat surface.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work presents a method of information fusion involving data captured by both a standard charge-coupled device (CCD) camera and a time-of-flight (ToF) camera to be used in the detection of the proximity between a manipulator robot and a human. Both cameras are assumed to be located above the work area of an industrial robot. The fusion of colour images and time-of-flight information makes it possible to know the 3D localization of objects with respect to a world coordinate system. At the same time, this allows to know their colour information. Considering that ToF information given by the range camera contains innacuracies including distance error, border error, and pixel saturation, some corrections over the ToF information are proposed and developed to improve the results. The proposed fusion method uses the calibration parameters of both cameras to reproject 3D ToF points, expressed in a common coordinate system for both cameras and a robot arm, in 2D colour images. In addition to this, using the 3D information, the motion detection in a robot industrial environment is achieved, and the fusion of information is applied to the foreground objects previously detected. This combination of information results in a matrix that links colour and 3D information, giving the possibility of characterising the object by its colour in addition to its 3D localisation. Further development of these methods will make it possible to identify objects and their position in the real world and to use this information to prevent possible collisions between the robot and such objects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ability of the HiGEM climate model to represent high-impact, regional, precipitation events is investigated in two ways. The first focusses on a case study of extreme regional accumulation of precipitation during the passage of a summer extra-tropical cyclone across southern England on 20 July 2007 that resulted in a national flooding emergency. The climate model is compared with a global Numerical Weather Prediction (NWP) model and higher resolution, nested limited area models. While the climate model does not simulate the timing and location of the cyclone and associated precipitation as accurately as the NWP simulations, the total accumulated precipitation in all models is similar to the rain gauge estimate across England and Wales. The regional accumulation over the event is insensitive to horizontal resolution for grid spacings ranging from 90km to 4km. Secondly, the free-running climate model reproduces the statistical distribution of daily precipitation accumulations observed in the England-Wales precipitation record. The model distribution diverges increasingly from the record for longer accumulation periods with a consistent under-representation of more intense multi-day accumulations. This may indicate a lack of low-frequency variability associated with weather regime persistence. Despite this, the overall seasonal and annual precipitation totals from the model are still comparable to those from ERA-Interim.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Extreme temperature during reproductive development affects rice (Oryza sativa L.) yield and seed quality. A controlled-environment reciprocal-transfer experiment was designed where plants from two japonica cultivars were grown at 28/24 ⁰C and moved to 18/14 ⁰C and vice versa, or from 28/24 to 38/34 ⁰C and vice versa, for 7-d periods to determine the respective temporal pattern of sensitivity of spikelet fertility, yield, and seed viability to each temperature extreme. Spikelet fertility and seed yield per panicle were severely reduced by extreme temperature in the 14 d period prior to anthesis; and both cultivars were affected at 38/34 ⁰C while only cv. Gleva was affected at 18/14 ºC. The damage was greater the earlier the panicles were stressed within this period. Later-exserted panicles compensated only partly for yield loss. Seed viability was significantly reduced by 7-d exposure to 38/34 ⁰C or 18/14 ⁰C at 1 to 7 and 1 to 14 d after anthesis, respectively, in cv. Gleva. Cultivar Taipei 309 was not affected by 7 d exposure at 18/14 ⁰C; and no consistent temporal pattern of sensitivity was evident at 38/34 ⁰C. Hence, brief exposure to low or high temperature was most damaging to spikelet fertility and yield 14 to 7 d before anthesis, coinciding with microsporogenesis; and it was almost as damaging around anthesis. Seed viability was most vulnerable to low or high temperature in the 7 or 14 d after anthesis, when histodifferentiation occurs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new generation of reanalysis products is currently being produced that provides global gridded atmospheric data spanning more than a century. Such data may be useful for characterising the observed long-term variability of extreme precipitation events, particularly in regions where spatial coverage of surface observations is limited, and in the pre-satellite era. An analysis of extreme precipitation events is performed over England and Wales, investigating the ability of Twentieth Century Reanalysis and ERA-Interim to represent extreme precipitation accumulations as recorded in the England and Wales Precipitation dataset on accumulation time-scales from 1 to 7 days. Significant correlations are found between daily precipitation accumulation observations and both reanalysis products. A hit-rate analysis indicates that the reanalyses have hit rates (as defined by an event above the 98th percentile) of approximately 40–65% for extreme events in both summer (JJA) and winter (DJF). This suggests that both ERA-Interim and Twentieth Century Reanalysis are difficult to use for representing individual extreme precipitation events over England and Wales.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent research into flood modelling has primarily concentrated on the simulation of inundation flow without considering the influences of channel morphology. River channels are often represented by a simplified geometry that is implicitly assumed to remain unchanged during flood simulations. However, field evidence demonstrates that significant morphological changes can occur during floods to mobilise the boundary sediments. Despite this, the effect of channel morphology on model results has been largely unexplored. To address this issue, the impact of channel cross-section geometry and channel long-profile variability on flood dynamics is examined using an ensemble of a 1D-2D hydraulic model (LISFLOOD-FP) of the 1:2102 year recurrence interval floods in Cockermouth, UK, within an uncertainty framework. A series of hypothetical scenarios of channel morphology were constructed based on a simple velocity based model of critical entrainment. A Monte-Carlo simulation framework was used to quantify the effects of channel morphology together with variations in the channel and floodplain roughness coefficients, grain size characteristics, and critical shear stress on measures of flood inundation. The results showed that the bed elevation modifications generated by the simplistic equations reflected a good approximation of the observed patterns of spatial erosion despite its overestimation of erosion depths. The effect of uncertainty on channel long-profile variability only affected the local flood dynamics and did not significantly affect the friction sensitivity and flood inundation mapping. The results imply that hydraulic models generally do not need to account for within event morphodynamic changes of the type and magnitude modelled, as these have a negligible impact that is smaller than other uncertainties, e.g. boundary conditions. Instead morphodynamic change needs to happen over a series of events to become large enough to change the hydrodynamics of floods in supply limited gravel-bed rivers like the one used in this research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the fidelity of virtual environments (VE) continues to increase, the possibility of using them as training platforms is becoming increasingly realistic for a variety of application domains, including military and emergency personnel training. In the past, there was much debate on whether the acquisition and subsequent transfer of spatial knowledge from VEs to the real world is possible, or whether the differences in medium during training would essentially be an obstacle to truly learning geometric space. In this paper, the authors present various cognitive and environmental factors that not only contribute to this process, but also interact with each other to a certain degree, leading to a variable exposure time requirement in order for the process of spatial knowledge acquisition (SKA) to occur. The cognitive factors that the authors discuss include a variety of individual user differences such as: knowledge and experience; cognitive gender differences; aptitude and spatial orientation skill; and finally, cognitive styles. Environmental factors discussed include: Size, Spatial layout complexity and landmark distribution. It may seem obvious that since every individual's brain is unique - not only through experience, but also through genetic predisposition that a one size fits all approach to training would be illogical. Furthermore, considering that various cognitive differences may further emerge when a certain stimulus is present (e.g. complex environmental space), it would make even more sense to understand how these factors can impact spatial memory, and to try to adapt the training session by providing visual/auditory cues as well as by changing the exposure time requirements for each individual. The impact of this research domain is important to VE training in general, however within service and military domains, guaranteeing appropriate spatial training is critical in order to ensure that disorientation does not occur in a life or death scenario.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a novel approach to the automatic classification of very large data sets composed of terahertz pulse transient signals, highlighting their potential use in biochemical, biomedical, pharmaceutical and security applications. Two different types of THz spectra are considered in the classification process. Firstly a binary classification study of poly-A and poly-C ribonucleic acid samples is performed. This is then contrasted with a difficult multi-class classification problem of spectra from six different powder samples that although have fairly indistinguishable features in the optical spectrum, they also possess a few discernable spectral features in the terahertz part of the spectrum. Classification is performed using a complex-valued extreme learning machine algorithm that takes into account features in both the amplitude as well as the phase of the recorded spectra. Classification speed and accuracy are contrasted with that achieved using a support vector machine classifier. The study systematically compares the classifier performance achieved after adopting different Gaussian kernels when separating amplitude and phase signatures. The two signatures are presented as feature vectors for both training and testing purposes. The study confirms the utility of complex-valued extreme learning machine algorithms for classification of the very large data sets generated with current terahertz imaging spectrometers. The classifier can take into consideration heterogeneous layers within an object as would be required within a tomographic setting and is sufficiently robust to detect patterns hidden inside noisy terahertz data sets. The proposed study opens up the opportunity for the establishment of complex-valued extreme learning machine algorithms as new chemometric tools that will assist the wider proliferation of terahertz sensing technology for chemical sensing, quality control, security screening and clinic diagnosis. Furthermore, the proposed algorithm should also be very useful in other applications requiring the classification of very large datasets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The XWS (eXtreme WindStorms) catalogue consists of storm tracks and model-generated maximum 3 s wind-gust footprints for 50 of the most extreme winter windstorms to hit Europe in the period 1979–2012. The catalogue is intended to be a valuable resource for both academia and industries such as (re)insurance, for example allowing users to characterise extreme European storms, and validate climate and catastrophe models. Several storm severity indices were investigated to find which could best represent a list of known high-loss (severe) storms. The best-performing index was Sft, which is a combination of storm area calculated from the storm footprint and maximum 925 hPa wind speed from the storm track. All the listed severe storms are included in the catalogue, and the remaining ones were selected using Sft. A comparison of the model footprint to station observations revealed that storms were generally well represented, although for some storms the highest gusts were underestimated. Possible reasons for this underestimation include the model failing to simulate strong enough pressure gradients and not representing convective gusts. A new recalibration method was developed to estimate the true distribution of gusts at each grid point and correct for this underestimation. The recalibration model allows for storm-to-storm variation which is essential given that different storms have different degrees of model bias. The catalogue is available at www.europeanwindstorms.org.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With a rapidly increasing fraction of electricity generation being sourced from wind, extreme wind power generation events such as prolonged periods of low (or high) generation and ramps in generation, are a growing concern for the efficient and secure operation of national power systems. As extreme events occur infrequently, long and reliable meteorological records are required to accurately estimate their characteristics. Recent publications have begun to investigate the use of global meteorological “reanalysis” data sets for power system applications, many of which focus on long-term average statistics such as monthly-mean generation. Here we demonstrate that reanalysis data can also be used to estimate the frequency of relatively short-lived extreme events (including ramping on sub-daily time scales). Verification against 328 surface observation stations across the United Kingdom suggests that near-surface wind variability over spatiotemporal scales greater than around 300 km and 6 h can be faithfully reproduced using reanalysis, with no need for costly dynamical downscaling. A case study is presented in which a state-of-the-art, 33 year reanalysis data set (MERRA, from NASA-GMAO), is used to construct an hourly time series of nationally-aggregated wind power generation in Great Britain (GB), assuming a fixed, modern distribution of wind farms. The resultant generation estimates are highly correlated with recorded data from National Grid in the recent period, both for instantaneous hourly values and for variability over time intervals greater than around 6 h. This 33 year time series is then used to quantify the frequency with which different extreme GB-wide wind power generation events occur, as well as their seasonal and inter-annual variability. Several novel insights into the nature of extreme wind power generation events are described, including (i) that the number of prolonged low or high generation events is well approximated by a Poission-like random process, and (ii) whilst in general there is large seasonal variability, the magnitude of the most extreme ramps is similar in both summer and winter. An up-to-date version of the GB case study data as well as the underlying model are freely available for download from our website: http://www.met.reading.ac.uk/~energymet/data/Cannon2014/.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study examines the atmospheric circulation patterns and surface features associated with the seven coldest winters in the U.K. since 1870, using the 20th Century Reanalysis. Six of these winters are outside the scope of previous reanalysis datasets; we examine them here for the first time. All winters show a marked lack of the climatological southwesterly flow over the UK, displaying easterly and northeasterly anomalies. Six of the seven winters (all except 1890) were associated with a negative phase of the North Atlantic Oscillation; 1890 was characterised by a blocking anticyclone over and northeast of the UK.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

At the most recent session of the Conference of the Parties (COP19) in Warsaw (November 2013) the Warsaw international mechanism for loss and damage associated with climate change impacts was established under the United Nations Framework Convention on Climate Change (UNFCCC). The mechanism aims at promoting the implementation of approaches to address loss and damage associated with the adverse effects of climate change. Specifically, it aims to enhance understanding of risk management approaches to address loss and damage. Understanding risks associated with impacts due to highly predictable (slow onset) events like sea-level rise is relatively straightforward whereas assessing the effects of climate change on extreme weather events and their impacts is much more difficult. However, extreme weather events are a significant cause of loss of life and livelihoods, particularly in vulnerable countries and communities in Africa. The emerging science of probabilistic event attribution is relevant as it provides scientific evidence on the contribution of anthropogenic climate change to changes in risk of extreme events. It thus provides the opportunity to explore scientifically-backed assessments of the human influence on such events. However, different ways of framing attribution questions can lead to very different assessments of change in risk. Here we explain the methods of, and implications of different approaches to attributing extreme weather events with a focus on Africa. Crucially, it demonstrates that defining the most appropriate attribution question to ask is not a science decision but needs to be made in dialogue with those stakeholders who will use the answers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Buildings affect people in various ways. They can help us to work more effectively; they also present a wide range of stimuli for our senses to react to. Intelligent buildings are designed to be aesthetic in sensory terms not just visually appealing but ones in which occupants experience delight, freshness, airiness, daylight, views out and social ambience. All these factors contribute to a general aesthetic which gives pleasure and affects one’s mood. If there is to be a common vision, it is essential for architects, engineers and clients to work closely together throughout the planning, design, construction and operational stages which represent the conception, birth and life of the building. There has to be an understanding of how patterns of work are best suited to a particular building form served by appropriate environmental systems. A host of technologies are emerging that help these processes, but in the end it is how we think about achieving responsive buildings that matters. Intelligent buildings should cope with social and technological changes and also be adaptable to short-term and long-term human needs. We live through our senses. They rely on stimulation from the tasks we are focused on; people around us but also the physical environment. We breathe air and its quality affects the olfactory system; temperature is felt by thermoreceptors in the skin; sound enters our ears; the visual scene is beheld by our eyes. All these stimuli are transmitted along the sensory nervous system to the brain for processing from which physiological and psychological reactions and judgments are formed depending on perception, expectancies and past experiences. It is clear that the environmental setting plays a role in this sensory process. This is the essence of sensory design. Space plays its part as well. The flow of communication is partly electronic but also largely by people meeting face to face. Our sense of space wants different things at different times. Sometimes privacy but other times social needs have to be satisfied besides the organizational requirement to have effective human communications throughout the building. In general if the senses are satisfied people feel better and work better.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a novel application of virtual environments to assist in encouraging behavior change in individuals who misuse drugs or alcohol. We describe the user-centered design of a series of scenes to engage users in the identification of triggers and to encourage discussions about relevant coping skills. Results from the initial testing of this application with six service users showed variation in user responses. Results also suggested that the system should encourage group discussion and that it was linked to a small improvement in users’ confidence in understanding and identifying triggers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We extend extreme learning machine (ELM) classifiers to complex Reproducing Kernel Hilbert Spaces (RKHS) where the input/output variables as well as the optimization variables are complex-valued. A new family of classifiers, called complex-valued ELM (CELM) suitable for complex-valued multiple-input–multiple-output processing is introduced. In the proposed method, the associated Lagrangian is computed using induced RKHS kernels, adopting a Wirtinger calculus approach formulated as a constrained optimization problem similarly to the conventional ELM classifier formulation. When training the CELM, the Karush–Khun–Tuker (KKT) theorem is used to solve the dual optimization problem that consists of satisfying simultaneously smallest training error as well as smallest norm of output weights criteria. The proposed formulation also addresses aspects of quaternary classification within a Clifford algebra context. For 2D complex-valued inputs, user-defined complex-coupled hyper-planes divide the classifier input space into four partitions. For 3D complex-valued inputs, the formulation generates three pairs of complex-coupled hyper-planes through orthogonal projections. The six hyper-planes then divide the 3D space into eight partitions. It is shown that the CELM problem formulation is equivalent to solving six real-valued ELM tasks, which are induced by projecting the chosen complex kernel across the different user-defined coordinate planes. A classification example of powdered samples on the basis of their terahertz spectral signatures is used to demonstrate the advantages of the CELM classifiers compared to their SVM counterparts. The proposed classifiers retain the advantages of their ELM counterparts, in that they can perform multiclass classification with lower computational complexity than SVM classifiers. Furthermore, because of their ability to perform classification tasks fast, the proposed formulations are of interest to real-time applications.