23 resultados para 911
em CentAUR: Central Archive University of Reading - UK
Resumo:
A new 'storm-tracking approach' to analysing the prediction of storms by different forecast systems has recently been developed. This paper provides a brief illustration of the type of results/information that can be obtained using the approach. It also describes in detail how eScience methodologies have been used to help apply the storm-tracking approach to very large datasets
Resumo:
A regional study of the prediction of extratropical cyclones by the European Centre for Medium-Range Weather Forecasts (ECMWF) Ensemble Prediction System (EPS) has been performed. An objective feature-tracking method has been used to identify and track the cyclones along the forecast trajectories. Forecast error statistics have then been produced for the position, intensity, and propagation speed of the storms. In previous work, data limitations meant it was only possible to present the diagnostics for the entire Northern Hemisphere (NH) or Southern Hemisphere. A larger data sample has allowed the diagnostics to be computed separately for smaller regions around the globe and has made it possible to explore the regional differences in the prediction of storms by the EPS. Results show that in the NH there is a larger ensemble mean error in the position of storms over the Atlantic Ocean. Further analysis revealed that this is mainly due to errors in the prediction of storm propagation speed rather than in direction. Forecast storms propagate too slowly in all regions, but the bias is about 2 times as large in the NH Atlantic region. The results show that storm intensity is generally overpredicted over the ocean and underpredicted over the land and that the absolute error in intensity is larger over the ocean than over the land. In the NH, large errors occur in the prediction of the intensity of storms that originate as tropical cyclones but then move into the extratropics. The ensemble is underdispersive for the intensity of cyclones (i.e., the spread is smaller than the mean error) in all regions. The spatial patterns of the ensemble mean error and ensemble spread are very different for the intensity of cyclones. Spatial distributions of the ensemble mean error suggest that large errors occur during the growth phase of storm development, but this is not indicated by the spatial distributions of the ensemble spread. In the NH there are further differences. First, the large errors in the prediction of the intensity of cyclones that originate in the tropics are not indicated by the spread. Second, the ensemble mean error is larger over the Pacific Ocean than over the Atlantic, whereas the opposite is true for the spread. The use of a storm-tracking approach, to both weather forecasters and developers of forecast systems, is also discussed.
Resumo:
Neem leaves, neem cake (a by-product left after the extraction of oil from neem seed) and a commercially refined product aza (azadirachtin) extracted from seed were evaluated. Aqueous extracts of crude neem formulations used as a seedling dip treatment significantly reduced the number of females and egg masses in roots whereas the refined one did not. A split-root technique was used to demonstrate the translocation of active compounds within a plant and their subsequent effect on the development of nematodes. When applied to the root portion all formulations significantly reduced the number of egg masses and eggs per egg mass. Whereas on the untreated root portion, neem cake at 3% w/w and aza at 0.1% w/w significantly reduced the number of egg masses as compared with neem leaves at 3% w/w, aza at 0.05% and control. All the neern formulations significantly reduced the number of eggs per egg mass on' the untreated root portion. The effect of neem leaves and cake on the development of root-knot nematodes was tested at 2, 4, 6, 8, and 16 weeks after their application to soil. Even after 16 weeks all the treatments significantly reduced the galling index and number of egg masses but their effectiveness declined over time. After storing neem leaves, cake and aza for 8 months under ambient conditions the efficacy of neem leaves and aza, against root-knot nematodes, remained stable whereas that of cake declined. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Research shows that poor indoor air quality (IAQ) in school buildings can cause a reduction in the students’ performance assessed by short-term computer-based tests; whereas good air quality in classrooms can enhance children's concentration and also teachers’ productivity. Investigation of air quality in classrooms helps us to characterise pollutant levels and implement corrective measures. Outdoor pollution, ventilation equipment, furnishings, and human activities affect IAQ. In school classrooms, the occupancy density is high (1.8–2.4 m2/person) compared to offices (10 m2/person). Ventilation systems expend energy and there is a trend to save energy by reducing ventilation rates. We need to establish the minimum acceptable level of fresh air required for the health of the occupants. This paper describes a project, which will aim to investigate the effect of IAQ and ventilation rates on pupils’ performance and health using psychological tests. The aim is to recommend suitable ventilation rates for classrooms and examine the suitability of the air quality guidelines for classrooms. The air quality, ventilation rates and pupils’ performance in classrooms will be evaluated in parallel measurements. In addition, Visual Analogue Scales will be used to assess subjective perception of the classroom environment and SBS symptoms. Pupil performance will be measured with Computerised Assessment Tests (CAT), and Pen and Paper Performance Tasks while physical parameters of the classroom environment will be recorded using an advanced data logging system. A total number of 20 primary schools in the Reading area are expected to participate in the present investigation, and the pupils participating in this study will be within the age group of 9–11 years. On completion of the project, based on the overall data recommendations for suitable ventilation rates for schools will be formulated.
Resumo:
Formal and analytical models that contractors can use to assess and price project risk at the tender stage have proliferated in recent years. However, they are rarely used in practice. Introducing more models would, therefore, not necessarily help. A better understanding is needed of how contractors arrive at a bid price in practice, and how, and in what circumstances, risk apportionment actually influences pricing levels. More than 60 proposed risk models for contractors that are published in journals were examined and classified. Then exploratory interviews with five UK contractors and documentary analyses on how contractors price work generally and risk specifically were carried out to help in comparing the propositions from the literature to what contractors actually do. No comprehensive literature on the real bidding processes used in practice was found, and there is no evidence that pricing is systematic. Hence, systematic risk and pricing models for contractors may have no justifiable basis. Contractors process their bids through certain tendering gateways. They acknowledge the risk that they should price. However, the final settlement depends on a set of complex, micro-economic factors. Hence, risk accountability may be smaller than its true cost to the contractor. Risk apportionment occurs at three stages of the whole bid-pricing process. However, analytical approaches tend not to incorporate this, although they could.
Resumo:
During locomotion, retinal flow, gaze angle, and vestibular information can contribute to one's perception of self-motion. Their respective roles were investigated during active steering: Retinal flow and gaze angle were biased by altering the visual information during computer-simulated locomotion, and vestibular information was controlled through use of a motorized chair that rotated the participant around his or her vertical axis. Chair rotation was made appropriate for the steering response of the participant or made inappropriate by rotating a proportion of the veridical amount. Large steering errors resulted from selective manipulation of retinal flow and gaze angle, and the pattern of errors provided strong evidence for an additive model of combination. Vestibular information had little or no effect on steering performance, suggesting that vestibular signals are not integrated with visual information for the control of steering at these speeds.
Resumo:
Motor vehicle accidents are one of the principal causes of adolescent disability or mortality and male drivers are more likely to be involved in road accidents than female drivers. In part such associations between driver age and sex have been linked to differences in risky behaviour (e.g. speed, violations) and individual characteristics (e.g. sensation seeking, deviant behaviour). The aim of this research is to determine whether associations between risky road user behaviour and individual characteristics are a function of driver behaviour or whether they are intrinsic and measurable in individuals too young to drive. Five hundred and sixty-seven pre-driver students aged 11-16 from three secondary schools completed questionnaires measuring enthusiasm for speed, sensation seeking, deviant behaviour and attitudes towards driver violations. Boys reported more risky attitudes than girls for all measures. Associations between sensation seeking, deviant behaviour and attitudes towards risky road use were present from early adolescence and were strongest around age 14, before individuals learn to drive. Risky attitudes towards road use are associated with individual characteristics and are observed in adolescents long before they learn to drive. Safe attitudes towards road use and driver behaviour should be promoted from childhood in order to be effective. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
The paper introduces an efficient construction algorithm for obtaining sparse linear-in-the-weights regression models based on an approach of directly optimizing model generalization capability. This is achieved by utilizing the delete-1 cross validation concept and the associated leave-one-out test error also known as the predicted residual sums of squares (PRESS) statistic, without resorting to any other validation data set for model evaluation in the model construction process. Computational efficiency is ensured using an orthogonal forward regression, but the algorithm incrementally minimizes the PRESS statistic instead of the usual sum of the squared training errors. A local regularization method can naturally be incorporated into the model selection procedure to further enforce model sparsity. The proposed algorithm is fully automatic, and the user is not required to specify any criterion to terminate the model construction procedure. Comparisons with some of the existing state-of-art modeling methods are given, and several examples are included to demonstrate the ability of the proposed algorithm to effectively construct sparse models that generalize well.
Resumo:
Encyclopedia entry for "idioms" and "idiomatic expressions" in Italian
Resumo:
Understanding human movement is key to improving input devices and interaction techniques. This paper presents a study of mouse movements of motion-impaired users, with an aim to gaining a better understanding of impaired movement. The cursor trajectories of six motion-impaired users and three able-bodied users are studied according to their submovement structure. Several aspects of the movement are studied, including the frequency and duration of pauses between submovements, verification times, the number of submovements, the peak speed of submovements and the accuracy of submovements in two-dimensions. Results include findings that some motion-impaired users pause more often and for longer than able-bodied users, require up to five times more submovements to complete the same task, and exhibit a correlation between error and peak submovement speed that does not exist for able-bodied users.
Resumo:
In 'Avalanche', an object is lowered, players staying in contact throughout. Normally the task is easily accomplished. However, with larger groups counter-intuitive behaviours appear. The paper proposes a formal theory for the underlying causal mechanisms. The aim is to not only provide an explicit, testable hypothesis for the source of the observed modes of behaviour-but also to exemplify the contribution that formal theory building can make to understanding complex social phenomena. Mapping reveals the importance of geometry to the Avalanche game; each player has a pair of balancing loops, one involved in lowering the object, the other ensuring contact. For more players, sets of balancing loops interact and these can allow dominance by reinforcing loops, causing the system to chase upwards towards an ever-increasing goal. However, a series of other effects concerning human physiology and behaviour (HPB) is posited as playing a role. The hypothesis is therefore rigorously tested using simulation. For simplicity a 'One Degree of Freedom' case is examined, allowing all of the effects to be included whilst rendering the analysis more transparent. Formulation and experimentation with the model gives insight into the behaviours. Multi-dimensional rate/level analysis indicates that there is only a narrow region in which the system is able to move downwards. Model runs reproduce the single 'desired' mode of behaviour and all three of the observed 'problematic' ones. Sensitivity analysis gives further insight into the system's modes and their causes. Behaviour is seen to arise only when the geometric effects apply (number of players greater than degrees of freedom of object) in combination with a range of HPB effects. An analogy exists between the co-operative behaviour required here and various examples: conflicting strategic objectives in organizations; Prisoners' Dilemma and integrated bargaining situations. Additionally, the game may be relatable in more direct algebraic terms to situations involving companies in which the resulting behaviours are mediated by market regulations. Finally, comment is offered on the inadequacy of some forms of theory building and the case is made for formal theory building involving the use of models, analysis and plausible explanations to create deep understanding of social phenomena.
Resumo:
We present a new approach to determine palaeotemperatures (mean annual surface temperatures) based on measurements of the liquid–vapour homogenisation temperature of fluid inclusions in stalagmites. The aim of this study is to explore the potential and the limitations of this new palaeothermometer and to develop a reliable methodology for routine applications in palaeoclimate research. Therefore, we have investigated recent fluid inclusions from the top part of actively growing stalagmites that have formed at temperatures close to the present-day cave air temperature. A precondition for measuring homogenisation temperatures of originally monophase inclusions is the nucleation of a vapour bubble by means of single ultra-short laser pulses. Based on the observed homogenisation temperatures (Th(obs)) and measurements of the vapour bubble diameter at a known temperature, we calculated stalagmite formation temperatures (Tf) by applying a thermodynamic model that takes into account the effect of surface tension on liquid–vapour homogenisation. Results from recent stalagmite samples demonstrate that calculated stalagmite formation temperatures match the present-day cave air temperature within ± 0.2 °C. To avoid artificially induced changes of the fluid density we defined specific demands on the selection, handling and preparation of the stalagmite samples. Application of the method is restricted to stalagmites that formed at cave temperatures greater than ~ 9–11 °C.