39 resultados para Distress thermometer
em CentAUR: Central Archive University of Reading - UK
Resumo:
An improved amplifier for atmospheric fine wire resistance thermometry is described. The amplifier uses a low excitation current (50 mu A). This is shown to ensure negligible self-heating of the low mass fine wire resistance sensor, compared with measured nocturnal surface air temperature fluctuations. The system provides sufficient amplification for a +/- 50 degrees C span using a +/- 5 V dynamic range analog-to-digital converter, with a noise level of less than 0.01 degrees C. A Kelvin four-wire connection cancels the effect of long lead resistances: a 50 m length of screened cable connecting the Reading design of fine wire thermometer to the amplifier produced no measurable temperature change at 12 bit resolution.
Resumo:
Objective. This study investigated whether trait positive schizotypy or trait dissociation was associated with increased levels of data-driven processing and symptoms of post-traumatic distress following a road traffic accident. Methods. Forty-five survivors of road traffic accidents were recruited from a London Accident and Emergency service. Each completed measures of trait positive schizotypy, trait dissociation, data-driven processing, and post-traumatic stress. Results. Trait positive schizotypy was associated with increased levels of data-driven processing and post-traumatic symptoms during a road traffic accident, whereas trait dissociation was not. Conclusions. Previous results which report a significant relationship between trait dissociation and post-traumatic symptoms may be an artefact of the relationship between trait positive schizotypy and trait dissociation.
Resumo:
This work analyzes the use of linear discriminant models, multi-layer perceptron neural networks and wavelet networks for corporate financial distress prediction. Although simple and easy to interpret, linear models require statistical assumptions that may be unrealistic. Neural networks are able to discriminate patterns that are not linearly separable, but the large number of parameters involved in a neural model often causes generalization problems. Wavelet networks are classification models that implement nonlinear discriminant surfaces as the superposition of dilated and translated versions of a single "mother wavelet" function. In this paper, an algorithm is proposed to select dilation and translation parameters that yield a wavelet network classifier with good parsimony characteristics. The models are compared in a case study involving failed and continuing British firms in the period 1997-2000. Problems associated with over-parameterized neural networks are illustrated and the Optimal Brain Damage pruning technique is employed to obtain a parsimonious neural model. The results, supported by a re-sampling study, show that both neural and wavelet networks may be a valid alternative to classical linear discriminant models.
Resumo:
Analyzes the use of linear and neural network models for financial distress classification, with emphasis on the issues of input variable selection and model pruning. A data-driven method for selecting input variables (financial ratios, in this case) is proposed. A case study involving 60 British firms in the period 1997-2000 is used for illustration. It is shown that the use of the Optimal Brain Damage pruning technique can considerably improve the generalization ability of a neural model. Moreover, the set of financial ratios obtained with the proposed selection procedure is shown to be an appropriate alternative to the ratios usually employed by practitioners.
Resumo:
This paper is concerned with the use of a genetic algorithm to select financial ratios for corporate distress classification models. For this purpose, the fitness value associated to a set of ratios is made to reflect the requirements of maximizing the amount of information available for the model and minimizing the collinearity between the model inputs. A case study involving 60 failed and continuing British firms in the period 1997-2000 is used for illustration. The classification model based on ratios selected by the genetic algorithm compares favorably with a model employing ratios usually found in the financial distress literature.
Resumo:
Systematic natural ventilation effects on measured temperatures within a standard large wooden thermometer screen are investigated under summer conditions, using well-calibrated platinum resistance thermometers. Under low ventilation (2mwind speed u2 < 1.1 m s−1), the screen slightly underestimates daytime air temperature but overestimates air temperature nocturnally by 0.2◦C. The screen’s lag time L lengthens with decreasing wind speed, following an inverse power law relationship between L and u2. For u2 > 2 m s−1, L ∼ 2.5 min, increasing, when calm, to at least 15 min. Spectral response properties of the screen to air temperature fluctuations vary with wind speed because of the lag changes. Ventilation effects are particularly apparent at the higher (>25◦C) temperatures, both through the lag effect and from solar heating. For sites where wind speed decreases with increasing daytime temperature, thermometer screen temperatures may consequently show larger uncertainties at the higher temperatures. Under strong direct beam solar radiation (>850W m−2) the radiation effect is likely to be <0.4◦C. Copyright c 2011 RoyalMeteorological Society
Resumo:
Relative humidity (RH) measurements, as derived from wet-bulb and dry-bulb thermometers operated as a psychrometer within a thermometer screen, have limited accuracy because of natural ventilation variations. Standard RH calculations generally assume a fixed screen psychrometer coefficient, but this is too small during poor ventilation. By comparing a reference humidity probe—exposed within a screen containing a psychrometer—with wind-speed measurements under controlled conditions, a wind-speed correction for the screen psychrometer coefficient has been derived and applicable when 2-metre wind speeds fall below 3 ms–1. Applying this to hourly-averaged data reduced the mean moist RH bias of the psychrometer (over the reference probe) from 1.2% to 0.4%, and reduced the inter-quartile range of the RH differences from 2.0% to 0.8%. This correction is particularly amenable to automatic measurement systems.
Resumo:
We present a new approach to determine palaeotemperatures (mean annual surface temperatures) based on measurements of the liquid–vapour homogenisation temperature of fluid inclusions in stalagmites. The aim of this study is to explore the potential and the limitations of this new palaeothermometer and to develop a reliable methodology for routine applications in palaeoclimate research. Therefore, we have investigated recent fluid inclusions from the top part of actively growing stalagmites that have formed at temperatures close to the present-day cave air temperature. A precondition for measuring homogenisation temperatures of originally monophase inclusions is the nucleation of a vapour bubble by means of single ultra-short laser pulses. Based on the observed homogenisation temperatures (Th(obs)) and measurements of the vapour bubble diameter at a known temperature, we calculated stalagmite formation temperatures (Tf) by applying a thermodynamic model that takes into account the effect of surface tension on liquid–vapour homogenisation. Results from recent stalagmite samples demonstrate that calculated stalagmite formation temperatures match the present-day cave air temperature within ± 0.2 °C. To avoid artificially induced changes of the fluid density we defined specific demands on the selection, handling and preparation of the stalagmite samples. Application of the method is restricted to stalagmites that formed at cave temperatures greater than ~ 9–11 °C.
Resumo:
Most research on the discourses and practices of urban regeneration in-the UK has examined case studies located in areas of relative socio-economic distress. Less research has been undertaken on regeneration projects and agendas in areas characterise by strong economic growth. Yet, it is in such places that some of the best examples of the discourses, practices and impacts of contemporary urban regeneration can be. found. In some areas of high demand regeneration projects have used inner urban brownfield sites as locations for new investment. With the New Labour government's urban policy agendas targeting similar forms of regeneration, an examination of completed or on-going schemes is timely and relevant to debates over the direction that policy should take. This paper, drawing on a study of urban regeneration in one of England's, fastest growing towns, Reading in Berkshire, examines the discourses, practices and impacts of redevelopment schemes during the 1990s and 2000s. Reading's experiences have received national attention and have been hailed as a model for other urban areas to follow. The research documents the discursive and concrete aspects of local regeneration and examines the ways in which specific priorities and defined problems have come to dominate agendas. Collectively, the study argues that market-driven objectives come to dominate regeneration agendas, even in areas of strong demand where development agencies wield a relatively high degree of influence. Such regeneration plays a symbolic and practical role in creating new forms of exclusion and interpretations of place. (C) 2003 Elsevier Science Ltd. All rights reserved.
Resumo:
Thermometer screen properties are poorly characterised at low wind speeds. Temperatures from a large thermometer screen have been compared with those from an automatically shaded open-air fine-wire resistance thermometer. For the majority of 5-minute average measurements obtained between July 2008 and 2009, the screen and fine-wire temperatures agreed closely, with a median difference <0.05◦C. At low wind speeds however, larger temperature differences occurred. When calm (wind speed at 2 metres, u2, ≤ 0.1 m s−1), the difference between screen and open-air temperatures varied from −0.25◦C to +0.87◦C. At night with u2 < 0.5 m s−1, this difference was −0.14◦C to 0.39◦C, and, rarely, up to −0.68◦C to 1.38◦C. At the minimum in the daily temperature cycle, the semi-urban site at Reading had u2 < 1 m s−1 for 52% of the observations 1997–2008, u2 < 0.5 m s−1 for 34% and calm conditions for 20%. Consequently uncertainties in the minimum temperature measurements may arise from poor ventilation, which can propagate through calculations to daily average temperatures. In comparison with the daily minimum temperature, the 0900 UTC synoptic temperature measurement has a much lower abundance (5%) of calm conditions.
Resumo:
The conventional method for assessing acute oral toxicity (OECD Test Guideline 401) was designed to identify the median lethal dose (LD50), using the death of animals as an endpoint. Introduced as an alternative method (OECD Test Guideline 420), the Fixed Dose Procedure (FDP) relies on the observation of clear signs of toxicity, uses fewer animals and causes less suffering. More recently, the Acute Toxic Class method and the Up-and-Down Procedure have also been adopted as OECD test guidelines. Both of these methods also use fewer animals than the conventional method, although they still use death as an endpoint. Each of the three new methods incorporates a sequential dosing procedure, which results in increased efficiency. In 1999, with a view to replacing OECD Test Guideline 401, the OECD requested that the three new test guidelines be updated. This was to bring them in line with the regulatory needs of all OECD Member Countries, provide further reductions in the number of animals used, and introduce refinements to reduce the pain and distress experienced by the animals. This paper describes a statistical modelling approach for the evaluation of acute oral toxicity tests, by using the revised FDP for illustration. Opportunities for further design improvements are discussed.