950 resultados para Automated Hazard Analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Contributing to the evaluation of seismic hazards, a previously unmapped strand of the Seattle Fault Zone (SFZ), cutting across the southwest side of Lake Washington and southeast Seattle, is located and characterized on the basis of bathymetry, borehole logs, and ground penetrating radar (GPR). Previous geologic mapping and geophysical analysis of the Seattle area have generally mapped the locations of some strands of the SFZ, though a complete and accurate understanding of locations of all individual strands of the fault system is still incomplete. A bathymetric scarp-like feature and co-linear aeromagnetic anomaly lineament defined the extent of the study area. A 2-dimensional lithology cross-section was constructed using six boreholes, chosen from suitable boreholes in the study area. In addition, two GPR transects, oblique to the proposed fault trend, served to identify physical differences in subsurface materials. The proposed fault trace follows the previously mapped contact between the Oligocene Blakeley Formation and Quaternary deposits, and topographic changes in slope. GPR profiles in Seward Park and across the proposed fault location show the contact between the Blakeley Formation and unconsolidated glacial deposits, but it does not constrain an offset. However, north-dipping beds in the Blakely Formation are consistent with previous interpretations of P-wave seismic profiles on Mercer Island and Bellevue, Washington. The profiles show the mapped location of the aeromagnetic lineament in Lake Washington and the inferred location of the steeply-dipping, high-amplitude bedrock reflector, representing a fault strand. This north-dipping reflector is likely the same feature identified in my analysis. I characterize the strand as a splay fault, antithetic to the frontal fault of the SFZ. This new fault may pose a geologic hazard to the region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to apply multifailure survival methods to analyze time to multiple occurrences of basal cell carcinoma (BCC). Data from 4.5 years of follow-up in a randomized controlled trial, the Nambour Skin Cancer Prevention Trial (1992-1996), to evaluate skin cancer prevention were used to assess the influence of sunscreen application on the time to first BCC and the time to subsequent BCCs. Three different approaches of time to ordered multiple events were applied and compared: the Andersen-Gill, Wei-Lin-Weissfeld, and Prentice-Williams-Peterson models. Robust variance estimation approaches were used for all multifailure survival models. Sunscreen treatment was not associated with time to first occurrence of a BCC (hazard ratio = 1.04, 95% confidence interval: 0.79, 1.45). Time to subsequent BCC tumors using the Andersen-Gill model resulted in a lower estimated hazard among the daily sunscreen application group, although statistical significance was not reached (hazard ratio = 0.82, 95% confidence interval: 0.59, 1.15). Similarly, both the Wei-Lin-Weissfeld marginal-hazards and the Prentice-Williams-Peterson gap-time models revealed trends toward a lower risk of subsequent BCC tumors among the sunscreen intervention group. These results demonstrate the importance of conducting multiple-event analysis for recurring events, as risk factors for a single event may differ from those where repeated events are considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Good quality concept lattice drawings are required to effectively communicate logical structure in Formal Concept Analysis. Data analysis frameworks such as the Toscana System use manually arranged concept lattices to avoid the problem of automatically producing high quality lattices. This limits Toscana systems to a finite number of concept lattices that have been prepared a priori. To extend the use of formal concept analysis, automated techniques are required that can produce high quality concept lattice drawings on demand. This paper proposes and evaluates an adaption of layer diagrams to improve automated lattice drawing. © Springer-Verlag Berlin Heidelberg 2006.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

(Magill, M., Quinzii, M., 2002. Capital market equilibrium with moral hazard. Journal of Mathematical Economics 38, 149-190) showed that, in a stockmarket economy with private information, the moral hazard problem may be resolved provided that a spanning overlap condition is satisfed. This result depends on the assumption that the technology is given by a stochastic production function with a single scalar input. The object of the present paper is to extend the analysis of Magill and Quinzii to the case of multiple inputs. We show that their main result extends to this general case if and only if, for each firm, the number of linearly independent combinations of securities having payoffs correlated with, but not dependent on, the firms output is equal to the number of degrees of freedom in the firm's production technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present AUSLEM (AUStralian Land Erodibility Model), a land erodibility modelling system that utilizes a rule-set of surficial and climatic thresholds applied through a Geographic Information System (GIs) modelling framework to predict landscape susceptibility to wind erosion. AUSLEM is distinctive in that it quantitatively assesses landscape susceptibility to wind erosion at a 5 x 5 km. spatial resolution on a monthly time-step across Australia. The system was implemented for representative wet (1984), dry (1994), and average rainfall (1997) years with corresponding low, high and moderate dust storm day frequencies. Results demonstrate that AUSLEM can identify landscape erodibility, and provide an interpretation of the physical nature and distribution of erodible landscapes in Australia. Further, results offer an assessment of the dynamic tendencies of erodibility in space and time in response to the El Nino Southern Oscillation (ENSO) and seasonal synoptic scale climate variability. A comparative analysis of AUSLEM output with independent national and international wind erosion, atmospheric aerosol and dust event records indicates a high level of model competency. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fuzzy signal detection analysis can be a useful complementary technique to traditional signal detection theory analysis methods, particularly in applied settings. For example, traffic situations are better conceived as being on a continuum from no potential for hazard to high potential, rather than either having potential or not having potential. This study examined the relative contribution of sensitivity and response bias to explaining differences in the hazard perception performance of novices and experienced drivers, and the effect of a training manipulation. Novice drivers and experienced drivers were compared (N = 64). Half the novices received training, while the experienced drivers and half the novices remained untrained. Participants completed a hazard perception test and rated potential for hazard in occluded scenes. The response latency of participants to the hazard perception test replicated previous findings of experienced/novice differences and trained/untrained differences. Fuzzy signal detection analysis of both the hazard perception task and the occluded rating task suggested that response bias may be more central to hazard perception test performance than sensitivity, with trained and experienced drivers responding faster and with a more liberal bias than untrained novices. Implications for driver training and the hazard perception test are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research examines a behavioural based safety (BBS) intervention within a paper mill in the South East of England. Further to this intervention two other mills are examined for the purposes of comparison — one an established BBS programme and the other an improving safety management system through management ownership. BBS programmes have become popular within the UK, but most of the research about their efficacy is carried out by the BBS providers themselves. This thesis aims to evaluate a BBS intervention from a standpoint which is not commercially biased in favour of BBS schemes. The aim of a BBS scheme is to either change personnel behaviours or attitudes, which in turn will positively affect the organisation's safety culture. The research framework involved a qualitative methodology in order to examine the effects of the intervention on the paper mill's safety culture. The techniques used were questionnaires and semi structured interviews, in addition to observation and discussions which were possible because of the author's position as participant observer. The results demonstrated a failure to improve any aspect of the mill's safety culture, which worsened following the BBS intervention. Issues such as trust, morale, communication and support of management showed significant signs of negative workforce response. The paper mill where the safety management system approach was utilised demonstrated a significantly improved safety culture and achieved site ownership from middle managers and supervisors. Research has demonstrated that a solid foundation is required prior to successfully implementing a BBS programme. For a programme to work there must be middle management support in addition to senior management commitment. If a trade union actively distances itself from BBS, it is also unlikely to be effective. This thesis proposes that BBS observation programmes are not suitable for the papermaking industry, particularly when staffing levels are low due to challenging economic conditions. Observers are not available when there are high hazard situations and this suggests that BBS implementation is not the correct intervention for the paper industry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Peptidic Nucleic Acids (PNAs) are achiral, uncharged nucleic add mimetics, with a novel backbone composed of N-(2-aminoethyl)glycine units attached to the DNA bases through carboxymethylene linkers. With the aim of extending and improving upon the molecular recognition properties of PNAs, the aim of this work was to synthesjse PNA building block intermediates containing a series of substituted purine bases for subsequent use in automated PNA synthesis. Four purine bases: 2,6~diaminopurine (D), isoGuanine (isoG), xanthine (X) and hypoxanthine (H) were identified for incorporation into PNAs targeted to DNA, with the promise of increased hybrid stability over extended pH ranges together with improvements over the use of adenine (A) in duplex formation, and cytosine (C) in triplex formation. A reliable, high-yielding synthesis of the PNA backbone component N -('2- butyloxycarbonyl-aminoethyl)glycinate ethyl ester was establishecl. The precursor N~(2-butyloxycarbonyl)amino acetonitrile was crystallised and analysed by X-ray crystallography for the first time. An excellent refinement (R = 0.0276) was attained for this structure, allowing comparisons with known analogues. Although chemical synthesis of pure, fully-characterised PNA monomers was not achieved, chemical synthesis of PNA building blocks composed of diaminopurine, xanthine and hypoxanthine was completely successful. In parallel, a second objective of this work was to characterise and evaluate novel crystalline intermediates, which formed a new series of substituted purine bases, generated by attaching alkyl substituents at the N9 or N7 sites of purine bases. Crystallographic analysis was undertaken to probe the regiochemistry of isomers, and to reveal interesting structural features of the new series of similarly-substituted purine bases. The attainment of the versatile synthetic intermediate 2,6-dichloro~9- (carboxymethyl)purine ethyl ester, and its homologous regioisomers 6-chloro~9- (carboxymethyl)purine ethyl ester and 6-chloro-7-(carboxymethyl)purine ethyl ester, necessitated the use of X-ray crystallographic analysis for unambiguous structural assignment. Successful refinement of the disordered 2,6-diamino-9-(carboxymethyl) purine ethyl ester allowed comparison with the reported structure of the adenine analogue, ethyl adenin-9-yl acetate. Replacement of the chloro moieties with amino, azido and methoxy groups expanded the internal angles at their point of attachment to the purine ring. Crystallographic analysis played a pivotal role towards confirming the identity of the peralkylated hypoxanthine derivative diethyl 6-oxo-6,7-dihydro-3H-purlne~3,7~djacetate, where two ethyl side chains were found to attach at N3 and N7,

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A series of N1-benzylideneheteroarylcarboxamidrazones was prepared in an automated fashion, and tested against Mycobacterium fortuitum in a rapid screen for antimycobacterial activity. Many of the compounds from this series were also tested against Mycobacterium tuberculosis, and the usefulness as M.fortuitum as a rapid, initial screen for anti-tubercular activity evaluated. Various deletions were made to the N1-benzylideneheteroarylcarboxamidrazone structure in order to establish the minimum structural requirements for activity. The N1-benzylideneheteroarylcarbox-amidrazones were then subjected to molecular modelling studies and their activities against M.fortuitum and M.tuberculosis were analysed using quantitative structure-analysis relationship (QSAR) techniques in the computational package TSAR (Oxford Molecular Ltd.). A set of equations predictive of antimycobacterial activity was hereby obtained. The series of N1-benzylidenehetero-arylcarboxamidrazones was also tested against a multidrug-resistant strain of Staphylococcus aureus (MRSA), followed by a panel of Gram-positive and Gram-negative bacteria, if activity was observed for MRSA. A set of antimycobacterial N1-benzylideneheteroarylcarboxamidrazones was hereby discovered, the best of which had MICs against m. fortuitum in the range 4-8μgml-1 and displayed 94% inhibition of M.tuberculosis at a concentration of 6.25μgml-1. The antimycobacterial activity of these compounds appeared to be specific, since the same compounds were shown to be inactive against other classes of organisms. Compounds which were found to be sufficiently active in any screen were also tested for their toxicity against human mononuclear leucocytes. Polyethylene glycol (PEG) was used as a soluble polymeric support for the synthesis of some fatty acid derivatives, containing an isoxazoline group, which may inhibit mycolic acid synthesis in mycobacteria. Both the PEG-bound products and the cleaved, isolated products themselves were tested against M.fortuitum and some low levels of antimycobacterial activity were observed, which may serve as lead compounds for further studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A detailed literature survey confirmed cold roll-forming to be a complex and little understood process. In spite of its growing value, the process remains largely un-automated with few principles used in set-up of the rolling mill. This work concentrates on experimental investigations of operating conditions in order to gain a scientific understanding of the process. The operating conditions are; inter-pass distance, roll load, roll speed, horizontal roll alignment. Fifty tests have been carried out under varied operating conditions, measuring section quality and longitudinal straining to give a picture of bending. A channel section was chosen for its simplicity and compatibility with previous work. Quality measurements were measured in terms of vertical bow, twist and cross-sectional geometric accuracy, and a complete method of classifying quality has been devised. The longitudinal strain profile was recorded, by the use of strain gauges attached to the strip surface at five locations. Parameter control is shown to be important in allowing consistency in section quality. At present rolling mills are constructed with large tolerances on operating conditions. By reduction of the variability in parameters, section consistency is maintained and mill down-time is reduced. Roll load, alignment and differential roll speed are all shown to affect quality, and can be used to control quality. Set-up time is reduced by improving the design of the mill so that parameter values can be measured and set, without the need for judgment by eye. Values of parameters can be guided by models of the process, although elements of experience are still unavoidable. Despite increased parameter control, section quality is variable, if only due to variability in strip material properties. Parameters must therefore be changed during rolling. Ideally this can take place by closed-loop feedback control. Future work lies in overcoming the problems connected with this control.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The number of fatal accidents in the agricultural, horticultural and forestry industry in Great Britain has declined from an annual rate of about 135 in the 1960's to its current level of about 50. Changes to the size and makeup of the population at risk mean that there has been no real improvement in fatal injury incidence rates for farmers. The Health and Safety Executives' (HSE) current system of accident investigation, recording, and analysis is directed primarily at identifying fault, allocating blame, and punishing wrongdoers. Relatively little information is recorded about the personal and organisational factors that contributed to, or failed to prevent accidents. To develop effective preventive strategies, it is important to establish whether errors by the victims and others, occur at the skills, rules, or knowledge level of functioning: are violations of some rule or procedure; or stem from failures to correctly appraise, or control a hazard. A modified version of the Hale and Glendon accident causation model was used to study 230 fatal accidents. Inspectors' original reports were examined and expert judgement applied to identify and categorise the errors committed by each of the parties involved. The highest proportion of errors that led directly to accidents occurred whilst the victims were operating at the knowledge level. The mix and proportion of errors varied considerably between different classes of victim and kind of accident. Different preventive strategies will be needed to address the problem areas identified.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The northern half of the parish of St. Catherine in Jamaica was selected as a test area to study, by means of remote sensing, the problems of soil erosion in a tropical environment. An initial study was carried out to determine whether eroded land within this environment could be successfully interpreted and mapped from the available 1: 25,000 scale aerial photographs. When satisfied that a sufficiently high percentage of the eroded land could be interpreted on the aerial photographs the main study was initiated. This involved interpreting the air photo cover of the study area for identifying and classifying land use and eroded land, and plotting the results on overlays on topographic base maps. These overlays were then composited with data on the soils and slopes of the study area. The areas of different soil type/slope/land use combinations were then measured, as was the area of eroded land for each of these combinations. This data was then analysed in two ways. The first way involved determining which of the combinations of soil type, slope and land use were most and least eroded and, on the basis of this, to draw up recommendations concerning future land use. The second analysis was aimed at determining which of the three factors, soil type, slope and land use, was most responsible for determining the rate of erosion. Although it was possible to show that slope was not very significant in determining the rate of erosion, it was much more difficult to separate the effects of land use and soil type. The results do, however, suggest that land use is more significant than soil type in determining the rate of erosion within the study area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study evaluated sources of within- and between-subject variability in standard white-on-white (W-W) perimetry and short-wavelength automated perimetry (SWAP). The Influence of staircase strategy on the fatigue effect in W-W perimetry was investigated for a 4 dB single step, single reversal strategy; a variable step size, single reversal dynamic strategy; and the standard 4-2 dB double reversal strategy. The fatigue effect increased as the duration of the examination Increased and was greatest in the second eye for all strategies. The fatigue effect was lowest for the 4dB strategy, which exhibited the shortest examination time and was greatest for the 4-2 dB strategy, which exhibited the longest examination time. Staircase efficiency was lowest for the 4 dB strategy and highest for the dynamic strategy which thus offers a reduced examination time and low inter-subject variability. The normal between-subject variability of SWAP was determined for the standard 4-2 dB double reversal strategy and the 3 dB single reversal FASTPAC strategy and compared to that of W-W perimetry, The decrease in sensitivity with Increase in age was greatest for SWAP. The between-subject variability of SWAP was greater than W-W perimetry. Correction for the Influence of ocular media absorption reduced the between-subject variability of SWAP, The FASTPAC strategy yielded the lowest between-subject variability In SWAP, but the greatest between-subject variability In WoW perimetry. The greater between-subject variability of SWAP has profound Implications for the delineation of visual field abnormality, The fatigue effect for the Full Threshold strategy in SWAP was evaluated with conventional opaque, and translucent occlusion of the fellow eye. SWAP exhibited a greater fatigue effect than W-W perimetry. Translucent occlusion reduced the between-subject variability of W-W perimetry but Increased the between-subject variability of SWAP. The elevation of sensitivity was greater with translucent occlusion which has implications for the statistical analysis of W-W perimetry and SWAP. The influence of age-related cataract extraction and IOL implantation upon the visual field derived by WoW perimetry and SWAP was determined. Cataract yielded a general reduction In sensitivity which was preferentially greater in SWAP, even after the correction of SWAP for the attenuation of the stimulus by the ocular media. There was no correlation between either backward or forward light scatter and the magnitude of the attenuation of W-W or SWAP sensitivity. The post-operative mean deviation in SWAP was positive and has ramifications for the statistical Interpretation of SWAP. Short-wavelength-sensitive pathway isolation was assessed as a function of stimulus eccentricity using the two-colour Increment threshold method. At least 15 dB of SWS pathway Isolation was achieved for 440 nm, 450 nm and 460 nm stimuli at a background luminance of 100 cdm-2, There was a slight decrease In SWS pathway Isolation for all stimulus wavelengths with increasing eccentricity which was not of clinical significance. Adopting a 450 nm stimulus may reduce between-subject variability In SWAP due to a reduction In ocular media absorption and macular pigment absorption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Progressive addition spectacle lenses (PALs) have now become the method of choice for many presbyopic individuals to alleviate the visual problems of middle-age. Such lenses are difficult to assess and characterise because of their lack of discrete geographical locators of their key features. A review of the literature (mostly patents) describing the different designs of these lenses indicates the range of approaches to solving the visual problem of presbyopia. However, very little is published about the comparative optical performance of these lenses. A method is described here based on interferometry for the assessment of PALs, with a comparison of measurements made on an automatic focimeter. The relative merits of these techniques are discussed. Although the measurements are comparable, it is considered that the interferometry method is more readily automated, and would be ultimately capable of producing a more rapid result.