950 resultados para Automated Hazard Analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Means to automate the fact replace the man in their job functions for a man and machines automatic mechanism, ie documentary specialists in computer and computers are the cornerstone of any modern system of documentation and information. From this point of view immediately raises the problem of deciding what resources should be applied to solve the specific problem in each specific case. We will not let alone to propose quick fixes or recipes in order to decide what to do in any case. The solution depends on repeat for each particular problem. What we want is to move some points that can serve as a basis for reflection to help find the best solution possible, once the problem is defined correctly. The first thing to do before starting any automated system project is to define exactly the domain you want to cover and assess with greater precision possible importance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Statistical analysis of DNA microarray data provides a valuable diagnostic tool for the investigation of genetic components of diseases. To take advantage of the multitude of available data sets and analysis methods, it is desirable to combine both different algorithms and data from different studies. Applying ensemble learning, consensus clustering and cross-study normalization methods for this purpose in an almost fully automated process and linking different analysis modules together under a single interface would simplify many microarray analysis tasks. Results: We present ArrayMining.net, a web-application for microarray analysis that provides easy access to a wide choice of feature selection, clustering, prediction, gene set analysis and cross-study normalization methods. In contrast to other microarray-related web-tools, multiple algorithms and data sets for an analysis task can be combined using ensemble feature selection, ensemble prediction, consensus clustering and cross-platform data integration. By interlinking different analysis tools in a modular fashion, new exploratory routes become available, e.g. ensemble sample classification using features obtained from a gene set analysis and data from multiple studies. The analysis is further simplified by automatic parameter selection mechanisms and linkage to web tools and databases for functional annotation and literature mining. Conclusion: ArrayMining.net is a free web-application for microarray analysis combining a broad choice of algorithms based on ensemble and consensus methods, using automatic parameter selection and integration with annotation databases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As one of the newest members in the field of articial immune systems (AIS), the Dendritic Cell Algorithm (DCA) is based on behavioural models of natural dendritic cells (DCs). Unlike other AIS, the DCA does not rely on training data, instead domain or expert knowledge is required to predetermine the mapping between input signals from a particular instance to the three categories used by the DCA. This data preprocessing phase has received the criticism of having manually over-fitted the data to the algorithm, which is undesirable. Therefore, in this paper we have attempted to ascertain if it is possible to use principal component analysis (PCA) techniques to automatically categorise input data while still generating useful and accurate classication results. The integrated system is tested with a biometrics dataset for the stress recognition of automobile drivers. The experimental results have shown the application of PCA to the DCA for the purpose of automated data preprocessing is successful.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Toppling analysis of a precariously balanced rock (PBR) can provide insights into the nature of ground motion that has not occurred at that location in the past and, by extension, realistic constraints on peak ground motions for use in engineering design. Earlier approaches have targeted simplistic 2-D models of the rock or modeled the rock-pedestal contact using spring-damper assemblies that require re-calibration for each rock. These analyses also assume that the rock does not slide on the pedestal. Here, a method to model PBRs in three dimensions is presented. The 3-D model is created from a point cloud of the rock, the pedestal, and their interface, obtained using Terrestrial Laser Scanning (TLS). The dynamic response of the model under earthquake excitation is simulated using a rigid body dynamics algorithm. The veracity of this approach is demonstrated by comparisons against data from shake table experiments. Fragility maps for toppling probability of the Echo Cliff PBR and the Pacifico PBR as a function of various ground motion parameters, rock-pedestal interface friction coefficient, and excitation direction are presented. The seismic hazard at these PBR locations is estimated using these maps. Additionally, these maps are used to assess whether the synthetic ground motions at these locations resulting from scenario earthquakes on the San Andreas Fault are realistic (toppling would indicate that the ground motions are unrealistically high).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Mount Meager Volcanic Complex (MMVC) in south-western British Columbia is a potentially active, hydrothermally altered massif comprising a series of steep, glaciated peaks. Climatic conditions and glacial retreat has led to the further weathering, exposure and de-buttressing of steep slopes composed of weak, unconsolidated material. This has resulted in an increased frequency of landslide events over the past few decades, many of which have dammed the rivers bordering the Complex. The breach of these debris dams presents a risk of flooding to the downstream communities. Preliminary mapping showed there are numerous sites around the Complex where future failure could occur. Some of these areas are currently undergoing progressive slope movement and display features to support this such as anti-scarps and tension cracks. The effect of water infiltration on stability was modelled using the Rocscience program Slide 6.0. The main site of focus was Mount Meager in the south- east of the Complex where the most recent landslide took place. Two profiles through Mount Meager were analysed along with one other location in the northern section of the MMVC, where instability had been detected. The lowest Factor of Safety (FOS) for each profile was displayed and an estimate of the volume which could be generated was deduced. A hazard map showing the inundation zones for various volumes of debris flows was created from simulations using LAHARZ. Results showed the massif is unstable, even before infiltration. Varying the amount of infiltration appears to have no significant impact on the FOS annually implying that small changes of any kind could also trigger failure. Further modelling could be done to assess the impact of infiltration over shorter time scales. The Slide models show the volume of material that could be delivered to the Lillooet River Valley to be of the order of 109 m3 which, based on the LAHARZ simulations, would completely inundate the valley and communities downstream. A major hazard of this is that the removal of such a large amount of material has the potential to trigger an explosive eruption of the geothermal system and renew volcanic activity. Although events of this size are infrequent, there is a significant risk to the communities downstream of the complex.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To explore the relationship between caregiver characteristics and the adequacy of domestic swimming pool fencing.A typical metropolitan area of a large Australian capital city, Brisbane.From a reanalysis of the dataset of the 1989 Brisbane Home Safety Survey of 1050 householders, associations between 10 caregiver factors, pool ownership, and quality of pool fencing, were analysed. Household characteristics relating to toddlers (children < or = 4 years), and socioeconomic measures were also included in the analyses. Pool fencing quality was measured on an ordinal scale derived from Australian Standards Association guidelines, confirmed through home visits by trained inspectors.Caregiver factors did not distinguish households with a swimming pool from those without, nor were they associated with adequacy of pool fencing among pool owners. Pool owners, with or without children, were less likely to perceive having a childproof fence as being important. Strongest correlates of adequacy of pool fencing were socioeconomic indicators of surrounding districts.These results do not support the arguments of opponents of compulsory pool fencing that caregiver factors are adequate to prevent toddler drownings and obviate the need for a pool fence. Pool owners do not appear to perceive their pool as a hazard for young children, and complacency about the adequacy of pool fencing needs to be replaced by increased caregiver health beliefs, skills, and perceptions. Article in Injury Prevention 3(4):257-61 · December 1997

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation focused on the longitudinal analysis of business start-ups using three waves of data from the Kauffman Firm Survey. The first essay used the data from years 2004-2008, and examined the simultaneous relationship between a firm’s capital structure, human resource policies, and its impact on the level of innovation. The firm leverage was calculated as, debt divided by total financial resources. Index of employee well-being was determined by a set of nine dichotomous questions asked in the survey. A negative binomial fixed effects model was used to analyze the effect of employee well-being and leverage on the count data of patents and copyrights, which were used as a proxy for innovation. The paper demonstrated that employee well-being positively affects the firm's innovation, while a higher leverage ratio had a negative impact on the innovation. No significant relation was found between leverage and employee well-being. The second essay used the data from years 2004-2009, and inquired whether a higher entrepreneurial speed of learning is desirable, and whether there is a linkage between the speed of learning and growth rate of the firm. The change in the speed of learning was measured using a pooled OLS estimator in repeated cross-sections. There was evidence of a declining speed of learning over time, and it was concluded that a higher speed of learning is not necessarily a good thing, because speed of learning is contingent on the entrepreneur's initial knowledge, and the precision of the signals he receives from the market. Also, there was no reason to expect speed of learning to be related to the growth of the firm in one direction over another. The third essay used the data from years 2004-2010, and determined the timing of diversification activities by the business start-ups. It captured when a start-up diversified for the first time, and explored the association between an early diversification strategy adopted by a firm, and its survival rate. A semi-parametric Cox proportional hazard model was used to examine the survival pattern. The results demonstrated that firms diversifying at an early stage in their lives show a higher survival rate; however, this effect fades over time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation examines the quality of hazard mitigation elements in a coastal, hazard prone state. I answer two questions. First, in a state with a strong mandate for hazard mitigation elements in comprehensive plans, does plan quality differ among county governments? Second, if such variation exists, what drives this variation? My research focuses primarily on Florida’s 35 coastal counties, which are all at risk for hurricane and flood hazards, and all fall under Florida’s mandate to have a comprehensive plan that includes a hazard mitigation element. Research methods included document review to rate the hazard mitigation elements of all 35 coastal county plans and subsequent analysis against demographic and hazard history factors. Following this, I conducted an electronic, nationwide survey of planning professionals and academics, informed by interviews of planning leaders in Florida counties. I found that hazard mitigation element quality varied widely among the 35 Florida coastal counties, but were close to a normal distribution. No plans were of exceptionally high quality. Overall, historical hazard effects did not correlate with hazard mitigation element quality, but some demographic variables that are associated with urban populations did. The variance in hazard mitigation element quality indicates that while state law may mandate, and even prescribe, hazard mitigation in local comprehensive plans, not all plans will result in equal, or even adequate, protection for people. Furthermore, the mixed correlations with demographic variables representing social and disaster vulnerability shows that, at least at the county level, vulnerability to hazards does not have a strong effect on hazard mitigation element quality. From a theory perspective, my research is significant because it compares assumptions about vulnerability based on hazard history and demographics to plan quality. The only vulnerability-related variables that appeared to correlate, and at that mildly so, with hazard mitigation element quality, were those typically representing more urban areas. In terms of the theory of Neo-Institutionalism and theories related to learning organizations, my research shows that planning departments appear to have set norms and rules of operating that preclude both significant public involvement and learning from prior hazard events.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D, Computing) -- Queen's University, 2016-09-30 09:55:51.506

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the development of improved performance test protocols by renowned researchers, there are still road networks which experience premature cracking and failure. One area of major concern in asphalt science and technology, especially in cold regions in Canada is thermal (low temperature) cracking. Usually right after winter periods, severe cracks are seen on poorly designed road networks. Quality assurance tests based on improved asphalt performance protocols have been implemented by government agencies to ensure that roads being constructed are at the required standard but asphalt binders that pass these quality assurance tests still crack prematurely. While it would be easy to question the competence of the quality assurance test protocols, it should be noted that performance tests which are being used and were repeated in this study, namely the extended bending beam rheometer (EBBR) test, double edge-notched tension test (DENT), dynamic shear rheometer (DSR) test and X-ray fluorescence (XRF) analysis have all been verified and proven to successfully predict asphalt pavement behaviour in the field. Hence this study looked to probe and test the quality and authenticity of the asphalt binders being used for road paving. This study covered thermal cracking and physical hardening phenomenon by comparing results from testing asphalt binder samples obtained from the storage ‘tank’ prior to paving (tank samples) and recovered samples for the same contracts with aim of explaining why asphalt binders that have passed quality assurance tests are still prone to fail prematurely. The study also attempted to find out if the short testing time and automated procedure of torsion bar experiments can replace the established but tedious procedure of the EBBR. In the end, it was discovered that significant differences in performance and composition exist between tank and recovered samples for the same contracts. Torsion bar experimental data also indicated some promise in predicting physical hardening.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background To identify those characteristics of self-management interventions in patients with heart failure (HF) that are effective in influencing health-related quality of life, mortality, and hospitalizations. Methods and Results Randomized trials on self-management interventions conducted between January 1985 and June 2013 were identified and individual patient data were requested for meta-analysis. Generalized mixed effects models and Cox proportional hazard models including frailty terms were used to assess the relation between characteristics of interventions and health-related outcomes. Twenty randomized trials (5624 patients) were included. Longer intervention duration reduced mortality risk (hazard ratio 0.99, 95% confidence interval [CI] 0.97–0.999 per month increase in duration), risk of HF-related hospitalization (hazard ratio 0.98, 95% CI 0.96–0.99), and HF-related hospitalization at 6 months (risk ratio 0.96, 95% CI 0.92–0.995). Although results were not consistent across outcomes, interventions comprising standardized training of interventionists, peer contact, log keeping, or goal-setting skills appeared less effective than interventions without these characteristics. Conclusion No specific program characteristics were consistently associated with better effects of self-management interventions, but longer duration seemed to improve the effect of self-management interventions on several outcomes. Future research using factorial trial designs and process evaluations is needed to understand the working mechanism of specific program characteristics of self-management interventions in HF patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present thesis focuses on the on-fault slip distribution of large earthquakes in the framework of tsunami hazard assessment and tsunami warning improvement. It is widely known that ruptures on seismic faults are strongly heterogeneous. In the case of tsunamigenic earthquakes, the slip heterogeneity strongly influences the spatial distribution of the largest tsunami effects along the nearest coastlines. Unfortunately, after an earthquake occurs, the so-called finite-fault models (FFM) describing the coseismic on-fault slip pattern becomes available over time scales that are incompatible with early tsunami warning purposes, especially in the near field. Our work aims to characterize the slip heterogeneity in a fast, but still suitable way. Using finite-fault models to build a starting dataset of seismic events, the characteristics of the fault planes are studied with respect to the magnitude. The patterns of the slip distribution on the rupture plane, analysed with a cluster identification algorithm, reveal a preferential single-asperity representation that can be approximated by a two-dimensional Gaussian slip distribution (2D GD). The goodness of the 2D GD model is compared to other distributions used in literature and its ability to represent the slip heterogeneity in the form of the main asperity is proven. The magnitude dependence of the 2D GD parameters is investigated and turns out to be of primary importance from an early warning perspective. The Gaussian model is applied to the 16 September 2015 Illapel, Chile, earthquake and used to compute early tsunami predictions that are satisfactorily compared with the available observations. The fast computation of the 2D GD and its suitability in representing the slip complexity of the seismic source make it a useful tool for the tsunami early warning assessments, especially for what concerns the near field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Three dimensional (3D) printers of continuous fiber reinforced composites, such as MarkTwo (MT) by Markforged, can be used to manufacture such structures. To date, research works devoted to the study and application of flexible elements and CMs realized with MT printer are only a few and very recent. A good numerical and/or analytical tool for the mechanical behavior analysis of the new composites is still missing. In addition, there is still a gap in obtaining the material properties used (e.g. elastic modulus) as it is usually unknown and sensitive to printing parameters used (e.g. infill density), making the numerical simulation inaccurate. Consequently, the aim of this thesis is to present several work developed. The first is a preliminary investigation on the tensile and flexural response of Straight Beam Flexures (SBF) realized with MT printer and featuring different interlayer fiber volume-fraction and orientation, as well as different laminate position within the sample. The second is to develop a numerical analysis within the Carrera' s Unified Formulation (CUF) framework, based on component-wise (CW) approach, including a novel preprocessing tool that has been developed to account all regions printed in an easy and time efficient way. Among its benefits, the CUF-CW approach enables building an accurate database for collecting first natural frequencies modes results, then predicting Young' s modulus based on an inverse problem formulation. To validate the tool, the numerical results are compared to the experimental natural frequencies evaluated using a digital image correlation method. Further, we take the CUF-CW model and use static condensation to analyze smart structures which can be decomposed into a large number of similar components. Third, the potentiality of MT in combination with topology optimization and compliant joints design (CJD) is investigated for the realization of automated machinery mechanisms subjected to inertial loads.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biomedicine is a highly interdisciplinary research area at the interface of sciences, anatomy, physiology, and medicine. In the last decade, biomedical studies have been greatly enhanced by the introduction of new technologies and techniques for automated quantitative imaging, thus considerably advancing the possibility to investigate biological phenomena through image analysis. However, the effectiveness of this interdisciplinary approach is bounded by the limited knowledge that a biologist and a computer scientist, by professional training, have of each other’s fields. The possible solution to make up for both these lacks lies in training biologists to make them interdisciplinary researchers able to develop dedicated image processing and analysis tools by exploiting a content-aware approach. The aim of this Thesis is to show the effectiveness of a content-aware approach to automated quantitative imaging, by its application to different biomedical studies, with the secondary desirable purpose of motivating researchers to invest in interdisciplinarity. Such content-aware approach has been applied firstly to the phenomization of tumour cell response to stress by confocal fluorescent imaging, and secondly, to the texture analysis of trabecular bone microarchitecture in micro-CT scans. Third, this approach served the characterization of new 3-D multicellular spheroids of human stem cells, and the investigation of the role of the Nogo-A protein in tooth innervation. Finally, the content-aware approach also prompted to the development of two novel methods for local image analysis and colocalization quantification. In conclusion, the content-aware approach has proved its benefit through building new approaches that have improved the quality of image analysis, strengthening the statistical significance to allow unveiling biological phenomena. Hopefully, this Thesis will contribute to inspire researchers to striving hard for pursuing interdisciplinarity.