999 resultados para accident models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

To enhance workplace safety in the construction industry it is important to understand interrelationships among safety risk factors associated with construction accidents. This study incorporates the systems theory into Heinrich’s domino theory to explore the interrelationships of risks and break the chain of accident causation. Through both empirical and statistical analyses of 9,358 accidents which occurred in the U.S. construction industry between 2002 and 2011, the study investigates relationships between accidents and injury elements (e.g., injury type, part of body, injury severity) and the nature of construction injuries by accident type. The study then discusses relationships between accidents and risks, including worker behavior, injury source, and environmental condition, and identifies key risk factors and risk combinations causing accidents. The research outcomes will assist safety managers to prioritize risks according to the likelihood of accident occurrence and injury characteristics, and pay more attention to balancing significant risk relationships to prevent accidents and achieve safer working environments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Developers and policy makers are consistently at odds over the debate as to whether impact fees increase house prices. This debate continues despite the extensive body of theoretical and empirical international literature that discusses the passing on to home buyers of impact fees, and the corresponding increase to housing prices. In attempting to quantify this impact, over a dozen empirical studies have been carried out in the US and Canada since the 1980’s. However the methodologies used vary greatly, as do the results. Despite similar infrastructure funding policies in numerous developed countries, no such empirical works exist outside of the US/Canada. The purpose of this research is to analyse the existing econometric models in order to identify, compare and contrast the theoretical bases, methodologies, key assumptions and findings of each. This research will assist in identifying if further model development is required and/or whether any of these models have external validity and are readily transferable outside of the US. The findings conclude that there is very little explicit rationale behind the various model selections and that significant model deficiencies appear still to exist.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A synthesis is presented of the predictive capability of a family of near-wall wall-normal free Reynolds stress models (which are completely independent of wall topology, i.e., of the distance fromthe wall and the normal-to-thewall orientation) for oblique-shock-wave/turbulent-boundary-layer interactions. For the purpose of comparison, results are also presented using a standard low turbulence Reynolds number k–ε closure and a Reynolds stress model that uses geometric wall normals and wall distances. Studied shock-wave Mach numbers are in the range MSW = 2.85–2.9 and incoming boundary-layer-thickness Reynolds numbers are in the range Reδ0 = 1–2×106. Computations were carefully checked for grid convergence. Comparison with measurements shows satisfactory agreement, improving on results obtained using a k–ε model, and highlights the relative importance of redistribution and diffusion closures, indicating directions for future modeling work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of Bayesian methodologies for solving optimal experimental design problems has increased. Many of these methods have been found to be computationally intensive for design problems that require a large number of design points. A simulation-based approach that can be used to solve optimal design problems in which one is interested in finding a large number of (near) optimal design points for a small number of design variables is presented. The approach involves the use of lower dimensional parameterisations that consist of a few design variables, which generate multiple design points. Using this approach, one simply has to search over a few design variables, rather than searching over a large number of optimal design points, thus providing substantial computational savings. The methodologies are demonstrated on four applications, including the selection of sampling times for pharmacokinetic and heat transfer studies, and involve nonlinear models. Several Bayesian design criteria are also compared and contrasted, as well as several different lower dimensional parameterisation schemes for generating the many design points.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter is a tutorial that teaches you how to design extended finite state machine (EFSM) test models for a system that you want to test. EFSM models are more powerful and expressive than simple finite state machine (FSM) models, and are one of the most commonly used styles of models for model-based testing, especially for embedded systems. There are many languages and notations in use for writing EFSM models, but in this tutorial we write our EFSM models in the familiar Java programming language. To generate tests from these EFSM models we use ModelJUnit, which is an open-source tool that supports several stochastic test generation algorithms, and we also show how to write your own model-based testing tool. We show how EFSM models can be used for unit testing and system testing of embedded systems, and for offline testing as well as online testing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Readily accepted knowledge regarding crash causation is consistently omitted from efforts to model and subsequently understand motor vehicle crash occurrence and their contributing factors. For instance, distracted and impaired driving accounts for a significant proportion of crash occurrence, yet is rarely modeled explicitly. In addition, spatially allocated influences such as local law enforcement efforts, proximity to bars and schools, and roadside chronic distractions (advertising, pedestrians, etc.) play a role in contributing to crash occurrence and yet are routinely absent from crash models. By and large, these well-established omitted effects are simply assumed to contribute to model error, with predominant focus on modeling the engineering and operational effects of transportation facilities (e.g. AADT, number of lanes, speed limits, width of lanes, etc.) The typical analytical approach—with a variety of statistical enhancements—has been to model crashes that occur at system locations as negative binomial (NB) distributed events that arise from a singular, underlying crash generating process. These models and their statistical kin dominate the literature; however, it is argued in this paper that these models fail to capture the underlying complexity of motor vehicle crash causes, and thus thwart deeper insights regarding crash causation and prevention. This paper first describes hypothetical scenarios that collectively illustrate why current models mislead highway safety researchers and engineers. It is argued that current model shortcomings are significant, and will lead to poor decision-making. Exploiting our current state of knowledge of crash causation, crash counts are postulated to arise from three processes: observed network features, unobserved spatial effects, and ‘apparent’ random influences that reflect largely behavioral influences of drivers. It is argued; furthermore, that these three processes in theory can be modeled separately to gain deeper insight into crash causes, and that the model represents a more realistic depiction of reality than the state of practice NB regression. An admittedly imperfect empirical model that mixes three independent crash occurrence processes is shown to outperform the classical NB model. The questioning of current modeling assumptions and implications of the latent mixture model to current practice are the most important contributions of this paper, with an initial but rather vulnerable attempt to model the latent mixtures as a secondary contribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study investigated the specificity of the post-concussion syndrome (PCS) expectation-as-etiology hypothesis. Undergraduate students (n = 551) were randomly allocated to one of three vignette conditions. Vignettes depicted either a very mild (VMI), mild (MI), or moderate-to-severe (MSI) motor vehicle-related traumatic brain injury (TBI). Participants reported the PCS and PTSD symptoms that they imagined the depicted injury would produce. Secondary outcomes (knowledge of mild TBI, and the perceived undesirability of TBI) were also assessed. After data screening, the distribution of participants by condition was: VMI (n = 100), MI (n = 96), and MSI (n = 71). There was a significant effect of condition on PCS symptomatology, F(2, 264) = 16.55, p < .001. Significantly greater PCS symptomatology was expected in the MSI condition compared to the other conditions (MSI > VMI; medium effect, r = .33; MSI > MI; small-to-medium effect, r = .22). The same pattern of group differences was found for PTSD symptoms, F(2, 264) = 17.12, p < .001. Knowledge of mild TBI was not related to differences in expected PCS symptoms by condition; and the perceived undesirability of TBI was only associated with reported PCS symptomatology in the MSI condition. Systematic variation in the severity of a depicted TBI produces different PCS and PTSD symptom expectations. Even a very mild TBI vignette can elicit expectations of PCS symptoms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article addresses the transformation of a process model with an arbitrary topology into an equivalent structured process model. In particular, this article studies the subclass of process models that have no equivalent well-structured representation but which, nevertheless, can be partially structured into their maximally-structured representation. The transformations are performed under a behavioral equivalence notion that preserves the observed concurrency of tasks in equivalent process models. The article gives a full characterization of the subclass of acyclic process models that have no equivalent well-structured representation, but do have an equivalent maximally-structured one, as well as proposes a complete structuring method. Together with our previous results, this article completes the solution of the process model structuring problem for the class of acyclic process models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Process mining encompasses the research area which is concerned with knowledge discovery from information system event logs. Within the process mining research area, two prominent tasks can be discerned. First of all, process discovery deals with the automatic construction of a process model out of an event log. Secondly, conformance checking focuses on the assessment of the quality of a discovered or designed process model in respect to the actual behavior as captured in event logs. Hereto, multiple techniques and metrics have been developed and described in the literature. However, the process mining domain still lacks a comprehensive framework for assessing the goodness of a process model from a quantitative perspective. In this study, we describe the architecture of an extensible framework within ProM, allowing for the consistent, comparative and repeatable calculation of conformance metrics. For the development and assessment of both process discovery as well as conformance techniques, such a framework is considered greatly valuable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A pressing cost issue facing construction is the procurement of off-site pre-manufactured assemblies. In order to encourage Australian adoption of off-site manufacture (OSM), a new approach to underlying processes is required. The advent of object oriented digital models for construction design assumes intelligent use of data. However, the construction production system relies on traditional methods and data sources and is expected to benefit from the application of well-established business process management techniques. The integration of the old and new data sources allows for the development of business process models which, by capturing typical construction processes involving OSM, provides insights into such processes. This integrative approach is the foundation of research into the use of OSM to increase construction productivity in Australia. The purpose of this study is to develop business process models capturing the procurement, resources and information flow of construction projects. For each stage of the construction value chain, a number of sub-processes are identified. Business Process Modelling Notation (BPMN), a mainstream business process modelling standard, is used to create base-line generic construction process models. These models identify OSM decision-making points that could provide cost reductions in procurement workflow and management systems. This paper reports on phase one of an on-going research aiming to develop a proto-type workflow application that can provide semi-automated support to construction processes involving OSM and assist in decision-making in the adoption of OSM thus contributing to a sustainable built environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Shoulder joint is a complex integration of soft and hard tissues. It plays an important role in performing daily activities and can be considered as a perfect compromise between mobility and stability. However, shoulder is vulnerable to complications such as dislocations and osteoarthritis. Finite element (FE) models have been developed to understand shoulder injury mechanisms, implications of disease on shoulder complex and in assessing the quality of shoulder implants. Further, although few, Finite element shoulder models have also been utilized to answer important clinical questions such as the difference between a normal and osteoarthritic shoulder joint. However, due to the absence of experimental validation, it is questionable whether the constitutive models applied in these FE models are adequate to represent mechanical behaviors of shoulder elements (Cartilages, Ligaments, Muscles etc), therefore the confidence of using current models in answering clinically relevant question. The main objective of this review is to critically evaluate the existing FE shoulder models that have been used to investigate clinical problems. Due concern is given to check the adequacy of representative constitutive models of shoulder elements in drawing clinically relevant conclusion. Suggestions have been given to improve the existing shoulder models by inclusion of adequate constitutive models for shoulder elements to confidently answer clinically relevant questions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Determining the properties and integrity of subchondral bone in the developmental stages of osteoarthritis, especially in a form that can facilitate real-time characterization for diagnostic and decision-making purposes, is still a matter for research and development. This paper presents relationships between near infrared absorption spectra and properties of subchondral bone obtained from 3 models of osteoarthritic degeneration induced in laboratory rats via: (i) menisectomy (MSX); (ii) anterior cruciate ligament transaction (ACL); and (iii) intra-articular injection of mono-ido-acetate (1 mg) (MIA), in the right knee joint, with 12 rats per model group (N = 36). After 8 weeks, the animals were sacrificed and knee joints were collected. A custom-made diffuse reflectance NIR probe of diameter 5 mm was placed on the tibial surface and spectral data were acquired from each specimen in the wavenumber range 4000–12 500 cm− 1. After spectral acquisition, micro computed tomography (micro-CT) was performed on the samples and subchondral bone parameters namely: bone volume (BV) and bone mineral density (BMD) were extracted from the micro-CT data. Statistical correlation was then conducted between these parameters and regions of the near infrared spectra using multivariate techniques including principal component analysis (PCA), discriminant analysis (DA), and partial least squares (PLS) regression. Statistically significant linear correlations were found between the near infrared absorption spectra and subchondral bone BMD (R2 = 98.84%) and BV (R2 = 97.87%). In conclusion, near infrared spectroscopic probing can be used to detect, qualify and quantify changes in the composition of the subchondral bone, and could potentially assist in distinguishing healthy from OA bone as demonstrated with our laboratory rat models.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Three dimensional cellular models that mimic disease are being increasingly investigated and have opened an exciting new research area into understanding pathomechanisms. The advantage of 3D in vitro disease models is that they allow systematic and in-depth studies of physiological and pathophysiological processes with less costs and ethical concerns that have arisen with animal models. The purpose of the 3D approach is to allow crosstalk between cells and microenvironment, and with cues from the microenvironment, cells can assemble their niche similar to in vivo conditions. The use of 3D models for mimicking disease processes such as cancer, osteoarthritis etc., is only emerging and allows multidisciplinary teams consisting of tissue engineers, biologist biomaterial scientists and clinicians to work closely together. While in vitro systems require rigorous testing before they can be considered as replicates of the in vivo model, major steps have been made, suggesting that they will become powerful tools for studying physiological and pathophysiological processes. This paper aims to summarize some of the existing 3D models and proposes a novel 3D model of the eye structures that are involved in the most common cause of blindness in the Western World, namely age-related macular degeneration (AMD).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Identifying the design features that impact construction is essential to developing cost effective and constructible designs. The similarity of building components is a critical design feature that affects method selection, productivity, and ultimately construction cost and schedule performance. However, there is limited understanding of what constitutes similarity in the design of building components and limited computer-based support to identify this feature in a building product model. This paper contributes a feature-based framework for representing and reasoning about component similarity that builds on ontological modelling, model-based reasoning and cluster analysis techniques. It describes the ontology we developed to characterize component similarity in terms of the component attributes, the direction, and the degree of variation. It also describes the generic reasoning process we formalized to identify component similarity in a standard product model based on practitioners' varied preferences. The generic reasoning process evaluates the geometric, topological, and symbolic similarities between components, creates groupings of similar components, and quantifies the degree of similarity. We implemented this reasoning process in a prototype cost estimating application, which creates and maintains cost estimates based on a building product model. Validation studies of the prototype system provide evidence that the framework is general and enables a more accurate and efficient cost estimating process.