891 resultados para event tree analysis
Resumo:
Effective risk management is crucial for any organisation. One of its key steps is risk identification, but few tools exist to support this process. Here we present a method for the automatic discovery of a particular type of process-related risk, the danger of deadline transgressions or overruns, based on the analysis of event logs. We define a set of time-related process risk indicators, i.e., patterns observable in event logs that highlight the likelihood of an overrun, and then show how instances of these patterns can be identified automatically using statistical principles. To demonstrate its feasibility, the approach has been implemented as a plug-in module to the process mining framework ProM and tested using an event log from a Dutch financial institution.
Resumo:
A range of authors from the risk management, crisis management, and crisis communications literature have proposed different models as a means of understanding components of crisis. A generic component of these sources has focused on preparedness practices before disturbance events and response practices during events. This paper provides a critical analysis of three key explanatory models of how crises escalate highlighting the strengths and limitations of each approach. The paper introduces an optimised conceptual model utilising components from the previous work under the four phases of pre-event, response, recovery, and post-event. Within these four phases, a ten step process is introduced that can enhance understanding of the progression of distinct stages of disturbance for different types of events. This crisis evolution framework is examined as a means to provide clarity and applicability to a range of infrastructure failure contexts and provide a path for further empirical investigation in this area.
Resumo:
Railway is one of the most important, reliable and widely used means of transportation, carrying freight, passengers, minerals, grains, etc. Thus, research on railway tracks is extremely important for the development of railway engineering and technologies. The safe operation of a railway track is based on the railway track structure that includes rails, fasteners, pads, sleepers, ballast, subballast and formation. Sleepers are very important components of the entire structure and may be made of timber, concrete, steel or synthetic materials. Concrete sleepers were first installed around the middle of last century and currently are installed in great numbers around the world. Consequently, the design of concrete sleepers has a direct impact on the safe operation of railways. The "permissible stress" method is currently most commonly used to design sleepers. However, the permissible stress principle does not consider the ultimate strength of materials, probabilities of actual loads, and the risks associated with failure, all of which could lead to the conclusion of cost-ineffectiveness and over design of current prestressed concrete sleepers. Recently the limit states design method, which appeared in the last century and has been already applied in the design of buildings, bridges, etc, is proposed as a better method for the design of prestressed concrete sleepers. The limit states design has significant advantages compared to the permissible stress design, such as the utilisation of the full strength of the member, and a rational analysis of the probabilities related to sleeper strength and applied loads. This research aims to apply the ultimate limit states design to the prestressed concrete sleeper, namely to obtain the load factors of both static and dynamic loads for the ultimate limit states design equations. However, the sleepers in rail tracks require different safety levels for different types of tracks, which mean the different types of tracks have different load factors of limit states design equations. Therefore, the core tasks of this research are to find the load factors of the static component and dynamic component of loads on track and the strength reduction factor of the sleeper bending strength for the ultimate limit states design equations for four main types of tracks, i.e., heavy haul, freight, medium speed passenger and high speed passenger tracks. To find those factors, the multiple samples of static loads, dynamic loads and their distributions are needed. In the four types of tracks, the heavy haul track has the measured data from Braeside Line (A heavy haul line in Central Queensland), and the distributions of both static and dynamic loads can be found from these data. The other three types of tracks have no measured data from sites and the experimental data are hardly available. In order to generate the data samples and obtain their distributions, the computer based simulations were employed and assumed the wheel-track impacts as induced by different sizes of wheel flats. A valid simulation package named DTrack was firstly employed to generate the dynamic loads for the freight and medium speed passenger tracks. However, DTrack is only valid for the tracks which carry low or medium speed vehicles. Therefore, a 3-D finite element (FE) model was then established for the wheel-track impact analysis of the high speed track. This FE model has been validated by comparing its simulation results with the DTrack simulation results, and with the results from traditional theoretical calculations based on the case of heavy haul track. Furthermore, the dynamic load data of the high speed track were obtained from the FE model and the distributions of both static and dynamic loads were extracted accordingly. All derived distributions of loads were fitted by appropriate functions. Through extrapolating those distributions, the important parameters of distributions for the static load induced sleeper bending moment and the extreme wheel-rail impact force induced sleeper dynamic bending moments and finally, the load factors, were obtained. Eventually, the load factors were obtained by the limit states design calibration based on reliability analyses with the derived distributions. After that, a sensitivity analysis was performed and the reliability of the achieved limit states design equations was confirmed. It has been found that the limit states design can be effectively applied to railway concrete sleepers. This research significantly contributes to railway engineering and the track safety area. It helps to decrease the failure and risks of track structure and accidents; better determines the load range for existing sleepers in track; better rates the strength of concrete sleepers to support bigger impact and loads on railway track; increases the reliability of the concrete sleepers and hugely saves investments on railway industries. Based on this research, many other bodies of research can be promoted in the future. Firstly, it has been found that the 3-D FE model is suitable for the study of track loadings and track structure vibrations. Secondly, the equations for serviceability and damageability limit states can be developed based on the concepts of limit states design equations of concrete sleepers obtained in this research, which are for the ultimate limit states.
Resumo:
Due to the demand for better and deeper analysis in sports, organizations (both professional teams and broadcasters) are looking to use spatiotemporal data in the form of player tracking information to obtain an advantage over their competitors. However, due to the large volume of data, its unstructured nature, and lack of associated team activity labels (e.g. strategic/tactical), effective and efficient strategies to deal with such data have yet to be deployed. A bottleneck restricting such solutions is the lack of a suitable representation (i.e. ordering of players) which is immune to the potentially infinite number of possible permutations of player orderings, in addition to the high dimensionality of temporal signal (e.g. a game of soccer last for 90 mins). Leveraging a recent method which utilizes a "role-representation", as well as a feature reduction strategy that uses a spatiotemporal bilinear basis model to form a compact spatiotemporal representation. Using this representation, we find the most likely formation patterns of a team associated with match events across nearly 14 hours of continuous player and ball tracking data in soccer. Additionally, we show that we can accurately segment a match into distinct game phases and detect highlights. (i.e. shots, corners, free-kicks, etc) completely automatically using a decision-tree formulation.
Resumo:
Dengue virus (DENV) transmission in Australia is driven by weather factors and imported dengue fever (DF) cases. However, uncertainty remains regarding the threshold effects of high-order interactions among weather factors and imported DF cases and the impact of these factors on autochthonous DF. A time-series regression tree model was used to assess the threshold effects of natural temporal variations of weekly weather factors and weekly imported DF cases in relation to incidence of weekly autochthonous DF from 1 January 2000 to 31 December 2009 in Townsville and Cairns, Australia. In Cairns, mean weekly autochthonous DF incidence increased 16.3-fold when the 3-week lagged moving average maximum temperature was <32 °C, the 4-week lagged moving average minimum temperature was ≥24 °C and the sum of imported DF cases in the previous 2 weeks was >0. When the 3-week lagged moving average maximum temperature was ≥32 °C and the other two conditions mentioned above remained the same, mean weekly autochthonous DF incidence only increased 4.6-fold. In Townsville, the mean weekly incidence of autochthonous DF increased 10-fold when 3-week lagged moving average rainfall was ≥27 mm, but it only increased 1.8-fold when rainfall was <27 mm during January to June. Thus, we found different responses of autochthonous DF incidence to weather factors and imported DF cases in Townsville and Cairns. Imported DF cases may also trigger and enhance local outbreaks under favorable climate conditions.
Resumo:
This paper uses innovative content analysis techniques to map how the death of Oscar Pistorius' girlfriend, Reeva Steenkamp, was framed on Twitter conversations. Around 1.5 million posts from a two-week timeframe are analyzed with a combination of syntactic and semantic methods. This analysis is grounded in the frame analysis perspective and is different than sentiment analysis. Instead of looking for explicit evaluations, such as “he is guilty” or “he is innocent”, we showcase through the results how opinions can be identified by complex articulations of more implicit symbolic devices such as examples and metaphors repeatedly mentioned. Different frames are adopted by users as more information about the case is revealed: from a more episodic one, highly used in the very beginning, to more systemic approaches, highlighting the association of the event with urban violence, gun control issues, and violence against women. A detailed timeline of the discussions is provided.
Resumo:
Terrorists usually target high occupancy iconic and public buildings using vehicle borne incendiary devices in order to claim a maximum number of lives and cause extensive damage to public property. While initial casualties are due to direct shock by the explosion, collapse of structural elements may extensively increase the total figure. Most of these buildings have been or are built without consideration of their vulnerability to such events. Therefore, the vulnerability and residual capacity assessment of buildings to deliberately exploded bombs is important to provide mitigation strategies to protect the buildings' occupants and the property. Explosive loads and their effects on a building have therefore attracted significant attention in the recent past. Comprehensive and economical design strategies must be developed for future construction. This research investigates the response and damage of reinforced concrete (RC) framed buildings together with their load bearing key structural components to a near field blast event. Finite element method (FEM) based analysis was used to investigate the structural framing system and components for global stability, followed by a rigorous analysis of key structural components for damage evaluation using the codes SAP2000 and LS DYNA respectively. The research involved four important areas in structural engineering. They are blast load determination, numerical modelling with FEM techniques, material performance under high strain rate and non-linear dynamic structural analysis. The response and damage of a RC framed building for different blast load scenarios were investigated. The blast influence region for a two dimensional RC frame was investigated for different load conditions and identified the critical region for each loading case. Two types of design methods are recommended for RC columns to provide superior residual capacities. They are RC columns detailing with multi-layer steel reinforcement cages and a composite columns including a central structural steel core. These are to provide post blast gravity load resisting capacity compared to typical RC column against a catastrophic collapse. Overall, this research broadens the current knowledge of blast and residual capacity analysis of RC framed structures and recommends methods to evaluate and mitigate blast impact on key elements of multi-storey buildings.
Resumo:
The validity of using rainfall characteristics as lumped parameters for investigating the pollutant wash-off process such as first flush occurrence is questionable. This research study introduces an innovative concept of using sector parameters to investigate the relationship between the pollutant wash-off process and different sectors of the runoff hydrograph and rainfall hyetograph. The research outcomes indicated that rainfall depth and rainfall intensity are two key rainfall characteristics which influence the wash-off process compared to the antecedent dry period. Additionally, the rainfall pattern also plays a critical role in the wash-off process and is independent of the catchment characteristics. The knowledge created through this research study provides the ability to select appropriate rainfall events for stormwater quality treatment design based on the required treatment outcomes such as the need to target different sectors of the runoff hydrograph or pollutant species. The study outcomes can also contribute to enhancing stormwater quality modelling and prediction in view of the fact that conventional approaches to stormwater quality estimation is primarily based on rainfall intensity rather than considering other rainfall parameters or solely based on stochastic approaches irrespective of the characteristics of the rainfall event.
Resumo:
Standard Monte Carlo (sMC) simulation models have been widely used in AEC industry research to address system uncertainties. Although the benefits of probabilistic simulation analyses over deterministic methods are well documented, the sMC simulation technique is quite sensitive to the probability distributions of the input variables. This phenomenon becomes highly pronounced when the region of interest within the joint probability distribution (a function of the input variables) is small. In such cases, the standard Monte Carlo approach is often impractical from a computational standpoint. In this paper, a comparative analysis of standard Monte Carlo simulation to Markov Chain Monte Carlo with subset simulation (MCMC/ss) is presented. The MCMC/ss technique constitutes a more complex simulation method (relative to sMC), wherein a structured sampling algorithm is employed in place of completely randomized sampling. Consequently, gains in computational efficiency can be made. The two simulation methods are compared via theoretical case studies.
Resumo:
This research analyses the extent of damage to buildings in Brisbane, Ipswich and Grantham during the recent Eastern Australia flooding and explore the role planning and design/construction regulations played in these failures. It highlights weaknesses in the current systems and propose effective solutions to mitigate future damage and financial loss under current or future climates. 2010 and early 2011 saw major flooding throughout much of Eastern Australia. Queensland and Victoria were particularly hard hit, with insured losses in these states reaching $2.5 billion and many thousands of homes inundated. The Queensland cities of Brisbane and Ipswich were the worst affected; around two-thirds of all inundated property/buildings were in these two areas. Other local government areas to record high levels of inundation were Central Highlands and Rockhampton Regional Councils in Queensland, and Buloke, Campaspe, Central Gold Fields and Loddon in Victoria. Flash flooding was a problem in a number of Victorian councils, but the Lockyer Valley west of Ipswich suffered the most extensive damage with 19 lives lost and more than 100 homes completely destroyed. In all more than 28,000 properties were inundated in Queensland and around 2,500 buildings affected in Victoria. Of the residential properties affected in Brisbane, around 90% were in areas developed prior to the introduction of floodplain development controls, with many also suffering inundation during the 1974 floods. The project developed a predictive model for estimating flood loss and occupant displacement. This model can now be used for flood risk assessments or rapid assessment of impacts following a flood event.
Resumo:
Consumers of whole foods, such as fruits, demand consistent high quality and seek varieties with enhanced health properties, convenience or novel taste. We have raised the polyphenolic content of apple by genetic engineering of the anthocyanin pathway using the apple transcription factor MYB10. These apples have very high concentrations of foliar, flower and fruit anthocyanins, especially in the fruit peel. Independent lines were examined for impacts on tree growth, photosynthesis and fruit characteristics. Fruit were analysed for changes in metabolite and transcript levels. Fruit were also used in taste trials to study the consumer perception of such a novel apple. No negative taste attributes were associated with the elevated anthocyanins. Modification with this one gene provides near isogenic material and allows us to examine the effects on an established cultivar, with a view to enhancing consumer appeal independently of other fruit qualities. © 2012 Society for Experimental Biology, Association of Applied Biologists and Blackwell Publishing Ltd.
Resumo:
With the growing size and variety of social media files on the web, it’s becoming critical to efficiently organize them into clusters for further processing. This paper presents a novel scalable constrained document clustering method that harnesses the power of search engines capable of dealing with large text data. Instead of calculating distance between the documents and all of the clusters’ centroids, a neighborhood of best cluster candidates is chosen using a document ranking scheme. To make the method faster and less memory dependable, the in-memory and in-database processing are combined in a semi-incremental manner. This method has been extensively tested in the social event detection application. Empirical analysis shows that the proposed method is efficient both in computation and memory usage while producing notable accuracy.
Resumo:
Proanthocyanidins (PAs) are products of the flavonoid pathway, which also leads to the production of anthocyanins and flavonols. Many flavonoids have antioxidant properties and may have beneficial effects for human health. PAs are found in the seeds and fruits of many plants. In apple fruit (Malus × domestica Borkh.), the flavonoid biosynthetic pathway is most active in the skin, with the flavan-3-ols, catechin, and epicatechin acting as the initiating units for the synthesis of PA polymers. This study examined the genes involved in the production of PAs in three apple cultivars: two heritage apple cultivars, Hetlina and Devonshire Quarrenden, and a commercial cultivar, Royal Gala. HPLC analysis shows that tree-ripe fruit from Hetlina and Devonshire Quarrenden had a higher phenolic content than Royal Gala. Epicatechin and catechin biosynthesis is under the control of the biosynthetic enzymes anthocyanidin reductase (ANR) and leucoanthocyanidin reductase (LAR1), respectively. Counter-intuitively, real-time quantitative PCR analysis showed that the expression levels of Royal Gala LAR1 and ANR were significantly higher than those of both Devonshire Quarrenden and Hetlina. This suggests that a compensatory feedback mechanism may be active, whereby low concentrations of PAs may induce higher expression of gene transcripts. Further investigation is required into the regulation of these key enzymes in apple.
Resumo:
Motivated by the analysis of the Australian Grain Insect Resistance Database (AGIRD), we develop a Bayesian hurdle modelling approach to assess trends in strong resistance of stored grain insects to phosphine over time. The binary response variable from AGIRD indicating presence or absence of strong resistance is characterized by a majority of absence observations and the hurdle model is a two step approach that is useful when analyzing such a binary response dataset. The proposed hurdle model utilizes Bayesian classification trees to firstly identify covariates and covariate levels pertaining to possible presence or absence of strong resistance. Secondly, generalized additive models (GAMs) with spike and slab priors for variable selection are fitted to the subset of the dataset identified from the Bayesian classification tree indicating possibility of presence of strong resistance. From the GAM we assess trends, biosecurity issues and site specific variables influencing the presence of strong resistance using a variable selection approach. The proposed Bayesian hurdle model is compared to its frequentist counterpart, and also to a naive Bayesian approach which fits a GAM to the entire dataset. The Bayesian hurdle model has the benefit of providing a set of good trees for use in the first step and appears to provide enough flexibility to represent the influence of variables on strong resistance compared to the frequentist model, but also captures the subtle changes in the trend that are missed by the frequentist and naive Bayesian models.
Resumo:
Organisations are constantly seeking new ways to improve operational efficiencies. This research study investigates a novel way to identify potential efficiency gains in business operations by observing how they are carried out in the past and then exploring better ways of executing them by taking into account trade-offs between time, cost and resource utilisation. This paper demonstrates how they can be incorporated in the assessment of alternative process execution scenarios by making use of a cost environment. A genetic algorithm-based approach is proposed to explore and assess alternative process execution scenarios, where the objective function is represented by a comprehensive cost structure that captures different process dimensions. Experiments conducted with different variants of the genetic algorithm evaluate the approach's feasibility. The findings demonstrate that a genetic algorithm-based approach is able to make use of cost reduction as a way to identify improved execution scenarios in terms of reduced case durations and increased resource utilisation. The ultimate aim is to utilise cost-related insights gained from such improved scenarios to put forward recommendations for reducing process-related cost within organisations.