891 resultados para event tree analysis
Resumo:
The economic loss caused by the storm surge disasters is much higher than that caused by any other marine disaster in China, the loss from the severe storm surge disaster being the highest. Statistics show that there were 62 typhoon landings over the east-southeast coast of China since 1990, three of which, occurring in 1992, 1994 and 1997, respectively, caused the most severe damage. The direct economic losses due to these events are 9.3, 17.0 and 30 billion yuan (RMB, or about 1.7, 2.6 and 3.8 billion USD, respectively), which is much greater than the loss of 5.5 billion yuan (RMB) on an average every year during the 1989-1991 period. This paper makes a comparative analysis of the damage caused by the three events and presents an overview of progress of precautions against storm surge disaster in China. The suggested counter measures to mitigate the loss from the severe storm surge disasters in China is as follows: (1) Raise the whole society awareness of precaution against severe storm surge disaster; (2) Work out a new plan for building sea walls; (3) Improve and perfect the available warning and disaster relief command system; (4) Develop the insurance service in order to promptly mitigate the loss caused by severe storm surge disaster event.
Resumo:
The cold-water event along the southeast coast of the United States in the summer of 2003 is studied using satellite data combined with in situ observations. The analysis suggests that the cooling is produced by wind-driven coastal upwelling, which breaks the thermocline barrier in the summer of 2003. The strong and persistent southwesterly winds in the summer of 2003 play an important role of lifting the bottom isotherms up to the surface and away from the coast, generating persistent surface cooling in July-August 2003. Once the thermocline barrier is broken, the stratification in the nearshore region is weakened substantially, allowing further coastal cooling of large magnitudes by episodic southerly wind bursts or passage of coastally trapped waves at periods of a few days. These short-period winds or waves would otherwise have no effects on the surface temperature because of the strong thermocline barrier in summer if not for the low-frequency cooling produced by the persistent southwesterly winds.
Resumo:
1. Complete sequences of 1140 base pair of the cytochrome b gene from 133 specimens were obtained from nine localities including the inflow drainage system, isolated lakes and outflow drainage system in Qinghai-Tibetan Plateau to assess genetic diversity and to infer population histories of the freshwater fish Schizopygopsis pylzovi.2. Nucleotide diversities (pi) were moderate (0.0024-0.0045) in populations from the outflow drainage system and Tuosuo Lake, but low (0.0018-0.0021) in populations from Qiadam Basin. It is probable that the low intra-population variability is related with the paleoenvironmental fluctuation in Qiadam Basin, suggesting that the populations from Qiadam Basin have experienced severe bottleneck events in history.3. Phylogenetic tree topologies indicate that the individuals from different populations did not form reciprocal monophyly, but the populations from the adjacent drainages cluster geographically. Most population pairwise F-ST tests were significant, with non-significant pairwise tests between Tuosu Lake and Tuosuo Lake in the north-west of the Qinghai-Tibetan Plateau. Analysis of molecular variance (AMOVA) indicates that the significant genetic variation was explained at the levels of catchments within and among, not among specific boundaries or inflow and outflow drainage systems.4. The nested clade phylogeographical analysis indicates that historical processes are very important in the observed geographical structuring of S. pylzovi, and the contemporary population structure and differentiation of S. pylzovi may be consistent with the historical tectonic events occurred in the course of uplifts of the Qinghai-Tibetan Plateau. Fluctuations of the ecogeographical environment and major hydrographic formation might have promoted contiguous range expansion of freshwater fish populations, whereas the geological barriers among drainages have resulted in the fragmentation of population and restricted the gene flow among populations.5. The significantly large negative F-s-value (-24.91, P < 0.01) of Fu's F-s-test and the unimodal mismatch distribution indicate that the species S. pylzovi underwent a sudden population expansion after the historical tectonic event of the Gonghe Movement.6. The results of this study indicate that each population from the Qinghai-Tibetan Plateau should be managed and conserved separately and that efforts should be directed towards preserving the genetic integrity of each group.
Resumo:
提出了一种基于数字化的生产模型,使用控制图、故障树分析和专家知识,能够进行制造过程实时监控的诊断,该模型提高了故障诊断系统的可靠性,并提供了可实际操作的可视化建模工具。所开发的在线统计过程控制系统能够根据生产事件的监测,动态响应制造过程变化。该系统运用可视化建模工具,根据专家经验进行故障树建模,通过故障树自动生成专家系统诊断规则库,实现诊断知识的自动获取。将该系统应用于汽车变速箱装配过程的检测与故障诊断,验证了方法的有效性。
Resumo:
The sedimentary-volcanic tuff (locally called "green-bean rock") formed during the early Middle Triassic volcanic event in Guizhou Province is characterized as being thin, stable, widespread, short in forming time and predominantly green in color. The green-bean rock is a perfect indicator for stratigraphic division. Its petrographic and geochemical features are unique, and it is composed mainly of glassy fragments and subordinately of crystal fragments and volcanic ash balls. Analysis of the major and trace elements and rare-earth elements ( REE), as well as the related diagrams, permits us to believe that the green-bean rock is acidic volcanic material of the calc-alkaline series formed in the Indosinian orogenic belt on the Sino-Vietnam border, which was atmospherically transported to the tectonically stable areas and then deposited as sedimentary-volcanic rocks there. According to the age of green-bean rock, it is deduced that the boundary age of the Middle-Lower Triassic overlain by the sedimentary-volcanic tuff is about 247 Ma.
Resumo:
Urquhart, C., Durbin, J. & Spink, S. (2004). Training needs analysis of healthcare library staff, undertaken for South Yorkshire Workforce Development Confederation. Aberystwyth: Department of Information Studies, University of Wales Aberystwyth. Sponsorship: South Yorkshire WDC (NHS)
Resumo:
Wydział Historyczny: Instytut Prahistorii
Resumo:
A number of problems in network operations and engineering call for new methods of traffic analysis. While most existing traffic analysis methods are fundamentally temporal, there is a clear need for the analysis of traffic across multiple network links — that is, for spatial traffic analysis. In this paper we give examples of problems that can be addressed via spatial traffic analysis. We then propose a formal approach to spatial traffic analysis based on the wavelet transform. Our approach (graph wavelets) generalizes the traditional wavelet transform so that it can be applied to data elements connected via an arbitrary graph topology. We explore the necessary and desirable properties of this approach and consider some of its possible realizations. We then apply graph wavelets to measurements from an operating network. Our results show that graph wavelets are very useful for our motivating problems; for example, they can be used to form highly summarized views of an entire network's traffic load, to gain insight into a network's global traffic response to a link failure, and to localize the extent of a failure event within the network.
Resumo:
The problem of discovering frequent arrangements of temporal intervals is studied. It is assumed that the database consists of sequences of events, where an event occurs during a time-interval. The goal is to mine temporal arrangements of event intervals that appear frequently in the database. The motivation of this work is the observation that in practice most events are not instantaneous but occur over a period of time and different events may occur concurrently. Thus, there are many practical applications that require mining such temporal correlations between intervals including the linguistic analysis of annotated data from American Sign Language as well as network and biological data. Two efficient methods to find frequent arrangements of temporal intervals are described; the first one is tree-based and uses depth first search to mine the set of frequent arrangements, whereas the second one is prefix-based. The above methods apply efficient pruning techniques that include a set of constraints consisting of regular expressions and gap constraints that add user-controlled focus into the mining process. Moreover, based on the extracted patterns a standard method for mining association rules is employed that applies different interestingness measures to evaluate the significance of the discovered patterns and rules. The performance of the proposed algorithms is evaluated and compared with other approaches on real (American Sign Language annotations and network data) and large synthetic datasets.
Resumo:
Background: We conducted a survival analysis of all the confirmed cases of Adult Tuberculosis (TB) patients treated in Cork-City, Ireland. The aim of this study was to estimate Survival time (ST), including median time of survival and to assess the association and impact of covariates (TB risk factors) to event status and ST. The outcome of the survival analysis is reported in this paper. Methods: We used a retrospective cohort study research design to review data of 647 bacteriologically confirmed TB patients from the medical record of two teaching hospitals. Mean age 49 years (Range 18–112). We collected information on potential risk factors of all confirmed cases of TB treated between 2008–2012. For the survival analysis, the outcome of interest was ‘treatment failure’ or ‘death’ (whichever came first). A univariate descriptive statistics analysis was conducted using a non- parametric procedure, Kaplan -Meier (KM) method to estimate overall survival (OS), while the Cox proportional hazard model was used for the multivariate analysis to determine possible association of predictor variables and to obtain adjusted hazard ratio. P value was set at <0.05, log likelihood ratio test at >0.10. Data were analysed using SPSS version 15.0. Results: There was no significant difference in the survival curves of male and female patients. (Log rank statistic = 0.194, df = 1, p = 0.66) and among different age group (Log rank statistic = 1.337, df = 3, p = 0.72). The mean overall survival (OS) was 209 days (95%CI: 92–346) while the median was 51 days (95% CI: 35.7–66). The mean ST for women was 385 days (95%CI: 76.6–694) and for men was 69 days (95%CI: 48.8–88.5). Multivariate Cox regression showed that patient who had history of drug misuse had 2.2 times hazard than those who do not have drug misuse. Smokers and alcohol drinkers had hazard of 1.8 while patients born in country of high endemicity (BICHE) had hazard of 6.3 and HIV co-infection hazard was 1.2. Conclusion: There was no significant difference in survival curves of male and female and among age group. Women had a higher ST compared to men. But men had a higher hazard rate compared to women. Anti-TNF, immunosuppressive medication and diabetes were found to be associated with longer ST, while alcohol, smoking, RICHE, BICHE was associated with shorter ST.
Resumo:
It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain
Resumo:
Twitter has changed the dynamic of the academic conference. Before Twitter, delegate participation was primarily dependent on attendance and feedback was limited to post-event survey. With Twitter, delegates have become active participants. They pass comment, share reactions and critique presentations, all the while generating a running commentary. This study examines this phenomenon using the Academic & Special Libraries (A&SL) conference 2015 (hashtag #asl2015) as a case study. A post-conference survey was undertaken asking delegates how and why they used Twitter at #asl2015. A content and conceptual analysis of tweets was conducted using Topsy and Storify. This analysis examined how delegates interacted with presentations, which sessions generated most activity on the timeline and the type of content shared. Actual tweet activity and volume per presentation was compared to survey responses. Finally, recommendations on Twitter engagement for conference organisers and presenters are provided.
Resumo:
BACKGROUND: Methicillin-resistant Staphylococcus aureus (MRSA) is a common cause of complicated skin and skin-structure infection (cSSSI). Increasing antimicrobial resistance in cSSSI has led to a need for new safe and effective therapies. Ceftaroline was evaluated as treatment for cSSSI in 2 identical phase 3 clinical trials, the pooled analysis of which is presented here. The primary objective of each trial was to determine the noninferiority of the clinical cure rate achieved with ceftaroline monotherapy, compared with that achieved with vancomycin plus aztreonam combination therapy, in the clinically evaluable (CE) and modified intent-to-treat (MITT) patient populations. METHODS: Adult patients with cSSSI requiring intravenous therapy received ceftaroline (600 mg every 12 h) or vancomycin plus aztreonam (1 g each every 12 h) for 5-14 days. RESULTS: Of 1378 patients enrolled in both trials, 693 received ceftaroline and 685 received vancomycin plus aztreonam. Baseline characteristics of the treatment groups were comparable. Clinical cure rates were similar for ceftaroline and vancomycin plus aztreonam in the CE (91.6% vs 92.7%) and MITT (85.9% vs 85.5%) populations, respectively, as well as in patients infected with MRSA (93.4% vs 94.3%). The rates of adverse events, discontinuations because of an adverse event, serious adverse events, and death also were similar between treatment groups. CONCLUSIONS: Ceftaroline achieved high clinical cure rates, was efficacious against cSSSI caused by MRSA and other common cSSSI pathogens, and was well tolerated, with a safety profile consistent with the cephalosporin class. Ceftaroline has the potential to provide a monotherapy alternative for the treatment of cSSSI. TRIAL REGISTRATION: ClinicalTrials.gov identifiers: NCT00424190 for CANVAS 1 and NCT00423657 for CANVAS 2.
Resumo:
To investigate the neural systems that contribute to the formation of complex, self-relevant emotional memories, dedicated fans of rival college basketball teams watched a competitive game while undergoing functional magnetic resonance imaging (fMRI). During a subsequent recognition memory task, participants were shown video clips depicting plays of the game, stemming either from previously-viewed game segments (targets) or from non-viewed portions of the same game (foils). After an old-new judgment, participants provided emotional valence and intensity ratings of the clips. A data driven approach was first used to decompose the fMRI signal acquired during free viewing of the game into spatially independent components. Correlations were then calculated between the identified components and post-scanning emotion ratings for successfully encoded targets. Two components were correlated with intensity ratings, including temporal lobe regions implicated in memory and emotional functions, such as the hippocampus and amygdala, as well as a midline fronto-cingulo-parietal network implicated in social cognition and self-relevant processing. These data were supported by a general linear model analysis, which revealed additional valence effects in fronto-striatal-insular regions when plays were divided into positive and negative events according to the fan's perspective. Overall, these findings contribute to our understanding of how emotional factors impact distributed neural systems to successfully encode dynamic, personally-relevant event sequences.
Resumo:
Mitchell et al. argue that divergence-time estimates for our avian phylogeny were too young because of an "inappropriate" maximum age constraint for the most recent common ancestor of modern birds and that, as a result, most modern bird orders diverged before the Cretaceous-Paleogene mass extinction event 66 million years ago instead of after. However, their interpretations of the fossil record and timetrees are incorrect.