3 resultados para event tree analysis
em CORA - Cork Open Research Archive - University College Cork - Ireland
Resumo:
Background: We conducted a survival analysis of all the confirmed cases of Adult Tuberculosis (TB) patients treated in Cork-City, Ireland. The aim of this study was to estimate Survival time (ST), including median time of survival and to assess the association and impact of covariates (TB risk factors) to event status and ST. The outcome of the survival analysis is reported in this paper. Methods: We used a retrospective cohort study research design to review data of 647 bacteriologically confirmed TB patients from the medical record of two teaching hospitals. Mean age 49 years (Range 18–112). We collected information on potential risk factors of all confirmed cases of TB treated between 2008–2012. For the survival analysis, the outcome of interest was ‘treatment failure’ or ‘death’ (whichever came first). A univariate descriptive statistics analysis was conducted using a non- parametric procedure, Kaplan -Meier (KM) method to estimate overall survival (OS), while the Cox proportional hazard model was used for the multivariate analysis to determine possible association of predictor variables and to obtain adjusted hazard ratio. P value was set at <0.05, log likelihood ratio test at >0.10. Data were analysed using SPSS version 15.0. Results: There was no significant difference in the survival curves of male and female patients. (Log rank statistic = 0.194, df = 1, p = 0.66) and among different age group (Log rank statistic = 1.337, df = 3, p = 0.72). The mean overall survival (OS) was 209 days (95%CI: 92–346) while the median was 51 days (95% CI: 35.7–66). The mean ST for women was 385 days (95%CI: 76.6–694) and for men was 69 days (95%CI: 48.8–88.5). Multivariate Cox regression showed that patient who had history of drug misuse had 2.2 times hazard than those who do not have drug misuse. Smokers and alcohol drinkers had hazard of 1.8 while patients born in country of high endemicity (BICHE) had hazard of 6.3 and HIV co-infection hazard was 1.2. Conclusion: There was no significant difference in survival curves of male and female and among age group. Women had a higher ST compared to men. But men had a higher hazard rate compared to women. Anti-TNF, immunosuppressive medication and diabetes were found to be associated with longer ST, while alcohol, smoking, RICHE, BICHE was associated with shorter ST.
Resumo:
It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain
Resumo:
Twitter has changed the dynamic of the academic conference. Before Twitter, delegate participation was primarily dependent on attendance and feedback was limited to post-event survey. With Twitter, delegates have become active participants. They pass comment, share reactions and critique presentations, all the while generating a running commentary. This study examines this phenomenon using the Academic & Special Libraries (A&SL) conference 2015 (hashtag #asl2015) as a case study. A post-conference survey was undertaken asking delegates how and why they used Twitter at #asl2015. A content and conceptual analysis of tweets was conducted using Topsy and Storify. This analysis examined how delegates interacted with presentations, which sessions generated most activity on the timeline and the type of content shared. Actual tweet activity and volume per presentation was compared to survey responses. Finally, recommendations on Twitter engagement for conference organisers and presenters are provided.