415 resultados para Event Log Mining
Resumo:
Term-based approaches can extract many features in text documents, but most include noise. Many popular text-mining strategies have been adapted to reduce noisy information from extracted features; however, text-mining techniques suffer from low frequency. The key issue is how to discover relevance features in text documents to fulfil user information needs. To address this issue, we propose a new method to extract specific features from user relevance feedback. The proposed approach includes two stages. The first stage extracts topics (or patterns) from text documents to focus on interesting topics. In the second stage, topics are deployed to lower level terms to address the low-frequency problem and find specific terms. The specific terms are determined based on their appearances in relevance feedback and their distribution in topics or high-level patterns. We test our proposed method with extensive experiments in the Reuters Corpus Volume 1 dataset and TREC topics. Results show that our proposed approach significantly outperforms the state-of-the-art models.
Resumo:
An important aspect of decision support systems involves applying sophisticated and flexible statistical models to real datasets and communicating these results to decision makers in interpretable ways. An important class of problem is the modelling of incidence such as fire, disease etc. Models of incidence known as point processes or Cox processes are particularly challenging as they are ‘doubly stochastic’ i.e. obtaining the probability mass function of incidents requires two integrals to be evaluated. Existing approaches to the problem either use simple models that obtain predictions using plug-in point estimates and do not distinguish between Cox processes and density estimation but do use sophisticated 3D visualization for interpretation. Alternatively other work employs sophisticated non-parametric Bayesian Cox process models, but do not use visualization to render interpretable complex spatial temporal forecasts. The contribution here is to fill this gap by inferring predictive distributions of Gaussian-log Cox processes and rendering them using state of the art 3D visualization techniques. This requires performing inference on an approximation of the model on a discretized grid of large scale and adapting an existing spatial-diurnal kernel to the log Gaussian Cox process context.
Resumo:
To this day, realizations in the standard-model of (lossy) trapdoor functions from discrete-log-type assumptions require large public key sizes, e.g., about Θ(λ 2) group elements for a reduction from the decisional Diffie-Hellman assumption (where λ is a security parameter). We propose two realizations of lossy trapdoor functions that achieve public key size of only Θ(λ) group elements in bilinear groups, with a reduction from the decisional Bilinear Diffie-Hellman assumption. Our first construction achieves this result at the expense of a long common reference string of Θ(λ 2) elements, albeit reusable in multiple LTDF instantiations. Our second scheme also achieves public keys of size Θ(λ), entirely in the standard model and in particular without any reference string, at the cost of a slightly more involved construction. The main technical novelty, developed for the second scheme, is a compact encoding technique for generating compressed representations of certain sequences of group elements for the public parameters.
Resumo:
Sharing photos through mobile devices has a great potential for creating shared experiences of social events between co-located as well as remote participants. In order to design novel event sharing tools, we need to develop in-depth understanding of current practices surrounding these so called ‘event photos’- photos about and taken during different social events such as weddings picnics, and music concert visits among others. We studied people’s practices related to event photos through in-depth interviews, guided home visits and naturalistic observations. Our results show four major themes describing practices surrounding event photos: 1) representing events, 2) significant moments, 3) situated activities through photos, and 4) collectivism and roles of participants.
Resumo:
The recent floods in south-east Queensland have focused policy, academic and community attention on the challenges associated with severe weather events (SWE), specifically pre-disaster preparation, disaster-response and post-disaster community resilience. Financially, the cost of SWE was $9 billion in the 2011 Australian Federal Budget (Swan 2011); psychologically and emotionally, the impact on individual mental health and community wellbeing is also significant but more difficult to quantify. However, recent estimates suggest that as many as one in five will subsequently experience major emotional distress (Bonanno et al. 2010). With climate change predicted to increase the frequency and intensity of a wide range of SWE in Australia (Garnaut 2011; The Climate Institute 2011), there is an urgent and critical need to ensure that the unique psychological and social needs of more vulnerable community members - such as older residents - are better understood and integrated into disaster preparedness and response policy, planning and protocols. Navigating the complex dynamics of SWE can be particularly challenging for older adults and their disaster experience is frequently magnified by a wide array of cumulative and interactive stressors, which intertwine to make them uniquely vulnerable to significant short and long-term adverse effects. This current article provides a brief introduction to the current literature in this area and highlights a gap in the research relating to communication tools during and after severe weather events.
Resumo:
Event report on the Open Access and Research 2013 conference which focused on recent developments and the strategic advantages they bring to the research sector.
Resumo:
Standard Monte Carlo (sMC) simulation models have been widely used in AEC industry research to address system uncertainties. Although the benefits of probabilistic simulation analyses over deterministic methods are well documented, the sMC simulation technique is quite sensitive to the probability distributions of the input variables. This phenomenon becomes highly pronounced when the region of interest within the joint probability distribution (a function of the input variables) is small. In such cases, the standard Monte Carlo approach is often impractical from a computational standpoint. In this paper, a comparative analysis of standard Monte Carlo simulation to Markov Chain Monte Carlo with subset simulation (MCMC/ss) is presented. The MCMC/ss technique constitutes a more complex simulation method (relative to sMC), wherein a structured sampling algorithm is employed in place of completely randomized sampling. Consequently, gains in computational efficiency can be made. The two simulation methods are compared via theoretical case studies.
Resumo:
The practices and public reputation of mining have been changing over time. In the past, mining operations frequently stood accused of being socially and environmentally disruptive, whereas mining today invests heavily in ‘socially responsible’ and ‘sustainable’ business practices. Changes such as these can be witnessed internationally as well as in places like Western Australia (WA), where the mining sector has matured into an economic pillar of the state, and indeed the nation in the context of the recent resources boom. This paper explores the role of mining in WA, presenting a multi-disciplinary perspective on the sector's contribution to sustainable development in the state. The perspectives offered here are drawn from community-based research and the associated academic literature as well as data derived from government sources and the not-for-profit sector. Findings suggest that despite noteworthy attitudinal and operational improvements in the industry, social, economic and environmental problem areas remain. As mining in WA is expected to grow in the years to come, these problem areas require the attention of business and government alike to ensure the long-term sustainability of development as well as people and place.
Resumo:
This paper presents an analysis of media reports of Australian women in mine management. It argues that a dominant storyline in the texts is one of gender change; in fact, a ‘feminine revolution’ is said to have occurred in the mining industry and corporate Australia more generally. Despite this celebratory and transformative discourse the female mine managers interviewed in the media texts seek to distance themselves from women/female identity/femininity and take up a script of gender neutrality. It is demonstrated, however, that this script is saturated with the assumptions and definitions of managerial masculinity.
Resumo:
This paper draws upon Hubbard's (1999, p. 57) term ‘scary heterosexualities,’ that is non-normative heterosexuality, in the context of the rural drawing on data from fieldwork in the remote Western Australian mining town of Kalgoorlie. Our focus is ‘the skimpie’ – a female barmaid who serves in her underwear and who, in both historical and contemporary times, is strongly associated with rural mining communities. Interviews with skimpies and local residents as well as participant observation reveal how potential fears and anxieties about skimpies are managed. We identify the discursive and spatial processes by which skimpie work is contained in Kalgoorlie so that the potential scariness ‘the skimpie’ represents to the rural is muted and buttressed in terms of a more conventional and less threatening rural heterosexuality.
Resumo:
Despite ongoing ‘boom’ conditions in the Australian mining industry, women remain substantially and unevenly under-represented in the sector, as is the case in other resource-dependent countries. Building on the literature critiquing business-case rationales and strategies as a means to achieve women’s equality in the workplace, we examine the business case for employing more women as advanced by the Australian mining industry. Specifically, we apply a discourse analysis to seven substantial, publically-available documents produced by the industry’s national and state peak organizations between 2005 and 2013. Our study makes two contributions. First, we map the features of the business case at the sectoral rather than firm or workplace level and examine its public mobilization. Second, we identify the construction and deployment of a normative identity – ‘the ideal mining woman’ – as a key outcome of this business-case discourse. Crucially, women are therein positioned as individually responsible for gender equality in the workplace.
Resumo:
Objective While many jurisdictions internationally now require learner drivers to complete a specified number of hours of supervised driving practice before being able to drive unaccompanied, very few require learner drivers to complete a log book to record this practice and then present it to the licensing authority. Learner drivers in most Australian jurisdictions must complete a log book that records their practice thereby confirming to the licensing authority that they have met the mandated hours of practice requirement. These log books facilitate the management and enforcement of minimum supervised hours of driving requirements. Method Parents of learner drivers in two Australian states, Queensland and New South Wales, completed an online survey assessing a range of factors, including their perceptions of the accuracy of their child’s learner log book and the effectiveness of the log book system. Results The study indicates that the large majority of parents believe that their child’s learner log book is accurate. However, they generally report that the log book system is only moderately effective as a system to measure the number of hours of supervised practice a learner driver has completed. Conclusions The results of this study suggest the presence of a paradox with many parents possibly believing that others are not as diligent in the use of log books as they are or that the system is too open to misuse. Given that many parents report that their child’s log book is accurate, this study has important implications for the development and ongoing monitoring of hours of practice requirements in graduated driver licensing systems.
Resumo:
Automated process discovery techniques aim at extracting process models from information system logs. Existing techniques in this space are effective when applied to relatively small or regular logs, but generate spaghetti-like and sometimes inaccurate models when confronted to logs with high variability. In previous work, trace clustering has been applied in an attempt to reduce the size and complexity of automatically discovered process models. The idea is to split the log into clusters and to discover one model per cluster. This leads to a collection of process models – each one representing a variant of the business process – as opposed to an all-encompassing model. Still, models produced in this way may exhibit unacceptably high complexity and low fitness. In this setting, this paper presents a two-way divide-and-conquer process discovery technique, wherein the discovered process models are split on the one hand by variants and on the other hand hierarchically using subprocess extraction. Splitting is performed in a controlled manner in order to achieve user-defined complexity or fitness thresholds. Experiments on real-life logs show that the technique produces collections of models substantially smaller than those extracted by applying existing trace clustering techniques, while allowing the user to control the fitness of the resulting models.
Resumo:
Dose-finding designs estimate the dose level of a drug based on observed adverse events. Relatedness of the adverse event to the drug has been generally ignored in all proposed design methodologies. These designs assume that the adverse events observed during a trial are definitely related to the drug, which can lead to flawed dose-level estimation. We incorporate adverse event relatedness into the so-called continual reassessment method. Adverse events that have ‘doubtful’ or ‘possible’ relationships to the drug are modelled using a two-parameter logistic model with an additive probability mass. Adverse events ‘probably’ or ‘definitely’ related to the drug are modelled using a cumulative logistic model. To search for the maximum tolerated dose, we use the maximum estimated toxicity probability of these two adverse event relatedness categories. We conduct a simulation study that illustrates the characteristics of the design under various scenarios. This article demonstrates that adverse event relatedness is important for improved dose estimation. It opens up further research pathways into continual reassessment design methodologies.
Resumo:
This study was a step forward to improve the performance for discovering useful knowledge – especially, association rules in this study – in databases. The thesis proposed an approach to use granules instead of patterns to represent knowledge implicitly contained in relational databases; and multi-tier structure to interpret association rules in terms of granules. Association mappings were proposed for the construction of multi-tier structure. With these tools, association rules can be quickly assessed and meaningless association rules can be justified according to the association mappings. The experimental results indicated that the proposed approach is promising.