270 resultados para 319.272053
Resumo:
One of the fastest growing industries – aviation – faces serious and compounding challenges in maintaining healthy relationships with community stakeholders. One area in aviation creating community conflict is noise pollution. However, current understandings of the factors that affect noise annoyance of the community are poorly conceptualized. More importantly, the way community needs and expectations could be incorporated in airport governance has been inadequately framed to address the issue of aircraft noise. This paper proposes the util-ity of adopting an integrated strategic asset management (ISAM) framework [1] to explore the dynamic nature of relationships between and airport and its surrounding area. The case of the Gold Coast Airport (OOL) operator and community stakeholders is used. This paper begins with an overview of the ISAM framework in the context of airport governance and sustainable development – as a way to find a balance between economic opportunities and societal concerns through stakeholder engagement. Next, an exploratory case study is adopted as a method to explore the noise-related complaints, complainants, and possible causes. Fol-lowing this, the paper reviews three approaches to community stakeholder engagement in Australia, Japan, and UK and discusses their implications in the con-text of OOL. The paper concludes with a contention that airport governance is likely to be much more effective with the adoption of ISAM framework than without it.
Resumo:
Process-Aware Information Systems (PAISs) support executions of operational processes that involve people, resources, and software applications on the basis of process models. Process models describe vast, often infinite, amounts of process instances, i.e., workflows supported by the systems. With the increasing adoption of PAISs, large process model repositories emerged in companies and public organizations. These repositories constitute significant information resources. Accurate and efficient retrieval of process models and/or process instances from such repositories is interesting for multiple reasons, e.g., searching for similar models/instances, filtering, reuse, standardization, process compliance checking, verification of formal properties, etc. This paper proposes a technique for indexing process models that relies on their alternative representations, called untanglings. We show the use of untanglings for retrieval of process models based on process instances that they specify via a solution to the total executability problem. Experiments with industrial process models testify that the proposed retrieval approach is up to three orders of magnitude faster than the state of the art.
Resumo:
Background We sought to determine whether or not there are differences in disease progression after radical or nonradical (debulking) surgical procedures for malignant pleural mesothelioma. Methods Over a 49-month period, 132 patients with malignant pleural mesothelioma underwent surgery. Fifty-three underwent extrapleural pneumonectomy and 79 underwent nonradical procedures. Time to evidence of clinical disease progression was recorded, as was the site(s) of that disease. Results One-hundred nineteen patients were evaluable, of which 59% (22 radical; 48 nonradical) had disease progression. Overall 30-day mortality was 8.5% (7.5% radical; 9% nonradical). The median time to overall disease progression was considerably longer after extrapleural pneumonectomy than debulking surgery (319 days vs 197 days, p = 0.019), as was the time to local disease progression (631 days vs 218 days, p = 0.0018). There was no preponderance of earlier stage disease in the radical surgery group. There was a trend toward prolonged survival in those undergoing radical surgery, but no significant difference between the groups (497 days vs 324 days, p = 0.079). In those who had extrapleural pneumonectomy, time-to-disease progression significantly decreased with N2 disease compared with N0/1 involvement (197 days vs 358 days, p = 0.02). Conclusions Extrapleural pneumonectomy may be preferable to debulking surgery in malignant pleural mesothelioma to delay disease progression and give greater control of local disease. Involvement of N2 nodes is associated with accelerated disease progression and is therefore a contraindication to extrapleural pneumonectomy. © 2004 by The Society of Thoracic Surgeons.
Resumo:
Osteocytes are the mature cells and perform as mechanosensors within the bone. The mechanical property of osteocytes plays an important role to fulfill these functions. However, little researches have been done to investigate the mechanical deformation properties of single osteocytes. Atomic Force Microscopy (AFM) is a state-of-art experimental facility for high resolution imaging of tissues, cells and any surfaces as well as for probing mechanical properties of the samples both qualitatively and quantitatively. In this paper, the experimental study based on AFM is firstly used to obtain forceindentation curves of single round osteocytes. The porohyperelastic (PHE) model of a single osteocyte is then developed by using the inverse finite element analysis (FEA) to identify and extract mechanical properties from the experiment results. It has been found that the PHE model is a good candidature for biomechanics studies of osteocytes.
Resumo:
Non-linear finite deformations of articular cartilages under physiological loading conditions can be attributed to hyperelastic behavior. This paper contains experimental results of indentation tests in finite deformation and proposes an empirical based new generalized hyperelastic constitutive model to account for strain-rate dependency for humeral head cartilage tissues. The generalized model is based on existing hyperelastic constitutive relationships that are extensively used to represent biological tissues in biomechanical literature. The experimental results were obtained for three loading velocities, corresponding to low (1x10-3 s-1), moderate and high strain-rates (1x10-1 s-1), which represent physiological loading rates that are experienced in daily activities such as lifting, holding objects and sporting activities. Hyperelastic material parameters were identified by non linear curve fitting procedure. Analysis demonstrated that the material behavior of cartilage can be effectively decoupled into strain-rate independent(elastic) and dependent parts. Further, experiments conducted using different indenters indicated that the parameters obtained are significantly affected by the indenter size, potentially due to structural inhomogeneity of the tissue. The hyperelastic constitutive model developed in this paper opens a new avenue for the exploration of material properties of cartilage tissues.
Resumo:
Association rule mining is one technique that is widely used when querying databases, especially those that are transactional, in order to obtain useful associations or correlations among sets of items. Much work has been done focusing on efficiency, effectiveness and redundancy. There has also been a focusing on the quality of rules from single level datasets with many interestingness measures proposed. However, with multi-level datasets now being common there is a lack of interestingness measures developed for multi-level and cross-level rules. Single level measures do not take into account the hierarchy found in a multi-level dataset. This leaves the Support-Confidence approach, which does not consider the hierarchy anyway and has other drawbacks, as one of the few measures available. In this chapter we propose two approaches which measure multi-level association rules to help evaluate their interestingness by considering the database’s underlying taxonomy. These measures of diversity and peculiarity can be used to help identify those rules from multi-level datasets that are potentially useful.
Resumo:
Literacy in dance involves conscious awareness of cognitive, aesthetic and physical activity along with the skills to articulate these activities as required in any given context. Dance literacy, perhaps uniquely, also entails unconscious, tacit, embodied knowledge within the holistic body, a corporeality: knowledge which is physically experienced but only articulated in the dance. The essence of this corporeality has a transcendent quality which contributes to the universality of dance. The degrees to which a dancer’s awareness is refined, the physical activity articulated and the embodied knowledge universal, will define the level of development of the dancer’s literacy. This literacy can be learned, though not every body and mind has equal capacity for development. If we wish to develop dance literacy, qualitatively encompassing more than dance technique, the art of learning must be carefully cultivated to allow the art of dance to flourish. The pathways of learning dance are individuated; transcendence is realised through the common experience that what we are learning is coming from within.
Resumo:
Guaranteeing the quality of extracted features that describe relevant knowledge to users or topics is a challenge because of the large number of extracted features. Most popular existing term-based feature selection methods suffer from noisy feature extraction, which is irrelevant to the user needs (noisy). One popular method is to extract phrases or n-grams to describe the relevant knowledge. However, extracted n-grams and phrases usually contain a lot of noise. This paper proposes a method for reducing the noise in n-grams. The method first extracts more specific features (terms) to remove noisy features. The method then uses an extended random set to accurately weight n-grams based on their distribution in the documents and their terms distribution in n-grams. The proposed approach not only reduces the number of extracted n-grams but also improves the performance. The experimental results on Reuters Corpus Volume 1 (RCV1) data collection and TREC topics show that the proposed method significantly outperforms the state-of-art methods underpinned by Okapi BM25, tf*idf and Rocchio.
Resumo:
Business processes depend on human resources and managers must regularly evaluate the performance of their employees based on a number of measures, some of which are subjective in nature. As modern organisations use information systems to automate their business processes and record information about processes’ executions in event logs, it now becomes possible to get objective information about resource behaviours by analysing data recorded in event logs. We present an extensible framework for extracting knowledge from event logs about the behaviour of a human resource and for analysing the dynamics of this behaviour over time. The framework is fully automated and implements a predefined set of behavioural indicators for human resources. It also provides a means for organisations to define their own behavioural indicators, using the conventional Structured Query Language, and a means to analyse the dynamics of these indicators. The framework's applicability is demonstrated using an event log from a German bank.
Resumo:
This book examines the interface between religion, charity law and human rights. It does so by treating the Church of England and its current circumstances as a timely case study providing an opportunity to examine the tensions that have now become such a characteristic feature of that interface. Firstly, it suggests that the Church is the primary source of canon law principles that have played a formative role in shaping civic morality throughout the common law jurisdictions: the history of their emergence and enforcement by the State in post-Reformation England is recorded and assessed. Secondly, it reveals that of such principles those of greatest weight were associated with matters of sexuality: in particular, for centuries, family law was formulated and applied with regard for the sanctity of the heterosexual marital family which provided the only legally permissible context for any form of sexual relationship. Thirdly, given that history, it identifies and assesses the particular implications that now arise for the Church as a consequence of recent charity law reform outcomes and human rights case law developments: a comparative analysis of religion related case law is provided. Finally, following an outline of the structure and organizational functions of the Church, a detailed analysis is undertaken of its success in engaging with these issues in the context of the Lambeth Conferences, the wider Anglican Communion and in the ill-fated Covenant initiative. From the perspective of the dilemmas currently challenging the moral authority of the Church of England, this book identifies and explores the contemporary ‘moral imperatives’ or red line issues that now threaten the coherence of Christian religions in most leading common law nations. Gay marriage and abortion are among the host of morally charged and deeply divisive topics demanding a reasoned response and leadership from religious bodies. Attention is given to the judicial interpretation and evaluation of these and other issues that now undermine the traditional role of the Church of England. As the interface between religion, charity law and human rights becomes steadily more fractious, with religious fundamentalism and discrimination acquiring a higher profile, there is now a pressing need for a more balanced relationship between those with and those without religious beliefs. This book will be an invaluable aid in starting the process of achieving a triangulated relationship between the principles of canon law, charity law and human rights law.
Resumo:
Computer vision is increasingly becoming interested in the rapid estimation of object detectors. The canonical strategy of using Hard Negative Mining to train a Support Vector Machine is slow, since the large negative set must be traversed at least once per detector. Recent work has demonstrated that, with an assumption of signal stationarity, Linear Discriminant Analysis is able to learn comparable detectors without ever revisiting the negative set. Even with this insight, the time to learn a detector can still be on the order of minutes. Correlation filters, on the other hand, can produce a detector in under a second. However, this involves the unnatural assumption that the statistics are periodic, and requires the negative set to be re-sampled per detector size. These two methods differ chie y in the structure which they impose on the co- variance matrix of all examples. This paper is a comparative study which develops techniques (i) to assume periodic statistics without needing to revisit the negative set and (ii) to accelerate the estimation of detectors with aperiodic statistics. It is experimentally verified that periodicity is detrimental.
Resumo:
Proxy re-encryption (PRE) is a highly useful cryptographic primitive whereby Alice and Bob can endow a proxy with the capacity to change ciphertext recipients from Alice to Bob, without the proxy itself being able to decrypt, thereby providing delegation of decryption authority. Key-private PRE (KP-PRE) specifies an additional level of confidentiality, requiring pseudo-random proxy keys that leak no information on the identity of the delegators and delegatees. In this paper, we propose a CPA-secure PK-PRE scheme in the standard model (which we then transform into a CCA-secure scheme in the random oracle model). Both schemes enjoy highly desirable properties such as uni-directionality and multi-hop delegation. Unlike (the few) prior constructions of PRE and KP-PRE that typically rely on bilinear maps under ad hoc assumptions, security of our construction is based on the hardness of the standard Learning-With-Errors (LWE) problem, itself reducible from worst-case lattice hard problems that are conjectured immune to quantum cryptanalysis, or “post-quantum”. Of independent interest, we further examine the practical hardness of the LWE assumption, using Kannan’s exhaustive search algorithm coupling with pruning techniques. This leads to state-of-the-art parameters not only for our scheme, but also for a number of other primitives based on LWE published the literature.
Resumo:
This work is motivated by the desire to covertly track mobile targets, either animal or human, in previously unmapped outdoor natural environments using off-road robotic platforms with a non-negligible acoustic signature. The use of robots for stealthy surveillance is not new. Many studies exist but only consider the navigation problem to maintain visual covertness. However, robotic systems also have a significant acoustic footprint from the onboard sensors, motors, computers and cooling systems, and also from the wheels interacting with the terrain during motion. All these can jepordise any visual covertness. In this work, we experimentally explore the concepts of opportunistically utilizing naturally occurring sounds within outdoor environments to mask the motion of a robot, and being visually covert whilst maintaining constant observation of the target. Our experiments in a constrained outdoor built environment demonstrate the effectiveness of the concept by showing a reduced acoustic signature as perceived by a mobile target allowing the robot to covertly navigate to opportunistic vantage points for observation.
Resumo:
In this paper we present a unified sequential Monte Carlo (SMC) framework for performing sequential experimental design for discriminating between a set of models. The model discrimination utility that we advocate is fully Bayesian and based upon the mutual information. SMC provides a convenient way to estimate the mutual information. Our experience suggests that the approach works well on either a set of discrete or continuous models and outperforms other model discrimination approaches.