161 resultados para AD-HOC NETWORKS
Resumo:
A crucial task in contractor prequalification is to establish a set of decision criteria through which the capabilities of contractors are measured and judged. However, in the UK, there are no nationwide standards or guidelines governing the selection of decision criteria for contractor prequalification. The decision criteria are usually established by individual clients on an ad hoc basis. This paper investigates the divergence of decision criteria used by different client and consultant organisations in contractor prequalification through a large empirical survey conducted in the UK. The results indicate that there are significant differences in the selection and use of decision criteria for prequalification.
Resumo:
As the systematic investigation of Twitter as a communications platform continues, the question of developing reliable comparative metrics for the evaluation of public, communicative phenomena on Twitter becomes paramount. What is necessary here is the establishment of an accepted standard for the quantitative description of user activities on Twitter. This needs to be flexible enough in order to be applied to a wide range of communicative situations, such as the evaluation of individual users’ and groups of users’ Twitter communication strategies, the examination of communicative patterns within hashtags and other identifiable ad hoc publics on Twitter (Bruns & Burgess, 2011), and even the analysis of very large datasets of everyday interactions on the platform. By providing a framework for quantitative analysis on Twitter communication, researchers in different areas (e.g., communication studies, sociology, information systems) are enabled to adapt methodological approaches and to conduct analyses on their own. Besides general findings about communication structure on Twitter, large amounts of data might be used to better understand issues or events retrospectively, detect issues or events in an early stage, or even to predict certain real-world developments (e.g., election results; cf. Tumasjan, Sprenger, Sandner, & Welpe, 2010, for an early attempt to do so).
Resumo:
Over the past decade, social media have gone through a process of legitimation and official adoption, and they are now becoming embedded as part of the official communications apparatus of many commercial and public-sector organisations— in turn, providing platforms like Twitter with their own sources of legitimacy. Arguably, the demonstrated utility of social media platforms and tools in times of crisis—from civil unrest and violent crime through to natural disasters like bushfires, earthquakes, and floods—has been a crucial driver of this newfound legitimacy. In the mid-2000s, user-created content and ‘Web 2.0’ platforms were known to play a role in crisis communication; back then, the involvement of extra-institutional actors in providing and sharing information around such events involved distributed, ad hoc, or niche platforms (like Flickr), and was more likely to be framed as ‘citizen journalism’ or ‘crowdsourcing’ (see, for example, Liu, Palen, Sutton, Hughes, & Vieweg, 2008, on the then-emerging role of photo-sharing in disasters). Since then, the dramatically increased take-up of mainstream social media platforms like Facebook and Twitter means that the pool of potential participants in online crisis communication has broadened to include a much larger proportion of the general population, as well as traditional media and official emergency response organisations.
Resumo:
Australia lacks a satisfactory, national paradigm for assessing legal capacity in the context of testamentary, enduring power of attorney and advance care directive documents. Capacity assessments are currently conducted on an ad hoc basis by legal and/or medical professionals. The reliability of the assessment process is subject to the skill set and mutual understanding of the legal and/or medical professional conducting the assessment. There is a growth in the prevalence of diseases such as dementia. Such diseases impact upon cognition which increasingly necessitates collaboration between the legal and medical professions when assessing the effect of mentally disabling conditions upon legal capacity. Miscommunication and lack of understanding between legal and medical professionals involved could impede the development of a satisfactory paradigm. This article will discuss legal capacity assessment in Australia and how to strengthen the relationship between legal and medical professionals involved in capacity assessments. The development of a national paradigm would promote consistency and transparency of process, helping to improve the professional relationship and maximising the principles of autonomy, participation and dignity.
Resumo:
Proxy re-encryption (PRE) is a highly useful cryptographic primitive whereby Alice and Bob can endow a proxy with the capacity to change ciphertext recipients from Alice to Bob, without the proxy itself being able to decrypt, thereby providing delegation of decryption authority. Key-private PRE (KP-PRE) specifies an additional level of confidentiality, requiring pseudo-random proxy keys that leak no information on the identity of the delegators and delegatees. In this paper, we propose a CPA-secure PK-PRE scheme in the standard model (which we then transform into a CCA-secure scheme in the random oracle model). Both schemes enjoy highly desirable properties such as uni-directionality and multi-hop delegation. Unlike (the few) prior constructions of PRE and KP-PRE that typically rely on bilinear maps under ad hoc assumptions, security of our construction is based on the hardness of the standard Learning-With-Errors (LWE) problem, itself reducible from worst-case lattice hard problems that are conjectured immune to quantum cryptanalysis, or “post-quantum”. Of independent interest, we further examine the practical hardness of the LWE assumption, using Kannan’s exhaustive search algorithm coupling with pruning techniques. This leads to state-of-the-art parameters not only for our scheme, but also for a number of other primitives based on LWE published the literature.
Resumo:
The operation of Autonomous Underwater Vehicles (AUVs) within underwater sensor network fields provides an opportunity to reuse the network infrastructure for long baseline localisation of the AUV. Computationally efficient localisation can be accomplished using off-the-shelf hardware that is comparatively inexpensive and which could already be deployed in the environment for monitoring purposes. This paper describes the development of a particle filter based localisation system which is implemented onboard an AUV in real-time using ranging information obtained from an ad-hoc underwater sensor network. An experimental demonstration of this approach was conducted in a lake with results presented illustrating network communication and localisation performance.
Resumo:
The use of the Sengstaken–Blakemore tube as a life-saving treatment for bleeding oesophageal varices is slowly becoming the least preferred method possibly due to the potential complications associated with its placement. Nursing practice pertaining to the care of this patient group appears ad hoc and reliant on local knowledge and experience as opposed to recognised evidence of best practice. Therefore, this paper focuses on the application of Lewin's transitional change theory used to introduce a change in nursing practice with the application of a guideline to enhance the care of patients with a Sengstaken–Blakemore tube in situ within a general intensive care unit. This method identified some of the complexities surrounding the change process including the driving and restraining forces that must be harnessed and minimised in order for the adoption of change to be successful.
Resumo:
We revisit the venerable question of access credentials management, which concerns the techniques that we, humans with limited memory, must employ to safeguard our various access keys and tokens in a connected world. Although many existing solutions can be employed to protect a long secret using a short password, those solutions typically require certain assumptions on the distribution of the secret and/or the password, and are helpful against only a subset of the possible attackers. After briefly reviewing a variety of approaches, we propose a user-centric comprehensive model to capture the possible threats posed by online and offline attackers, from the outside and the inside, against the security of both the plaintext and the password. We then propose a few very simple protocols, adapted from the Ford-Kaliski server-assisted password generator and the Boldyreva unique blind signature in particular, that provide the best protection against all kinds of threats, for all distributions of secrets. We also quantify the concrete security of our approach in terms of online and offline password guesses made by outsiders and insiders, in the random-oracle model. The main contribution of this paper lies not in the technical novelty of the proposed solution, but in the identification of the problem and its model. Our results have an immediate and practical application for the real world: they show how to implement single-sign-on stateless roaming authentication for the internet, in a ad-hoc user-driven fashion that requires no change to protocols or infrastructure.
Resumo:
Ranking documents according to the Probability Ranking Principle has been theoretically shown to guarantee optimal retrieval effectiveness in tasks such as ad hoc document retrieval. This ranking strategy assumes independence among document relevance assessments. This assumption, however, often does not hold, for example in the scenarios where redundancy in retrieved documents is of major concern, as it is the case in the sub–topic retrieval task. In this chapter, we propose a new ranking strategy for sub–topic retrieval that builds upon the interdependent document relevance and topic–oriented models. With respect to the topic– oriented model, we investigate both static and dynamic clustering techniques, aiming to group topically similar documents. Evidence from clusters is then combined with information about document dependencies to form a new document ranking. We compare and contrast the proposed method against state–of–the–art approaches, such as Maximal Marginal Relevance, Portfolio Theory for Information Retrieval, and standard cluster–based diversification strategies. The empirical investigation is performed on the ImageCLEF 2009 Photo Retrieval collection, where images are assessed with respect to sub–topics of a more general query topic. The experimental results show that our approaches outperform the state–of–the–art strategies with respect to a number of diversity measures.
Resumo:
In this thesis we investigate the use of quantum probability theory for ranking documents. Quantum probability theory is used to estimate the probability of relevance of a document given a user's query. We posit that quantum probability theory can lead to a better estimation of the probability of a document being relevant to a user's query than the common approach, i. e. the Probability Ranking Principle (PRP), which is based upon Kolmogorovian probability theory. Following our hypothesis, we formulate an analogy between the document retrieval scenario and a physical scenario, that of the double slit experiment. Through the analogy, we propose a novel ranking approach, the quantum probability ranking principle (qPRP). Key to our proposal is the presence of quantum interference. Mathematically, this is the statistical deviation between empirical observations and expected values predicted by the Kolmogorovian rule of additivity of probabilities of disjoint events in configurations such that of the double slit experiment. We propose an interpretation of quantum interference in the document ranking scenario, and examine how quantum interference can be effectively estimated for document retrieval. To validate our proposal and to gain more insights about approaches for document ranking, we (1) analyse PRP, qPRP and other ranking approaches, exposing the assumptions underlying their ranking criteria and formulating the conditions for the optimality of the two ranking principles, (2) empirically compare three ranking principles (i. e. PRP, interactive PRP, and qPRP) and two state-of-the-art ranking strategies in two retrieval scenarios, those of ad-hoc retrieval and diversity retrieval, (3) analytically contrast the ranking criteria of the examined approaches, exposing similarities and differences, (4) study the ranking behaviours of approaches alternative to PRP in terms of the kinematics they impose on relevant documents, i. e. by considering the extent and direction of the movements of relevant documents across the ranking recorded when comparing PRP against its alternatives. Our findings show that the effectiveness of the examined ranking approaches strongly depends upon the evaluation context. In the traditional evaluation context of ad-hoc retrieval, PRP is empirically shown to be better or comparable to alternative ranking approaches. However, when we turn to examine evaluation contexts that account for interdependent document relevance (i. e. when the relevance of a document is assessed also with respect to other retrieved documents, as it is the case in the diversity retrieval scenario) then the use of quantum probability theory and thus of qPRP is shown to improve retrieval and ranking effectiveness over the traditional PRP and alternative ranking strategies, such as Maximal Marginal Relevance, Portfolio theory, and Interactive PRP. This work represents a significant step forward regarding the use of quantum theory in information retrieval. It demonstrates in fact that the application of quantum theory to problems within information retrieval can lead to improvements both in modelling power and retrieval effectiveness, allowing the constructions of models that capture the complexity of information retrieval situations. Furthermore, the thesis opens up a number of lines for future research. These include: (1) investigating estimations and approximations of quantum interference in qPRP; (2) exploiting complex numbers for the representation of documents and queries, and; (3) applying the concepts underlying qPRP to tasks other than document ranking.
Resumo:
At the end of the first decade of the twenty-first century, there is unprecedented awareness of the need for a transformation in development, to meet the needs of the present while also preserving the ability of future generations to meet their own needs. However, within engineering, educators still tend to regard such development as an ‘aspect’ of engineering rather than an overarching meta-context, with ad hoc and highly variable references to topics. Furthermore, within a milieu of interpretations there can appear to be conflicting needs for achieving sustainable development, which can be confusing for students and educators alike. Different articulations of sustainable development can create dilemmas around conflicting needs for designers and researchers, at the level of specific designs and (sub-) disciplinary analysis. Hence sustainability issues need to be addressed at a meta-level using a whole of system approach, so that decisions regarding these dilemmas can be made. With this appreciation, and in light of curriculum renewal challenges that also exist in engineering education, this paper considers how educators might take the next step to move from sustainable development being an interesting ‘aspect’ of the curriculum, to sustainable development as a meta-context for curriculum renewal. It is concluded that capacity building for such strategic considerations is critical in engineering education.
Resumo:
At the end of the first decade of the twenty-first century, there is unprecedented awareness of the need for a transformation in development, to meet the needs of the present while also preserving the ability of future generations to meet their own needs. However, within engineering, educators still tend to regard such development as an ‘aspect’ of engineering rather than an overarching meta-context, with ad hoc and highly variable references to topics. Furthermore, within a milieu of interpretations there can appear to be conflicting needs for achieving sustainable development, which can be confusing for students and educators alike. Different articulations of sustainable development can create dilemmas around conflicting needs for designers and researchers, at the level of specific designs and (sub-) disciplinary analysis. Hence sustainability issues need to be addressed at a meta-level using a whole of system approach, so that decisions regarding these dilemmas can be made. With this appreciation, and in light of curriculum renewal challenges that also exist in engineering education, this paper considers how educators might take the next step to move from sustainable development being an interesting ‘aspect’ of the curriculum, to sustainable development as a meta-context for curriculum renewal. It is concluded that capacity building for such strategic considerations is critical in engineering education.
Resumo:
With increasing signs of climate change and the influence of national and international carbon-related laws and agreements, governments all over the world are grappling with how to rapidly transition to low-carbon living. This includes adapting to the impacts of climate change that are very likely to be experienced due to current emission levels (including extreme weather and sea level changes), and mitigating against further growth in greenhouse gas emissions that are likely to result in further impacts. Internationally, the concept of ‘Biophilic Urbanism’, a term coined by Professors Tim Beatley and Peter Newman to refer to the use of natural elements as design features in urban landscapes, is emerging as a key component in addressing such climate change challenges in rapidly growing urban contexts. However, the economics of incorporating such options is not well understood and requires further attention to underpin a mainstreaming of biophilic urbanism. Indeed, there appears to be an ad hoc, reactionary approach to creating economic arguments for or against the design, installation or maintenance of natural elements such as green walls, green roofs, streetscapes, and parklands. With this issue in mind, this paper will overview research as part of an industry collaborative research project that considers the potential for using a number of environmental economic valuation techniques that have evolved over the last several decades in agricultural and resource economics, to systematically value the economic value of biophilic elements in the urban context. Considering existing literature on environmental economic valuation techniques, the paper highlights opportunities for creating a standardised language for valuing biophilic elements. The conclusions have implications for expanding the field of environmental economic value to support the economic evaluations and planning of the greater use of natural elements in cities. Insights are also noted for the more mature fields of agricultural and resource economics.
Resumo:
A significant reduction in global greenhouse gas (GHG) emissions is a priority, and the preservation of existing building stock presents a significant opportunity to reduce the carbon footprint of our built environment. Within this ‘wicked’ problem context, and moving beyond the ad hoc and incremental performance improvements that have been made to date, collaborative and multidisciplinary efforts are required to find rapid and transformational solutions. Design has emerged as a strategic and redirective practice, and lessons can therefore be learned about transformation and potentially applied in the built environment. The purpose of this paper is to discuss a pragmatic and novel research approach for undertaking such applied design driven research. This paper begins with a discussion of key contributions from design science (rational) and action research (reflective) philosophies in creating an emerging methodological ‘hybrid design approach’. This research approach is then discussed in relation to its application to specific research exploring the processes, methods and lessons from design in heritage building retrofit projects. Drawing on both industry and academic knowledge to ensure relevance and rigour, it is anticipated that the hybrid design approach will be useful for others tackling such complex wicked problems that require context-specific solutions.
Resumo:
The Comment by Mayers and Reiter criticizes our work on two counts. Firstly, it is claimed that the quantum decoherence effects that we report in consequence of our experimental analysis of neutron Compton scattering from H in gaseous H2 are not, as we maintain, outside the framework of conventional neutron scatteringtheory. Secondly, it is claimed that we did not really observe such effects, owing to a faulty analysis of the experimental data, which are claimed to be in agreement with conventional theory. We focus in this response on the critical issue of the reliability of our experimental results and analysis. Using the same standard Vesuvio instrument programs used by Mayers et al., we show that, if the experimental results for H in gaseous H2 are in agreement with conventional theory, then those for D in gaseous D2 obtained in the same way cannot be, and vice-versa. We expose a flaw in the calibration methodology used by Mayers et al. that leads to the present disagreement over the behaviour of H, namely the ad hoc adjustment of the measured H peak positions in TOF during the calibration of Vesuvio so that agreement is obtained with the expectation of conventional theory. We briefly address the question of the necessity to apply the theory of open quantum systems.