916 resultados para uses


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This book offers a fundamental challenge to a variety of theoretical, social, and political paradigms, ranging from law and justice studies to popular culture, linguistics to political activism. Developing the intellectual project initiated in Queering Paradigms, this volume extends queer theorizing in challenging new directions and uses queer insights to explore, trouble, and interrogate the social, political, and intellectual agendas that pervade (and are often taken for granted within) public discourses and academic disciplines. The contributing authors include queer theorists, socio-linguists, sociologists, political activists, educators, social workers and criminologists. Together, they contribute not only to the ongoing process of theorizing queerly, but also to the critique and reformulation of their respective disciplines.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A Wireless Sensor Network (WSN) is a set of sensors that are integrated with a physical environment. These sensors are small in size, and capable of sensing physical phenomena and processing them. They communicate in a multihop manner, due to a short radio range, to form an Ad Hoc network capable of reporting network activities to a data collection sink. Recent advances in WSNs have led to several new promising applications, including habitat monitoring, military target tracking, natural disaster relief, and health monitoring. The current version of sensor node, such as MICA2, uses a 16 bit, 8 MHz Texas Instruments MSP430 micro-controller with only 10 KB RAM, 128 KB program space, 512 KB external ash memory to store measurement data, and is powered by two AA batteries. Due to these unique specifications and a lack of tamper-resistant hardware, devising security protocols for WSNs is complex. Previous studies show that data transmission consumes much more energy than computation. Data aggregation can greatly help to reduce this consumption by eliminating redundant data. However, aggregators are under the threat of various types of attacks. Among them, node compromise is usually considered as one of the most challenging for the security of WSNs. In a node compromise attack, an adversary physically tampers with a node in order to extract the cryptographic secrets. This attack can be very harmful depending on the security architecture of the network. For example, when an aggregator node is compromised, it is easy for the adversary to change the aggregation result and inject false data into the WSN. The contributions of this thesis to the area of secure data aggregation are manifold. We firstly define the security for data aggregation in WSNs. In contrast with existing secure data aggregation definitions, the proposed definition covers the unique characteristics that WSNs have. Secondly, we analyze the relationship between security services and adversarial models considered in existing secure data aggregation in order to provide a general framework of required security services. Thirdly, we analyze existing cryptographic-based and reputationbased secure data aggregation schemes. This analysis covers security services provided by these schemes and their robustness against attacks. Fourthly, we propose a robust reputationbased secure data aggregation scheme for WSNs. This scheme minimizes the use of heavy cryptographic mechanisms. The security advantages provided by this scheme are realized by integrating aggregation functionalities with: (i) a reputation system, (ii) an estimation theory, and (iii) a change detection mechanism. We have shown that this addition helps defend against most of the security attacks discussed in this thesis, including the On-Off attack. Finally, we propose a secure key management scheme in order to distribute essential pairwise and group keys among the sensor nodes. The design idea of the proposed scheme is the combination between Lamport's reverse hash chain as well as the usual hash chain to provide both past and future key secrecy. The proposal avoids the delivery of the whole value of a new group key for group key update; instead only the half of the value is transmitted from the network manager to the sensor nodes. This way, the compromise of a pairwise key alone does not lead to the compromise of the group key. The new pairwise key in our scheme is determined by Diffie-Hellman based key agreement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Analyzing security protocols is an ongoing research in the last years. Different types of tools are developed to make the analysis process more precise, fast and easy. These tools consider security protocols as black boxes that can not easily be composed. It is difficult or impossible to do a low-level analysis or combine different tools with each other using these tools. This research uses Coloured Petri Nets (CPN) to analyze OSAP trusted computing protocol. The OSAP protocol is modeled in different levels and it is analyzed using state space method. The produced model can be combined with other trusted computing protocols in future works.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article explores the use of probabilistic classification, namely finite mixture modelling, for identification of complex disease phenotypes, given cross-sectional data. In particular, if focuses on posterior probabilities of subgroup membership, a standard output of finite mixture modelling, and how the quantification of uncertainty in these probabilities can lead to more detailed analyses. Using a Bayesian approach, we describe two practical uses of this uncertainty: (i) as a means of describing a persons membership to a single or multiple latent subgroups and (ii) as a means of describing identified subgroups by patient-centred covariates not included in model estimation. These proposed uses are demonstrated on a case study in Parkinsons disease (PD), where latent subgroups are identified using multiple symptoms from the Unified Parkinsons Disease Rating Scale (UPDRS).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many music programs in Australia deliver a United States (US) package created by the Recreational Music-Making Movement, founded by Karl Bruhn and Barry Bittman. This quasi-formal group of music makers, academics and practitioners uses the logic of decentralised global networks to connect with local musicians, offering them benefits associated with their Recreational Music Program (RMP). These RMPs encapsulate the broad goals of the movement, developed in the US during the 1980s, and now available as a package, endorsed by the National Association of Music Merchants (NAMM), for music retailers and community organisations to deliver locally (Bittman et al., 2003). High participation rates in RMPs have been historically documented amongst baby boomers with disposable income. Yet the Australian programs increasingly target marginalised groups and associated funding sources, which in turn has lowered the costs of participation. This chapter documents how Australian manifestations of RMPs presently report on the benefits of participation to attract cross-sector funding. It seeks to show the diversity of participants who claim to have developed and accessed resources that improve their capacity for resilience through recreational music performance events. We identify funding issues pertaining to partnerships between local agencies and state governments that have begun to commission such music programs. Our assessment of eight Australian RMPs includes all additional music groups implemented since the first program, their purposes and costs, the skills and coping strategies that participants developed, how organisers have reported on resources, outcomes and attracted funding. We represent these features through a summary table, standard descriptive statistics and commentaries from participants and organisers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a method for measuring the in-bucket payload volume on a dragline excavator for the purpose of estimating the material's bulk density in real-time. Knowledge of the payload's bulk density can provide feedback to mine planning and scheduling to improve blasting and therefore provide a more uniform bulk density across the excavation site. This allows a single optimal bucket size to be used for maximum overburden removal per dig and in turn reduce costs and emissions in dragline operation and maintenance. The proposed solution uses a range bearing laser to locate and scan full buckets between the lift and dump stages of the dragline cycle. The bucket is segmented from the scene using cluster analysis, and the pose of the bucket is calculated using the Iterative Closest Point (ICP) algorithm. Payload points are identified using a known model and subsequently converted into a height grid for volume estimation. Results from both scaled and full scale implementations show that this method can achieve an accuracy of above 95%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents an image based visual servoing system that is intended to be used for tracking and obtaining scientific observations of the HIFiRE vehicles. The primary aim of this tracking platform is to acquire and track the thermal signature emitted from the surface of the vehicle during the re-entry phase of the mission using an infra-red camera. The implemented visual servoing scheme uses a classical image based approach to identify and track the target using visual kinematic control. The paper utilizes simulation and experimental results to show the tracking performance of the system using visual feedback. Discussions on current implementation and control techniques to further improve the performance of the system are also explored.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A model to predict the buildup of mainly traffic-generated volatile organic compounds or VOCs (toluene, ethylbenzene, ortho-xylene, meta-xylene, and para-xylene) on urban road surfaces is presented. The model required three traffic parameters, namely average daily traffic (ADT), volume to capacity ratio (V/C), and surface texture depth (STD), and two chemical parameters, namely total suspended solid (TSS) and total organic carbon (TOC), as predictor variables. Principal component analysis and two phase factor analysis were performed to characterize the model calibration parameters. Traffic congestion was found to be the underlying cause of traffic-related VOC buildup on urban roads. The model calibration was optimized using orthogonal experimental design. Partial least squares regression was used for model prediction. It was found that a better optimized orthogonal design could be achieved by including the latent factors of the data matrix into the design. The model performed fairly accurately for three different land uses as well as five different particle size fractions. The relative prediction errors were 1040% for the different size fractions and 2840% for the different land uses while the coefficients of variation of the predicted intersite VOC concentrations were in the range of 2545% for the different size fractions. Considering the sizes of the data matrices, these coefficients of variation were within the acceptable interlaboratory range for analytes at ppb concentration levels.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Intelligent agents are an advanced technology utilized in Web Intelligence. When searching information from a distributed Web environment, information is retrieved by multi-agents on the client site and fused on the broker site. The current information fusion techniques rely on cooperation of agents to provide statistics. Such techniques are computationally expensive and unrealistic in the real world. In this paper, we introduce a model that uses a world ontology constructed from the Dewey Decimal Classification to acquire user profiles. By search using specific and exhaustive user profiles, information fusion techniques no longer rely on the statistics provided by agents. The model has been successfully evaluated using the large INEX data set simulating the distributed Web environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Relevance Feedback (RF) has been proven very effective for improving retrieval accuracy. Adaptive information filtering (AIF) technology has benefited from the improvements achieved in all the tasks involved over the last decades. A difficult problem in AIF has been how to update the system with new feedback efficiently and effectively. In current feedback methods, the updating processes focus on updating system parameters. In this paper, we developed a new approach, the Adaptive Relevance Features Discovery (ARFD). It automatically updates the system's knowledge based on a sliding window over positive and negative feedback to solve a nonmonotonic problem efficiently. Some of the new training documents will be selected using the knowledge that the system currently obtained. Then, specific features will be extracted from selected training documents. Different methods have been used to merge and revise the weights of features in a vector space. The new model is designed for Relevance Features Discovery (RFD), a pattern mining based approach, which uses negative relevance feedback to improve the quality of extracted features from positive feedback. Learning algorithms are also proposed to implement this approach on Reuters Corpus Volume 1 and TREC topics. Experiments show that the proposed approach can work efficiently and achieves the encouragement performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study uses dosimetry film measurements and Monte Carlo simulations to investigate the accuracy of type-a (pencil-beam) dose calculations for predicting the radiation doses delivered during stereotactic radiotherapy treatments of the brain. It is shown that when evaluating doses in a water phantom, the type-a algorithm provides dose predictions which are accurate to within clinically relevant criteria, gamma(3%,3mm), but these predictions are nonetheless subtly different from the results of evaluating doses from the same fields using radiochromic film and Monte Carlo simulations. An analysis of a clinical meningioma treatment suggests that when predicting stereotactic radiotherapy doses to the brain, the inaccuracies of the type-a algorithm can be exacerbated by inadequate evaluation of the effects of nearby bone or air, resulting in dose differences of up to 10% for individual fields. The results of this study indicate the possible advantage of using Monte Carlo calculations, as well as measurements with high-spatial resolution media, to verify type-a predictions of dose delivered in cranial treatments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper considers the functions of Greek mythology in general and the Theseus and the Minotaur myth in particular in two contemporary texts of adolescent masculinity: Rick Riordans Percy Jackson series (2005-2009) and Matt Ottleys Requiem for a Beast: A Work for Image, Word and Music (2007). These texts reveal the ongoing flexibility of mythic texts to be pressed into service of shoring up or challenging currently hegemonic ideologies of self and state. Both Riordan and Ottley make a variety of intertextual uses of classical hero plots in order to facilitate their own narrative explorations of contemporary adolescent men coming of age. These intertextual gestures might easily be read as gestures of alignment with narrative traditions and authority which simultaneously confer legitimacy on Riordan and Ottley, on their texts, and by extension, on their readers. However, when read in juxtaposition, it is clear that Riordan and Ottley may use classical mythology to articulate similarly gendered adolescence, they produce divergent visions of nationed adolescence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is a common acceptance that contemporary schoolchildren live in a world that is intensely visual and commercially motivated, where what is imagined and what is experienced intermingle. Because of this, contemporary education should encourage a child to make reference to, and connection with their out-of-school life. The core critical underpinnings of curriculum based arts appreciation and theory hinge on educators and students taking a historical look at the ways artists have engaged with, and made comment upon, their contemporary societies. My article uses this premise to argue for the need to persist with pushing for critique of/through the visual, that it be delivered as an active process via the arts classroom rather than as visual literacy, here regarded as a more passive process for interpreting and understanding visual material. The article asserts that visual arts lessons are best placed to provide fully students with such critique because they help students to develop a critical eye, an interpretive lens often used by artists to view, analyse and independently navigate and respond to contemporary society.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Road deposited solids are a mix of pollutants originating from a range of anthropogenic sources common to urban land uses and soil inputs from surrounding areas. These particles accumulate potentially toxic pollutants thereby posing a threat to receiving waters. Reliable estimation of sources of particulate pollutants in build-up and quantification of particle composition is important for the development of best management practices for stormwater quality mitigation. The research study analysed build-up pollutants from sixteen different urban road surfaces and soil from four background locations. The road surfaces were selected from residential, industrial and commercial land uses from four suburbs in Gold Coast, Australia. Collected build-up samples were analysed for solids load, organic matter and mineralogy. The soil samples were analysed for mineralogy. Quantitative and qualitative analysis of mineralogical data, along with multivariate data analysis were employed to identify the relative source contributions to road deposited solids. The build-up load on road surfaces in different suburbs showed significant differences due to the nature of anthropogenic activities, road texture depth and antecedent dry period. Analysis revealed that build-up pollutants consists primarily of soil derived minerals (60%) and the remainder is composed of traffic generated pollutants and organic matter. Major mineral components detected were quartz and potential clay forming minerals such as albite, microline, chlorite and muscovite. An average of 40-50% of build-up pollutants by weight was made up of quartz. Comparison of the mineral component of build-up pollutants with background soil samples indicated that the minerals primarily originate from surrounding soils. About 2.2% of build-up pollutants were organic matter which originates largely from plant matter. Traffic related pollutants which are potentially toxic to the receiving water environment represented about 30% of the build-up pollutants at the study sites.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sustainability has been a major factor and determinant of commercial property design, construction, retro-fitting and landlord and tenant requirements over the last decade, supported by the introduction of rating tools such as NABERS and GreenStar and the recently mandated Building Energy Efficiency Certificate (BEEC). However, the movement to sustainable and energy efficient housing has not been established for the same period, and although mandatory building regulations have been in place for new residential housing construction since 2004, the requirement to improve the sustainability and energy efficiency of housing constructed prior to 2004 has not been mandatory. Residential dwelling energy efficiency and rating schemes introduced in Australia over the past decade have included rating schemes such as BASIX, NatHERS, First rate, ACTHERS, and Building Code of Australia and these have applied to new dwelling construction. At both National and State level the use of energy efficiency schemes for existing residential dwellings has been voluntary and despite significant cash incentives have not always been successful or achieved widespread take-up. In 2010, the Queensland Government regulated that all homes offered for sale, whether a new or existing dwellings require the seller to provide a sustainability declaration that provides details of the sustainability measures associated with the dwelling being sold. The purpose of this declaration being to inform buyers and increase community awareness of home sustainability features. This paper uses an extensive review of real estate marketing material, together with a comprehensive survey of real estate agents to analyse the current market compliance, awareness and acceptance of existing green housing regulations and the importance that residential property owners and purchasers place on energy efficient and sustainable housing. The findings indicate that there is still little community awareness or concern of sustainable housing features when making home purchase decisions.