960 resultados para Syntactic Projection


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Robust hashing is an emerging field that can be used to hash certain data types in applications unsuitable for traditional cryptographic hashing methods. Traditional hashing functions have been used extensively for data/message integrity, data/message authentication, efficient file identification and password verification. These applications are possible because the hashing process is compressive, allowing for efficient comparisons in the hash domain but non-invertible meaning hashes can be used without revealing the original data. These techniques were developed with deterministic (non-changing) inputs such as files and passwords. For such data types a 1-bit or one character change can be significant, as a result the hashing process is sensitive to any change in the input. Unfortunately, there are certain applications where input data are not perfectly deterministic and minor changes cannot be avoided. Digital images and biometric features are two types of data where such changes exist but do not alter the meaning or appearance of the input. For such data types cryptographic hash functions cannot be usefully applied. In light of this, robust hashing has been developed as an alternative to cryptographic hashing and is designed to be robust to minor changes in the input. Although similar in name, robust hashing is fundamentally different from cryptographic hashing. Current robust hashing techniques are not based on cryptographic methods, but instead on pattern recognition techniques. Modern robust hashing algorithms consist of feature extraction followed by a randomization stage that introduces non-invertibility and compression, followed by quantization and binary encoding to produce a binary hash output. In order to preserve robustness of the extracted features, most randomization methods are linear and this is detrimental to the security aspects required of hash functions. Furthermore, the quantization and encoding stages used to binarize real-valued features requires the learning of appropriate quantization thresholds. How these thresholds are learnt has an important effect on hashing accuracy and the mere presence of such thresholds are a source of information leakage that can reduce hashing security. This dissertation outlines a systematic investigation of the quantization and encoding stages of robust hash functions. While existing literature has focused on the importance of quantization scheme, this research is the first to emphasise the importance of the quantizer training on both hashing accuracy and hashing security. The quantizer training process is presented in a statistical framework which allows a theoretical analysis of the effects of quantizer training on hashing performance. This is experimentally verified using a number of baseline robust image hashing algorithms over a large database of real world images. This dissertation also proposes a new randomization method for robust image hashing based on Higher Order Spectra (HOS) and Radon projections. The method is non-linear and this is an essential requirement for non-invertibility. The method is also designed to produce features more suited for quantization and encoding. The system can operate without the need for quantizer training, is more easily encoded and displays improved hashing performance when compared to existing robust image hashing algorithms. The dissertation also shows how the HOS method can be adapted to work with biometric features obtained from 2D and 3D face images.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The reliability analysis is crucial to reducing unexpected down time, severe failures and ever tightened maintenance budget of engineering assets. Hazard based reliability methods are of particular interest as hazard reflects the current health status of engineering assets and their imminent failure risks. Most existing hazard models were constructed using the statistical methods. However, these methods were established largely based on two assumptions: one is the assumption of baseline failure distributions being accurate to the population concerned and the other is the assumption of effects of covariates on hazards. These two assumptions may be difficult to achieve and therefore compromise the effectiveness of hazard models in the application. To address this issue, a non-linear hazard modelling approach is developed in this research using neural networks (NNs), resulting in neural network hazard models (NNHMs), to deal with limitations due to the two assumptions for statistical models. With the success of failure prevention effort, less failure history becomes available for reliability analysis. Involving condition data or covariates is a natural solution to this challenge. A critical issue for involving covariates in reliability analysis is that complete and consistent covariate data are often unavailable in reality due to inconsistent measuring frequencies of multiple covariates, sensor failure, and sparse intrusive measurements. This problem has not been studied adequately in current reliability applications. This research thus investigates such incomplete covariates problem in reliability analysis. Typical approaches to handling incomplete covariates have been studied to investigate their performance and effects on the reliability analysis results. Since these existing approaches could underestimate the variance in regressions and introduce extra uncertainties to reliability analysis, the developed NNHMs are extended to include handling incomplete covariates as an integral part. The extended versions of NNHMs have been validated using simulated bearing data and real data from a liquefied natural gas pump. The results demonstrate the new approach outperforms the typical incomplete covariates handling approaches. Another problem in reliability analysis is that future covariates of engineering assets are generally unavailable. In existing practices for multi-step reliability analysis, historical covariates were used to estimate the future covariates. Covariates of engineering assets, however, are often subject to substantial fluctuation due to the influence of both engineering degradation and changes in environmental settings. The commonly used covariate extrapolation methods thus would not be suitable because of the error accumulation and uncertainty propagation. To overcome this difficulty, instead of directly extrapolating covariate values, projection of covariate states is conducted in this research. The estimated covariate states and unknown covariate values in future running steps of assets constitute an incomplete covariate set which is then analysed by the extended NNHMs. A new assessment function is also proposed to evaluate risks of underestimated and overestimated reliability analysis results. A case study using field data from a paper and pulp mill has been conducted and it demonstrates that this new multi-step reliability analysis procedure is able to generate more accurate analysis results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

While there are many similarities between the languages of the various workflow management systems, there are also significant differences. One particular area of differences is caused by the fact that different systems impose different syntactic restrictions. In such cases, business analysts have to choose between either conforming to the language in their specifications or transforming these specifications afterwards. The latter option is preferable as this allows for a separation of concerns. In this paper we investigate to what extent such transformations are possible in the context of various syntactical restrictions (the most restrictive of which will be referred to as structured workflows). We also provide a deep insight into the consequences, particularly in terms of expressive power, of imposing such restrictions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

“The Cube” is a unique facility that combines 48 large multi-touch screens and very large-scale projection surfaces to form one of the world’s largest interactive learning and engagement spaces. The Cube facility is part of the Queensland University of Technology’s (QUT) newly established Science and Engineering Centre, designed to showcase QUT’s teaching and research capabilities in the STEM (Science, Technology, Engineering, and Mathematics) disciplines. In this application paper we describe, the Cube, its technical capabilities, design rationale and practical day-to-day operations, supporting up to 70,000 visitors per week. Essential to the Cube’s operation are five interactive applications designed and developed in tandem with the Cube’s technical infrastructure. Each of the Cube’s launch applications was designed and delivered by an independent team, while the overall vision of the Cube was shepherded by a small executive team. The diversity of design, implementation and integration approaches pursued by these five teams provides some insight into the challenges, and opportunities, presented when working with large distributed interaction technologies. We describe each of these applications in order to discuss the different challenges and user needs they address, which types of interactions they support and how they utilise the capabilities of the Cube facility.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The promise of metabonomics, a new "omics" technique, to validate Chinese medicines and the compatibility of Chinese formulas has been appreciated. The present study was undertaken to explore the excretion pattern of low molecular mass metabolites in the male Wistar-derived rat model of kidney yin deficiency induced with thyroxine and reserpine as well as the therapeutic effect of Liu Wei Di Huang Wan (LW) and its separated prescriptions, a classic traditional Chinese medicine formula for treating kidney yin deficiency in China. The study utilized ultra-performance liquid chromatography/electrospray ionization synapt high definition mass spectrometry (UPLC/ESI-SYNAPT-HDMS) in both negative and positive electrospray ionization (ESI). At the same time, blood biochemistry was examined to identify specific changes in the kidney yin deficiency. Distinct changes in the pattern of metabolites, as a result of daily administration of thyroxine and reserpine, were observed by UPLC-HDMS combined with a principal component analysis (PCA). The changes in metabolic profiling were restored to their baseline values after treatment with LW according to the PCA score plots. Altogether, the current metabonomic approach based on UPLC-HDMS and orthogonal projection to latent structures discriminate analysis (OPLS-DA) indicated 20 ions (14 in the negative mode, 8 in the positive mode, and 2 in both) as "differentiating metabolites".

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Climate change is expected to be one of the biggest global health threats in the 21st century. In response to changes in climate and associated extreme events, public health adaptation has become imperative. This thesis examined several key issues in this emerging research field. The thesis aimed to identify the climate-health (particularly temperature-health) relationships, then develop quantitative models that can be used to project future health impacts of climate change, and therefore help formulate adaptation strategies for dealing with climate-related health risks and reducing vulnerability. The research questions addressed by this thesis were: (1) What are the barriers to public health adaptation to climate change? What are the research priorities in this emerging field? (2) What models and frameworks can be used to project future temperature-related mortality under different climate change scenarios? (3) What is the actual burden of temperature-related mortality? What are the impacts of climate change on future burden of disease? and (4) Can we develop public health adaptation strategies to manage the health effects of temperature in response to climate change? Using a literature review, I discussed how public health organisations should implement and manage the process of planned adaptation. This review showed that public health adaptation can operate at two levels: building adaptive capacity and implementing adaptation actions. However, there are constraints and barriers to adaptation arising from uncertainty, cost, technologic limits, institutional arrangements, deficits of social capital, and individual perception of risks. The opportunities for planning and implementing public health adaptation are reliant on effective strategies to overcome likely barriers. I proposed that high priorities should be given to multidisciplinary research on the assessment of potential health effects of climate change, projections of future health impacts under different climate and socio-economic scenarios, identification of health cobenefits of climate change policies, and evaluation of cost-effective public health adaptation options. Heat-related mortality is the most direct and highly-significant potential climate change impact on human health. I thus conducted a systematic review of research and methods for projecting future heat-related mortality under different climate change scenarios. The review showed that climate change is likely to result in a substantial increase in heatrelated mortality. Projecting heat-related mortality requires understanding of historical temperature-mortality relationships, and consideration of future changes in climate, population and acclimatisation. Further research is needed to provide a stronger theoretical framework for mortality projections, including a better understanding of socioeconomic development, adaptation strategies, land-use patterns, air pollution and mortality displacement. Most previous studies were designed to examine temperature-related excess deaths or mortality risks. However, if most temperature-related deaths occur in the very elderly who had only a short life expectancy, then the burden of temperature on mortality would have less public health importance. To guide policy decisions and resource allocation, it is desirable to know the actual burden of temperature-related mortality. To achieve this, I used years of life lost to provide a new measure of health effects of temperature. I conducted a time-series analysis to estimate years of life lost associated with changes in season and temperature in Brisbane, Australia. I also projected the future temperaturerelated years of life lost attributable to climate change. This study showed that the association between temperature and years of life lost was U-shaped, with increased years of life lost on cold and hot days. The temperature-related years of life lost will worsen greatly if future climate change goes beyond a 2 °C increase and without any adaptation to higher temperatures. The excess mortality during prolonged extreme temperatures is often greater than the predicted using smoothed temperature-mortality association. This is because sustained period of extreme temperatures produce an extra effect beyond that predicted by daily temperatures. To better estimate the burden of extreme temperatures, I estimated their effects on years of life lost due to cardiovascular disease using data from Brisbane, Australia. The results showed that the association between daily mean temperature and years of life lost due to cardiovascular disease was U-shaped, with the lowest years of life lost at 24 °C (the 75th percentile of daily mean temperature in Brisbane), rising progressively as temperatures become hotter or colder. There were significant added effects of heat waves, but no added effects of cold spells. Finally, public health adaptation to hot weather is necessary and pressing. I discussed how to manage the health effects of temperature, especially with the context of climate change. Strategies to minimise the health effects of high temperatures and climate change can fall into two categories: reducing the heat exposure and managing the health effects of high temperatures. However, policy decisions need information on specific adaptations, together with their expected costs and benefits. Therefore, more research is needed to evaluate cost-effective adaptation options. In summary, this thesis adds to the large body of literature on the impacts of temperature and climate change on human health. It improves our understanding of the temperaturehealth relationship, and how this relationship will change as temperatures increase. Although the research is limited to one city, which restricts the generalisability of the findings, the methods and approaches developed in this thesis will be useful to other researchers studying temperature-health relationships and climate change impacts. The results may be helpful for decision-makers who develop public health adaptation strategies to minimise the health effects of extreme temperatures and climate change.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Daylight devices are important components of any climate responsive façade system. But, the evolution of parametric CAD systems and digital fabrication has had an impact on architectural form so that regular forms are shifting to complex geometries. Architectural and engineering integration of daylight devices in envelopes with complex geometries is a challenge in terms of design and performance evaluation. The purpose of this paper is to assess daylight performance of a building with a climatic responsive envelope with complex geometry that integrates shading devices in the façade. The case study is based on the Esplanade buildings in Singapore. Climate-based day-light metrics such as Daylight Availability and Useful Daylight Illuminance are used. DIVA (daylight simulation), and Grasshopper (parametric analysis) plug-ins for Rhinoceros have been employed to examine the range of performance possibilities. Parameters such as dimension, inclination of the device, projected shadows and shape have been changed in order to maximize daylight availability and Useful Daylight Illuminance while minimizing glare probability. While orientation did not have a great impact on the results, aperture of the shading devices did, showing that shading devices with a projection of 1.75 m to 2.00 m performed best, achieving target lighting levels without issues of glare.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

‘Ghost Wash’ unveils the past in a contemporary context. It is a blending of video projection, sound, music and performance that reconstructs the anger, the angularity, and the angst of Brisbane music from the late 70s through the 80s. The music is contained within an ongoing story about Brisbane music history.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This body of photographic work has been created to firstly, explore a new approach to practice-led research that uses an “action genre” approach to reflective practice (Lemke) and secondly, to visually explore human interaction with the fundamental item in life - water. The first of these is based on the contention that to understand the meanings inherent in photographs we cannot look merely at the end result. It is essential to keep looking at the actions of practitioners, and the influences upon them, to determine how external influences affect the meaning potential of editorial photographs (Grayson, 2012). WATER therefore, provides an ideal platform to reflect upon the actions and influences involved in creating work within the photographic genre of photojournalism. It enables this practitioner to reflect on each stage of production to gain a better understanding of how external influences impact the narrative potential within images created. There are multi-faceted influences experienced by photographers who are creating images that, in turn, are part of constructing and presenting the narrative potential of editorial photographs. There is an important relationship between professional photographers and the technical, cultural, economic and institutional forces that impinge upon all stages of production and publication. What results is a greater understanding of technical, cultural, economic and institutional forces that impinge upon all stages of production and publication. Therefore, to understand the meanings inherent in photographs within WATER, I do not look merely at the end result. It provides a case study looking at my actions in the filed, and the influences upon me, to determine how external influences affect the meaning potential of these photographs (Grayson, 2012). As a result, this project adds to the body of scholarship around the definition of Photojournalism, how it has adapted to the current media environment and provides scope for further research into emerging new genres within editorial photography, such as citizen photojournalism. Concurrently, the photographs themselves were created to visually explore how there remains a humanistic desire to interact with the natural form of water even while living a modern cosmopolitan life around it. Taking a photojournalistic approach to exploring this phenomenon, the images were created by “capturing moments as they happened” with no posing or setting up of images. This serendipitous approach to the photographic medium provides the practitioner with at least an attempt to direct the subjectivity contained explicitly in photographs. What results is a series of images that extend the visual dialogue around the role of water within modern humanistic lifestyles and how it remains an integral part of our society’s behaviors. It captures important moments that document this relationship at this time of modern development. The resulting works were exhibited and published as part of the Head On Photo Festival, Australia's largest photo festival and the world's second largest festival in Sydney 20-24 May 2013. The WATER series of images were curated by three Magnum members; Ian Berry, Eli Reed and Chris Steele-Perkins. Magnum is a highly regarded international photographic co-operative with editorial offices in New York, London, Paris and Tokyo. There was a projection of the works as part of the official festival programme, presented to both members of the public and Sydney’s photography professionals. In addition, a sample of images from the WATER series was chosen for inclusion in the Magnum-published hardcover book. References Grayson, Louise. 2012. “Editorial photographs and patterns of practice.” Journalism Practice. Accessed: http://www.tandfonline.com/doi/abs/10.1080/17512786.2012.726836#.UbZN-L--1RQ Lemke, Jay. 1995. Textual Politics: Discourse and Social Dynamics. London: Taylor & Francis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Weather variables, mainly temperature and humidity influence vectors, viruses, human biology, ecology and consequently the intensity and distribution of the vector-borne diseases. There is evidence that warmer temperature due to climate change will influence the dengue transmission. However, long term scenario-based projections are yet to be developed. Here, we assessed the impact of weather variability on dengue transmission in a megacity of Dhaka, Bangladesh and projected the future dengue risk attributable to climate change. Our results show that weather variables particularly temperature and humidity were positively associated with dengue transmission. The effects of weather variables were observed at a lag of four months. We projected that assuming a temperature increase of 3.3 °C without any adaptation measure and changes in socio-economic condition, there will be a projected increase of 16,030 dengue cases in Dhaka by the end of this century. This information might be helpful for the public health authorities to prepare for the likely increase of dengue due to climate change. The modelling framework used in this study may be applicable to dengue projection in other cities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work aims to promote integrity in autonomous perceptual systems, with a focus on outdoor unmanned ground vehicles equipped with a camera and a 2D laser range finder. A method to check for inconsistencies between the data provided by these two heterogeneous sensors is proposed and discussed. First, uncertainties in the estimated transformation between the laser and camera frames are evaluated and propagated up to the projection of the laser points onto the image. Then, for each pair of laser scan-camera image acquired, the information at corners of the laser scan is compared with the content of the image, resulting in a likelihood of correspondence. The result of this process is then used to validate segments of the laser scan that are found to be consistent with the image, while inconsistent segments are rejected. Experimental results illustrate how this technique can improve the reliability of perception in challenging environmental conditions, such as in the presence of airborne dust.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper uses innovative content analysis techniques to map how the death of Oscar Pistorius' girlfriend, Reeva Steenkamp, was framed on Twitter conversations. Around 1.5 million posts from a two-week timeframe are analyzed with a combination of syntactic and semantic methods. This analysis is grounded in the frame analysis perspective and is different than sentiment analysis. Instead of looking for explicit evaluations, such as “he is guilty” or “he is innocent”, we showcase through the results how opinions can be identified by complex articulations of more implicit symbolic devices such as examples and metaphors repeatedly mentioned. Different frames are adopted by users as more information about the case is revealed: from a more episodic one, highly used in the very beginning, to more systemic approaches, highlighting the association of the event with urban violence, gun control issues, and violence against women. A detailed timeline of the discussions is provided.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Genomic sequences are fundamentally text documents, admitting various representations according to need and tokenization. Gene expression depends crucially on binding of enzymes to the DNA sequence at small, poorly conserved binding sites, limiting the utility of standard pattern search. However, one may exploit the regular syntactic structure of the enzyme's component proteins and the corresponding binding sites, framing the problem as one of detecting grammatically correct genomic phrases. In this paper we propose new kernels based on weighted tree structures, traversing the paths within them to capture the features which underpin the task. Experimentally, we and that these kernels provide performance comparable with state of the art approaches for this problem, while offering significant computational advantages over earlier methods. The methods proposed may be applied to a broad range of sequence or tree-structured data in molecular biology and other domains.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There has been an intense debate about climatic impacts on the transmission of malaria. It is vitally important to accurately project future impacts of climate change on malaria to support effective policy–making and intervention activity concerning malaria control and prevention. This paper critically reviewed the published literature and examined both key findings and methodological issues in projecting future impacts of climate change on malaria transmission. A literature search was conducted using the electronic databases MEDLINE, Web of Science and PubMed. The projected impacts of climate change on malaria transmission were spatially heterogeneous and somewhat inconsistent. The variation in results may be explained by the interaction of climatic factors and malaria transmission cycles, variations in projection frameworks and uncertainties of future socioecological (including climate) changes. Current knowledge gaps are identified, future research directions are proposed and public health implications are assessed. Improving the understanding of the dynamic effects of climate on malaria transmission cycles, the advancement of modelling techniques and the incorporation of uncertainties in future socioecological changes are critical factors for projecting the impact of climate change on malaria transmission.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Young children are thought to be particularly sensitive to heatwaves, but relatively less research attention has been paid to this field to date. A systematic review was conducted to elucidate the relationship between heat waves and children’s health. Literature published up to August 2012 were identified using the following MeSH terms and keywords: “heatwave”, “heat wave”, “child health”, “morbidity”, “hospital admission”, “emergency department visit”, “family practice”, “primary health care”, “death” and “mortality”. Of the 628 publications identified, 12 met the selection criteria. The existing literature does not consistently suggest that mortality among children increases significantly during heat waves, even though infants were associated with more heat-related deaths. Exposure to heat waves in the perinatal period may pose a threat to children’s health. Pediatric diseases or conditions associated with heat waves include renal disease, respiratory disease, electrolyte imbalance and fever. Future research should focus on how to develop a consistent definition of a heat wave from a children’s health perspective, identifying the best measure of children’s exposure to heat waves, exploring sensitive outcome measures to quantify the impact of heat waves on children, evaluating the possible impacts of heat waves on children’s birth outcomes, and understanding the differences in vulnerability to heat waves among children of different ages and from different income countries. Projection of the children’s disease burden caused by heat waves under climate change scenarios, and development of effective heat wave mitigation and adaptation strategies that incorporate other child protective health measures, are also strongly recommended.