886 resultados para Projection
Resumo:
Robust hashing is an emerging field that can be used to hash certain data types in applications unsuitable for traditional cryptographic hashing methods. Traditional hashing functions have been used extensively for data/message integrity, data/message authentication, efficient file identification and password verification. These applications are possible because the hashing process is compressive, allowing for efficient comparisons in the hash domain but non-invertible meaning hashes can be used without revealing the original data. These techniques were developed with deterministic (non-changing) inputs such as files and passwords. For such data types a 1-bit or one character change can be significant, as a result the hashing process is sensitive to any change in the input. Unfortunately, there are certain applications where input data are not perfectly deterministic and minor changes cannot be avoided. Digital images and biometric features are two types of data where such changes exist but do not alter the meaning or appearance of the input. For such data types cryptographic hash functions cannot be usefully applied. In light of this, robust hashing has been developed as an alternative to cryptographic hashing and is designed to be robust to minor changes in the input. Although similar in name, robust hashing is fundamentally different from cryptographic hashing. Current robust hashing techniques are not based on cryptographic methods, but instead on pattern recognition techniques. Modern robust hashing algorithms consist of feature extraction followed by a randomization stage that introduces non-invertibility and compression, followed by quantization and binary encoding to produce a binary hash output. In order to preserve robustness of the extracted features, most randomization methods are linear and this is detrimental to the security aspects required of hash functions. Furthermore, the quantization and encoding stages used to binarize real-valued features requires the learning of appropriate quantization thresholds. How these thresholds are learnt has an important effect on hashing accuracy and the mere presence of such thresholds are a source of information leakage that can reduce hashing security. This dissertation outlines a systematic investigation of the quantization and encoding stages of robust hash functions. While existing literature has focused on the importance of quantization scheme, this research is the first to emphasise the importance of the quantizer training on both hashing accuracy and hashing security. The quantizer training process is presented in a statistical framework which allows a theoretical analysis of the effects of quantizer training on hashing performance. This is experimentally verified using a number of baseline robust image hashing algorithms over a large database of real world images. This dissertation also proposes a new randomization method for robust image hashing based on Higher Order Spectra (HOS) and Radon projections. The method is non-linear and this is an essential requirement for non-invertibility. The method is also designed to produce features more suited for quantization and encoding. The system can operate without the need for quantizer training, is more easily encoded and displays improved hashing performance when compared to existing robust image hashing algorithms. The dissertation also shows how the HOS method can be adapted to work with biometric features obtained from 2D and 3D face images.
Resumo:
The reliability analysis is crucial to reducing unexpected down time, severe failures and ever tightened maintenance budget of engineering assets. Hazard based reliability methods are of particular interest as hazard reflects the current health status of engineering assets and their imminent failure risks. Most existing hazard models were constructed using the statistical methods. However, these methods were established largely based on two assumptions: one is the assumption of baseline failure distributions being accurate to the population concerned and the other is the assumption of effects of covariates on hazards. These two assumptions may be difficult to achieve and therefore compromise the effectiveness of hazard models in the application. To address this issue, a non-linear hazard modelling approach is developed in this research using neural networks (NNs), resulting in neural network hazard models (NNHMs), to deal with limitations due to the two assumptions for statistical models. With the success of failure prevention effort, less failure history becomes available for reliability analysis. Involving condition data or covariates is a natural solution to this challenge. A critical issue for involving covariates in reliability analysis is that complete and consistent covariate data are often unavailable in reality due to inconsistent measuring frequencies of multiple covariates, sensor failure, and sparse intrusive measurements. This problem has not been studied adequately in current reliability applications. This research thus investigates such incomplete covariates problem in reliability analysis. Typical approaches to handling incomplete covariates have been studied to investigate their performance and effects on the reliability analysis results. Since these existing approaches could underestimate the variance in regressions and introduce extra uncertainties to reliability analysis, the developed NNHMs are extended to include handling incomplete covariates as an integral part. The extended versions of NNHMs have been validated using simulated bearing data and real data from a liquefied natural gas pump. The results demonstrate the new approach outperforms the typical incomplete covariates handling approaches. Another problem in reliability analysis is that future covariates of engineering assets are generally unavailable. In existing practices for multi-step reliability analysis, historical covariates were used to estimate the future covariates. Covariates of engineering assets, however, are often subject to substantial fluctuation due to the influence of both engineering degradation and changes in environmental settings. The commonly used covariate extrapolation methods thus would not be suitable because of the error accumulation and uncertainty propagation. To overcome this difficulty, instead of directly extrapolating covariate values, projection of covariate states is conducted in this research. The estimated covariate states and unknown covariate values in future running steps of assets constitute an incomplete covariate set which is then analysed by the extended NNHMs. A new assessment function is also proposed to evaluate risks of underestimated and overestimated reliability analysis results. A case study using field data from a paper and pulp mill has been conducted and it demonstrates that this new multi-step reliability analysis procedure is able to generate more accurate analysis results.
Resumo:
“The Cube” is a unique facility that combines 48 large multi-touch screens and very large-scale projection surfaces to form one of the world’s largest interactive learning and engagement spaces. The Cube facility is part of the Queensland University of Technology’s (QUT) newly established Science and Engineering Centre, designed to showcase QUT’s teaching and research capabilities in the STEM (Science, Technology, Engineering, and Mathematics) disciplines. In this application paper we describe, the Cube, its technical capabilities, design rationale and practical day-to-day operations, supporting up to 70,000 visitors per week. Essential to the Cube’s operation are five interactive applications designed and developed in tandem with the Cube’s technical infrastructure. Each of the Cube’s launch applications was designed and delivered by an independent team, while the overall vision of the Cube was shepherded by a small executive team. The diversity of design, implementation and integration approaches pursued by these five teams provides some insight into the challenges, and opportunities, presented when working with large distributed interaction technologies. We describe each of these applications in order to discuss the different challenges and user needs they address, which types of interactions they support and how they utilise the capabilities of the Cube facility.
Resumo:
The promise of metabonomics, a new "omics" technique, to validate Chinese medicines and the compatibility of Chinese formulas has been appreciated. The present study was undertaken to explore the excretion pattern of low molecular mass metabolites in the male Wistar-derived rat model of kidney yin deficiency induced with thyroxine and reserpine as well as the therapeutic effect of Liu Wei Di Huang Wan (LW) and its separated prescriptions, a classic traditional Chinese medicine formula for treating kidney yin deficiency in China. The study utilized ultra-performance liquid chromatography/electrospray ionization synapt high definition mass spectrometry (UPLC/ESI-SYNAPT-HDMS) in both negative and positive electrospray ionization (ESI). At the same time, blood biochemistry was examined to identify specific changes in the kidney yin deficiency. Distinct changes in the pattern of metabolites, as a result of daily administration of thyroxine and reserpine, were observed by UPLC-HDMS combined with a principal component analysis (PCA). The changes in metabolic profiling were restored to their baseline values after treatment with LW according to the PCA score plots. Altogether, the current metabonomic approach based on UPLC-HDMS and orthogonal projection to latent structures discriminate analysis (OPLS-DA) indicated 20 ions (14 in the negative mode, 8 in the positive mode, and 2 in both) as "differentiating metabolites".
The health effects of temperature : current estimates, future projections, and adaptation strategies
Resumo:
Climate change is expected to be one of the biggest global health threats in the 21st century. In response to changes in climate and associated extreme events, public health adaptation has become imperative. This thesis examined several key issues in this emerging research field. The thesis aimed to identify the climate-health (particularly temperature-health) relationships, then develop quantitative models that can be used to project future health impacts of climate change, and therefore help formulate adaptation strategies for dealing with climate-related health risks and reducing vulnerability. The research questions addressed by this thesis were: (1) What are the barriers to public health adaptation to climate change? What are the research priorities in this emerging field? (2) What models and frameworks can be used to project future temperature-related mortality under different climate change scenarios? (3) What is the actual burden of temperature-related mortality? What are the impacts of climate change on future burden of disease? and (4) Can we develop public health adaptation strategies to manage the health effects of temperature in response to climate change? Using a literature review, I discussed how public health organisations should implement and manage the process of planned adaptation. This review showed that public health adaptation can operate at two levels: building adaptive capacity and implementing adaptation actions. However, there are constraints and barriers to adaptation arising from uncertainty, cost, technologic limits, institutional arrangements, deficits of social capital, and individual perception of risks. The opportunities for planning and implementing public health adaptation are reliant on effective strategies to overcome likely barriers. I proposed that high priorities should be given to multidisciplinary research on the assessment of potential health effects of climate change, projections of future health impacts under different climate and socio-economic scenarios, identification of health cobenefits of climate change policies, and evaluation of cost-effective public health adaptation options. Heat-related mortality is the most direct and highly-significant potential climate change impact on human health. I thus conducted a systematic review of research and methods for projecting future heat-related mortality under different climate change scenarios. The review showed that climate change is likely to result in a substantial increase in heatrelated mortality. Projecting heat-related mortality requires understanding of historical temperature-mortality relationships, and consideration of future changes in climate, population and acclimatisation. Further research is needed to provide a stronger theoretical framework for mortality projections, including a better understanding of socioeconomic development, adaptation strategies, land-use patterns, air pollution and mortality displacement. Most previous studies were designed to examine temperature-related excess deaths or mortality risks. However, if most temperature-related deaths occur in the very elderly who had only a short life expectancy, then the burden of temperature on mortality would have less public health importance. To guide policy decisions and resource allocation, it is desirable to know the actual burden of temperature-related mortality. To achieve this, I used years of life lost to provide a new measure of health effects of temperature. I conducted a time-series analysis to estimate years of life lost associated with changes in season and temperature in Brisbane, Australia. I also projected the future temperaturerelated years of life lost attributable to climate change. This study showed that the association between temperature and years of life lost was U-shaped, with increased years of life lost on cold and hot days. The temperature-related years of life lost will worsen greatly if future climate change goes beyond a 2 °C increase and without any adaptation to higher temperatures. The excess mortality during prolonged extreme temperatures is often greater than the predicted using smoothed temperature-mortality association. This is because sustained period of extreme temperatures produce an extra effect beyond that predicted by daily temperatures. To better estimate the burden of extreme temperatures, I estimated their effects on years of life lost due to cardiovascular disease using data from Brisbane, Australia. The results showed that the association between daily mean temperature and years of life lost due to cardiovascular disease was U-shaped, with the lowest years of life lost at 24 °C (the 75th percentile of daily mean temperature in Brisbane), rising progressively as temperatures become hotter or colder. There were significant added effects of heat waves, but no added effects of cold spells. Finally, public health adaptation to hot weather is necessary and pressing. I discussed how to manage the health effects of temperature, especially with the context of climate change. Strategies to minimise the health effects of high temperatures and climate change can fall into two categories: reducing the heat exposure and managing the health effects of high temperatures. However, policy decisions need information on specific adaptations, together with their expected costs and benefits. Therefore, more research is needed to evaluate cost-effective adaptation options. In summary, this thesis adds to the large body of literature on the impacts of temperature and climate change on human health. It improves our understanding of the temperaturehealth relationship, and how this relationship will change as temperatures increase. Although the research is limited to one city, which restricts the generalisability of the findings, the methods and approaches developed in this thesis will be useful to other researchers studying temperature-health relationships and climate change impacts. The results may be helpful for decision-makers who develop public health adaptation strategies to minimise the health effects of extreme temperatures and climate change.
Resumo:
Daylight devices are important components of any climate responsive façade system. But, the evolution of parametric CAD systems and digital fabrication has had an impact on architectural form so that regular forms are shifting to complex geometries. Architectural and engineering integration of daylight devices in envelopes with complex geometries is a challenge in terms of design and performance evaluation. The purpose of this paper is to assess daylight performance of a building with a climatic responsive envelope with complex geometry that integrates shading devices in the façade. The case study is based on the Esplanade buildings in Singapore. Climate-based day-light metrics such as Daylight Availability and Useful Daylight Illuminance are used. DIVA (daylight simulation), and Grasshopper (parametric analysis) plug-ins for Rhinoceros have been employed to examine the range of performance possibilities. Parameters such as dimension, inclination of the device, projected shadows and shape have been changed in order to maximize daylight availability and Useful Daylight Illuminance while minimizing glare probability. While orientation did not have a great impact on the results, aperture of the shading devices did, showing that shading devices with a projection of 1.75 m to 2.00 m performed best, achieving target lighting levels without issues of glare.
Resumo:
‘Ghost Wash’ unveils the past in a contemporary context. It is a blending of video projection, sound, music and performance that reconstructs the anger, the angularity, and the angst of Brisbane music from the late 70s through the 80s. The music is contained within an ongoing story about Brisbane music history.
Resumo:
This body of photographic work has been created to firstly, explore a new approach to practice-led research that uses an “action genre” approach to reflective practice (Lemke) and secondly, to visually explore human interaction with the fundamental item in life - water. The first of these is based on the contention that to understand the meanings inherent in photographs we cannot look merely at the end result. It is essential to keep looking at the actions of practitioners, and the influences upon them, to determine how external influences affect the meaning potential of editorial photographs (Grayson, 2012). WATER therefore, provides an ideal platform to reflect upon the actions and influences involved in creating work within the photographic genre of photojournalism. It enables this practitioner to reflect on each stage of production to gain a better understanding of how external influences impact the narrative potential within images created. There are multi-faceted influences experienced by photographers who are creating images that, in turn, are part of constructing and presenting the narrative potential of editorial photographs. There is an important relationship between professional photographers and the technical, cultural, economic and institutional forces that impinge upon all stages of production and publication. What results is a greater understanding of technical, cultural, economic and institutional forces that impinge upon all stages of production and publication. Therefore, to understand the meanings inherent in photographs within WATER, I do not look merely at the end result. It provides a case study looking at my actions in the filed, and the influences upon me, to determine how external influences affect the meaning potential of these photographs (Grayson, 2012). As a result, this project adds to the body of scholarship around the definition of Photojournalism, how it has adapted to the current media environment and provides scope for further research into emerging new genres within editorial photography, such as citizen photojournalism. Concurrently, the photographs themselves were created to visually explore how there remains a humanistic desire to interact with the natural form of water even while living a modern cosmopolitan life around it. Taking a photojournalistic approach to exploring this phenomenon, the images were created by “capturing moments as they happened” with no posing or setting up of images. This serendipitous approach to the photographic medium provides the practitioner with at least an attempt to direct the subjectivity contained explicitly in photographs. What results is a series of images that extend the visual dialogue around the role of water within modern humanistic lifestyles and how it remains an integral part of our society’s behaviors. It captures important moments that document this relationship at this time of modern development. The resulting works were exhibited and published as part of the Head On Photo Festival, Australia's largest photo festival and the world's second largest festival in Sydney 20-24 May 2013. The WATER series of images were curated by three Magnum members; Ian Berry, Eli Reed and Chris Steele-Perkins. Magnum is a highly regarded international photographic co-operative with editorial offices in New York, London, Paris and Tokyo. There was a projection of the works as part of the official festival programme, presented to both members of the public and Sydney’s photography professionals. In addition, a sample of images from the WATER series was chosen for inclusion in the Magnum-published hardcover book. References Grayson, Louise. 2012. “Editorial photographs and patterns of practice.” Journalism Practice. Accessed: http://www.tandfonline.com/doi/abs/10.1080/17512786.2012.726836#.UbZN-L--1RQ Lemke, Jay. 1995. Textual Politics: Discourse and Social Dynamics. London: Taylor & Francis.
Resumo:
Weather variables, mainly temperature and humidity influence vectors, viruses, human biology, ecology and consequently the intensity and distribution of the vector-borne diseases. There is evidence that warmer temperature due to climate change will influence the dengue transmission. However, long term scenario-based projections are yet to be developed. Here, we assessed the impact of weather variability on dengue transmission in a megacity of Dhaka, Bangladesh and projected the future dengue risk attributable to climate change. Our results show that weather variables particularly temperature and humidity were positively associated with dengue transmission. The effects of weather variables were observed at a lag of four months. We projected that assuming a temperature increase of 3.3 °C without any adaptation measure and changes in socio-economic condition, there will be a projected increase of 16,030 dengue cases in Dhaka by the end of this century. This information might be helpful for the public health authorities to prepare for the likely increase of dengue due to climate change. The modelling framework used in this study may be applicable to dengue projection in other cities.
Resumo:
This work aims to promote integrity in autonomous perceptual systems, with a focus on outdoor unmanned ground vehicles equipped with a camera and a 2D laser range finder. A method to check for inconsistencies between the data provided by these two heterogeneous sensors is proposed and discussed. First, uncertainties in the estimated transformation between the laser and camera frames are evaluated and propagated up to the projection of the laser points onto the image. Then, for each pair of laser scan-camera image acquired, the information at corners of the laser scan is compared with the content of the image, resulting in a likelihood of correspondence. The result of this process is then used to validate segments of the laser scan that are found to be consistent with the image, while inconsistent segments are rejected. Experimental results illustrate how this technique can improve the reliability of perception in challenging environmental conditions, such as in the presence of airborne dust.
Resumo:
There has been an intense debate about climatic impacts on the transmission of malaria. It is vitally important to accurately project future impacts of climate change on malaria to support effective policy–making and intervention activity concerning malaria control and prevention. This paper critically reviewed the published literature and examined both key findings and methodological issues in projecting future impacts of climate change on malaria transmission. A literature search was conducted using the electronic databases MEDLINE, Web of Science and PubMed. The projected impacts of climate change on malaria transmission were spatially heterogeneous and somewhat inconsistent. The variation in results may be explained by the interaction of climatic factors and malaria transmission cycles, variations in projection frameworks and uncertainties of future socioecological (including climate) changes. Current knowledge gaps are identified, future research directions are proposed and public health implications are assessed. Improving the understanding of the dynamic effects of climate on malaria transmission cycles, the advancement of modelling techniques and the incorporation of uncertainties in future socioecological changes are critical factors for projecting the impact of climate change on malaria transmission.
Resumo:
Young children are thought to be particularly sensitive to heatwaves, but relatively less research attention has been paid to this field to date. A systematic review was conducted to elucidate the relationship between heat waves and children’s health. Literature published up to August 2012 were identified using the following MeSH terms and keywords: “heatwave”, “heat wave”, “child health”, “morbidity”, “hospital admission”, “emergency department visit”, “family practice”, “primary health care”, “death” and “mortality”. Of the 628 publications identified, 12 met the selection criteria. The existing literature does not consistently suggest that mortality among children increases significantly during heat waves, even though infants were associated with more heat-related deaths. Exposure to heat waves in the perinatal period may pose a threat to children’s health. Pediatric diseases or conditions associated with heat waves include renal disease, respiratory disease, electrolyte imbalance and fever. Future research should focus on how to develop a consistent definition of a heat wave from a children’s health perspective, identifying the best measure of children’s exposure to heat waves, exploring sensitive outcome measures to quantify the impact of heat waves on children, evaluating the possible impacts of heat waves on children’s birth outcomes, and understanding the differences in vulnerability to heat waves among children of different ages and from different income countries. Projection of the children’s disease burden caused by heat waves under climate change scenarios, and development of effective heat wave mitigation and adaptation strategies that incorporate other child protective health measures, are also strongly recommended.
Resumo:
Background Many studies have found associations between climatic conditions and dengue transmission. However, there is a debate about the future impacts of climate change on dengue transmission. This paper reviewed epidemiological evidence on the relationship between climate and dengue with a focus on quantitative methods for assessing the potential impacts of climate change on global dengue transmission. Methods A literature search was conducted in October 2012, using the electronic databases PubMed, Scopus, ScienceDirect, ProQuest, and Web of Science. The search focused on peer-reviewed journal articles published in English from January 1991 through October 2012. Results Sixteen studies met the inclusion criteria and most studies showed that the transmission of dengue is highly sensitive to climatic conditions, especially temperature, rainfall and relative humidity. Studies on the potential impacts of climate change on dengue indicate increased climatic suitability for transmission and an expansion of the geographic regions at risk during this century. A variety of quantitative modelling approaches were used in the studies. Several key methodological issues and current knowledge gaps were identified through this review. Conclusions It is important to assemble spatio-temporal patterns of dengue transmission compatible with long-term data on climate and other socio-ecological changes and this would advance projections of dengue risks associated with climate change. Keywords: Climate; Dengue; Models; Projection; Scenarios
Resumo:
Working primarily within the natural landscape, this practice-led research project explored connections between the artist's visual and perceptual experience of a journey or place while simultaneously emphasizing the capacity for digital media to create a perceptual dissonance. By exploring concepts of time, viewpoint, duration of sequences and the manipulation of traditional constructs of stop-frame animation, the practical work created a cognitive awareness of the elements of the journey through optical sensations. The work allowed an opportunity to reflect on the nature of visual experience and its mediation through images. The project recontextualized the selected mediums of still photography, animation and projection within contemporary display modes of multiple screen installations by analysing relationships between the experienced and the perceived. The resulting works added to current discourse on the interstices between still and moving imagery in a digital world.
Resumo:
Recent advances suggest that encoding images through Symmetric Positive Definite (SPD) matrices and then interpreting such matrices as points on Riemannian manifolds can lead to increased classification performance. Taking into account manifold geometry is typically done via (1) embedding the manifolds in tangent spaces, or (2) embedding into Reproducing Kernel Hilbert Spaces (RKHS). While embedding into tangent spaces allows the use of existing Euclidean-based learning algorithms, manifold shape is only approximated which can cause loss of discriminatory information. The RKHS approach retains more of the manifold structure, but may require non-trivial effort to kernelise Euclidean-based learning algorithms. In contrast to the above approaches, in this paper we offer a novel solution that allows SPD matrices to be used with unmodified Euclidean-based learning algorithms, with the true manifold shape well-preserved. Specifically, we propose to project SPD matrices using a set of random projection hyperplanes over RKHS into a random projection space, which leads to representing each matrix as a vector of projection coefficients. Experiments on face recognition, person re-identification and texture classification show that the proposed approach outperforms several recent methods, such as Tensor Sparse Coding, Histogram Plus Epitome, Riemannian Locality Preserving Projection and Relational Divergence Classification.