229 resultados para Multidimensional projection


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Robust hashing is an emerging field that can be used to hash certain data types in applications unsuitable for traditional cryptographic hashing methods. Traditional hashing functions have been used extensively for data/message integrity, data/message authentication, efficient file identification and password verification. These applications are possible because the hashing process is compressive, allowing for efficient comparisons in the hash domain but non-invertible meaning hashes can be used without revealing the original data. These techniques were developed with deterministic (non-changing) inputs such as files and passwords. For such data types a 1-bit or one character change can be significant, as a result the hashing process is sensitive to any change in the input. Unfortunately, there are certain applications where input data are not perfectly deterministic and minor changes cannot be avoided. Digital images and biometric features are two types of data where such changes exist but do not alter the meaning or appearance of the input. For such data types cryptographic hash functions cannot be usefully applied. In light of this, robust hashing has been developed as an alternative to cryptographic hashing and is designed to be robust to minor changes in the input. Although similar in name, robust hashing is fundamentally different from cryptographic hashing. Current robust hashing techniques are not based on cryptographic methods, but instead on pattern recognition techniques. Modern robust hashing algorithms consist of feature extraction followed by a randomization stage that introduces non-invertibility and compression, followed by quantization and binary encoding to produce a binary hash output. In order to preserve robustness of the extracted features, most randomization methods are linear and this is detrimental to the security aspects required of hash functions. Furthermore, the quantization and encoding stages used to binarize real-valued features requires the learning of appropriate quantization thresholds. How these thresholds are learnt has an important effect on hashing accuracy and the mere presence of such thresholds are a source of information leakage that can reduce hashing security. This dissertation outlines a systematic investigation of the quantization and encoding stages of robust hash functions. While existing literature has focused on the importance of quantization scheme, this research is the first to emphasise the importance of the quantizer training on both hashing accuracy and hashing security. The quantizer training process is presented in a statistical framework which allows a theoretical analysis of the effects of quantizer training on hashing performance. This is experimentally verified using a number of baseline robust image hashing algorithms over a large database of real world images. This dissertation also proposes a new randomization method for robust image hashing based on Higher Order Spectra (HOS) and Radon projections. The method is non-linear and this is an essential requirement for non-invertibility. The method is also designed to produce features more suited for quantization and encoding. The system can operate without the need for quantizer training, is more easily encoded and displays improved hashing performance when compared to existing robust image hashing algorithms. The dissertation also shows how the HOS method can be adapted to work with biometric features obtained from 2D and 3D face images.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development of user expertise is a strategic imperative for organizations in hyper-competitive markets. This paper conceptualizes opreationalises and validates user expertise in contemporary Information Systems (IS) as a formative, multidimensional index. Such a validated and widely accepted index would facilitate progression of past research on user competence and efficacy of IS to complex contemporary IS, while at the same time providing a benchmark for organizations to track their user expertise. The validation involved three separate studies, including exploratory and confirmatory phases, using data from 244 respondents.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The reliability analysis is crucial to reducing unexpected down time, severe failures and ever tightened maintenance budget of engineering assets. Hazard based reliability methods are of particular interest as hazard reflects the current health status of engineering assets and their imminent failure risks. Most existing hazard models were constructed using the statistical methods. However, these methods were established largely based on two assumptions: one is the assumption of baseline failure distributions being accurate to the population concerned and the other is the assumption of effects of covariates on hazards. These two assumptions may be difficult to achieve and therefore compromise the effectiveness of hazard models in the application. To address this issue, a non-linear hazard modelling approach is developed in this research using neural networks (NNs), resulting in neural network hazard models (NNHMs), to deal with limitations due to the two assumptions for statistical models. With the success of failure prevention effort, less failure history becomes available for reliability analysis. Involving condition data or covariates is a natural solution to this challenge. A critical issue for involving covariates in reliability analysis is that complete and consistent covariate data are often unavailable in reality due to inconsistent measuring frequencies of multiple covariates, sensor failure, and sparse intrusive measurements. This problem has not been studied adequately in current reliability applications. This research thus investigates such incomplete covariates problem in reliability analysis. Typical approaches to handling incomplete covariates have been studied to investigate their performance and effects on the reliability analysis results. Since these existing approaches could underestimate the variance in regressions and introduce extra uncertainties to reliability analysis, the developed NNHMs are extended to include handling incomplete covariates as an integral part. The extended versions of NNHMs have been validated using simulated bearing data and real data from a liquefied natural gas pump. The results demonstrate the new approach outperforms the typical incomplete covariates handling approaches. Another problem in reliability analysis is that future covariates of engineering assets are generally unavailable. In existing practices for multi-step reliability analysis, historical covariates were used to estimate the future covariates. Covariates of engineering assets, however, are often subject to substantial fluctuation due to the influence of both engineering degradation and changes in environmental settings. The commonly used covariate extrapolation methods thus would not be suitable because of the error accumulation and uncertainty propagation. To overcome this difficulty, instead of directly extrapolating covariate values, projection of covariate states is conducted in this research. The estimated covariate states and unknown covariate values in future running steps of assets constitute an incomplete covariate set which is then analysed by the extended NNHMs. A new assessment function is also proposed to evaluate risks of underestimated and overestimated reliability analysis results. A case study using field data from a paper and pulp mill has been conducted and it demonstrates that this new multi-step reliability analysis procedure is able to generate more accurate analysis results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reports on a conceptual model of a larger research effort proceeding from a central interest in the importance of assessing the IS-Support provided to key-user groups. This study conceptualised a new multidimensional IS-Support construct with four dimensions: training, documentation, assistance and authorisation, which form the overarching construct – IS-Support. We argue that a holistic measure for assessing IS-Support should consist of dimensions, and measures, that together assess the variety of the support provided to IS key-user groups. The proposed IS-Support construct is defined as the support the IS key-user groups receive to increase their capabilities in utilising information systems within the organisation. With two interrelated phases, conceptualisation phase and validation phase, to rigorously hypothesise and validate a measurement model, the IS-Support model, proposed in this study, is intended to include the characteristics of analytic theory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

“The Cube” is a unique facility that combines 48 large multi-touch screens and very large-scale projection surfaces to form one of the world’s largest interactive learning and engagement spaces. The Cube facility is part of the Queensland University of Technology’s (QUT) newly established Science and Engineering Centre, designed to showcase QUT’s teaching and research capabilities in the STEM (Science, Technology, Engineering, and Mathematics) disciplines. In this application paper we describe, the Cube, its technical capabilities, design rationale and practical day-to-day operations, supporting up to 70,000 visitors per week. Essential to the Cube’s operation are five interactive applications designed and developed in tandem with the Cube’s technical infrastructure. Each of the Cube’s launch applications was designed and delivered by an independent team, while the overall vision of the Cube was shepherded by a small executive team. The diversity of design, implementation and integration approaches pursued by these five teams provides some insight into the challenges, and opportunities, presented when working with large distributed interaction technologies. We describe each of these applications in order to discuss the different challenges and user needs they address, which types of interactions they support and how they utilise the capabilities of the Cube facility.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The promise of metabonomics, a new "omics" technique, to validate Chinese medicines and the compatibility of Chinese formulas has been appreciated. The present study was undertaken to explore the excretion pattern of low molecular mass metabolites in the male Wistar-derived rat model of kidney yin deficiency induced with thyroxine and reserpine as well as the therapeutic effect of Liu Wei Di Huang Wan (LW) and its separated prescriptions, a classic traditional Chinese medicine formula for treating kidney yin deficiency in China. The study utilized ultra-performance liquid chromatography/electrospray ionization synapt high definition mass spectrometry (UPLC/ESI-SYNAPT-HDMS) in both negative and positive electrospray ionization (ESI). At the same time, blood biochemistry was examined to identify specific changes in the kidney yin deficiency. Distinct changes in the pattern of metabolites, as a result of daily administration of thyroxine and reserpine, were observed by UPLC-HDMS combined with a principal component analysis (PCA). The changes in metabolic profiling were restored to their baseline values after treatment with LW according to the PCA score plots. Altogether, the current metabonomic approach based on UPLC-HDMS and orthogonal projection to latent structures discriminate analysis (OPLS-DA) indicated 20 ions (14 in the negative mode, 8 in the positive mode, and 2 in both) as "differentiating metabolites".

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Receiving emotional support has consistently been demonstrated as an important factor associated with mental health but sparse research has investigated giving support in addition to receiving it or the types of support that predict well-being. In this paper the relationship between giving and receiving instrumental and emotional social support and psychological well-being during and following a natural disaster is investigated. A survey administered between four and six months after fatal floods was conducted with 200 community members consisting of men (n = 68) and women (n = 132) aged between 17 and 87 years. Social support experiences were assessed using the 2-Way Social Support Scale (2-Way SSS; Shakespeare-Finch & Obst, 2011) and eudemonic well-being was measured using the Psychological Well-Being Scale (PWBS; Ryff & Keyes, 1995). Hierarchical multiple regression analyses were used to examine expected relationships and to explore the differential effects of the four factors of the 2-Way SSS. Results indicated that social support shared significant positive associations with domains of psychological well-being, especially with regards to interpersonal relationships. Receiving and giving emotional support were respectively the strongest unique predictors of psychological well-being. However, receiving instrumental support predicted less autonomy. Results highlight the importance of measuring social support as a multidimensional construct and affirm that disaster response policy and practice should focus on emotional as well as instrumental needs in order to promote individual and community psychosocial health following a flooding crisis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose A knowledge-based urban development needs to be sustainable and, therefore, requires ecological planning strategies to ensure a better quality of its services. The purpose of this paper is to present an innovative approach for monitoring the sustainability of urban services and help the policy-making authorities to revise the current planning and development practices for more effective solutions. The paper introduces a new assessment tool–Micro-level Urban-ecosystem Sustainability IndeX (MUSIX) – that provides a quantitative measure of urban sustainability in a local context. Design/methodology/approach A multi-method research approach was employed in the construction of the MUSIX. A qualitative research was conducted through an interpretive and critical literature review in developing a theoretical framework and indicator selection. A quantitative research was conducted through statistical and spatial analyses in data collection, processing and model application. Findings/results MUSIX was tested in a pilot study site and provided information referring to the main environmental impacts arising from rapid urban development and population growth. Related to that, some key ecological planning strategies were recommended to guide the preparation and assessment of development and local area plans. Research limitations/implications This study provided fundamental information that assists developers, planners and policy-makers to investigate the multidimensional nature of sustainability at the local level by capturing the environmental pressures and their driving forces in highly developed urban areas. Originality/value This study measures the sustainability of urban development plans through providing data analysis and interpretation of results in a new spatial data unit.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mathematical models of mosquito-borne pathogen transmission originated in the early twentieth century to provide insights into how to most effectively combat malaria. The foundations of the Ross–Macdonald theory were established by 1970. Since then, there has been a growing interest in reducing the public health burden of mosquito-borne pathogens and an expanding use of models to guide their control. To assess how theory has changed to confront evolving public health challenges, we compiled a bibliography of 325 publications from 1970 through 2010 that included at least one mathematical model of mosquito-borne pathogen transmission and then used a 79-part questionnaire to classify each of 388 associated models according to its biological assumptions. As a composite measure to interpret the multidimensional results of our survey, we assigned a numerical value to each model that measured its similarity to 15 core assumptions of the Ross–Macdonald model. Although the analysis illustrated a growing acknowledgement of geographical, ecological and epidemiological complexities in modelling transmission, most models during the past 40 years closely resemble the Ross–Macdonald model. Modern theory would benefit from an expansion around the concepts of heterogeneous mosquito biting, poorly mixed mosquito-host encounters, spatial heterogeneity and temporal variation in the transmission process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Food is a multidimensional construct. It has social, cultural, economic, psychological, emotional, biological, and political dimensions. It is both a material object and a catalyst for a range of social and cultural action. Richly implicated in the social and cultural milieu, food is a central marker of culture and society. Yet little is known about the messages and knowledges in the school curriculum about food. Popular debates around food in schools are largely connected with biomedical issues of obesity, exercise and nutrition. This is a study of the sociological dimensions of food-related messages, practices and knowledge formations in the primary school curriculum. It uses an exploratory, qualitative case study methodology to identify and examine the food activities of a Year 5 class in a Queensland school. Data was gathered over a twoyear period using observation, documentation and interviews methods. Food was found to be an integral part of the primary school's activity. It had economic, symbolic, pedagogic, and instrumental value. Messages about food were found in the official, enacted and hidden curricular which were framed by a food governance framework of legislation, procedures and norms. In the school studied, food knowledge was commodified as a part of a political economy that centred on an 'eat more' message. Certain foods were privileged over others while myths about energy, fruit, fruit juice and sugar shaped student dispositions, values, norms and action. There was little engagement with the cognitive and behavioural dimensions of food and nutrition. The thesis concludes with recommendations for a whole scale reconsideration of food in schools as curricular content and knowledge.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the past two to three decades, our understanding of poverty has broadened from a narrow focus on income and consumption to a multidimensional notion of education, health, social and political 1 participation, personal security and freedom and environmental quality. Thus, it encompasses not just low income, but lack of access to services, resources and skills; vulnerability; insecurity; and voicelessness and powerlessness. Multidimensional poverty is a determinant of health risks, health seeking behaviour, health care access and health outcomes. As analysis of health outcomes becomes more refined, it is increasingly apparent that the impressive gains in health experienced over recent decades are unevenly distributed. Aggregate indicators, whether at the global, regional or national level, often tend to mask striking variations in health outcomes between men and women, rich and poor, both across and within countries...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Spectroscopic studies of complex clinical fluids have led to the application of a more holistic approach to their chemical analysis becoming more popular and widely employed. The efficient and effective interpretation of multidimensional spectroscopic data relies on many chemometric techniques and one such group of tools is represented by so-called correlation analysis methods. Typical of these techniques are two-dimensional correlation analysis and statistical total correlation spectroscopy (STOCSY). Whilst the former has largely been applied to optical spectroscopic analysis, STOCSY was developed and has been applied almost exclusively to NMR metabonomic studies. Using a 1H NMR study of human blood plasma, from subjects recovering from exhaustive exercise trials, the basic concepts and applications of these techniques are examined. Typical information from their application to NMR-based metabonomics is presented and their value in aiding interpretation of NMR data obtained from biological systems is illustrated. Major energy metabolites are identified in the NMR spectra and the dynamics of their appearance and removal from plasma during exercise recovery are illustrated and discussed. The complementary nature of two-dimensional correlation analysis and statistical total correlation spectroscopy are highlighted.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Climate change is expected to be one of the biggest global health threats in the 21st century. In response to changes in climate and associated extreme events, public health adaptation has become imperative. This thesis examined several key issues in this emerging research field. The thesis aimed to identify the climate-health (particularly temperature-health) relationships, then develop quantitative models that can be used to project future health impacts of climate change, and therefore help formulate adaptation strategies for dealing with climate-related health risks and reducing vulnerability. The research questions addressed by this thesis were: (1) What are the barriers to public health adaptation to climate change? What are the research priorities in this emerging field? (2) What models and frameworks can be used to project future temperature-related mortality under different climate change scenarios? (3) What is the actual burden of temperature-related mortality? What are the impacts of climate change on future burden of disease? and (4) Can we develop public health adaptation strategies to manage the health effects of temperature in response to climate change? Using a literature review, I discussed how public health organisations should implement and manage the process of planned adaptation. This review showed that public health adaptation can operate at two levels: building adaptive capacity and implementing adaptation actions. However, there are constraints and barriers to adaptation arising from uncertainty, cost, technologic limits, institutional arrangements, deficits of social capital, and individual perception of risks. The opportunities for planning and implementing public health adaptation are reliant on effective strategies to overcome likely barriers. I proposed that high priorities should be given to multidisciplinary research on the assessment of potential health effects of climate change, projections of future health impacts under different climate and socio-economic scenarios, identification of health cobenefits of climate change policies, and evaluation of cost-effective public health adaptation options. Heat-related mortality is the most direct and highly-significant potential climate change impact on human health. I thus conducted a systematic review of research and methods for projecting future heat-related mortality under different climate change scenarios. The review showed that climate change is likely to result in a substantial increase in heatrelated mortality. Projecting heat-related mortality requires understanding of historical temperature-mortality relationships, and consideration of future changes in climate, population and acclimatisation. Further research is needed to provide a stronger theoretical framework for mortality projections, including a better understanding of socioeconomic development, adaptation strategies, land-use patterns, air pollution and mortality displacement. Most previous studies were designed to examine temperature-related excess deaths or mortality risks. However, if most temperature-related deaths occur in the very elderly who had only a short life expectancy, then the burden of temperature on mortality would have less public health importance. To guide policy decisions and resource allocation, it is desirable to know the actual burden of temperature-related mortality. To achieve this, I used years of life lost to provide a new measure of health effects of temperature. I conducted a time-series analysis to estimate years of life lost associated with changes in season and temperature in Brisbane, Australia. I also projected the future temperaturerelated years of life lost attributable to climate change. This study showed that the association between temperature and years of life lost was U-shaped, with increased years of life lost on cold and hot days. The temperature-related years of life lost will worsen greatly if future climate change goes beyond a 2 °C increase and without any adaptation to higher temperatures. The excess mortality during prolonged extreme temperatures is often greater than the predicted using smoothed temperature-mortality association. This is because sustained period of extreme temperatures produce an extra effect beyond that predicted by daily temperatures. To better estimate the burden of extreme temperatures, I estimated their effects on years of life lost due to cardiovascular disease using data from Brisbane, Australia. The results showed that the association between daily mean temperature and years of life lost due to cardiovascular disease was U-shaped, with the lowest years of life lost at 24 °C (the 75th percentile of daily mean temperature in Brisbane), rising progressively as temperatures become hotter or colder. There were significant added effects of heat waves, but no added effects of cold spells. Finally, public health adaptation to hot weather is necessary and pressing. I discussed how to manage the health effects of temperature, especially with the context of climate change. Strategies to minimise the health effects of high temperatures and climate change can fall into two categories: reducing the heat exposure and managing the health effects of high temperatures. However, policy decisions need information on specific adaptations, together with their expected costs and benefits. Therefore, more research is needed to evaluate cost-effective adaptation options. In summary, this thesis adds to the large body of literature on the impacts of temperature and climate change on human health. It improves our understanding of the temperaturehealth relationship, and how this relationship will change as temperatures increase. Although the research is limited to one city, which restricts the generalisability of the findings, the methods and approaches developed in this thesis will be useful to other researchers studying temperature-health relationships and climate change impacts. The results may be helpful for decision-makers who develop public health adaptation strategies to minimise the health effects of extreme temperatures and climate change.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the past few decades a major paradigm shift has occurred in the conceptualisation of chronic pain as a complex multidimensional phenomenon. Yet, pain experienced by individuals with a primary disability continues to be understood largely from a traditional biomedical model, despite its inherent limitations. This is reflected in the body of literature on the topic that is primarily driven by positivist assumptions and the search for etiologic pain mechanisms. Conversely, little is known about the experiences of and meanings attributed to, disability-related pain. Thus the purpose of this paper is to discuss the use of focus group methodology in elucidating the meanings and experiences of this population. Here, a distinction is made between the method of the focus group and focus group research as methodology. Typically, the focus group is presented as a seemingly atheoretical method of research. Drawing on research undertaken on the impact of chronic pain in people with multiple sclerosis, this paper seeks to theorise the focus group in arguing the methodological congruence of focus group research and the study of pain experience. It is argued that the contributions of group interaction and shared experiences in focus group discussions produce data and insights less accessible through more structured research methods. It is concluded that a biopsychosocial perspective of chronic pain may only ever be appreciated when the person-in-context is the unit of investigation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Daylight devices are important components of any climate responsive façade system. But, the evolution of parametric CAD systems and digital fabrication has had an impact on architectural form so that regular forms are shifting to complex geometries. Architectural and engineering integration of daylight devices in envelopes with complex geometries is a challenge in terms of design and performance evaluation. The purpose of this paper is to assess daylight performance of a building with a climatic responsive envelope with complex geometry that integrates shading devices in the façade. The case study is based on the Esplanade buildings in Singapore. Climate-based day-light metrics such as Daylight Availability and Useful Daylight Illuminance are used. DIVA (daylight simulation), and Grasshopper (parametric analysis) plug-ins for Rhinoceros have been employed to examine the range of performance possibilities. Parameters such as dimension, inclination of the device, projected shadows and shape have been changed in order to maximize daylight availability and Useful Daylight Illuminance while minimizing glare probability. While orientation did not have a great impact on the results, aperture of the shading devices did, showing that shading devices with a projection of 1.75 m to 2.00 m performed best, achieving target lighting levels without issues of glare.