308 resultados para refined multiscale entropy
Resumo:
This thesis is devoted to the study of linear relationships in symmetric block ciphers. A block cipher is designed so that the ciphertext is produced as a nonlinear function of the plaintext and secret master key. However, linear relationships within the cipher can still exist if the texts and components of the cipher are manipulated in a number of ways, as shown in this thesis. There are four main contributions of this thesis. The first contribution is the extension of the applicability of integral attacks from word-based to bitbased block ciphers. Integral attacks exploit the linear relationship between texts at intermediate stages of encryption. This relationship can be used to recover subkey bits in a key recovery attack. In principle, integral attacks can be applied to bit-based block ciphers. However, specific tools to define the attack on these ciphers are not available. This problem is addressed in this thesis by introducing a refined set of notations to describe the attack. The bit patternbased integral attack is successfully demonstrated on reduced-round variants of the block ciphers Noekeon, Present and Serpent. The second contribution is the discovery of a very small system of equations that describe the LEX-AES stream cipher. LEX-AES is based heavily on the 128-bit-key (16-byte) Advanced Encryption Standard (AES) block cipher. In one instance, the system contains 21 equations and 17 unknown bytes. This is very close to the upper limit for an exhaustive key search, which is 16 bytes. One only needs to acquire 36 bytes of keystream to generate the equations. Therefore, the security of this cipher depends on the difficulty of solving this small system of equations. The third contribution is the proposal of an alternative method to measure diffusion in the linear transformation of Substitution-Permutation-Network (SPN) block ciphers. Currently, the branch number is widely used for this purpose. It is useful for estimating the possible success of differential and linear attacks on a particular SPN cipher. However, the measure does not give information on the number of input bits that are left unchanged by the transformation when producing the output bits. The new measure introduced in this thesis is intended to complement the current branch number technique. The measure is based on fixed points and simple linear relationships between the input and output words of the linear transformation. The measure represents the average fraction of input words to a linear diffusion transformation that are not effectively changed by the transformation. This measure is applied to the block ciphers AES, ARIA, Serpent and Present. It is shown that except for Serpent, the linear transformations used in the block ciphers examined do not behave as expected for a random linear transformation. The fourth contribution is the identification of linear paths in the nonlinear round function of the SMS4 block cipher. The SMS4 block cipher is used as a standard in the Chinese Wireless LAN Wired Authentication and Privacy Infrastructure (WAPI) and hence, the round function should exhibit a high level of nonlinearity. However, the findings in this thesis on the existence of linear relationships show that this is not the case. It is shown that in some exceptional cases, the first four rounds of SMS4 are effectively linear. In these cases, the effective number of rounds for SMS4 is reduced by four, from 32 to 28. The findings raise questions about the security provided by SMS4, and might provide clues on the existence of a flaw in the design of the cipher.
Resumo:
The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.
Resumo:
During the past decade, a significant amount of research has been conducted internationally with the aim of developing, implementing, and verifying "advanced analysis" methods suitable for non-linear analysis and design of steel frame structures. Application of these methods permits comprehensive assessment of the actual failure modes and ultimate strengths of structural systems in practical design situations, without resort to simplified elastic methods of analysis and semi-empirical specification equations. Advanced analysis has the potential to extend the creativity of structural engineers and simplify the design process, while ensuring greater economy and more uniform safety with respect to the ultimate limit state. The application of advanced analysis methods has previously been restricted to steel frames comprising only members with compact cross-sections that are not subject to the effects of local buckling. This precluded the use of advanced analysis from the design of steel frames comprising a significant proportion of the most commonly used Australian sections, which are non-compact and subject to the effects of local buckling. This thesis contains a detailed description of research conducted over the past three years in an attempt to extend the scope of advanced analysis by developing methods that include the effects of local buckling in a non-linear analysis formulation, suitable for practical design of steel frames comprising non-compact sections. Two alternative concentrated plasticity formulations are presented in this thesis: the refined plastic hinge method and the pseudo plastic zone method. Both methods implicitly account for the effects of gradual cross-sectional yielding, longitudinal spread of plasticity, initial geometric imperfections, residual stresses, and local buckling. The accuracy and precision of the methods for the analysis of steel frames comprising non-compact sections has been established by comparison with a comprehensive range of analytical benchmark frame solutions. Both the refined plastic hinge and pseudo plastic zone methods are more accurate and precise than the conventional individual member design methods based on elastic analysis and specification equations. For example, the pseudo plastic zone method predicts the ultimate strength of the analytical benchmark frames with an average conservative error of less than one percent, and has an acceptable maximum unconservati_ve error of less than five percent. The pseudo plastic zone model can allow the design capacity to be increased by up to 30 percent for simple frames, mainly due to the consideration of inelastic redistribution. The benefits may be even more significant for complex frames with significant redundancy, which provides greater scope for inelastic redistribution. The analytical benchmark frame solutions were obtained using a distributed plasticity shell finite element model. A detailed description of this model and the results of all the 120 benchmark analyses are provided. The model explicitly accounts for the effects of gradual cross-sectional yielding, longitudinal spread of plasticity, initial geometric imperfections, residual stresses, and local buckling. Its accuracy was verified by comparison with a variety of analytical solutions and the results of three large-scale experimental tests of steel frames comprising non-compact sections. A description of the experimental method and test results is also provided.
Resumo:
This research investigated the impact of Education Queensland's employment policy and practices for beginning secondary teachers appointed on temporary engagement. The context was the public secondary school sector within the state of Queensland, Australia. The study was set within a context of the changing nature of work from full-time permanent employment towards casual, fixed-term contracts, temporary and part-time employment, a trend reflected in the employment patterns for teachers within Australia. Two broad categories of literature relating to the research problem of this thesis were reviewed, namely the beginning teacher and permanency or tenure. The focus in the research literature on beginning teachers was the professional experiences of teachers within the classroom and school. There was a paucity of research that considered the working and industrial conditions of temporary employment for beginning teachers or the personal and professional implications of this form of employment. The review of the context and literature was conceptualised as a Beginning Temporary Teacher Theoretical Framework which served to inform the study. Using a qualitative case study methodology, the research techniques employed for the thesis were semi-structured interview and document analysis. A simultaneously conducted research project in which the researcher participated entitled 'Winning the Lottery? Beginning Teachers on Temporary Engagement' foregrounded this thesis in terms of refining the research question, contributing to the literature and in the selection of the participants. For this case study the perspectives of four distinct yet inter-related categories of professionals were sought. These included four beginning secondary teachers, three school administrators, a Senior Personnel Officer with Education Queensland, and a representative from the Queensland Teachers' Union. The research findings indicated that none of the beginning teachers or other professionals viewed starting a career in teaching on temporary engagement as the ideal. The negative features identified were the differential treatment received and the high level of uncertainty associated with temporary employment. Differential treatment tended to indicate 'less' entitlements, in terms of access to induction and professional development, recreational and sick leave, acceptance by and expectations of other colleagues, and avenues of redress in grievance cases. Moreover, interviews indicated a high level of uncertainty in terms of starting within the teaching profession, commencing at a new school, and a regular income. In addition, frequent changes in schools and/or cohorts of students exacerbated levels of uncertainty. The beginning teachers reported significantly decreased motivation, self-esteem and sense of belonging, and increased stress levels. There was an even more marked negative impact on those beginning teachers who had experienced a higher number of temporary engagements and schools in their first year of teaching. Conversely, strong staff support and a reasonable length of time in the one school improved the quality of the beginning teachers' experiences. The overall impact of being on temporary engagement resulted in delayed permanent position appointments, decreased commitment to particular schools and to Education Queensland as the employing authority, and for two of the beginning teachers, it produced a desire to seek alternative employment. The implementation of Education Queensland's policies relating to working conditions and entitlements for these temporary beginning teachers at the school level was revealed to be less than satisfactory. There was a tendency towards 'just-in- time' management of the beginning teacher on temporary engagement. The beginning teachers received 'less-than-messages' about access to and use of departmental documentation, support through induction and professional development, and their transition from temporary to permanent employment. To ensure a more systematic, supportive and inclusive process for managing the temporary beginning teacher, a conceptual framework entitled 'Continuums of Tension' was developed. The four continuums included permanent employment - temporary employment; system perspective - individual perspective; teaching as a profession - teaching as a job; and the permanent beginning teacher - university graduate. The general principles of the human resource policies of Education Queensland were based on a commitment to permanent employment, a system's perspective, viewing teaching as a profession and a homogeneous group of permanent beginning teachers. Contrasting with this, the beginning teacher on temporary engagement tended to operate from the position of temporary employment and a perspective that was individually based. Their priorities therefore included the 'occupational' aspects of being a temporary teacher striving to become permanent. Thus there existed a tension or contradiction between the general principles of human resource policies within Education Queensland and the employment experiences of beginning teachers on temporary engagement. The study proposed three actions for resolution to address the aforementioned tensions. The actions included: (a) the effective provision and targeted communication of information; (b) support, induction and professional development; and (c) a coordinated approach between Education Queensland, Queensland Teachers' Union, the Universities and the beginning teacher. These actions are fm1her refined to include: (a) an induction kit to suppm1 the individual through the pre-employment to permanent employee phases, (b) an extrapolation of the roles and responsibilities of Education Queensland personnel charged with supporting the beginning temporary teacher, and (c) a series of recommendations to effect a coordinated approach amongst the key stakeholders. The theoretical and conceptual frameworks have provided a means of addressing the identified needs of the beginning teacher on temporary engagement. As such, this study has contributed to the research literature on teacher employment and professionalism and aims to provide a beginning temporary teacher with managed professional and occupational support.
Resumo:
Conifers are resistant to attack from a large number of potential herbivores or pathogens. Previous molecular and biochemical characterization of selected conifer defence systems support a model of multigenic, constitutive and induced defences that act on invading insects via physical, chemical, biochemical or ecological (multitrophic) mechanisms. However, the genomic foundation of the complex defence and resistance mechanisms of conifers is largely unknown. As part of a genomics strategy to characterize inducible defences and possible resistance mechanisms of conifers against insect herbivory, we developed a cDNA microarray building upon a new spruce (Picea spp.) expressed sequence tag resource. This first-generation spruce cDNA microarray contains 9720 cDNA elements representing c. 5500 unique genes. We used this array to monitor gene expression in Sitka spruce (Picea sitchensis) bark in response to herbivory by white pine weevils (Pissodes strobi, Curculionidae) or wounding, and in young shoot tips in response to western spruce budworm (Choristoneura occidentalis, Lepidopterae) feeding. Weevils are stem-boring insects that feed on phloem, while budworms are foliage feeding larvae that consume needles and young shoot tips. Both insect species and wounding treatment caused substantial changes of the host plant transcriptome detected in each case by differential gene expression of several thousand array elements at 1 or 2 d after the onset of treatment. Overall, there was considerable overlap among differentially expressed gene sets from these three stress treatments. Functional classification of the induced transcripts revealed genes with roles in general plant defence, octadecanoid and ethylene signalling, transport, secondary metabolism, and transcriptional regulation. Several genes involved in primary metabolic processes such as photosynthesis were down-regulated upon insect feeding or wounding, fitting with the concept of dynamic resource allocation in plant defence. Refined expression analysis using gene-specific primers and real-time PCR for selected transcripts was in agreement with microarray results for most genes tested. This study provides the first large-scale survey of insect-induced defence transcripts in a gymnosperm and provides a platform for functional investigation of plant-insect interactions in spruce. Induction of spruce genes of octadecanoid and ethylene signalling, terpenoid biosynthesis, and phenolic secondary metabolism are discussed in more detail.
Resumo:
The rapid growth in the number of online services leads to an increasing number of different digital identities each user needs to manage. As a result, many people feel overloaded with credentials, which in turn negatively impact their ability to manage them securely. Passwords are perhaps the most common type of credential used today. To avoid the tedious task of remembering difficult passwords, users often behave less securely by using low entropy and weak passwords. Weak passwords and bad password habits represent security threats to online services. Some solutions have been developed to eliminate the need for users to create and manage passwords. A typical solution is based on giving the user a hardware token that generates one-time-passwords, i.e. passwords for single session or transaction usage. Unfortunately, most of these solutions do not satisfy scalability and/or usability requirements, or they are simply insecure. In this paper, we propose a scalable OTP solution using mobile phones and based on trusted computing technology that combines enhanced usability with strong security.
Resumo:
Background This research addresses the development of a digital stethoscope for use with a telehealth communications network to allow doctors to examine patients remotely (a digital telehealth stethoscope). A telehealth stethoscope would allow remote auscultation of patients who do not live near a major hospital. Travelling from remote areas to major hospitals is expensive for patients and a telehealth stethoscope could result in significant cost savings. Using a stethoscope requires great skill. To design a telehealth stethoscope that meets doctors’ expectations, the use of existing stethoscopes in clinical contexts must be examined. Method Observations were conducted of 30 anaesthetic preadmission consultations. The observations were video- taped. Interaction between doctor, patient and non-human elements in the consultation were “coded” to transform the video into data. The data were analysed to reveal essential aspects of the interactions. Results The analysis has shown that the doctor controls the interaction during auscultation. The conduct of auscultation draws heavily on the doctor’s tacit knowledge, allowing the doctor to treat the acoustic stethoscope as infrastructure – that is, the stethoscope sinks into the background and becomes completely transparent in use. Conclusion Two important, and related, implications for the design of a telehealth stethoscope have arisen from this research. First, as a telehealth stethoscope will be a shared device, doctors will not be able to make use of their existing expertise in using their own stethoscopes. Very simply, a telehealth stethoscope will sound different to a doctor’s own stethoscope. Second, the collaborative interaction required to use a telehealth stethoscope will have to be invented and refined. A telehealth stethoscope will need to be carefully designed to address these issues and result in successful use. This research challenges the concept of a telehealth stethoscope by raising questions about the ease and confidence with which doctors could use such a device.
Resumo:
This workshop explores innovative approaches to understanding and cultivating sustainable food culture in urban environments via human-computer-interaction (HCI) design and ubiquitous technologies. We perceive the city as an intersecting network of people, place, and technology in constant transformation. Our 2009 OZCHI workshop, Hungry 24/7? HCI Design for Sustainable Food Culture, opened a new space for discussion on this intersection amongst researchers and practitioners from diverse backgrounds including academia, government, industry, and non-for-profit organisations. Building on the past success, this new instalment of the workshop series takes a more refined view on mobile human-food interaction and the role of interactive media in engaging citizens to cultivate more sustainable everyday human-food interactions on the go. Interactive media in this sense is distributed, pervasive, and embedded in the city as a network. The workshop addresses environmental, health, and social domains of sustainability by bringing together insights across disciplines to discuss conceptual and design approaches in orchestrating mobility and interaction of people and food in the city as a network of people, place, technology, and food.
Resumo:
Lack of a universally accepted and comprehensive taxonomy of cybercrime seriously impedes international efforts to accurately identify, report and monitor cybercrime trends. There is, not surprisingly, a corresponding disconnect internationally on the cybercrime legislation front, a much more serious problem and one which the International Telecommunication Union (ITU) says requires „the urgent attention of all nations‟. Yet, and despite the existence of the Council of Europe Convention on Cybercrime, a proposal for a global cybercrime treaty was rejected by the United Nations (UN) as recently as April 2010. This paper presents a refined and comprehensive taxonomy of cybercrime and demonstrates its utility for widespread use. It analyses how the USA, the UK, Australia and the UAE align with the CoE Convention and finds that more needs to be done to achieve conformance. We conclude with an analysis of the approaches used in Australia, in Queensland, and in the UAE, in Abu Dhabi, to fight cybercrime and identify a number of shared problems.
Resumo:
The theory of nonlinear dyamic systems provides some new methods to handle complex systems. Chaos theory offers new concepts, algorithms and methods for processing, enhancing and analyzing the measured signals. In recent years, researchers are applying the concepts from this theory to bio-signal analysis. In this work, the complex dynamics of the bio-signals such as electrocardiogram (ECG) and electroencephalogram (EEG) are analyzed using the tools of nonlinear systems theory. In the modern industrialized countries every year several hundred thousands of people die due to sudden cardiac death. The Electrocardiogram (ECG) is an important biosignal representing the sum total of millions of cardiac cell depolarization potentials. It contains important insight into the state of health and nature of the disease afflicting the heart. Heart rate variability (HRV) refers to the regulation of the sinoatrial node, the natural pacemaker of the heart by the sympathetic and parasympathetic branches of the autonomic nervous system. Heart rate variability analysis is an important tool to observe the heart's ability to respond to normal regulatory impulses that affect its rhythm. A computerbased intelligent system for analysis of cardiac states is very useful in diagnostics and disease management. Like many bio-signals, HRV signals are non-linear in nature. Higher order spectral analysis (HOS) is known to be a good tool for the analysis of non-linear systems and provides good noise immunity. In this work, we studied the HOS of the HRV signals of normal heartbeat and four classes of arrhythmia. This thesis presents some general characteristics for each of these classes of HRV signals in the bispectrum and bicoherence plots. Several features were extracted from the HOS and subjected an Analysis of Variance (ANOVA) test. The results are very promising for cardiac arrhythmia classification with a number of features yielding a p-value < 0.02 in the ANOVA test. An automated intelligent system for the identification of cardiac health is very useful in healthcare technology. In this work, seven features were extracted from the heart rate signals using HOS and fed to a support vector machine (SVM) for classification. The performance evaluation protocol in this thesis uses 330 subjects consisting of five different kinds of cardiac disease conditions. The classifier achieved a sensitivity of 90% and a specificity of 89%. This system is ready to run on larger data sets. In EEG analysis, the search for hidden information for identification of seizures has a long history. Epilepsy is a pathological condition characterized by spontaneous and unforeseeable occurrence of seizures, during which the perception or behavior of patients is disturbed. An automatic early detection of the seizure onsets would help the patients and observers to take appropriate precautions. Various methods have been proposed to predict the onset of seizures based on EEG recordings. The use of nonlinear features motivated by the higher order spectra (HOS) has been reported to be a promising approach to differentiate between normal, background (pre-ictal) and epileptic EEG signals. In this work, these features are used to train both a Gaussian mixture model (GMM) classifier and a Support Vector Machine (SVM) classifier. Results show that the classifiers were able to achieve 93.11% and 92.67% classification accuracy, respectively, with selected HOS based features. About 2 hours of EEG recordings from 10 patients were used in this study. This thesis introduces unique bispectrum and bicoherence plots for various cardiac conditions and for normal, background and epileptic EEG signals. These plots reveal distinct patterns. The patterns are useful for visual interpretation by those without a deep understanding of spectral analysis such as medical practitioners. It includes original contributions in extracting features from HRV and EEG signals using HOS and entropy, in analyzing the statistical properties of such features on real data and in automated classification using these features with GMM and SVM classifiers.
Resumo:
Aim Australian residential aged care does not have a system of quality assessment related to clinical outcomes, or comprehensive quality benchmarking. The Residential Care Quality Assessment was developed to fill this gap; and this paper discusses the process by which preliminary benchmarks representing high and low quality were developed for it. Methods Data were collected from all residents (n = 498) of nine facilities. Numerator–denominator analysis of clinical outcomes occurred at a facility-level, with rank-ordered results circulated to an expert panel. The panel identified threshold scores to indicate excellent and questionable care quality, and refined these through Delphi process. Results Clinical outcomes varied both within and between facilities; agreed thresholds for excellent and poor outcomes were finalised after three Delphi rounds. Conclusion Use of the Residential Care Quality Assessment provides a concrete means of monitoring care quality and allows benchmarking across facilities; its regular use could contribute to improved care outcomes within residential aged care in Australia.
Resumo:
This paper presents a multiscale study using the coupled Meshless technique/Molecular Dynamics (M2) for exploring the deformation mechanism of mono-crystalline metal (focus on copper) under uniaxial tension. In M2, an advanced transition algorithm using transition particles is employed to ensure the compatibility of both displacements and their gradients, and an effective local quasi-continuum approach is also applied to obtain the equivalent continuum strain energy density based on the atomistic poentials and Cauchy-Born rule. The key parameters used in M2 are firstly investigated using a benchmark problem. Then M2 is applied to the multiscale simulation for a mono-crystalline copper bar. It has found that the mono-crystalline copper has very good elongation property, and the ultimate strength and Young's modulus are much higher than those obtained in macro-scale.
Resumo:
Generative systems are now being proposed for addressing major ecological problems. The Complex Urban Systems Project (CUSP) founded in 2008 at the Queensland University of Technology, emphasises the ecological significance of the generative global networking of urban environments. It argues that the natural planetary systems for balancing global ecology are no longer able to respond sufficiently rapidly to the ecological damage caused by humankind and by dense urban conurbations in particular as evidenced by impacts such as climate change. The proposal of this research project is to provide a high speed generative nervous system for the planet by connecting major cities globally to interact directly with natural ecosystems to engender rapid ecological response. This would be achieved by active interactions of the global urban network with the natural ecosystem in the ecological principle of entropy. The key goal is to achieve ecologically positive cities by activating self-organising cities capable of full integration into natural eco-systems and to netowork the cities globally to provide the planet with a nervous system.
Resumo:
This paper argues a model of open system design for sustainable architecture, based on a thermodynamics framework of entropy as an evolutionary paradigm. The framework can be simplified to stating that an open system evolves in a non-linear pattern from a far-from-equilibrium state towards a non-equilibrium state of entropy balance, which is a highly ordered organization of the system when order comes out of chaos. This paper is work in progress on a PhD research project which aims to propose building information modelling for optimization and adaptation of buildings environmental performance as an alternative sustainable design program in architecture. It will be used for efficient distribution and consumption of energy and material resource in life-cycle buildings, with the active involvement of the end-users and the physical constraints of the natural environment.
Resumo:
Background: In response to the need for more comprehensive quality assessment within Australian residential aged care facilities, the Clinical Care Indicator (CCI) Tool was developed to collect outcome data as a means of making inferences about quality. A national trial of its effectiveness and a Brisbane-based trial of its use within the quality improvement context determined the CCI Tool represented a potentially valuable addition to the Australian aged care system. This document describes the next phase in the CCI Tool.s development; the aims of which were to establish validity and reliability of the CCI Tool, and to develop quality indicator thresholds (benchmarks) for use in Australia. The CCI Tool is now known as the ResCareQA (Residential Care Quality Assessment). Methods: The study aims were achieved through a combination of quantitative data analysis, and expert panel consultations using modified Delphi process. The expert panel consisted of experienced aged care clinicians, managers, and academics; they were initially consulted to determine face and content validity of the ResCareQA, and later to develop thresholds of quality. To analyse its psychometric properties, ResCareQA forms were completed for all residents (N=498) of nine aged care facilities throughout Queensland. Kappa statistics were used to assess inter-rater and test-retest reliability, and Cronbach.s alpha coefficient calculated to determine internal consistency. For concurrent validity, equivalent items on the ResCareQA and the Resident Classification Scales (RCS) were compared using Spearman.s rank order correlations, while discriminative validity was assessed using known-groups technique, comparing ResCareQA results between groups with differing care needs, as well as between male and female residents. Rank-ordered facility results for each clinical care indicator (CCI) were circulated to the panel; upper and lower thresholds for each CCI were nominated by panel members and refined through a Delphi process. These thresholds indicate excellent care at one extreme and questionable care at the other. Results: Minor modifications were made to the assessment, and it was renamed the ResCareQA. Agreement on its content was reached after two Delphi rounds; the final version contains 24 questions across four domains, enabling generation of 36 CCIs. Both test-retest and inter-rater reliability were sound with median kappa values of 0.74 (test-retest) and 0.91 (inter-rater); internal consistency was not as strong, with a Chronbach.s alpha of 0.46. Because the ResCareQA does not provide a single combined score, comparisons for concurrent validity were made with the RCS on an item by item basis, with most resultant correlations being quite low. Discriminative validity analyses, however, revealed highly significant differences in total number of CCIs between high care and low care groups (t199=10.77, p=0.000), while the differences between male and female residents were not significant (t414=0.56, p=0.58). Clinical outcomes varied both within and between facilities; agreed upper and lower thresholds were finalised after three Delphi rounds. Conclusions: The ResCareQA provides a comprehensive, easily administered means of monitoring quality in residential aged care facilities that can be reliably used on multiple occasions. The relatively modest internal consistency score was likely due to the multi-factorial nature of quality, and the absence of an aggregate result for the assessment. Measurement of concurrent validity proved difficult in the absence of a gold standard, but the sound discriminative validity results suggest that the ResCareQA has acceptable validity and could be confidently used as an indication of care quality within Australian residential aged care facilities. The thresholds, while preliminary due to small sample size, enable users to make judgements about quality within and between facilities. Thus it is recommended the ResCareQA be adopted for wider use.