93 resultados para DENSITY ANALYSIS
Resumo:
Areal bone mineral density (aBMD) is the most common surrogate measurement for assessing the bone strength of the proximal femur associated with osteoporosis. Additional factors, however, contribute to the overall strength of the proximal femur, primarily the anatomical geometry. Finite element analysis (FEA) is an effective and widely used computerbased simulation technique for modeling mechanical loading of various engineering structures, providing predictions of displacement and induced stress distribution due to the applied load. FEA is therefore inherently dependent upon both density and anatomical geometry. FEA may be performed on both three-dimensional and two-dimensional models of the proximal femur derived from radiographic images, from which the mechanical stiffness may be redicted. It is examined whether the outcome measures of two-dimensional FEA, two-dimensional, finite element analysis of X-ray images (FEXI), and three-dimensional FEA computed stiffness of the proximal femur were more sensitive than aBMD to changes in trabecular bone density and femur geometry. It is assumed that if an outcome measure follows known trends with changes in density and geometric parameters, then an increased sensitivity will be indicative of an improved prediction of bone strength. All three outcome measures increased non-linearly with trabecular bone density, increased linearly with cortical shell thickness and neck width, decreased linearly with neck length, and were relatively insensitive to neck-shaft angle. For femoral head radius, aBMD was relatively insensitive, with two-dimensional FEXI and threedimensional FEA demonstrating a non-linear increase and decrease in sensitivity, respectively. For neck anteversion, aBMD decreased non-linearly, whereas both two-dimensional FEXI and three dimensional FEA demonstrated a parabolic-type relationship, with maximum stiffness achieved at an angle of approximately 15o. Multi-parameter analysis showed that all three outcome measures demonstrated their highest sensitivity to a change in cortical thickness. When changes in all input parameters were considered simultaneously, three and twodimensional FEA had statistically equal sensitivities (0.41±0.20 and 0.42±0.16 respectively, p = ns) that were significantly higher than the sensitivity of aBMD (0.24±0.07; p = 0.014 and 0.002 for three-dimensional and two-dimensional FEA respectively). This simulation study suggests that since mechanical integrity and FEA are inherently dependent upon anatomical geometry, FEXI stiffness, being derived from conventional two-dimensional radiographic images, may provide an improvement in the prediction of bone strength of the proximal femur than currently provided by aBMD.
Resumo:
Bone mineral density (BMD) is currently the preferred surrogate for bone strength in clinical practice. Finite element analysis (FEA) is a computer simulation technique that can predict the deformation of a structure when a load is applied, providing a measure of stiffness (N mm− 1). Finite element analysis of X-ray images (3D-FEXI) is a FEA technique whose analysis is derived from a single 2D radiographic image. This ex-vivo study demonstrates that 3D-FEXI derived from a conventional 2D radiographic image has the potential to significantly increase the accuracy of failure load assessment of the proximal femur compared with that currently achieved with BMD.
Resumo:
The refractive error of a human eye varies across the pupil and therefore may be treated as a random variable. The probability distribution of this random variable provides a means for assessing the main refractive properties of the eye without the necessity of traditional functional representation of wavefront aberrations. To demonstrate this approach, the statistical properties of refractive error maps are investigated. Closed-form expressions are derived for the probability density function (PDF) and its statistical moments for the general case of rotationally-symmetric aberrations. A closed-form expression for a PDF for a general non-rotationally symmetric wavefront aberration is difficult to derive. However, for specific cases, such as astigmatism, a closed-form expression of the PDF can be obtained. Further, interpretation of the distribution of the refractive error map as well as its moments is provided for a range of wavefront aberrations measured in real eyes. These are evaluated using a kernel density and sample moments estimators. It is concluded that the refractive error domain allows non-functional analysis of wavefront aberrations based on simple statistics in the form of its sample moments. Clinicians may find this approach to wavefront analysis easier to interpret due to the clinical familiarity and intuitive appeal of refractive error maps.
Resumo:
There is a need in industry for a commodity polyethylene film with controllable degradation properties that will degrade in an environmentally neutral way, for applications such as shopping bags and packaging film. Additives such as starch have been shown to accelerate the degradation of plastic films, however control of degradation is required so that the film will retain its mechanical properties during storage and use, and then degrade when no longer required. By the addition of a photocatalyst it is hoped that polymer film will breakdown with exposure to sunlight. Furthermore, it is desired that the polymer film will degrade in the dark, after a short initial exposure to sunlight. Research has been undertaken into the photo- and thermo-oxidative degradation processes of 25 ìm thick LLDPE (linear low density polyethylene) film containing titania from different manufacturers. Films were aged in a suntest or in an oven at 50 °C, and the oxidation product formation was followed using IR spectroscopy. Degussa P25, Kronos 1002, and various organic-modified and doped titanias of the types Satchleben Hombitan and Hunstsman Tioxide incorporated into LLDPE films were assessed for photoactivity. Degussa P25 was found to be the most photoactive with UVA and UVC exposure. Surface modification of titania was found to reduce photoactivity. Crystal phase is thought to be among the most important factors when assessing the photoactivity of titania as a photocatalyst for degradation. Pre-irradiation with UVA or UVC for 24 hours of the film containing 3% Degussa P25 titania prior to aging in an oven resulted in embrittlement in ca. 200 days. The multivariate data analysis technique PCA (principal component analysis) was used as an exploratory tool to investigate the IR spectral data. Oxidation products formed in similar relative concentrations across all samples, confirming that titania was catalysing the oxidation of the LLDPE film without changing the oxidation pathway. PCA was also employed to compare rates of degradation in different films. PCA enabled the discovery of water vapour trapped inside cavities formed by oxidation by titania particles. Imaging ATR/FTIR spectroscopy with high lateral resolution was used in a novel experiment to examine the heterogeneous nature of oxidation of a model polymer compound caused by the presence of titania particles. A model polymer containing Degussa P25 titania was solvent cast onto the internal reflection element of the imaging ATR/FTIR and the oxidation under UVC was examined over time. Sensitisation of 5 ìm domains by titania resulted in areas of relatively high oxidation product concentration. The suitability of transmission IR with a synchrotron light source to the study of polymer film oxidation was assessed as the Australian Synchrotron in Melbourne, Australia. Challenges such as interference fringes and poor signal-to-noise ratio need to be addressed before this can become a routine technique.
Resumo:
The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.
Resumo:
The unsaturated soil mechanics is receiving increasing attention from researchers and as well as from practicing engineers. However, the requirement of sophisticated devices to measure unsaturated soil properties and time consumption have made the geotechnical engineers keep away from implication of the unsaturated soil mechanics for solving practical geotechnical problems. The application of the conventional laboratory devices with some modifications to measure unsaturated soil properties can promote the application of unsaturated soil mechanics into engineering practice. Therefore, in the present study, a conventional direct shear device was modified to measure unsaturated shear strength parameters at low suction. Specially, for the analysis of rain-induced slope failures, it is important to measure unsaturated shear strength parameters at low suction where slopes become unstable. The modified device was used to measure unsaturated shear strength of two silty soils at low suction values (0 ~ 50 kPa) that were achieved by following drying path and wetting path of soil-water characteristic curves (SWCCs) of soils. The results revealed that the internal friction angle of soil was not significantly affected by the suction and as well as the drying-wetting SWCCs of soils. The apparent cohesion of soil increased with a decreasing rate as the suction increased. Further, the apparent cohesion obtained from soil in wetting was greater than that obtained from soil in drying. Shear stress-shear displacement curves obtained from soil specimens subjected to the same net normal stress and different suction values showed a higher initial stiffness and a greater peak stress as the suction increased. In addition, it was observed that soil became more dilative with the increase of suction. A soil in wetting exhibited slightly higher peak shear stress and more contractive volume change behaviour than that of in drying at the same net normal stress and the suction.
Resumo:
The Centre for Subtropical Design at QUT, in partnership with the Queensland Government and Brisbane City Council, conducts research focused on 'best practice' outcomes for higher density urban living environments in the subtropics through the study of typical urban residential typologies, and urban design. The aim of the research is to inform and illustrate best practice subtropical design principles to policy makers and development industry professionals to stimulate climate-responsive outcomes. The Centre for Subtropical Design recently sought project-specific funding from the Queensland Department of Infrastructure and Planning (DIP) to investigate residential typologies for sustainable subtropical urban communities, based on transit orientated development principles and outcomes for areas around public transport nodes. A development site within the Fitzgibbon Urban Development Area, and close to a rail and bsu transport corridor, provided a case study location for this project. Four design-led multi-disciplinary creative teams participated in a Design Charrette and have produced concept drawings and propositions on a range of options, or prototypes. Analysis of selected prototypes has been undertaken to determine their environmental, economic and social performance. This Project Report discusses the scope of the project funded by DIP in terms of activities undertaken to date, and deliverables achieved. A subsequent Research Report will discuss the detailed findings of the analysis.
Resumo:
In June 2009 the Centre for Subtropical Design at the Queensland University of Technology conducted a design charrette to research design concepts for liveable subtropical neighbourhoods characterised by higher-density, mixed-use, family orientated housing. Subsequent analysis of the proposed designs evaluated how well these typologies support economic, environmental and social sustainability. The study was led by Ms Rosemary Kennedy, Director of the Centre for Subtropical Design and QUT School of Design Adjunct Professor Peter Richards, Chair of the Centre for Subtropical Design Board and director of Deicke Richards Architects and Urban Designers.
Resumo:
In June 2009 the Centre for Subtropical Design at the Queensland University of Technology conducted a design charrette to research design concepts for liveable subtropical neighbourhoods characterised by higher-density, mixed-use, family orientated housing. Subsequent analysis of the proposed designs evaluated how well these typologies support economic, environmental and social sustainability. The study was led by Ms Rosemary Kennedy, Director of the Centre for Subtropical Design and QUT School of Design Adjunct Professor Peter Richards, Chair of the Centre for Subtropical Design Board and director of Deicke Richards Architects and Urban Designers.
Resumo:
Humankind has been dealing with all kinds of disasters since the dawn of time. The risk and impact of disasters producing mass casualties worldwide is increasing, due partly to global warming as well as to increased population growth, increased density and the aging population. China, as a country with a large population, vast territory, and complex climatic and geographical conditions, has been plagued by all kinds of disasters. Disaster health management has traditionally been a relatively arcane discipline within public health. However, SARS, Avian Influenza, and earthquakes and floods, along with the need to be better prepared for the Olympic Games in China has brought disasters, their management and their potential for large scale health consequences on populations to the attention of the public, the government and the international community alike. As a result significant improvements were made to the disaster management policy framework, as well as changes to systems and structures to incorporate an improved disaster management focus. This involved the upgrade of the Centres for Disease Control and Prevention (CDC) throughout China to monitor and better control the health consequences particularly of infectious disease outbreaks. However, as can be seen in the Southern China Snow Storm and Wenchuan Earthquake in 2008, there remains a lack of integrated disaster management and efficient medical rescue, which has been costly in terms of economics and health for China. In the context of a very large and complex country, there is a need to better understand whether these changes have resulted in effective management of the health impacts of such incidents. To date, the health consequences of disasters, particularly in China, have not been a major focus of study. The main aim of this study is to analyse and evaluate disaster health management policy in China and in particular, its ability to effectively manage the health consequences of disasters. Flood has been selected for this study as it is a common and significant disaster type in China and throughout the world. This information will then be used to guide conceptual understanding of the health consequences of floods. A secondary aim of the study is to compare disaster health management in China and Australia as these countries differ in their length of experience in having a formalised policy response. The final aim of the study is to determine the extent to which Walt and Gilson’s (1994) model of policy explains how disaster management policy in China was developed and implemented after SARS in 2003 to the present day. This study has utilised a case study methodology. A document analysis and literature search of Chinese and English sources was undertaken to analyse and produce a chronology of disaster health management policy in China. Additionally, three detailed case studies of flood health management in China were undertaken along with three case studies in Australia in order to examine the policy response and any health consequences stemming from the floods. A total of 30 key international disaster health management experts were surveyed to identify fundamental elements and principles of a successful policy framework for disaster health management. Key policy ingredients were identified from the literature, the case-studies and the survey of experts. Walt and Gilson (1994)’s policy model that focuses on the actors, content, context and process of policy was found to be a useful model for analysing disaster health management policy development and implementation in China. This thesis is divided into four parts. Part 1 is a brief overview of the issues and context to set the scene. Part 2 examines the conceptual and operational context including the international literature, government documents and the operational environment for disaster health management in China. Part 3 examines primary sources of information to inform the analysis. This involves two key studies: • A comparative analysis of the management of floods in China and Australia • A survey of international experts in the field of disaster management so as to inform the evaluation of the policy framework in existence in China and the criteria upon which the expression of that policy could be evaluated Part 4 describes the key outcomes of this research which include: • A conceptual framework for describing the health consequences of floods • A conceptual framework for disaster health management • An evaluation of the disaster health management policy and its implementation in China. The research outcomes clearly identified that the most significant improvements are to be derived from improvements in the generic management of disasters, rather than the health aspects alone. Thus, the key findings and recommendations tend to focus on generic issues. The key findings of this research include the following: • The health consequences of floods may be described in terms of time as ‘immediate’, ‘medium term’ and ‘long term’ and also in relation to causation as ‘direct’ and ‘indirect’ consequences of the flood. These two aspects form a matrix which in turn guides management responses. • Disaster health management in China requires a more comprehensive response throughout the cycle of prevention, preparedness, response and recovery but it also requires a more concentrated effort on policy implementation to ensure the translation of the policy framework into effective incident management. • The policy framework in China is largely of international standard with a sound legislative base. In addition the development of the Centres for Disease Control and Prevention has provided the basis for a systematic approach to health consequence management. However, the key weaknesses in the current system include: o The lack of a key central structure to provide the infrastructure with vital support for policy development, implementation and evaluation. o The lack of well-prepared local response teams similar to local government based volunteer groups in Australia. • The system lacks structures to coordinate government action at the local level. The result of this is a poorly coordinated local response and lack of clarity regarding the point at which escalation of the response to higher levels of government is advisable. These result in higher levels of risk and negative health impacts. The key recommendations arising from this study are: 1. Disaster health management policy in China should be enhanced by incorporating disaster management considerations into policy development, and by requiring a disaster management risk analysis and disaster management impact statement for development proposals. 2. China should transform existing organizations to establish a central organisation similar to the Federal Emergency Management Agency (FEMA) in the USA or the Emergency Management Australia (EMA) in Australia. This organization would be responsible for leading nationwide preparedness through planning, standards development, education and incident evaluation and to provide operational support to the national and local government bodies in the event of a major incident. 3. China should review national and local plans to reflect consistency in planning, and to emphasize the advantages of the integrated planning process. 4. Enhance community resilience through community education and the development of a local volunteer organization. China should develop a national strategy which sets direction and standards in regard to education and training, and requires system testing through exercises. Other initiatives may include the development of a local volunteer capability with appropriate training to assist professional response agencies such as police and fire services in a major incident. An existing organisation such as the Communist Party may be an appropriate structure to provide this response in a cost effective manner. 5. Continue development of professional emergency services, particularly ambulance, to ensure an effective infrastructure is in place to support the emergency response in disasters. 6. Funding for disaster health management should be enhanced, not only from government, but also from other sources such as donations and insurance. It is necessary to provide a more transparent mechanism to ensure the funding is disseminated according to the needs of the people affected. 7. Emphasis should be placed on prevention and preparedness, especially on effective disaster warnings. 8. China should develop local disaster health management infrastructure utilising existing resources wherever possible. Strategies for enhancing local infrastructure could include the identification of local resources (including military resources) which could be made available to support disaster responses. It should develop operational procedures to access those resources. Implementation of these recommendations should better position China to reduce the significant health consequences experienced each year from major incidents such as floods and to provide an increased level of confidence to the community about the country’s capacity to manage such events.
Resumo:
In public places, crowd size may be an indicator of congestion, delay, instability, or of abnormal events, such as a fight, riot or emergency. Crowd related information can also provide important business intelligence such as the distribution of people throughout spaces, throughput rates, and local densities. A major drawback of many crowd counting approaches is their reliance on large numbers of holistic features, training data requirements of hundreds or thousands of frames per camera, and that each camera must be trained separately. This makes deployment in large multi-camera environments such as shopping centres very costly and difficult. In this chapter, we present a novel scene-invariant crowd counting algorithm that uses local features to monitor crowd size. The use of local features allows the proposed algorithm to calculate local occupancy statistics, scale to conditions which are unseen in the training data, and be trained on significantly less data. Scene invariance is achieved through the use of camera calibration, allowing the system to be trained on one or more viewpoints and then deployed on any number of new cameras for testing without further training. A pre-trained system could then be used as a ‘turn-key’ solution for crowd counting across a wide range of environments, eliminating many of the costly barriers to deployment which currently exist.
Resumo:
Navigational collisions are a major safety concern in many seaports. Despite the recent advances in port navigational safety research, little is known about harbor pilot’s perception of collision risks in anchorages. This study attempts to model such risks by employing a hierarchical ordered probit model, which is calibrated by using data collected through a risk perception survey conducted on Singapore port pilots. The hierarchical model is found to be useful to account for correlations in risks perceived by individual pilots. Results show higher perceived risks in anchorages attached to intersection, local and international fairway; becoming more critical at night. Lesser risks are perceived in anchorages featuring shoreline in boundary, higher water depth, lower density of stationary ships, cardinal marks and isolated danger marks. Pilotage experience shows a negative effect on perceived risks. This study indicates that hierarchical modeling would be useful for treating correlations in navigational safety data.
Resumo:
We present a formalism for the analysis of sensitivity of nuclear magnetic resonance pulse sequences to variations of pulse sequence parameters, such as radiofrequency pulses, gradient pulses or evolution delays. The formalism enables the calculation of compact, analytic expressions for the derivatives of the density matrix and the observed signal with respect to the parameters varied. The analysis is based on two constructs computed in the course of modified density-matrix simulations: the error interrogation operators and error commutators. The approach presented is consequently named the Error Commutator Formalism (ECF). It is used to evaluate the sensitivity of the density matrix to parameter variation based on the simulations carried out for the ideal parameters, obviating the need for finite-difference calculations of signal errors. The ECF analysis therefore carries a computational cost comparable to a single density-matrix or product-operator simulation. Its application is illustrated using a number of examples from basic NMR spectroscopy. We show that the strength of the ECF is its ability to provide analytic insights into the propagation of errors through pulse sequences and the behaviour of signal errors under phase cycling. Furthermore, the approach is algorithmic and easily amenable to implementation in the form of a programming code. It is envisaged that it could be incorporated into standard NMR product-operator simulation packages.