49 resultados para Correlation algorithm
em University of Queensland eSpace - Australia
Resumo:
The modelling of inpatient length of stay (LOS) has important implications in health care studies. Finite mixture distributions are usually used to model the heterogeneous LOS distribution, due to a certain proportion of patients sustaining-a longer stay. However, the morbidity data are collected from hospitals, observations clustered within the same hospital are often correlated. The generalized linear mixed model approach is adopted to accommodate the inherent correlation via unobservable random effects. An EM algorithm is developed to obtain residual maximum quasi-likelihood estimation. The proposed hierarchical mixture regression approach enables the identification and assessment of factors influencing the long-stay proportion and the LOS for the long-stay patient subgroup. A neonatal LOS data set is used for illustration, (C) 2003 Elsevier Science Ltd. All rights reserved.
Resumo:
We introduce a new second-order method of texture analysis called Adaptive Multi-Scale Grey Level Co-occurrence Matrix (AMSGLCM), based on the well-known Grey Level Co-occurrence Matrix (GLCM) method. The method deviates significantly from GLCM in that features are extracted, not via a fixed 2D weighting function of co-occurrence matrix elements, but by a variable summation of matrix elements in 3D localized neighborhoods. We subsequently present a new methodology for extracting optimized, highly discriminant features from these localized areas using adaptive Gaussian weighting functions. Genetic Algorithm (GA) optimization is used to produce a set of features whose classification worth is evaluated by discriminatory power and feature correlation considerations. We critically appraised the performance of our method and GLCM in pairwise classification of images from visually similar texture classes, captured from Markov Random Field (MRF) synthesized, natural, and biological origins. In these cross-validated classification trials, our method demonstrated significant benefits over GLCM, including increased feature discriminatory power, automatic feature adaptability, and significantly improved classification performance.
Resumo:
Endothelial dysfunction is an early key event of atherogenesis. Both fitness level and exercise intervention have been shown to positively influence endothelial function. In a cross-sectional study of 47 children, the relationship between habitual physical activity and flow-mediated dilation (FMD) of the brachial artery was explored. Habitual physical activity levels (PALs) were assessed using a validated stable isotope technique, and FMD of the brachial artery was measured via high-resolution ultrasound. The results showed that habitual physical activity significantly correlated with FMD (r=0.39, P=0.007), and remained the most influential variable on dilation in multivariate analysis. Although both fitness level and exercise intervention have previously been shown to positively influence FMD, this is the first time that a relationship with normal PALs has been investigated, especially, at such a young age. These data support the concept that physical activity exerts its protective effect on cardiovascular health via the endothelium and add further emphasis to the importance of physical activity in childhood.
Resumo:
Recently Adams and Bischof (1994) proposed a novel region growing algorithm for segmenting intensity images. The inputs to the algorithm are the intensity image and a set of seeds - individual points or connected components - that identify the individual regions to be segmented. The algorithm grows these seed regions until all of the image pixels have been assimilated. Unfortunately the algorithm is inherently dependent on the order of pixel processing. This means, for example, that raster order processing and anti-raster order processing do not, in general, lead to the same tessellation. In this paper we propose an improved seeded region growing algorithm that retains the advantages of the Adams and Bischof algorithm fast execution, robust segmentation, and no tuning parameters - but is pixel order independent. (C) 1997 Elsevier Science B.V.
Resumo:
Transpiration efficiency, W, the ratio of plant carbon produced to water transpired and carbon isotope discrimination of leaf dry matter, Delta(d)' were measured together on 30 lines of the C-4 species, Sorghum bicolor in the glasshouse and on eight lines grown in the field. In the glasshouse, the mean W observed was 4.9 mmol C mol(-1) H2O and the range was 0.8 mmol C mol(-1) H2O The mean Delta(d) was 3.0 parts per thousand and the observed range was 0.4 parts per thousand. In the field, the mean W was lower at 2.8 mmol C mol H2O and the mean Delta(d) was 4.6 parts per thousand. Significant positive correlations between W and Delta(d) were observed for plants grown in the glasshouse and in the field. The observed correlations were consistent with theory, opposite to those for C-4 species, and showed that variation in Delta(d) was an integrated measure of long-term variation in the ratio of intercellular to ambient CO2 partial pressure, p(i)/p(a). Detailed gas exchange measurements of carbon isotope discrimination during CO2 uptake, Delta(A) and p(i)/p(a) were made on leaves of eight S. bicolor lines. The observed relationship between Delta(A) and p(i)/p(a) was linear with a negative slope of 3.7 parts per thousand in Delta(A) for a unit change in p(i)/p(a). The slope of this linear relationship between Delta(A) and p(i)/p(a) in C-4 species is dependent on the leakiness of the CO2 concentrating mechanism of the C pathway, We estimated the leakiness (defined as the fraction of CO2 released in the bundle sheath by C-4 acid decarboxylations, which is lost by leakage) to be 0.2. We conclude that, although variation in Delta(d) observed in the 30 lines of S. bicolor is smaller than that commonly observed in C-4 species, it also reflects variation in transpiration efficiency, W. Among the eight lines examined in detail and in the environments used, there was considerable genotype x environment interaction.
Resumo:
Motivation: Prediction methods for identifying binding peptides could minimize the number of peptides required to be synthesized and assayed, and thereby facilitate the identification of potential T-cell epitopes. We developed a bioinformatic method for the prediction of peptide binding to MHC class II molecules. Results: Experimental binding data and expert knowledge of anchor positions and binding motifs were combined with an evolutionary algorithm (EA) and an artificial neural network (ANN): binding data extraction --> peptide alignment --> ANN training and classification. This method, termed PERUN, was implemented for the prediction of peptides that bind to HLA-DR4(B1*0401). The respective positive predictive values of PERUN predictions of high-, moderate-, low- and zero-affinity binder-a were assessed as 0.8, 0.7, 0.5 and 0.8 by cross-validation, and 1.0, 0.8, 0.3 and 0.7 by experimental binding. This illustrates the synergy between experimentation and computer modeling, and its application to the identification of potential immunotheraaeutic peptides.
Resumo:
To translate and transfer solution data between two totally different meshes (i.e. mesh 1 and mesh 2), a consistent point-searching algorithm for solution interpolation in unstructured meshes consisting of 4-node bilinear quadrilateral elements is presented in this paper. The proposed algorithm has the following significant advantages: (1) The use of a point-searching strategy allows a point in one mesh to be accurately related to an element (containing this point) in another mesh. Thus, to translate/transfer the solution of any particular point from mesh 2 td mesh 1, only one element in mesh 2 needs to be inversely mapped. This certainly minimizes the number of elements, to which the inverse mapping is applied. In this regard, the present algorithm is very effective and efficient. (2) Analytical solutions to the local co ordinates of any point in a four-node quadrilateral element, which are derived in a rigorous mathematical manner in the context of this paper, make it possible to carry out an inverse mapping process very effectively and efficiently. (3) The use of consistent interpolation enables the interpolated solution to be compatible with an original solution and, therefore guarantees the interpolated solution of extremely high accuracy. After the mathematical formulations of the algorithm are presented, the algorithm is tested and validated through a challenging problem. The related results from the test problem have demonstrated the generality, accuracy, effectiveness, efficiency and robustness of the proposed consistent point-searching algorithm. Copyright (C) 1999 John Wiley & Sons, Ltd.
Resumo:
OBJECTIVE: To evaluate a diagnostic algorithm for pulmonary tuberculosis based on smear microscopy and objective response to trial of antibiotics. SETTING: Adult medical wards, Hlabisa Hospital, South Africa, 1996-1997. METHODS: Adults with chronic chest symptoms and abnormal chest X-ray had sputum examined for Ziehl-Neelsen stained acid-fast bacilli by light microscopy. Those with negative smears were treated with amoxycillin for 5 days and assessed. Those who had not improved were treated with erythromycin for 5 days and reassessed. Response was compared with mycobacterial culture. RESULTS: Of 280 suspects who completed the diagnostic pathway, 160 (57%) had a positive smear, 46 (17%) responded to amoxycillin, 34 (12%) responded to erythromycin and 40 (14%) were treated as smear-negative tuberculosis. The sensitivity (89%) and specificity (84%) of the full algorithm for culture-positive tuberculosis were high. However, 11 patients (positive predictive value [PPV] 95%) were incorrectly diagnosed with tuberculosis, and 24 cases of tuberculosis (negative predictive value [NPV] 70%) were not identified. NPV improved to 75% when anaemia was included as a predictor. Algorithm performance was independent of human immunodeficiency virus status. CONCLUSION: Sputum smear microscopy plus trial of antibiotic algorithm among a selected group of tuberculosis suspects may increase diagnostic accuracy in district hospitals in developing countries.
Resumo:
Bosonized q-vertex operators related to the four-dimensional evaluation modules of the quantum affine superalgebra U-q[sl((2) over cap\1)] are constructed for arbitrary level k=alpha, where alpha not equal 0,-1 is a complex parameter appearing in the four-dimensional evaluation representations. They are intertwiners among the level-alpha highest weight Fock-Wakimoto modules. Screen currents which commute with the action of U-q[sl((2) over cap/1)] up to total differences are presented. Integral formulas for N-point functions of type I and type II q-vertex operators are proposed. (C) 2000 American Institute of Physics. [S0022-2488(00)00608-3].
Resumo:
Background The aim of this study was to study ecological correlations between age-adjusted all-cause mortality rates in Australian statistical divisions and (1) the proportion of residents that self-identify as Indigenous, (2) remoteness, and (3) socio-economic deprivation. Methods All-cause mortality rates for 57 statistical divisions were calculated and directly standardized to the 1997 Australian population in 5-year age groups using Australian Bureau of Statistics (ABS) data. The proportion of residents who self-identified as Indigenous was obtained from the 1996 Census. Remoteness was measured using ARIA (Accessibility and Remoteness Index for Australia) values. Socioeconomic deprivation was measured using SEIFA (Socio-Economic index for Australia) values from the ABS. Results Age-standardized all-cause mortality varies twofold from 5.7 to 11.3 per 1000 across Australian statistical divisions. Strongest correlation was between Indigenous status and mortality (r = 0.69, p < 0.001). correlation between remoteness and mortality was modest (r = 0.39, p = 0.002) as was correlation between socio-economic deprivation and mortality (r = -0.42, p = 0.001). Excluding the three divisions with the highest mortality, a multiple regression model using the logarithm of the adjusted mortality rate as the dependent variable showed that the partial correlation (and hence proportion of the variance explained) for Indigenous status was 0.03 (9 per cent; p = 0.03), for SEIFA score was -0.17 (3 per cent; p = 0.22); and for remoteness was -0.22 (5 per cent; p = 0.13). Collectively, the three variables studied explain 13 per cent of the variability in mortality. Conclusions Ecological correlation exists between all-cause mortality, Indigenous status, remoteness and disadvantage across Australia. The strongest correlation is with indigenous status, and correlation with all three characteristics is weak when the three statistical divisions with the highest mortality rates are excluded. intervention targeted at these three statistical divisions could reduce much of the variability in mortality in Australia.
Resumo:
In this paper, the minimum-order stable recursive filter design problem is proposed and investigated. This problem is playing an important role in pipeline implementation sin signal processing. Here, the existence of a high-order stable recursive filter is proved theoretically, in which the upper bound for the highest order of stable filters is given. Then the minimum-order stable linear predictor is obtained via solving an optimization problem. In this paper, the popular genetic algorithm approach is adopted since it is a heuristic probabilistic optimization technique and has been widely used in engineering designs. Finally, an illustrative example is sued to show the effectiveness of the proposed algorithm.
Resumo:
An equivalent algorithm is proposed to simulate thermal effects of the magma intrusion in geological systems, which are composed of porous rocks. Based on the physical and mathematical equivalence, the original magma solidification problem with a moving boundary between the rock and intruded magma is transformed into a new problem without the moving boundary but with a physically equivalent heat source. From the analysis of an ideal solidification model, the physically equivalent heat source has been determined in this paper. The major advantage in using the proposed equivalent algorithm is that the fixed finite element mesh with a variable integration time step can be employed to simulate the thermal effect of the intruded magma solidification using the conventional finite element method. The related numerical results have demonstrated the correctness and usefulness of the proposed equivalent algorithm for simulating the thermal effect of the intruded magma solidification in geological systems. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
Functional magnetic resonance imaging (FMRI) analysis methods can be quite generally divided into hypothesis-driven and data-driven approaches. The former are utilised in the majority of FMRI studies, where a specific haemodynamic response is modelled utilising knowledge of event timing during the scan, and is tested against the data using a t test or a correlation analysis. These approaches often lack the flexibility to account for variability in haemodynamic response across subjects and brain regions which is of specific interest in high-temporal resolution event-related studies. Current data-driven approaches attempt to identify components of interest in the data, but currently do not utilise any physiological information for the discrimination of these components. Here we present a hypothesis-driven approach that is an extension of Friman's maximum correlation modelling method (Neurolmage 16, 454-464, 2002) specifically focused on discriminating the temporal characteristics of event-related haemodynamic activity. Test analyses, on both simulated and real event-related FMRI data, will be presented.