949 resultados para normal coordinate analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

CONCLUSIONS: The new HSA protocol used in the mfVEP testing can be applied to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patients. Using this protocol can provide information about focal visual field differences across the horizontal midline, which can be utilized to differentiate between glaucoma and normal subjects. Sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucoma field loss. PURPOSE: Multifocal visual evoked potential (mfVEP) is a newly introduced method used for objective visual field assessment. Several analysis protocols have been tested to identify early visual field losses in glaucoma patients using the mfVEP technique, some were successful in detection of field defects, which were comparable to the standard automated perimetry (SAP) visual field assessment, and others were not very informative and needed more adjustment and research work. In this study we implemented a novel analysis approach and evaluated its validity and whether it could be used effectively for early detection of visual field defects in glaucoma. METHODS: Three groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes) and glaucoma suspect patients (38 eyes). All subjects had a two standard Humphrey field analyzer (HFA) test 24-2 and a single mfVEP test undertaken in one session. Analysis of the mfVEP results was done using the new analysis protocol; the hemifield sector analysis (HSA) protocol. Analysis of the HFA was done using the standard grading system. RESULTS: Analysis of mfVEP results showed that there was a statistically significant difference between the three groups in the mean signal to noise ratio (ANOVA test, p < 0.001 with a 95% confidence interval). The difference between superior and inferior hemispheres in all subjects were statistically significant in the glaucoma patient group in all 11 sectors (t-test, p < 0.001), partially significant in 5 / 11 (t-test, p < 0.01), and no statistical difference in most sectors of the normal group (1 / 11 sectors was significant, t-test, p < 0.9). Sensitivity and specificity of the HSA protocol in detecting glaucoma was 97% and 86%, respectively, and for glaucoma suspect patients the values were 89% and 79%, respectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fluorescence spectroscopy has recently become more common in clinical medicine. However, there are still many unresolved issues related to the methodology and implementation of instruments with this technology. In this study, we aimed to assess individual variability of fluorescence parameters of endogenous markers (NADH, FAD, etc.) measured by fluorescent spectroscopy (FS) in situ and to analyse the factors that lead to a significant scatter of results. Most studied fluorophores have an acceptable scatter of values (mostly up to 30%) for diagnostic purposes. Here we provide evidence that the level of blood volume in tissue impacts FS data with a significant inverse correlation. The distribution function of the fluorescence intensity and the fluorescent contrast coefficient values are a function of the normal distribution for most of the studied fluorophores and the redox ratio. The effects of various physiological (different content of skin melanin) and technical (characteristics of optical filters) factors on the measurement results were additionally studied.The data on the variability of the measurement results in FS should be considered when interpreting the diagnostic parameters, as well as when developing new algorithms for data processing and FS devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data fluctuation in multiple measurements of Laser Induced Breakdown Spectroscopy (LIBS) greatly affects the accuracy of quantitative analysis. A new LIBS quantitative analysis method based on the Robust Least Squares Support Vector Machine (RLS-SVM) regression model is proposed. The usual way to enhance the analysis accuracy is to improve the quality and consistency of the emission signal, such as by averaging the spectral signals or spectrum standardization over a number of laser shots. The proposed method focuses more on how to enhance the robustness of the quantitative analysis regression model. The proposed RLS-SVM regression model originates from the Weighted Least Squares Support Vector Machine (WLS-SVM) but has an improved segmented weighting function and residual error calculation according to the statistical distribution of measured spectral data. Through the improved segmented weighting function, the information on the spectral data in the normal distribution will be retained in the regression model while the information on the outliers will be restrained or removed. Copper elemental concentration analysis experiments of 16 certified standard brass samples were carried out. The average value of relative standard deviation obtained from the RLS-SVM model was 3.06% and the root mean square error was 1.537%. The experimental results showed that the proposed method achieved better prediction accuracy and better modeling robustness compared with the quantitative analysis methods based on Partial Least Squares (PLS) regression, standard Support Vector Machine (SVM) and WLS-SVM. It was also demonstrated that the improved weighting function had better comprehensive performance in model robustness and convergence speed, compared with the four known weighting functions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several analysis protocols have been tested to identify early visual field losses in glaucoma patients using the mfVEP technique, some were successful in detection of field defects, which were comparable to the standard SAP visual field assessment, and others were not very informative and needed more adjustment and research work. In this study we implemented a novel analysis approach and evaluated its validity and whether it could be used effectively for early detection of visual field defects in glaucoma. The purpose of this study is to examine the benefit of adding mfVEP hemifield Intersector analysis protocol to the standard HFA test when there is suspicious glaucomatous visual field loss. 3 groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes) and glaucoma suspect patients (38 eyes). All subjects had a two standard Humphrey visual field HFA test 24-2, optical coherence tomography of the optic nerve head, and a single mfVEP test undertaken in one session. Analysis of the mfVEP results was done using the new analysis protocol; the Hemifield Sector Analysis HSA protocol. The retinal nerve fibre (RNFL) thickness was recorded to identify subjects with suspicious RNFL loss. The hemifield Intersector analysis of mfVEP results showed that signal to noise ratio (SNR) difference between superior and inferior hemifields was statistically significant between the 3 groups (ANOVA p<0.001 with a 95% CI). The difference between superior and inferior hemispheres in all subjects were all statistically significant in the glaucoma patient group 11/11 sectors (t-test p<0.001), partially significant 5/11 in glaucoma suspect group (t-test p<0.01) and no statistical difference between most sectors in normal group (only 1/11 was significant) (t-test p<0.9). Sensitivity and specificity of the HSA protocol in detecting glaucoma was 97% and 86% respectively, while for glaucoma suspect were 89% and 79%. The use of SAP and mfVEP results in subjects with suspicious glaucomatous visual field defects, identified by low RNFL thickness, is beneficial in confirming early visual field defects. The new HSA protocol used in the mfVEP testing can be used to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patient. Using this protocol in addition to SAP analysis can provide information about focal visual field differences across the horizontal midline, and confirm suspicious field defects. Sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucoma field loss. The Intersector analysis protocol can detect early field changes not detected by standard HFA test.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Metrology processes contribute to entire manufacturing systems that can have a considerable impact on financial investment in coordinate measuring systems. However, there is a lack of generic methodologies to quantify their economical value in today’s industry. To solve this problem, a mathematical model is proposed in this paper by statistical deductive reasoning. This is done through defining the relationships between Process Capability Index, measurement uncertainty and tolerance band. The correctness of the mathematical model is proved by a case study. Finally, several comments and suggestions on evaluating and maximizing the benefits of metrology investment are given.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spamming has been a widespread problem for social networks. In recent years there is an increasing interest in the analysis of anti-spamming for microblogs, such as Twitter. In this paper we present a systematic research on the analysis of spamming in Sina Weibo platform, which is currently a dominant microblogging service provider in China. Our research objectives are to understand the specific spamming behaviors in Sina Weibo and find approaches to identify and block spammers in Sina Weibo based on spamming behavior classifiers. To start with the analysis of spamming behaviors we devise several effective methods to collect a large set of spammer samples, including uses of proactive honeypots and crawlers, keywords based searching and buying spammer samples directly from online merchants. We processed the database associated with these spammer samples and interestingly we found three representative spamming behaviors: Aggressive advertising, repeated duplicate reposting and aggressive following. We extract various features and compare the behaviors of spammers and legitimate users with regard to these features. It is found that spamming behaviors and normal behaviors have distinct characteristics. Based on these findings we design an automatic online spammer identification system. Through tests with real data it is demonstrated that the system can effectively detect the spamming behaviors and identify spammers in Sina Weibo.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To investigate the use of MRIA for quantitative characterisation of subretinal fibrosis secondary to nAMD. Methods: MRIA images of the posterior pole were acquired over 4 months from 20 eyes including those with inactive subretinal fibrosis and those being treated with ranibizumab for nAMD. Changes in morphology of the macula affected by nAMD were modelled and reflectance spectra at the MRIA acquisition wavelengths (507, 525, 552, 585, 596, 611 and 650nm) were computed using Monte Carlo simulation. Quantitative indicators of fibrosis were derived by matching image spectra to the model spectra of known morphological properties. Results: The model spectra were comparable to the image spectra, both normal and pathological. The key morphological changes that the model associated with nAMD were gliosis of the IS-OS junction, decrease in retinal blood and decrease in RPE melanin. However, these changes were not specific to fibrosis and none of the quantitative indicators showed a unique association with the degree of fibrosis. Moderate correlations were found with the clinical assessment, but not with the treatment program. Conclusion: MRIA can distinguish subretinal fibrosis from healthy tissue. The methods used show high sensitivity but low specificity, being unable to distinguish scarring from other abnormalities like atrophy. Quantification of scarring was not achieved with the wavelengths used due to the complex structural changes to retinal tissues in the process of nAMD. Further studies, incorporating other wavelengths, will establish whether MRIA has a role in the assessment of subretinal fibrosis in the context of retinal and choroidal pathology

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large-scale mechanical products, such as aircraft and rockets, consist of large numbers of small components, which introduce additional difficulty for assembly accuracy and error estimation. Planar surfaces as key product characteristics are usually utilised for positioning small components in the assembly process. This paper focuses on assembly accuracy analysis of small components with planar surfaces in large-scale volume products. To evaluate the accuracy of the assembly system, an error propagation model for measurement error and fixture error is proposed, based on the assumption that all errors are normally distributed. In this model, the general coordinate vector is adopted to represent the position of the components. The error transmission functions are simplified into a linear model, and the coordinates of the reference points are composed by theoretical value and random error. The installation of a Head-Up Display is taken as an example to analyse the assembly error of small components based on the propagation model. The result shows that the final coordination accuracy is mainly determined by measurement error of the planar surface in small components. To reduce the uncertainty of the plane measurement, an evaluation index of measurement strategy is presented. This index reflects the distribution of the sampling point set and can be calculated by an inertia moment matrix. Finally, a practical application is introduced for validating the evaluation index.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: Pregnancy may provide a 'teachable moment' for positive health behaviour change, as a time when women are both motivated towards health and in regular contact with health care professionals. This study aimed to investigate whether women's experiences of pregnancy indicate that they would be receptive to behaviour change during this period. DESIGN: Qualitative interview study. METHODS: Using interpretative phenomenological analysis, this study details how seven women made decisions about their physical activity and dietary behaviour during their first pregnancy. RESULTS: Two women had required fertility treatment to conceive. Their behaviour was driven by anxiety and a drive to minimize potential risks to the pregnancy. This included detailed information seeking and strict adherence to diet and physical activity recommendations. However, the majority of women described behaviour change as 'automatic', adopting a new lifestyle immediately upon discovering their pregnancy. Diet and physical activity were influenced by what these women perceived to be normal or acceptable during pregnancy (largely based on observations of others) and internal drivers, including bodily signals and a desire to retain some of their pre-pregnancy self-identity. More reasoned assessments regarding benefits for them and their baby were less prevalent and influential. CONCLUSIONS: Findings suggest that for women who conceived relatively easily, diet and physical activity behaviour during pregnancy is primarily based upon a combination of automatic judgements, physical sensations, and perceptions of what pregnant women are supposed to do. Health professionals and other credible sources appear to exert less influence. As such, pregnancy alone may not create a 'teachable moment'. Statement of contribution What is already known on this subject? Significant life events can be cues to action with relation to health behaviour change. However, much of the empirical research in this area has focused on negative health experiences such as receiving a false-positive screening result and hospitalization, and in relation to unequivocally negative behaviours such as smoking. It is often suggested that pregnancy, as a major life event, is a 'teachable moment' (TM) for lifestyle behaviour change due to an increase in motivation towards health and regular contact with health professionals. However, there is limited evidence for the utility of the TM model in predicting or promoting behaviour change. What does this study add? Two groups of women emerged from our study: the women who had experienced difficulties in conceiving and had received fertility treatment, and those who had conceived without intervention. The former group's experience of pregnancy was characterized by a sense of vulnerability and anxiety over sustaining the pregnancy which influenced every choice they made about their diet and physical activity. For the latter group, decisions about diet and physical activity were made immediately upon discovering their pregnancy, based upon a combination of automatic judgements, physical sensations, and perceptions of what is normal or 'good' for pregnancy. Among women with relatively trouble-free conception and pregnancy experiences, the necessary conditions may not be present to create a 'teachable moment'. This is due to a combination of a reliance on non-reflective decision-making, perception of low risk, and little change in affective response or self-concept.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Prices of U.S. Treasury securities vary over time and across maturities. When the market in Treasurys is sufficiently complete and frictionless, these prices may be modeled by a function time and maturity. A cross-section of this function for time held fixed is called the yield curve; the aggregate of these sections is the evolution of the yield curve. This dissertation studies aspects of this evolution. ^ There are two complementary approaches to the study of yield curve evolution here. The first is principal components analysis; the second is wavelet analysis. In both approaches both the time and maturity variables are discretized. In principal components analysis the vectors of yield curve shifts are viewed as observations of a multivariate normal distribution. The resulting covariance matrix is diagonalized; the resulting eigenvalues and eigenvectors (the principal components) are used to draw inferences about the yield curve evolution. ^ In wavelet analysis, the vectors of shifts are resolved into hierarchies of localized fundamental shifts (wavelets) that leave specified global properties invariant (average change and duration change). The hierarchies relate to the degree of localization with movements restricted to a single maturity at the base and general movements at the apex. Second generation wavelet techniques allow better adaptation of the model to economic observables. Statistically, the wavelet approach is inherently nonparametric while the wavelets themselves are better adapted to describing a complete market. ^ Principal components analysis provides information on the dimension of the yield curve process. While there is no clear demarkation between operative factors and noise, the top six principal components pick up 99% of total interest rate variation 95% of the time. An economically justified basis of this process is hard to find; for example a simple linear model will not suffice for the first principal component and the shape of this component is nonstationary. ^ Wavelet analysis works more directly with yield curve observations than principal components analysis. In fact the complete process from bond data to multiresolution is presented, including the dedicated Perl programs and the details of the portfolio metrics and specially adapted wavelet construction. The result is more robust statistics which provide balance to the more fragile principal components analysis. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The microarray technology provides a high-throughput technique to study gene expression. Microarrays can help us diagnose different types of cancers, understand biological processes, assess host responses to drugs and pathogens, find markers for specific diseases, and much more. Microarray experiments generate large amounts of data. Thus, effective data processing and analysis are critical for making reliable inferences from the data. ^ The first part of dissertation addresses the problem of finding an optimal set of genes (biomarkers) to classify a set of samples as diseased or normal. Three statistical gene selection methods (GS, GS-NR, and GS-PCA) were developed to identify a set of genes that best differentiate between samples. A comparative study on different classification tools was performed and the best combinations of gene selection and classifiers for multi-class cancer classification were identified. For most of the benchmarking cancer data sets, the gene selection method proposed in this dissertation, GS, outperformed other gene selection methods. The classifiers based on Random Forests, neural network ensembles, and K-nearest neighbor (KNN) showed consistently god performance. A striking commonality among these classifiers is that they all use a committee-based approach, suggesting that ensemble classification methods are superior. ^ The same biological problem may be studied at different research labs and/or performed using different lab protocols or samples. In such situations, it is important to combine results from these efforts. The second part of the dissertation addresses the problem of pooling the results from different independent experiments to obtain improved results. Four statistical pooling techniques (Fisher inverse chi-square method, Logit method. Stouffer's Z transform method, and Liptak-Stouffer weighted Z-method) were investigated in this dissertation. These pooling techniques were applied to the problem of identifying cell cycle-regulated genes in two different yeast species. As a result, improved sets of cell cycle-regulated genes were identified. The last part of dissertation explores the effectiveness of wavelet data transforms for the task of clustering. Discrete wavelet transforms, with an appropriate choice of wavelet bases, were shown to be effective in producing clusters that were biologically more meaningful. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study we have identified key genes that are critical in development of astrocytic tumors. Meta-analysis of microarray studies which compared normal tissue to astrocytoma revealed a set of 646 differentially expressed genes in the majority of astrocytoma. Reverse engineering of these 646 genes using Bayesian network analysis produced a gene network for each grade of astrocytoma (Grade I–IV), and ‘key genes’ within each grade were identified. Genes found to be most influential to development of the highest grade of astrocytoma, Glioblastoma multiforme were: COL4A1, EGFR, BTF3, MPP2, RAB31, CDK4, CD99, ANXA2, TOP2A, and SERBP1. All of these genes were up-regulated, except MPP2 (down regulated). These 10 genes were able to predict tumor status with 96–100% confidence when using logistic regression, cross validation, and the support vector machine analysis. Markov genes interact with NFkβ, ERK, MAPK, VEGF, growth hormone and collagen to produce a network whose top biological functions are cancer, neurological disease, and cellular movement. Three of the 10 genes - EGFR, COL4A1, and CDK4, in particular, seemed to be potential ‘hubs of activity’. Modified expression of these 10 Markov Blanket genes increases lifetime risk of developing glioblastoma compared to the normal population. The glioblastoma risk estimates were dramatically increased with joint effects of 4 or more than 4 Markov Blanket genes. Joint interaction effects of 4, 5, 6, 7, 8, 9 or 10 Markov Blanket genes produced 9, 13, 20.9, 26.7, 52.8, 53.2, 78.1 or 85.9%, respectively, increase in lifetime risk of developing glioblastoma compared to normal population. In summary, it appears that modified expression of several ‘key genes’ may be required for the development of glioblastoma. Further studies are needed to validate these ‘key genes’ as useful tools for early detection and novel therapeutic options for these tumors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

C-reactive protein (CRP), a normally occurring human plasma protein may become elevated as much as 1,000 fold during disease states involving acute inflammation or tissue damage. Through its binding to phosphorylcholine in the presence of calcium, CRP has been shown to potentiate the activation of complement, stimulate phagocytosis and opsonize certain microorganisms. Utilizing a flow cytometric functional ligand binding assay I have demonstrated that a monocyte population in human peripheral blood and specific human-derived myelomonocytic cell lines reproducibly bind an evolutionarily conserved conformational pentraxin epitope on human CRP through a mechanism that does not involve its ligand, phosphorylcholine. ^ A variety of cell lines at different stages of differentiation were examined. The monocytic cell line, THP-1, bound the most CRP followed by U937 and KG-1a cells. The HL-60 cell line was induced towards either the granulocyte or monocyte pathway with DMSO or PMA, respectively. Untreated HL-60 cells or DMSO-treated cells did not bind CRP while cells treated with PMA showed increased binding of CRP, similar to U-937 cells. T cell and B-cell derived lines were negative. ^ Inhibition studies with Limulin and human SAP demonstrated that the binding site is a conserved pentraxin epitope. The calcium requirement necessary for binding to occur indicated that the cells recognize a conformational form of CRP. Phosphorylcholine did not inhibit the reaction therefore the possibility that CRP had bound to damaged membranes with exposed PC sites was discounted. ^ A study of 81 normal donors using flow cytometry demonstrated that a majority of peripheral blood monocytes (67.9 ± 1.3, mean ± sem) bound CRP. The percentage of binding was normally distributed and not affected by gender, age or ethnicity. Whole blood obtained from donors representing a variety of disease states showed a significant reduction in the level of CRP bound by monocytes in those donors classified with infection, inflammation or cancer. This reduction in monocyte populations binding CRP did not correlate with the concentration of plasma CRP. ^ The ability of monocytes to specifically bind CRP combined with the binding reactivity of the protein itself to a variety of phosphorylcholine containing substances may represent an important bridge between innate and adaptive immunity. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The adverse health effects of long-term exposure to lead are well established, with major uptake into the human body occurring mainly through oral ingestion by young children. Lead-based paint was frequently used in homes built before 1978, particularly in inner-city areas. Minority populations experience the effects of lead poisoning disproportionately. ^ Lead-based paint abatement is costly. In the United States, residents of about 400,000 homes, occupied by 900,000 young children, lack the means to correct lead-based paint hazards. The magnitude of this problem demands research on affordable methods of hazard control. One method is encapsulation, defined as any covering or coating that acts as a permanent barrier between the lead-based paint surface and the environment. ^ Two encapsulants were tested for reliability and effective life span through an accelerated lifetime experiment that applied stresses exceeding those encountered under normal use conditions. The resulting time-to-failure data were used to extrapolate the failure time under conditions of normal use. Statistical analysis and models of the test data allow forecasting of long-term reliability relative to the 20-year encapsulation requirement. Typical housing material specimens simulating walls and doors coated with lead-based paint were overstressed before encapsulation. A second, un-aged set was also tested. Specimens were monitored after the stress test with a surface chemical testing pad to identify the presence of lead breaking through the encapsulant. ^ Graphical analysis proposed by Shapiro and Meeker and the general log-linear model developed by Cox were used to obtain results. Findings for the 80% reliability time to failure varied, with close to 21 years of life under normal use conditions for encapsulant A. The application of product A on the aged gypsum and aged wood substrates yielded slightly lower times. Encapsulant B had an 80% reliable life of 19.78 years. ^ This study reveals that encapsulation technologies can offer safe and effective control of lead-based paint hazards and may be less expensive than other options. The U.S. Department of Health and Human Services and the CDC are committed to eliminating childhood lead poisoning by 2010. This ambitious target is feasible, provided there is an efficient application of innovative technology, a goal to which this study aims to contribute. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Convention on Biodiversity (CBD) was created in 1992 to coordinate global governments to protect biological resources. The CBD has three goals: protection of biodiversity, achievement of sustainable use of biodiversity and facilitation of equitable sharing of the benefits of biological resources. The goal of protecting biological resources has remained both controversial and difficult to implement. This study focused more on the goal of biodiversity protection. The research was designed to examine how globally constructed environmental policies get adapted by national governments and then passed down to local levels where actual implementation takes place. Effectiveness of such policies depends on the extent of actual implementation at local levels. Therefore, compliance was divided and examined at three levels: global, national and local. The study then developed various criteria to measure compliance at these levels. Both qualitative and quantitative methods were used to analyze compliance and implementation. The study was guided by three questions broadly examining critical factors that most influence the implementation of biodiversity protection policies at the global, national and local levels. Findings show that despite an overall biodiversity deficit of 0.9 hectares per person, global compliance with the CBD goals is currently at 35%. Compliance is lowest at local levels at 14%, it is slightly better at national level at 50%, and much better at the international level 64%. Compliance appears higher at both national and international levels because compliance here is paper work based and policy formulation. If implementation at local levels continues to produce this low compliance, overall conservation outcomes can only get worse than what it is at present. There are numerous weaknesses and capacity challenges countries are yet to address in their plans. In order to increase local level compliance, the study recommends a set of robust policies that build local capacity, incentivize local resource owners, and implement biodiversity protection programs that are akin to local needs and aspirations.^