887 resultados para analysis with NMR


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reliability analysis has several important engineering applications. Designers and operators of equipment are often interested in the probability of the equipment operating successfully to a given age - this probability is known as the equipment's reliability at that age. Reliability information is also important to those charged with maintaining an item of equipment, as it enables them to model and evaluate alternative maintenance policies for the equipment. In each case, information on failures and survivals of a typical sample of items is used to estimate the required probabilities as a function of the item's age, this process being one of many applications of the statistical techniques known as distribution fitting. In most engineering applications, the estimation procedure must deal with samples containing survivors (suspensions or censorings); this thesis focuses on several graphical estimation methods that are widely used for analysing such samples. Although these methods have been current for many years, they share a common shortcoming: none of them is continuously sensitive to changes in the ages of the suspensions, and we show that the resulting reliability estimates are therefore more pessimistic than necessary. We use a simple example to show that the existing graphical methods take no account of any service recorded by suspensions beyond their respective previous failures, and that this behaviour is inconsistent with one's intuitive expectations. In the course of this thesis, we demonstrate that the existing methods are only justified under restricted conditions. We present several improved methods and demonstrate that each of them overcomes the problem described above, while reducing to one of the existing methods where this is justified. Each of the improved methods thus provides a realistic set of reliability estimates for general (unrestricted) censored samples. Several related variations on these improved methods are also presented and justified. - i

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Texture analysis and textural cues have been applied for image classification, segmentation and pattern recognition. Dominant texture descriptors include directionality, coarseness, line-likeness etc. In this dissertation a class of textures known as particulate textures are defined, which are predominantly coarse or blob-like. The set of features that characterise particulate textures are different from those that characterise classical textures. These features are micro-texture, macro-texture, size, shape and compaction. Classical texture analysis techniques do not adequately capture particulate texture features. This gap is identified and new methods for analysing particulate textures are proposed. The levels of complexity in particulate textures are also presented ranging from the simplest images where blob-like particles are easily isolated from their back- ground to the more complex images where the particles and the background are not easily separable or the particles are occluded. Simple particulate images can be analysed for particle shapes and sizes. Complex particulate texture images, on the other hand, often permit only the estimation of particle dimensions. Real life applications of particulate textures are reviewed, including applications to sedimentology, granulometry and road surface texture analysis. A new framework for computation of particulate shape is proposed. A granulometric approach for particle size estimation based on edge detection is developed which can be adapted to the gray level of the images by varying its parameters. This study binds visual texture analysis and road surface macrotexture in a theoretical framework, thus making it possible to apply monocular imaging techniques to road surface texture analysis. Results from the application of the developed algorithm to road surface macro-texture, are compared with results based on Fourier spectra, the auto- correlation function and wavelet decomposition, indicating the superior performance of the proposed technique. The influence of image acquisition conditions such as illumination and camera angle on the results was systematically analysed. Experimental data was collected from over 5km of road in Brisbane and the estimated coarseness along the road was compared with laser profilometer measurements. Coefficient of determination R2 exceeding 0.9 was obtained when correlating the proposed imaging technique with the state of the art Sensor Measured Texture Depth (SMTD) obtained using laser profilometers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper studies the missing covariate problem which is often encountered in survival analysis. Three covariate imputation methods are employed in the study, and the effectiveness of each method is evaluated within the hazard prediction framework. Data from a typical engineering asset is used in the case study. Covariate values in some time steps are deliberately discarded to generate an incomplete covariate set. It is found that although the mean imputation method is simpler than others for solving missing covariate problems, the results calculated by it can differ largely from the real values of the missing covariates. This study also shows that in general, results obtained from the regression method are more accurate than those of the mean imputation method but at the cost of a higher computational expensive. Gaussian Mixture Model (GMM) method is found to be the most effective method within these three in terms of both computation efficiency and predication accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the field of process mining, the use of event logs for the purpose of root cause analysis is increasingly studied. In such an analysis, the availability of attributes/features that may explain the root cause of some phenomena is crucial. Currently, the process of obtaining these attributes from raw event logs is performed more or less on a case-by-case basis: there is still a lack of generalized systematic approach that captures this process. This paper proposes a systematic approach to enrich and transform event logs in order to obtain the required attributes for root cause analysis using classical data mining techniques, the classification techniques. This approach is formalized and its applicability has been validated using both self-generated and publicly-available logs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This chapter examines why policy decision-makers opt for command and control environmental regulation despite the availability of a plethora of market-based instruments which are more efficient and cost-effective. Interestingly, Sri Lanka has adopted a wholly command and control system, during both the pre and post liberalisation economic policies. This chapter first examines the merits and demerits of command and control and market-based approaches and then looks at Sri Lanka’s extensive environmental regulatory framework. The chapter then examines the likely reasons as to why the country has gone down the path of inflexible regulatory measures and has become entrenched in them. The various hypotheses are discussed and empirical evidence is provided. The chapter also discusses the consequences of an environmentally slack economy and policy implications stemming from adopting a wholly regulatory approach. The chapter concludes with a discussion of the main results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The reliability analysis is crucial to reducing unexpected down time, severe failures and ever tightened maintenance budget of engineering assets. Hazard based reliability methods are of particular interest as hazard reflects the current health status of engineering assets and their imminent failure risks. Most existing hazard models were constructed using the statistical methods. However, these methods were established largely based on two assumptions: one is the assumption of baseline failure distributions being accurate to the population concerned and the other is the assumption of effects of covariates on hazards. These two assumptions may be difficult to achieve and therefore compromise the effectiveness of hazard models in the application. To address this issue, a non-linear hazard modelling approach is developed in this research using neural networks (NNs), resulting in neural network hazard models (NNHMs), to deal with limitations due to the two assumptions for statistical models. With the success of failure prevention effort, less failure history becomes available for reliability analysis. Involving condition data or covariates is a natural solution to this challenge. A critical issue for involving covariates in reliability analysis is that complete and consistent covariate data are often unavailable in reality due to inconsistent measuring frequencies of multiple covariates, sensor failure, and sparse intrusive measurements. This problem has not been studied adequately in current reliability applications. This research thus investigates such incomplete covariates problem in reliability analysis. Typical approaches to handling incomplete covariates have been studied to investigate their performance and effects on the reliability analysis results. Since these existing approaches could underestimate the variance in regressions and introduce extra uncertainties to reliability analysis, the developed NNHMs are extended to include handling incomplete covariates as an integral part. The extended versions of NNHMs have been validated using simulated bearing data and real data from a liquefied natural gas pump. The results demonstrate the new approach outperforms the typical incomplete covariates handling approaches. Another problem in reliability analysis is that future covariates of engineering assets are generally unavailable. In existing practices for multi-step reliability analysis, historical covariates were used to estimate the future covariates. Covariates of engineering assets, however, are often subject to substantial fluctuation due to the influence of both engineering degradation and changes in environmental settings. The commonly used covariate extrapolation methods thus would not be suitable because of the error accumulation and uncertainty propagation. To overcome this difficulty, instead of directly extrapolating covariate values, projection of covariate states is conducted in this research. The estimated covariate states and unknown covariate values in future running steps of assets constitute an incomplete covariate set which is then analysed by the extended NNHMs. A new assessment function is also proposed to evaluate risks of underestimated and overestimated reliability analysis results. A case study using field data from a paper and pulp mill has been conducted and it demonstrates that this new multi-step reliability analysis procedure is able to generate more accurate analysis results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Spectroscopic studies of complex clinical fluids have led to the application of a more holistic approach to their chemical analysis becoming more popular and widely employed. The efficient and effective interpretation of multidimensional spectroscopic data relies on many chemometric techniques and one such group of tools is represented by so-called correlation analysis methods. Typical of these techniques are two-dimensional correlation analysis and statistical total correlation spectroscopy (STOCSY). Whilst the former has largely been applied to optical spectroscopic analysis, STOCSY was developed and has been applied almost exclusively to NMR metabonomic studies. Using a 1H NMR study of human blood plasma, from subjects recovering from exhaustive exercise trials, the basic concepts and applications of these techniques are examined. Typical information from their application to NMR-based metabonomics is presented and their value in aiding interpretation of NMR data obtained from biological systems is illustrated. Major energy metabolites are identified in the NMR spectra and the dynamics of their appearance and removal from plasma during exercise recovery are illustrated and discussed. The complementary nature of two-dimensional correlation analysis and statistical total correlation spectroscopy are highlighted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective To explore the characteristics of regional distribution of cancer deaths in Shandong Province with the principle components analysis. Methods The principle components analysis with co-variance matrix for age-adjusted mortality rates and percentages of 20 types of cancer in 22 counties (cities) were carried out using SAS Software. Results Over 90% of the total information could be reflected by the top 3 principle components and the first principle component alone represented more than half of the overall regional variances. The first component mainly reflected the area differences of esophageal cancer. The second component mainly reflected the area differences of lung cancer, stomach cancer and liver cancer. The value of the first principal component scores showed a clear trend that the west areas possessed higher values and the east the lower values. Based on the top two components,the 22 counties (cities) could be divided into several geographical clusters. Conclusion The overall difference of regional distribution of cancers in Shandong is dominated by several major cancers including esophageal cancer, lung cancer, stomach cancer and liver cancer. Among them,esophageal cancer makes the largest contribution. If the range of counties (cities) analyzed could be further widened, the characteristics of regional distribution of cancer mortality would be better examined. Abstract in Chinese 目的 利用主成分分析探讨山东省恶性肿瘤死亡的地区分布特征. 方法 利用SAS软件对山东省22个县市区2004~2006午的20种恶性肿瘤标化死亡率和构成比分别进行协方差矩阵主成分分析. 结果 前3个主成分就反映了总体差异90%以上的信息,其中仅第1主成分就提供了总体差异一半以上的信息.第1主成分主要反映了食管癌的地区差异,第2主成分主要反映肺癌的地区差异,兼顾胃癌和肝癌.各地区第1主成分得分呈现西高东低的趋势,根据第1和第2主成分可以将调查地区分为若干类别,表现为明显的地理聚集性. 结论 山东省各地区恶性肿瘤死亡的总体差异主要取决于少数高发肿瘤,包括食管癌、肺癌、胃癌、肝癌等,其中以食管癌地位最为突出.如能进一步扩大分析范围,可更好地查明恶性肿瘤死亡的地区特征.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: There is currently no early predictive marker of survival for patients receiving chemotherapy for malignant pleural mesothelioma (MPM). Tumour response may be predictive for overall survival (OS), though this has not been explored. We have thus undertaken a combined-analysis of OS, from a 42 day landmark, of 526 patients receiving systemic therapy for MPM. We also validate published progression-free survival rates (PFSRs) and a progression-free survival (PFS) prognostic-index model. Methods: Analyses included nine MPM clinical trials incorporating six European Organisation for Research and Treatment of Cancer (EORTC) studies. Analysis of OS from landmark (from day 42 post-treatment) was considered regarding tumour response. PFSR analysis data included six non-EORTC MPM clinical trials. Prognostic index validation was performed on one non-EORTC data-set, with available survival data. Results: Median OS, from landmark, of patients with partial response (PR) was 12·8 months, stable disease (SD), 9·4 months and progressive disease (PD), 3·4 months. Both PR and SD were associated with longer OS from landmark compared with disease progression (both p < 0·0001). PFSRs for platinum-based combination therapies were consistent with published significant clinical activity ranges. Effective separation between PFS and OS curves provided a validation of the EORTC prognostic model, based on histology, stage and performance status. Conclusion: Response to chemotherapy is associated with significantly longer OS from landmark in patients with MPM. © 2012 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The method of generalized estimating equations (GEE) is a popular tool for analysing longitudinal (panel) data. Often, the covariates collected are time-dependent in nature, for example, age, relapse status, monthly income. When using GEE to analyse longitudinal data with time-dependent covariates, crucial assumptions about the covariates are necessary for valid inferences to be drawn. When those assumptions do not hold or cannot be verified, Pepe and Anderson (1994, Communications in Statistics, Simulations and Computation 23, 939–951) advocated using an independence working correlation assumption in the GEE model as a robust approach. However, using GEE with the independence correlation assumption may lead to significant efficiency loss (Fitzmaurice, 1995, Biometrics 51, 309–317). In this article, we propose a method that extracts additional information from the estimating equations that are excluded by the independence assumption. The method always includes the estimating equations under the independence assumption and the contribution from the remaining estimating equations is weighted according to the likelihood of each equation being a consistent estimating equation and the information it carries. We apply the method to a longitudinal study of the health of a group of Filipino children.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article measures Japanese prefectures' productivity from 1991 to 2002, taking CO2 emissions into consideration, and examines the factors that impact on productivity. We use the data envelopment analysis and measure the Luenberger productivity indicator, incorporating CO2 emissions in the analysis. Our results show that productivity was decreasing during the period of investigation. According to the results of the generalized method of moment estimation, the operations rate, the share of the energy intensive industries and social capital significantly impact on productivity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Engineers must have deep and accurate conceptual understanding of their field and Concept inventories (CIs) are one method of assessing conceptual understanding and providing formative feedback. Current CI tests use Multiple Choice Questions (MCQ) to identify misconceptions and have undergone reliability and validity testing to assess conceptual understanding. However, they do not readily provide the diagnostic information about students’ reasoning and therefore do not effectively point to specific actions that can be taken to improve student learning. We piloted the textual component of our diagnostic CI on electrical engineering students using items from the signals and systems CI. We then analysed the textual responses using automated lexical analysis software to test the effectiveness of these types of software and interviewed the students regarding their experience using the textual component. Results from the automated text analysis revealed that students held both incorrect and correct ideas for certain conceptual areas and provided indications of student misconceptions. User feedback also revealed that the inclusion of the textual component is helpful to students in assessing and reflecting on their own understanding.