930 resultados para Analysis tools
Resumo:
A thesis submitted in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Information Systems.
Resumo:
Presented at 23rd International Conference on Real-Time Networks and Systems (RTNS 2015). 4 to 6, Nov, 2015, Main Track. Lille, France.
Resumo:
The structural analysis involves the definition of the model and selection of the analysis type. The model should represent the stiffness, the mass and the loads of the structure. The structures can be represented using simplified models, such as the lumped mass models, and advanced models resorting the Finite Element Method (FEM) and Discrete Element Method (DEM). Depending on the characteristics of the structure, different types of analysis can be used such as limit analysis, linear and non-linear static analysis and linear and non-linear dynamic analysis. Unreinforced masonry structures present low tensile strength and the linear analyses seem to not be adequate for assessing their structural behaviour. On the other hand, the static and dynamic non-linear analyses are complex, since they involve large time computational requirements and advanced knowledge of the practitioner. The non-linear analysis requires advanced knowledge on the material properties, analysis tools and interpretation of results. The limit analysis with macro-blocks can be assumed as a more practical method in the estimation of maximum load capacity of structure. Furthermore, the limit analysis require a reduced number of parameters, which is an advantage for the assessment of ancient and historical masonry structures, due to the difficult in obtaining reliable data.
Resumo:
This study utilised recent developments in forensic aromatic hydrocarbon fingerprint analysis to characterise and identify specific biogenic, pyrogenic and petrogenic contamination. The fingerprinting and data interpretation techniques discussed include the recognition of: The distribution patterns of hydrocarbons (alkylated naphthalene, phenanthrene, dibenzothiophene, fluorene, chrysene and phenol isomers), • Analysis of “source-specific marker” compounds (individual saturated hydrocarbons, including n-alkanes (n-C5 through 0-C40) • Selected benzene, toluene, ethylbenzene and xylene isomers (BTEX), • The recalcitrant isoprenoids; pristane and phytane and • The determination of diagnostic ratios of specific petroleum / non-petroleum constituents, and the application of various statistical and numerical analysis tools. An unknown sample from the Irish Environmental Protection Agency (EPA) for origin characterisation was subjected to analysis by gas chromatography utilising both flame ionisation and mass spectral detection techniques in comparison to known reference materials. The percentage of the individual Polycyclic Aromatic Hydrocarbons (PAIIs) and biomarker concentrations in the unknown sample were normalised to the sum of the analytes and the results were compared with the corresponding results with a range of reference materials. In addition, to the determination of conventional diagnostic PAH and biomarker ratios, a number of “source-specific markers” isomeric PAHs within the same alkylation levels were determined, and their relative abundance ratios were computed in order to definitively identify and differentiate the various sources. Statistical logarithmic star plots were generated from both sets of data to give a pictorial representation of the comparison between the unknown sample and reference products. The study successfully characterised the unknown sample as being contaminated with a “coal tar” and clearly demonstrates the future role of compound ratio analysis (CORAT) in the identification of possible source contaminants.
Resumo:
Introduction: Coordination is a strategy chosen by the central nervous system to control the movements and maintain stability during gait. Coordinated multi-joint movements require a complex interaction between nervous outputs, biomechanical constraints, and pro-prioception. Quantitatively understanding and modeling gait coordination still remain a challenge. Surgeons lack a way to model and appreciate the coordination of patients before and after surgery of the lower limbs. Patients alter their gait patterns and their kinematic synergies when they walk faster or slower than normal speed to maintain their stability and minimize the energy cost of locomotion. The goal of this study was to provide a dynamical system approach to quantitatively describe human gait coordination and apply it to patients before and after total knee arthroplasty. Methods: A new method of quantitative analysis of interjoint coordination during gait was designed, providing a general model to capture the whole dynamics and showing the kinematic synergies at various walking speeds. The proposed model imposed a relationship among lower limb joint angles (hips and knees) to parameterize the dynamics of locomotion of each individual. An integration of different analysis tools such as Harmonic analysis, Principal Component Analysis, and Artificial Neural Network helped overcome high-dimensionality, temporal dependence, and non-linear relationships of the gait patterns. Ten patients were studied using an ambulatory gait device (Physilog®). Each participant was asked to perform two walking trials of 30m long at 3 different speeds and to complete an EQ-5D questionnaire, a WOMAC and Knee Society Score. Lower limbs rotations were measured by four miniature angular rate sensors mounted respectively, on each shank and thigh. The outcomes of the eight patients undergoing total knee arthroplasty, recorded pre-operatively and post-operatively at 6 weeks, 3 months, 6 months and 1 year were compared to 2 age-matched healthy subjects. Results: The new method provided coordination scores at various walking speeds, ranged between 0 and 10. It determined the overall coordination of the lower limbs as well as the contribution of each joint to the total coordination. The difference between the pre-operative and post-operative coordination values were correlated with the improvements of the subjective outcome scores. Although the study group was small, the results showed a new way to objectively quantify gait coordination of patients undergoing total knee arthroplasty, using only portable body-fixed sensors. Conclusion: A new method for objective gait coordination analysis has been developed with very encouraging results regarding the objective outcome of lower limb surgery.
Resumo:
This paper describes methods to analyze the brain's electric fields recorded with multichannel Electroencephalogram (EEG) and demonstrates their implementation in the software CARTOOL. It focuses on the analysis of the spatial properties of these fields and on quantitative assessment of changes of field topographies across time, experimental conditions, or populations. Topographic analyses are advantageous because they are reference independents and thus render statistically unambiguous results. Neurophysiologically, differences in topography directly indicate changes in the configuration of the active neuronal sources in the brain. We describe global measures of field strength and field similarities, temporal segmentation based on topographic variations, topographic analysis in the frequency domain, topographic statistical analysis, and source imaging based on distributed inverse solutions. All analysis methods are implemented in a freely available academic software package called CARTOOL. Besides providing these analysis tools, CARTOOL is particularly designed to visualize the data and the analysis results using 3-dimensional display routines that allow rapid manipulation and animation of 3D images. CARTOOL therefore is a helpful tool for researchers as well as for clinicians to interpret multichannel EEG and evoked potentials in a global, comprehensive, and unambiguous way.
Resumo:
Although many larger Iowa cities have staff traffic engineers who have a dedicated interest in safety, smaller jurisdictions do not. Rural agencies and small communities must rely on consultants, if available, or local staff to identify locations with a high number of crashes and to devise mitigating measures. However, smaller agencies in Iowa have other available options to receive assistance in obtaining and interpreting crash data. These options are addressed in this manual. Many proposed road improvements or alternatives can be evaluated using methods that do not require in-depth engineering analysis. The Iowa Department of Transportation (DOT) supported developing this manual to provide a tool that assists communities and rural agencies in identifying and analyzing local roadway-related traffic safety concerns. In the past, a limited number of traffic safety professionals had access to adequate tools and training to evaluate potential safety problems quickly and efficiently and select possible solutions. Present-day programs and information are much more conducive to the widespread dissemination of crash data, mapping, data comparison, and alternative selections and comparisons. Information is available and in formats that do not require specialized training to understand and use. This manual describes several methods for reviewing crash data at a given location, identifying possible contributing causes, selecting countermeasures, and conducting economic analyses for the proposed mitigation. The Federal Highway Administration (FHWA) has also developed other analysis tools, which are described in the manual. This manual can also serve as a reference for traffic engineers and other analysts.
Resumo:
Tutkimuksen tavoitteena on analysoida 74 sellu- ja paperiyrityksen taloudellista suorituskykyä kannattavuutta, maksuvalmiutta, vakavaraisuutta ja arvonluontikykyä kuvaavilla tunnusluvuilla. Tutkimuksen teoriaosa esittelee liiketoiminta-analyysin välineet, jonka jälkeen esitellään taloudelliset tunnusluvut. Empiriaosassa käydään läpi vuoden 2005 tunnusluvut yritystasolla. Jotta voidaan tarkastella tunnuslukujen muutoksia pitkällä aikavälillä, yritykset ryhmitellään maantieteellisen sijainnin sekä liiketoimintaorientaation mukaan. Tutkimus on kuvaileva. Tunnusluvuista voidaan todeta sellu- ja paperiteollisuudessa meneillään oleva toimialan rakennemuutos. Eteläamerikkalaiset yritykset, jotka hyötyvät uudesta ja kustannustehokkaasta raaka-aineesta, ovat siirtyneet lähemmäs arvonluontia, kun taas suurin osa pohjoisamerikkalaisista yrityksistä, jotka olivat toimialan johtavia arvonluojia, ovat nyt arvon tuhoajia. Toimiala kärsii myös alhaisesta kannattavuudesta, joka vaikuttaa eniten pohjoisamerikkalaisiin yrityksiin. Samaan aikaan eteläamerikkalaiset yritykset ovat nostaneet kannattavuuttaan, mikä puolestaan korostaa meneillään olevaa muutosta.
Resumo:
Raw measurement data does not always immediately convey useful information, but applying mathematical statistical analysis tools into measurement data can improve the situation. Data analysis can offer benefits like acquiring meaningful insight from the dataset, basing critical decisions on the findings, and ruling out human bias through proper statistical treatment. In this thesis we analyze data from an industrial mineral processing plant with the aim of studying the possibility of forecasting the quality of the final product, given by one variable, with a model based on the other variables. For the study mathematical tools like Qlucore Omics Explorer (QOE) and Sparse Bayesian regression (SB) are used. Later on, linear regression is used to build a model based on a subset of variables that seem to have most significant weights in the SB model. The results obtained from QOE show that the variable representing the desired final product does not correlate with other variables. For SB and linear regression, the results show that both SB and linear regression models built on 1-day averaged data seriously underestimate the variance of true data, whereas the two models built on 1-month averaged data are reliable and able to explain a larger proportion of variability in the available data, making them suitable for prediction purposes. However, it is concluded that no single model can fit well the whole available dataset and therefore, it is proposed for future work to make piecewise non linear regression models if the same available dataset is used, or the plant to provide another dataset that should be collected in a more systematic fashion than the present data for further analysis.
Resumo:
The purpose of this study was to examine the disability discourses present in Ontario elementary schools curriculum. The study used a critical social analysis perspective to employ a textual discourse analysis on the Planning [title of subject] Programs for Students with Special Education Needs (PPSSEN) section of the curriculum. The present study utilized Parker's (1992) seven criteria for distinguishing discourses and discovered five main discourses; Independent, dependent, legal, scientific and agency discourses. The second step to this research was the placement and discussion of these five discourses on three diverse texts, Paulo Freire's (2008) Pedagogy o/ the Oppressed, Psychiatry Inside Out, Selected writings of Franco Basaglia, written by Scheper-Huges and Lovell (1987) and Aronowitz and Giroux's (1985) Education Under Siege: The Conservative, Liberal and Radical Debate over Schooling. These unique perspectives were used as methods of analysis tools to further analyze the dominate disability discourses. The texts provided textual support in three major areas; dialectics, critical education and structural conditions of power and language of traditional roles and responsibilities. The findings and discussions presented in this project contain significant implications for anyone involved with students with disabilities in any education system.
Resumo:
Cette recherche sur les barrières à l’accès pour les pauvres atteints de maladies chroniques en Inde a trois objectifs : 1) évaluer si les buts, les objectifs, les instruments et la population visée, tels qu'ils sont formulés dans les politiques nationales actuelles de santé en Inde, permettent de répondre aux principales barrières à l’accès pour les pauvres atteints de maladies chroniques; 2) évaluer les types de leviers et les instruments identifiés par les politiques nationales de santé en Inde pour éliminer ces barrières à l’accès; 3) et évaluer si ces politiques se sont améliorées avec le temps à l’égard de l’offre de soins à la population pour les maladies chroniques et plus spécifiquement chez les pauvres. En utilisant le Framework Approach de Ritchie et Spencer (1993), une analyse qualitative de contenu a été complétée avec des politiques nationales de santé indiennes. Pour commencer, un cadre conceptuel sur les barrières à l’accès aux soins pour les pauvres atteints de maladies chroniques en Inde a été créé à partir d’une revue de la littérature scientifique. Par la suite, les politiques ont été échantillonnées en Inde en 2009. Un cadre thématique et un index ont été générés afin de construire les outils d’analyse et codifier le contenu. Finalement, les analyses ont été effectuées en utilisant cet index, en plus de chartes, de maps, d'une grille de questions et d'études de cas. L’analyse a tété effectuée en comparant les barrières à l’accès qui avaient été originalement identifiées dans le cadre thématique avec celles identifiées par l’analyse de contenu de chaque politique. Cette recherche met en évidence que les politiques nationales de santé indiennes s’attaquent à un certain nombre de barrières à l’accès pour les pauvres, notamment en ce qui a trait à l’amélioration des services de santé dans le secteur public, l’amélioration des connaissances de la population et l’augmentation de certaines interventions sur les maladies chroniques. D’un autre côté, les barrières à l’accès reliées aux coûts du traitement des maladies chroniques, le fait que les soins de santé primaires ne soient pas abordables pour beaucoup d’individus et la capacité des gens de payer sont, parmi les barrières à l'accès identifiées dans le cadre thématique, celles qui ont reçu le moins d’attention. De plus, lorsque l’on observe le temps de formulation de chaque politique, il semble que les efforts pour augmenter les interventions et l’offre de soins pour les maladies chroniques physiques soient plus récents. De plus, les pauvres ne sont pas ciblés par les actions reliées aux maladies chroniques. Le risque de les marginaliser davantage est important avec la transition économique, démographique et épidémiologique qui transforme actuellement le pays et la demande des services de santé.
Resumo:
The thesis has covered various aspects of modeling and analysis of finite mean time series with symmetric stable distributed innovations. Time series analysis based on Box and Jenkins methods are the most popular approaches where the models are linear and errors are Gaussian. We highlighted the limitations of classical time series analysis tools and explored some generalized tools and organized the approach parallel to the classical set up. In the present thesis we mainly studied the estimation and prediction of signal plus noise model. Here we assumed the signal and noise follow some models with symmetric stable innovations.We start the thesis with some motivating examples and application areas of alpha stable time series models. Classical time series analysis and corresponding theories based on finite variance models are extensively discussed in second chapter. We also surveyed the existing theories and methods correspond to infinite variance models in the same chapter. We present a linear filtering method for computing the filter weights assigned to the observation for estimating unobserved signal under general noisy environment in third chapter. Here we consider both the signal and the noise as stationary processes with infinite variance innovations. We derived semi infinite, double infinite and asymmetric signal extraction filters based on minimum dispersion criteria. Finite length filters based on Kalman-Levy filters are developed and identified the pattern of the filter weights. Simulation studies show that the proposed methods are competent enough in signal extraction for processes with infinite variance.Parameter estimation of autoregressive signals observed in a symmetric stable noise environment is discussed in fourth chapter. Here we used higher order Yule-Walker type estimation using auto-covariation function and exemplify the methods by simulation and application to Sea surface temperature data. We increased the number of Yule-Walker equations and proposed a ordinary least square estimate to the autoregressive parameters. Singularity problem of the auto-covariation matrix is addressed and derived a modified version of the Generalized Yule-Walker method using singular value decomposition.In fifth chapter of the thesis we introduced partial covariation function as a tool for stable time series analysis where covariance or partial covariance is ill defined. Asymptotic results of the partial auto-covariation is studied and its application in model identification of stable auto-regressive models are discussed. We generalize the Durbin-Levinson algorithm to include infinite variance models in terms of partial auto-covariation function and introduce a new information criteria for consistent order estimation of stable autoregressive model.In chapter six we explore the application of the techniques discussed in the previous chapter in signal processing. Frequency estimation of sinusoidal signal observed in symmetric stable noisy environment is discussed in this context. Here we introduced a parametric spectrum analysis and frequency estimate using power transfer function. Estimate of the power transfer function is obtained using the modified generalized Yule-Walker approach. Another important problem in statistical signal processing is to identify the number of sinusoidal components in an observed signal. We used a modified version of the proposed information criteria for this purpose.
Resumo:
Successful classification, information retrieval and image analysis tools are intimately related with the quality of the features employed in the process. Pixel intensities, color, texture and shape are, generally, the basis from which most of the features are Computed and used in such fields. This papers presents a novel shape-based feature extraction approach where an image is decomposed into multiple contours, and further characterized by Fourier descriptors. Unlike traditional approaches we make use of topological knowledge to generate well-defined closed contours, which are efficient signatures for image retrieval. The method has been evaluated in the CBIR context and image analysis. The results have shown that the multi-contour decomposition, as opposed to a single shape information, introduced a significant improvement in the discrimination power. (c) 2008 Elsevier B.V. All rights reserved,
Resumo:
The recent advances in CMOS technology have allowed for the fabrication of transistors with submicronic dimensions, making possible the integration of tens of millions devices in a single chip that can be used to build very complex electronic systems. Such increase in complexity of designs has originated a need for more efficient verification tools that could incorporate more appropriate physical and computational models. Timing verification targets at determining whether the timing constraints imposed to the design may be satisfied or not. It can be performed by using circuit simulation or by timing analysis. Although simulation tends to furnish the most accurate estimates, it presents the drawback of being stimuli dependent. Hence, in order to ensure that the critical situation is taken into account, one must exercise all possible input patterns. Obviously, this is not possible to accomplish due to the high complexity of current designs. To circumvent this problem, designers must rely on timing analysis. Timing analysis is an input-independent verification approach that models each combinational block of a circuit as a direct acyclic graph, which is used to estimate the critical delay. First timing analysis tools used only the circuit topology information to estimate circuit delay, thus being referred to as topological timing analyzers. However, such method may result in too pessimistic delay estimates, since the longest paths in the graph may not be able to propagate a transition, that is, may be false. Functional timing analysis, in turn, considers not only circuit topology, but also the temporal and functional relations between circuit elements. Functional timing analysis tools may differ by three aspects: the set of sensitization conditions necessary to declare a path as sensitizable (i.e., the so-called path sensitization criterion), the number of paths simultaneously handled and the method used to determine whether sensitization conditions are satisfiable or not. Currently, the two most efficient approaches test the sensitizability of entire sets of paths at a time: one is based on automatic test pattern generation (ATPG) techniques and the other translates the timing analysis problem into a satisfiability (SAT) problem. Although timing analysis has been exhaustively studied in the last fifteen years, some specific topics have not received the required attention yet. One such topic is the applicability of functional timing analysis to circuits containing complex gates. This is the basic concern of this thesis. In addition, and as a necessary step to settle the scenario, a detailed and systematic study on functional timing analysis is also presented.
Resumo:
Static analysis tools report software defects that may or may not be detected by other verification methods. Two challenges complicating the adoption of these tools are spurious false positive warnings and legitimate warnings that are not acted on. This paper reports automated support to help address these challenges using logistic regression models that predict the foregoing types of warnings from signals in the warnings and implicated code. Because examining many potential signaling factors in large software development settings can be expensive, we use a screening methodology to quickly discard factors with low predictive power and cost-effectively build predictive models. Our empirical evaluation indicates that these models can achieve high accuracy in predicting accurate and actionable static analysis warnings, and suggests that the models are competitive with alternative models built without screening.