928 resultados para automation of fit analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Temperature, pressure, gas stoichiometry, and residence time were varied to control the yield and product distribution of the palladium-catalyzed aminocarbonylation of aromatic bromides in both a silicon microreactor and a packed-bed tubular reactor. Automation of the system set points and product sampling enabled facile and repeatable reaction analysis with minimal operator supervision. It was observed that the reaction was divided into two temperature regimes. An automated system was used to screen steady-state conditions for offline analysis by gas chromatography to fit a reaction rate model. Additionally, a transient temperature ramp method utilizing online infrared analysis was used, leading to more rapid determination of the reaction activation energy of the lower temperature regimes. The entire reaction spanning both regimes was modeled in good agreement with the experimental data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To assess the validity and repeatability of objective compared to subjective contact lens fit analysis. Methods: Thirty-five subjects (aged 22.0. ±. 3.0 years) wore two different soft contact lens designs. Four lens fit variables: centration, horizontal lag, post-blink movement in up-gaze and push-up recovery speed were assessed subjectively (four observers) and objectively from slit-lamp biomicroscopy captured images and video. The analysis was repeated a week later. Results: The average of the four experienced observers was compared to objective measures, but centration, movement on blink, lag and push-up recovery speed all varied significantly between them (p <. 0.001). Horizontal lens centration was on average close to central as assessed both objectively and subjectively (p > 0.05). The 95% confidence interval of subjective repeatability was better than objective assessment (±0.128. mm versus ±0.168. mm, p = 0.417), but utilised only 78% of the objective range. Vertical centration assessed objectively showed a slight inferior decentration (0.371. ±. 0.381. mm) with good inter- and intrasession repeatability (p > 0.05). Movement-on-blink was lower estimated subjectively than measured objectively (0.269. ±. 0.179. mm versus 0.352. ±. 0.355. mm; p = 0.035), but had better repeatability (±0.124. mm versus ±0.314. mm 95% confidence interval) unless correcting for the smaller range (47%). Horizontal lag was lower estimated subjectively (0.562. ±. 0.259. mm) than measured objectively (0.708. ±. 0.374. mm, p <. 0.001), had poorer repeatability (±0.132. mm versus ±0.089. mm 95% confidence interval) and had a smaller range (63%). Subjective categorisation of push-up speed of recovery showed reasonable differentiation relative to objective measurement (p <. 0.001). Conclusions: The objective image analysis allows an accurate, reliable and repeatable assessment of soft contact lens fit characteristics, being a useful tool for research and optimisation of lens fit in clinical practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to develop a methodology using Raman hyperspectral imaging and chemometric methods for identification of pre- and post-blast explosive residues on banknote surfaces. The explosives studied were of military, commercial and propellant uses. After the acquisition of the hyperspectral imaging, independent component analysis (ICA) was applied to extract the pure spectra and the distribution of the corresponding image constituents. The performance of the methodology was evaluated by the explained variance and the lack of fit of the models, by comparing the ICA recovered spectra with the reference spectra using correlation coefficients and by the presence of rotational ambiguity in the ICA solutions. The methodology was applied to forensic samples to solve an automated teller machine explosion case. Independent component analysis proved to be a suitable method of resolving curves, achieving equivalent performance with the multivariate curve resolution with alternating least squares (MCR-ALS) method. At low concentrations, MCR-ALS presents some limitations, as it did not provide the correct solution. The detection limit of the methodology presented in this study was 50μgcm(-2).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: The main goal of this study was to develop and compare two different techniques for classification of specific types of corneal shapes when Zernike coefficients are used as inputs. A feed-forward artificial Neural Network (NN) and discriminant analysis (DA) techniques were used. METHODS: The inputs both for the NN and DA were the first 15 standard Zernike coefficients for 80 previously classified corneal elevation data files from an Eyesys System 2000 Videokeratograph (VK), installed at the Departamento de Oftalmologia of the Escola Paulista de Medicina, São Paulo. The NN had 5 output neurons which were associated with 5 typical corneal shapes: keratoconus, with-the-rule astigmatism, against-the-rule astigmatism, "regular" or "normal" shape and post-PRK. RESULTS: The NN and DA responses were statistically analyzed in terms of precision ([true positive+true negative]/total number of cases). Mean overall results for all cases for the NN and DA techniques were, respectively, 94% and 84.8%. CONCLUSION: Although we used a relatively small database, results obtained in the present study indicate that Zernike polynomials as descriptors of corneal shape may be a reliable parameter as input data for diagnostic automation of VK maps, using either NN or DA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research aims toward a better understanding of the organizational culture(s) of the judiciary in Switzerland by analysing what 'good justice' means nowadays in this country. It seeks to clarify whether, and to what extent, expectations of 'good justice' of judicial actors (judges without managerial experience) and of managerial actors (court managers) are similar and to describe possible managerial implications that may result from this. As judges are at the heart of the judicial organization and exert a strong influence on other groups of actors (Sullivan, Warren et al. 1994), the congruence of their expectations with those of court managers will be at the centre of the analysis. Additionally, referring to the conceptual worlds of Boltanski and Thévenaut (1991), we analyze how closely these expectations are to management-oriented values. We found that almost half of expectations are common to the two groups examined and the main quoted ones are compatible to new public management (NPM) concepts. On the other hand, those expectations shared exclusively by judges relate to the human side of justice, whereas those specific to court managers focus on the way justice functions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Standard methods for the analysis of linear latent variable models oftenrely on the assumption that the vector of observed variables is normallydistributed. This normality assumption (NA) plays a crucial role inassessingoptimality of estimates, in computing standard errors, and in designinganasymptotic chi-square goodness-of-fit test. The asymptotic validity of NAinferences when the data deviates from normality has been calledasymptoticrobustness. In the present paper we extend previous work on asymptoticrobustnessto a general context of multi-sample analysis of linear latent variablemodels,with a latent component of the model allowed to be fixed across(hypothetical)sample replications, and with the asymptotic covariance matrix of thesamplemoments not necessarily finite. We will show that, under certainconditions,the matrix $\Gamma$ of asymptotic variances of the analyzed samplemomentscan be substituted by a matrix $\Omega$ that is a function only of thecross-product moments of the observed variables. The main advantage of thisis thatinferences based on $\Omega$ are readily available in standard softwareforcovariance structure analysis, and do not require to compute samplefourth-order moments. An illustration with simulated data in the context ofregressionwith errors in variables will be presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Structural equation models are widely used in economic, socialand behavioral studies to analyze linear interrelationships amongvariables, some of which may be unobservable or subject to measurementerror. Alternative estimation methods that exploit different distributionalassumptions are now available. The present paper deals with issues ofasymptotic statistical inferences, such as the evaluation of standarderrors of estimates and chi--square goodness--of--fit statistics,in the general context of mean and covariance structures. The emphasisis on drawing correct statistical inferences regardless of thedistribution of the data and the method of estimation employed. A(distribution--free) consistent estimate of $\Gamma$, the matrix ofasymptotic variances of the vector of sample second--order moments,will be used to compute robust standard errors and a robust chi--squaregoodness--of--fit squares. Simple modifications of the usual estimateof $\Gamma$ will also permit correct inferences in the case of multi--stage complex samples. We will also discuss the conditions under which,regardless of the distribution of the data, one can rely on the usual(non--robust) inferential statistics. Finally, a multivariate regressionmodel with errors--in--variables will be used to illustrate, by meansof simulated data, various theoretical aspects of the paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We extend to score, Wald and difference test statistics the scaled and adjusted corrections to goodness-of-fit test statistics developed in Satorra and Bentler (1988a,b). The theory is framed in the general context of multisample analysis of moment structures, under general conditions on the distribution of observable variables. Computational issues, as well as the relation of the scaled and corrected statistics to the asymptotic robust ones, is discussed. A Monte Carlo study illustrates thecomparative performance in finite samples of corrected score test statistics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work is devoted to the problem of reconstructing the basis weight structure at paper web with black{box techniques. The data that is analyzed comes from a real paper machine and is collected by an o®-line scanner. The principal mathematical tool used in this work is Autoregressive Moving Average (ARMA) modelling. When coupled with the Discrete Fourier Transform (DFT), it gives a very flexible and interesting tool for analyzing properties of the paper web. Both ARMA and DFT are independently used to represent the given signal in a simplified version of our algorithm, but the final goal is to combine the two together. Ljung-Box Q-statistic lack-of-fit test combined with the Root Mean Squared Error coefficient gives a tool to separate significant signals from noise.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study the performance measurement, a part of the research and development of the RNC, was improved by implementing counter testing to the Nokia Automation System. The automation of counter testing is a feature the customer ordered, because performing counter testing manually is rather complex. The objective was to implement an automated counter testing system, which once configured correctly, would manage to run the testing and perform the analysis. The requirements for the counter testing were first studied. It was investigated if the auto-mation of the feature was feasible in the meetings with the customer. The basic functionality required for the automation was also drawn. The technologies used in the architecture of the Nokia Automation System were studied. Based on the results of the study, a new technology, wxWidgets, was introduced. The new technology was necessary to facilitate the implementing of the required feature. Finally the implementation of the counter testing was defined and implemented. The result of this study was the automation of the counter testing method developed as a new feature for the Nokia Automation System. The feature meets the specifications and requirements set by the customer. The performing of the counter testing feature is totally automated. Only configuration of the test cases is done by the user. The customer has presented new requests to further develop the feature and there are plans by the Nokia Automation System developers to implement those in the near future. The study describes the implementation of the counter testing feature introduced. The results of the study give guidelines for further developing the feature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Premenstrual syndrome and premenstrual dysphoric disorder (PMDD) seem to form a severity continuum with no clear-cut boundary. However, since the American Psychiatric Association proposed the research criteria for PMDD in 1994, there has been no agreement about the symptomatic constellation that constitutes this syndrome. The objective of the present study was to establish the core latent structure of PMDD symptoms in a non-clinical sample. Data concerning PMDD symptoms were obtained from 632 regularly menstruating college students (mean age 24.4 years, SD 5.9, range 17 to 49). For the first random half (N = 316), we performed principal component analysis (PCA) and for the remaining half (N = 316), we tested three theory-derived competing models of PMDD by confirmatory factor analysis. PCA allowed us to extract two correlated factors, i.e., dysphoric-somatic and behavioral-impairment factors. The two-dimensional latent model derived from PCA showed the best overall fit among three models tested by confirmatory factor analysis (c²53 = 64.39, P = 0.13; goodness-of-fit indices = 0.96; adjusted goodness-of-fit indices = 0.95; root mean square residual = 0.05; root mean square error of approximation = 0.03; 90%CI = 0.00 to 0.05; Akaike's information criterion = -41.61). The items "out of control" and "physical symptoms" loaded conspicuously on the first factor and "interpersonal impairment" loaded higher on the second factor. The construct validity for PMDD was accounted for by two highly correlated dimensions. These results support the argument for focusing on the core psychopathological dimension of PMDD in future studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mass transfer kinetics in osmotic dehydration is usually modeled by Fick's law, empirical models and probabilistic models. The aim of this study was to determine the applicability of Peleg model to investigate the mass transfer during osmotic dehydration of mackerel (Scomber japonicus) slices at different temperatures. Osmotic dehydration was performed on mackerel slices by cooking-infusion in solutions with glycerol and salt (a w = 0.64) at different temperatures: 50, 70, and 90 ºC. Peleg rate constant (K1) (h(g/gdm)-1) varied with temperature variation from 0.761 to 0.396 for water loss, from 5.260 to 2.947 for salt gain, and from 0.854 to 0.566 for glycerol intake. In all cases, it followed the Arrhenius relationship (R²>0.86). The Ea (kJ / mol) values obtained were 16.14; 14.21, and 10.12 for water, salt, and glycerol, respectively. The statistical parameters that qualify the goodness of fit (R²>0.91 and RMSE<0.086) indicate promising applicability of Peleg model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A method is presented for determining the time to first division of individual bacterial cells growing on agar media. Bacteria were inoculated onto agar-coated slides and viewed by phase-contrast microscopy. Digital images of the growing bacteria were captured at intervals and the time to first division estimated by calculating the "box area ratio". This is the area of the smallest rectangle that can be drawn around an object, divided by the area of the object itself. The box area ratios of cells were found to increase suddenly during growth at a time that correlated with cell division as estimated by visual inspection of the digital images. This was caused by a change in the orientation of the two daughter cells that occurred when sufficient flexibility arose at their point of attachment. This method was used successfully to generate lag time distributions for populations of Escherichia coli, Listeria monocytogenes and Pseudomonas aeruginosa, but did not work with the coccoid organism Staphylococcus aureus. This method provides an objective measure of the time to first cell division, whilst automation of the data processing allows a large number of cells to be examined per experiment. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reliability analysis of probabilistic forecasts, in particular through the rank histogram or Talagrand diagram, is revisited. Two shortcomings are pointed out: Firstly, a uniform rank histogram is but a necessary condition for reliability. Secondly, if the forecast is assumed to be reliable, an indication is needed how far a histogram is expected to deviate from uniformity merely due to randomness. Concerning the first shortcoming, it is suggested that forecasts be grouped or stratified along suitable criteria, and that reliability is analyzed individually for each forecast stratum. A reliable forecast should have uniform histograms for all individual forecast strata, not only for all forecasts as a whole. As to the second shortcoming, instead of the observed frequencies, the probability of the observed frequency is plotted, providing and indication of the likelihood of the result under the hypothesis that the forecast is reliable. Furthermore, a Goodness-Of-Fit statistic is discussed which is essentially the reliability term of the Ignorance score. The discussed tools are applied to medium range forecasts for 2 m-temperature anomalies at several locations and lead times. The forecasts are stratified along the expected ranked probability score. Those forecasts which feature a high expected score turn out to be particularly unreliable.