990 resultados para Testing Procedure


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Gesture-based applications have particularities, since users interact in a natural way, much as they interact in the non-digital world. Hence, new requirements are needed on the software design process. This paper shows a software development process model for these applications, including requirement specification, design, implementation, and testing procedures. The steps and activities of the proposed model were tested through a game case study, which is a puzzle game. The puzzle is completed when all pieces of a painting are correctly positioned by the drag and drop action of users hand gesture. It also shows the results obtained of applying a heuristic evaluation on this game. © 2012 IEEE.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The State of Michigan is striving to eliminate bovine tuberculosis (Tb) infection among free-ranging white-tailed deer in the northeastern Lower Peninsula of the state. Aggressive reduction in the overall deer population abundance may help to further reduce TB prevalence, but this course of action is unacceptable to many hunters and landowners. Targeted culling of sick deer would likely be far more acceptable to these stakeholders, so in the winter of 2003 the Michigan Department of Natural Resources pilot-trialed a new strategy based on live-trapping and Tb-testing of wild deer. The field study was conducted in a township with relatively high TB prevalence within Deer Management Unit 452 in the northeastern Lower Peninsula. Over a 2-month trapping period, 119 individual deer were live-trapped, blood sampled, fitted with a radio-collar, and released. A total of 31 of these deer were subsequently classified as Tb-suspect by at least one of five blood tests employed (however there was a low level of agreement among tests). A delay in testing meant that only six of these suspect deer were culled by sharpshooters before pre-programmed release of their radio-collars, after which they could no longer be located. Mycobacterium bovis was cultured from one of these six suspect deer; the other five were negative on culture. All target deer were located to within shooting range with 1 – 2 days of effort, and all the radio-collars on the apparently-healthy deer dropped off after the intended 90-day interval, and were thereafter recovered for re-use. There was considerable support for this pilot project among hunters, farmers, state and federal agriculture agencies, the media and the general public, and so we recommend that further field trials be undertaken using this technique. The initial focus of these trials should be on improving the efficacy and reliability of the blood testing procedure.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Constructing a 3D surface model from sparse-point data is a nontrivial task. Here, we report an accurate and robust approach for reconstructing a surface model of the proximal femur from sparse-point data and a dense-point distribution model (DPDM). The problem is formulated as a three-stage optimal estimation process. The first stage, affine registration, is to iteratively estimate a scale and a rigid transformation between the mean surface model of the DPDM and the sparse input points. The estimation results of the first stage are used to establish point correspondences for the second stage, statistical instantiation, which stably instantiates a surface model from the DPDM using a statistical approach. This surface model is then fed to the third stage, kernel-based deformation, which further refines the surface model. Handling outliers is achieved by consistently employing the least trimmed squares (LTS) approach with a roughly estimated outlier rate in all three stages. If an optimal value of the outlier rate is preferred, we propose a hypothesis testing procedure to automatically estimate it. We present here our validations using four experiments, which include 1 leave-one-out experiment, 2 experiment on evaluating the present approach for handling pathology, 3 experiment on evaluating the present approach for handling outliers, and 4 experiment on reconstructing surface models of seven dry cadaver femurs using clinically relevant data without noise and with noise added. Our validation results demonstrate the robust performance of the present approach in handling outliers, pathology, and noise. An average 95-percentile error of 1.7-2.3 mm was found when the present approach was used to reconstruct surface models of the cadaver femurs from sparse-point data with noise added.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fastener grade steels with varying alloy contents and heat treatments were employed to measure changes in resistance to hydrogen assisted cracking. The testing procedure compared notched tension specimens fractured in air to threshold stress values obtained during hydrogen charging, utilizing a rising step load procedure. Bainitic structures improved resistance by 10-20% compared to tempered martensite structures. Dual phase steels with a tempered martensite matrix and 20% ferrite were more susceptible and notch sensitive. High strength, fully pearlitic structures showed an improvement in resistance. Carbon content, per se, had no effect on the resistance of steel to hydrogen assisted cracking. Chromium caused a deleterious effect but all other alloying elements studied did not cause much change in hydrogen assisted cracking susceptibility.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND In 2006, bluetongue virus serotype 8 (BTV-8) was detected for the first time in central Europe. Measures to control the infection in livestock were implemented in Switzerland but the question was raised whether free-ranging wildlife could be a maintenance host for BTV-8. Furthermore Toggenburg orbivirus (TOV), considered as a potential 25th BTV serotype, was detected in 2007 in domestic goats in Switzerland and wild ruminants were considered a potential source of infection. To assess prevalences of BTV-8 and TOV infections in wildlife, we conducted a serological and virological survey in red deer, roe deer, Alpine chamois and Alpine ibex between 2009 and 2011. Because samples originating from wildlife carcasses are often of poor quality, we also documented the influence of hemolysis on test results, and evaluated the usefulness of confirmatory tests. RESULTS Ten out of 1,898 animals (0.5%, 95% confidence interval 0.3-1.0%) had detectable antibodies against BTV-8 and BTV-8 RNA was found in two chamois and one roe deer (0.3%, 0.1-0.8%). Seroprevalence was highest among red deer, and the majority of positive wild animals were sampled close to areas where outbreaks had been reported in livestock. Most samples were hemolytic and the range of the optical density percentage values obtained in the screening test increased with increasing hemolysis. Confirmatory tests significantly increased specificity of the testing procedure and proved to be applicable even on poor quality samples. Nearly all samples confirmed as positive had an optical density percentage value greater than 50% in the ELISA screening. CONCLUSIONS Prevalence of BTV-8 infection was low, and none of the tested animals were positive for TOV. Currently, wild ruminants are apparently not a reservoir for these viruses in Switzerland. However, we report for the first time BTV-8 RNA in Alpine chamois. This animal was found at high altitude and far from a domestic outbreak, which suggests that the virus could spread into/through the Alps. Regarding testing procedures, hemolysis did not significantly affect test results but confirmatory tests proved to be necessary to obtain reliable prevalence estimates. The cut-off value recommended by the manufacturer for the screening test was applicable for wildlife samples.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper examines the mean-reverting property of real exchange rates. Earlier studies have generally not been able to reject the null hypothesis of a unit-root in real exchange rates, especially for the post-Bretton Woods floating period. The results imply that long-run purchasing power parity does not hold. More recent studies, especially those using panel unit-root tests, have found more favorable results, however. But, Karlsson and Löthgren (2000) and others have recently pointed out several potential pitfalls of panel unit-root tests. Thus, the panel unit-root test results are suggestive, but they are far from conclusive. Moreover, consistent individual country time series evidence that supports long-run purchasing power parity continues to be scarce. In this paper, we test for long memory using Lo's (1991) modified rescaled range test, and the rescaled variance test of Giraitis, Kokoszka, Leipus, and Teyssière (2003). Our testing procedure provides a non-parametric alternative to the parametric tests commonly used in this literature. Our data set consists of monthly observations from April 1973 to April 2001 of the G-7 countries in the OECD. Our two tests find conflicting results when we use U.S. dollar real exchange rates. However, when non-U.S. dollar real exchange rates are used, we find only two cases out of fifteen where the null hypothesis of an unit-root with short-term dependence can be rejected in favor of the alternative hypothesis of long-term dependence using the modified rescaled range test, and only one case when using the rescaled variance test. Our results therefore provide a contrast to the recent favorable panel unit-root test results.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Interaction effect is an important scientific interest for many areas of research. Common approach for investigating the interaction effect of two continuous covariates on a response variable is through a cross-product term in multiple linear regression. In epidemiological studies, the two-way analysis of variance (ANOVA) type of method has also been utilized to examine the interaction effect by replacing the continuous covariates with their discretized levels. However, the implications of model assumptions of either approach have not been examined and the statistical validation has only focused on the general method, not specifically for the interaction effect.^ In this dissertation, we investigated the validity of both approaches based on the mathematical assumptions for non-skewed data. We showed that linear regression may not be an appropriate model when the interaction effect exists because it implies a highly skewed distribution for the response variable. We also showed that the normality and constant variance assumptions required by ANOVA are not satisfied in the model where the continuous covariates are replaced with their discretized levels. Therefore, naïve application of ANOVA method may lead to an incorrect conclusion. ^ Given the problems identified above, we proposed a novel method modifying from the traditional ANOVA approach to rigorously evaluate the interaction effect. The analytical expression of the interaction effect was derived based on the conditional distribution of the response variable given the discretized continuous covariates. A testing procedure that combines the p-values from each level of the discretized covariates was developed to test the overall significance of the interaction effect. According to the simulation study, the proposed method is more powerful then the least squares regression and the ANOVA method in detecting the interaction effect when data comes from a trivariate normal distribution. The proposed method was applied to a dataset from the National Institute of Neurological Disorders and Stroke (NINDS) tissue plasminogen activator (t-PA) stroke trial, and baseline age-by-weight interaction effect was found significant in predicting the change from baseline in NIHSS at Month-3 among patients received t-PA therapy.^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Most studies of differential gene-expressions have been conducted between two given conditions. The two-condition experimental (TCE) approach is simple in that all genes detected display a common differential expression pattern responsive to a common two-condition difference. Therefore, the genes that are differentially expressed under the other conditions other than the given two conditions are undetectable with the TCE approach. In order to address the problem, we propose a new approach called multiple-condition experiment (MCE) without replication and develop corresponding statistical methods including inference of pairs of conditions for genes, new t-statistics, and a generalized multiple-testing method for any multiple-testing procedure via a control parameter C. We applied these statistical methods to analyze our real MCE data from breast cancer cell lines and found that 85 percent of gene-expression variations were caused by genotypic effects and genotype-ANAX1 overexpression interactions, which agrees well with our expected results. We also applied our methods to the adenoma dataset of Notterman et al. and identified 93 differentially expressed genes that could not be found in TCE. The MCE approach is a conceptual breakthrough in many aspects: (a) many conditions of interests can be conducted simultaneously; (b) study of association between differential expressions of genes and conditions becomes easy; (c) it can provide more precise information for molecular classification and diagnosis of tumors; (d) it can save lot of experimental resources and time for investigators.^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Multi-center clinical trials are very common in the development of new drugs and devices. One concern in such trials, is the effect of individual investigational sites enrolling small numbers of patients on the overall result. Can the presence of small centers cause an ineffective treatment to appear effective when treatment-by-center interaction is not statistically significant?^ In this research, simulations are used to study the effect that centers enrolling few patients may have on the analysis of clinical trial data. A multi-center clinical trial with 20 sites is simulated to investigate the effect of a new treatment in comparison to a placebo treatment. Twelve of these 20 investigational sites are considered small, each enrolling less than four patients per treatment group. Three clinical trials are simulated with sample sizes of 100, 170 and 300. The simulated data is generated with various characteristics, one in which treatment should be considered effective and another where treatment is not effective. Qualitative interactions are also produced within the small sites to further investigate the effect of small centers under various conditions.^ Standard analysis of variance methods and the "sometimes-pool" testing procedure are applied to the simulated data. One model investigates treatment and center effect and treatment-by-center interaction. Another model investigates treatment effect alone. These analyses are used to determine the power to detect treatment-by-center interactions, and the probability of type I error.^ We find it is difficult to detect treatment-by-center interactions when only a few investigational sites enrolling a limited number of patients participate in the interaction. However, we find no increased risk of type I error in these situations. In a pooled analysis, when the treatment is not effective, the probability of finding a significant treatment effect in the absence of significant treatment-by-center interaction is well within standard limits of type I error. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The aim of automatic pathological voice detection systems is to serve as tools, to medical specialists, for a more objective, less invasive and improved diagnosis of diseases. In this respect, the gold standard for those system include the usage of a optimized representation of the spectral envelope, either based on cepstral coefficients from the mel-scaled Fourier spectral envelope (Mel-Frequency Cepstral Coefficients) or from an all-pole estimation (Linear Prediction Coding Cepstral Coefficients) forcharacterization, and Gaussian Mixture Models for posterior classification. However, the study of recently proposed GMM-based classifiers as well as Nuisance mitigation techniques, such as those employed in speaker recognition, has not been widely considered inpathology detection labours. The present work aims at testing whether or not the employment of such speaker recognition tools might contribute to improve system performance in pathology detection systems, specifically in the automatic detection of Obstructive Sleep Apnea. The testing procedure employs an Obstructive Sleep Apnea database, in conjunction with GMM-based classifiers looking for a better performance. The results show that an improved performance might be obtained by using such approach.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

En esta tesis doctoral se presenta una investigación sobre el comportamiento deformacional de las escolleras empleadas en banquetas de cimentación de obras portuarias de cajones. El trabajo aborda el estudio de la deformabilidad de escolleras portuarias combinando (i) investigación mediante ensayos de laboratorio; (ii) análisis del comportamiento in situ de las banquetas de escolleras y (iii) cálculos realizados con modelos numéricos. Se expone en primer lugar la investigación experimental realizada en el Laboratorio de Geotecnia del Centro de Estudios y Experimentación de Obras Públicas (CEDEX) para estudiar la deformabilidad de las escolleras mediante ensayos a gran escala, habida cuenta las grandes dimensiones de las partículas de escollera. Se ha tratado de establecer una metodología de ensayo que reproduzca las solicitaciones de las escolleras colocadas en banquetas de cimentación de obras portuarias. Asimismo, se ha hecho una interpretación exhaustiva de los resultados de los ensayos con el fin de establecer unos valores que caractericen la deformabilidad de las escolleras analizadas. Es posible dar un intervalo de valores de la compresibilidad de las escolleras portuarias que, dada la escasez de literatura existente, constituyen unos valores de referencia. Asimismo, se ha propuesto una metodología para para estimar la deformabilidad de escolleras portuarias in situ. La información disponible ha permitido realizar estudios de la deformabilidad in situ en dos muelles españoles con semejanzas estructurales y constructivas. La interpretación conjunta de los resultados ha sugerido unos valores de deformabilidad in situ. Conviene destacar que la práctica ausencia de rangos de valores de compresibilidad in situ para estos rellenos empleados en obras portuarias pone de manifiesto la importancia de los resultados obtenidos. Evidencias de diferencias de comportamiento de las escolleras empleadas en banquetas de cimentaci ón de cajones portuarios en laboratorio e in situ han sido documentadas. La evaluación conjunta del comportamiento tenso-deformacional de las escolleras en laboratorio e in situ ha estimulado la búsqueda de una correlación entre la compresibilidad de las escolleras en ambos escenarios. Finalmente, se ha elaborado un modelo numérico con la formulación matemática del método sincrético (Perucho (2004, 2008)) que supone una opción interesante para evaluar la deformabilidad de los rellenos granulares. En la práctica, el empleo del modelo sincrético requiere la determinación de unos microparámetros. La disponibilidad de numerosos resultados de laboratorio realizados en las escolleras portuarias ha permitido calibrar el modelo realizado. De esta manera, se dispone de una herramienta de cálculo para evaluar la deformabilidad de los relleno granulares con un método numérico. The focus of this Thesis is to explore the deformational behavior of large rock fill materials used as rock mattress foundations for gravity caissons structures. The determination of the compressibility of large granular media focuses on (i) laboratory testing, (ii) in situ performance analysis of rock mattress foundations for caissons, and (iii) numerical modelling. First, the results of the large-scale laboratory research program, conducted at the Geotechnical Laboratory for the Center for Studies and Experimentation for Public Works (CEDEX), to determine the deformability of large rock fill materials is presented. The testing procedure was specifically designed to reproduce the loading sequence of in situ rubble mound foundations. A thoughtful analysis of the laboratory testing results suggests a range of compressibility for large granular media. The lack of currently available information regarding large rock fill deformability places a certain emphasis on the results of the testing program. Second, the results of this research includes a procedure for evaluating in situ rock fill deformational behavior. Data, collected from monitoring two caisson-type quays in Spain, provides information to study in situ rock mattress foundations. Careful interpretation of in situ data reveals a range of deformability of rock mattress foundations in caisson-type quays. Based upon a review of available literature, assessments on the behavior of rock mattress foundations for caissons using in situ analysis are quite limited. The data from this research are likely to contribute to the knowledge of the in situ behavior of rock mattress foundations for caissons. Additionally, findings indicate an appreciable variation between the laboratory and the in situ behaviour of materials from rock mattress foundations for caissons. Dissimilarities between laboratory and in situ moduli of deformation are examined in detail. Correlations between laboratory and in situ values are made. Finally, numerical modeling, based upon the research of Perucho (2004, 2008), is presented to predict the deformation behavior of large granular media. The determination of microparameters that control macropropierties requires extensive calibration effort. The calibration process was carried out using the results of large-scale laboratory testing available from previous analysis. The presented numerical method is both versatile and attractive as it reasonably predicts the compressibility of large rock fill materials.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A Administração Financeira surge no início do século XIX juntamente com o movimento de consolidação das grandes empresas e a formação dos mercados nacionais americano enquanto que no Brasil os primeiros estudos ocorrem a partir da segunda metade do século XX. Desde entãoo país conseguiu consolidar alguns centros de excelência em pesquisa, formar grupo significativo de pesquisadores seniores e expandir as áreas de pesquisa no campo, contudo, ainda são poucos os trabalhos que buscam retratar as características da produtividade científica em Finanças. Buscando contribuir para a melhor compreensão do comportamento produtivo dessa área a presente pesquisa estuda sua produção científica, materializada na forma de artigos digitais, publicados em 24 conceituados periódicos nacionais classificados nos estratos Qualis/CAPES A2, B1 e B2 da Área de Administração, Ciências Contábeis e Turismo. Para tanto são aplicadas a Lei de Bradford, Lei do Elitismo de Price e Lei de Lotka. Pela Lei de Bradford são identificadas três zonas de produtividade sendo o núcleo formado por três revistas, estando uma delas classificada no estrato Qualis/CAPES B2, o que evidencia a limitação de um recorte tendo como único critério a classificação Qualis/CAPES. Para a Lei do Elitismo de Price, seja pela contagem direta ou completa, não identificamos comportamento de uma elite semelhante ao apontado pela teoria e que conta com grande número de autores com apenas uma publicação.Aplicando-se o modelo do Poder Inverso Generalizado, calculado por Mínimos Quadrados Ordinários (MQO), verificamos que produtividade dos pesquisadores, quando feita pela contagem direta, se adequa àquela definida pela Lei de Lotka ao nível de α = 0,01 de significância, contudo, pela contagem completa não podemos confirmar a hipótese de homogeneidade das distribuições, além do fato de que nas duas contagens a produtividade analisada pelo parâmetro n é maior que 2 e, portanto, a produtividade do pesquisadores de finanças é menor que a defendida pela teoria.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The problem of similarity measurement of biological signals is considered on this article. The dynamic time warping algorithm is used as a possible solution. A short overview of this algorithm and its modifications are given. Testing procedure for different modifications of DTW, which are based on artificial test signals, are presented.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A Administração Financeira surge no início do século XIX juntamente com o movimento de consolidação das grandes empresas e a formação dos mercados nacionais americano enquanto que no Brasil os primeiros estudos ocorrem a partir da segunda metade do século XX. Desde entãoo país conseguiu consolidar alguns centros de excelência em pesquisa, formar grupo significativo de pesquisadores seniores e expandir as áreas de pesquisa no campo, contudo, ainda são poucos os trabalhos que buscam retratar as características da produtividade científica em Finanças. Buscando contribuir para a melhor compreensão do comportamento produtivo dessa área a presente pesquisa estuda sua produção científica, materializada na forma de artigos digitais, publicados em 24 conceituados periódicos nacionais classificados nos estratos Qualis/CAPES A2, B1 e B2 da Área de Administração, Ciências Contábeis e Turismo. Para tanto são aplicadas a Lei de Bradford, Lei do Elitismo de Price e Lei de Lotka. Pela Lei de Bradford são identificadas três zonas de produtividade sendo o núcleo formado por três revistas, estando uma delas classificada no estrato Qualis/CAPES B2, o que evidencia a limitação de um recorte tendo como único critério a classificação Qualis/CAPES. Para a Lei do Elitismo de Price, seja pela contagem direta ou completa, não identificamos comportamento de uma elite semelhante ao apontado pela teoria e que conta com grande número de autores com apenas uma publicação.Aplicando-se o modelo do Poder Inverso Generalizado, calculado por Mínimos Quadrados Ordinários (MQO), verificamos que produtividade dos pesquisadores, quando feita pela contagem direta, se adequa àquela definida pela Lei de Lotka ao nível de α = 0,01 de significância, contudo, pela contagem completa não podemos confirmar a hipótese de homogeneidade das distribuições, além do fato de que nas duas contagens a produtividade analisada pelo parâmetro n é maior que 2 e, portanto, a produtividade do pesquisadores de finanças é menor que a defendida pela teoria.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Some of the problems arising from the inherent instability of emulsions are discussed. Aspects of emulsion stability are described and particular attention is given to the influence of the chemical nature of the dispersed phase on adsorbed film structure and stability, Emulsion stability has been measured by a photomicrographic technique. Electrophoresis, interfacial tension and droplet rest-time data were also obtained. Emulsions were prepared using a range of oils, including aliphatic and aromatic hydrocarbons, dispersed In a solution of sodium dodecyl sulphate. In some cases a small amount of alkane or alkanol was incorporated into the oil phase. In general the findings agree with the classical view that the stability of oil-in-water emulsions is favoured by a closely packed interfacial film and appreciable electric charge on the droplets. The inclusion of non-ionic alcohol leads to enhanced stability, presumably owing to the formation of a "mixed" interfacial film which is more closely packed and probably more coherent than that of the anionic surfactant alone. In some instances differences in stability cannot he accounted for simply by differences in interfacial adsorption or droplet charge. Alternative explanations are discussed and it is postulated that the coarsening of emulsions may occur not only hy coalescence but also through the migration of oil from small droplets to larger ones by molecular diffusion. The viability of using the coalescence rates of droplets at a plane interface as a guide to emulsion stability has been researched. The construction of a suitable apparatus and the development of a standard testing procedure are described. Coalescence-time distributions may be correlated by equations similar to those presented by other workers, or by an analysis based upon the log-normal function. Stability parameters for a range of oils are discussed in terms of differences in film drainage and the natl1re of the interfacial film. Despite some broad correlations there is generally poor agreement between droplet and emulsion stabilities. It is concluded that hydrodynamic factors largely determine droplet stability in the systems studied. Consequently droplet rest-time measurements do not provide a sensible indication of emulsion stability,