981 resultados para Component method


Relevância:

70.00% 70.00%

Publicador:

Resumo:

This paper presents a novel segmentation method for cuboidal cell nuclei in images of prostate tissue stained with hematoxylin and eosin. The proposed method allows segmenting normal, hyperplastic and cancerous prostate images in three steps: pre-processing, segmentation of cuboidal cell nuclei and post-processing. The pre-processing step consists of applying contrast stretching to the red (R) channel to highlight the contrast of cuboidal cell nuclei. The aim of the second step is to apply global thresholding based on minimum cross entropy to generate a binary image with candidate regions for cuboidal cell nuclei. In the post-processing step, false positives are removed using the connected component method. The proposed segmentation method was applied to an image bank with 105 samples and measures of sensitivity, specificity and accuracy were compared with those provided by other segmentation approaches available in the specialized literature. The results are promising and demonstrate that the proposed method allows the segmentation of cuboidal cell nuclei with a mean accuracy of 97%. © 2013 Elsevier Ltd. All rights reserved.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Background and Objective. Ever since the human development index was published in 1990 by the United Nations Development Programme (UNDP), many researchers started searching and corporative studying for more effective methods to measure the human development. Published in 1999, Lai’s “Temporal analysis of human development indicators: principal component approach” provided a valuable statistical way on human developmental analysis. This study presented in the thesis is the extension of Lai’s 1999 research. ^ Methods. I used the weighted principal component method on the human development indicators to measure and analyze the progress of human development in about 180 countries around the world from the year 1999 to 2010. The association of the main principal component obtained from the study and the human development index reported by the UNDP was estimated by the Spearman’s rank correlation coefficient. The main principal component was then further applied to quantify the temporal changes of the human development of selected countries by the proposed Z-test. ^ Results. The weighted means of all three human development indicators, health, knowledge, and standard of living, were increased from 1999 to 2010. The weighted standard deviation for GDP per capita was also increased across years indicated the rising inequality of standard of living among countries. The ranking of low development countries by the main principal component (MPC) is very similar to that by the human development index (HDI). Considerable discrepancy between MPC and HDI ranking was found among high development countries with high GDP per capita shifted to higher ranks. The Spearman’s rank correlation coefficient between the main principal component and the human development index were all around 0.99. All the above results were very close to outcomes in Lai’s 1999 report. The Z test result on temporal analysis of main principal components from 1999 to 2010 on Qatar was statistically significant, but not on other selected countries, such as Brazil, Russia, India, China, and U.S.A.^ Conclusion. To synthesize the multi-dimensional measurement of human development into a single index, the weighted principal component method provides a good model by using the statistical tool on a comprehensive ranking and measurement. Since the weighted main principle component index is more objective because of using population of nations as weight, more effective when the analysis is across time and space, and more flexible when the countries reported to the system has been changed year after year. Thus, in conclusion, the index generated by using weighted main principle component has some advantage over the human development index created in UNDP reports.^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The force distribution inside a dovetail joint is complex. Wood is simultaneously loaded in different directions in the several connected surfaces. The analytical solutions available for the analysis of the behavior of those carpentry joints rely on the mechanical properties of wood. In particular, the stiffness properties of wood under compression are crucial for the forces equilibrium. Simulations showed that the stiffness values considered in each of the springs normally assumed in the analytical models, have great influence in the bearing capacity and stiffness of the dovetail joints, with important consequence on the stress distribution over the overall structure. In a wide experimental campaign, the properties under compression of the most common wood species of existing timber structures have been determined. Then, a solved example of a dovetail joint is presented assuming different wood species and the corresponding strength and stiffness properties values obtained in the tests.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objetivos. El Objetivo general de este estudio es determinar si el Instrumento para evaluar las Fallas Cognitivas Ocupacionales (Occupational Cognitive Failures Questionnaire - OCFQ) desarrollado por Allahyari T. et al. (2011) , tiene validez transcultural y podría ser un Instrumento fiable y válido que se puede adaptar al contexto cultural Colombiano para la valoración de las Fallas Cognitivas en el ámbito laboral. Metodología. Se llevó a cabo la traducción, adaptación y validación del Cuestionario de Fallas Cognitivas Ocupacionales (OCFQ) al contexto cultural colombiano, siguiendo las recomendaciones de la Organización Mundial de la Salud (OMS) para el proceso de traducción y adaptación de instrumentos y posteriormente la evaluación de fiabilidad y validez del instrumento adaptado, en cuatro etapas: Etapa 1. Traducción - retro traducción, Etapa 2. “Debriefing” y Análisis de legibilidad, Etapa 3. Validez de contenido, usando el Índice de Validez de Contenido (CVI) y Etapa 4. Evaluación de propiedades métricas. Para la evaluación de Validez de Constructo se aplicó el Análisis Factorial por el Método de Componentes Principales y Rotación Varimax; la consistencia interna y la estabilidad temporal fueron evaluados mediante el Alpha de Cronbach (α) y el test-retest, respectivamente. Resultados. El Cuestionario OCFQ fue adaptado al contexto cultural Colombiano; el análisis de Legibilidad determinó que de acuerdo con el Grado en la escala Inflesz, el Cuestionario es Bastante Fácil de leer. Partiendo de la versión original de 30 ítems se obtuvo una nueva versión de 25 ítems, ya que después de la evaluación de Validez de Contenido se rechazaron 5 ítems. El Índice de Validez de Contenido (CVI) para la versión final del OCFQ adaptado es aceptable (CVI=0,84). Los resultados de las pruebas métricas muestran que la versión final del OCFQ adaptado tiene una buena Consistencia Interna (α=0.90) y el Índice de Correlación Interclases (ICC) fue de 0.91 mostrando una muy buena Estabilidad Temporal. El Análisis Factorial estableció para el Cuestionario OCFQ 4 factores que explican el 47% de la varianza total. Conclusión. La evaluación de las Fallas Cognitivas en el ámbito laboral requiere que se disponga de una herramienta válida y fiable. De acuerdo con los resultados en este estudio se puede establecer que el OCFQ adaptado al Contexto Cultural Colombiano podría ser un instrumento adecuado para medir las Fallas Cognitivas en el ámbito laboral en plantas industriales.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Las pruebas de software (Testing) son en la actualidad la técnica más utilizada para la validación y la evaluación de la calidad de un programa. El testing está integrado en todas las metodologías prácticas de desarrollo de software y juega un papel crucial en el éxito de cualquier proyecto de software. Desde las unidades de código más pequeñas a los componentes más complejos, su integración en un sistema de software y su despliegue a producción, todas las piezas de un producto de software deben ser probadas a fondo antes de que el producto de software pueda ser liberado a un entorno de producción. La mayor limitación del testing de software es que continúa siendo un conjunto de tareas manuales, representando una buena parte del coste total de desarrollo. En este escenario, la automatización resulta fundamental para aliviar estos altos costes. La generación automática de casos de pruebas (TCG, del inglés test case generation) es el proceso de generar automáticamente casos de prueba que logren un alto recubrimiento del programa. Entre la gran variedad de enfoques hacia la TCG, esta tesis se centra en un enfoque estructural de caja blanca, y más concretamente en una de las técnicas más utilizadas actualmente, la ejecución simbólica. En ejecución simbólica, el programa bajo pruebas es ejecutado con expresiones simbólicas como argumentos de entrada en lugar de valores concretos. Esta tesis se basa en un marco general para la generación automática de casos de prueba dirigido a programas imperativos orientados a objetos (Java, por ejemplo) y basado en programación lógica con restricciones (CLP, del inglés constraint logic programming). En este marco general, el programa imperativo bajo pruebas es primeramente traducido a un programa CLP equivalente, y luego dicho programa CLP es ejecutado simbólicamente utilizando los mecanismos de evaluación estándar de CLP, extendidos con operaciones especiales para el tratamiento de estructuras de datos dinámicas. Mejorar la escalabilidad y la eficiencia de la ejecución simbólica constituye un reto muy importante. Es bien sabido que la ejecución simbólica resulta impracticable debido al gran número de caminos de ejecución que deben ser explorados y a tamaño de las restricciones que se deben manipular. Además, la generación de casos de prueba mediante ejecución simbólica tiende a producir un número innecesariamente grande de casos de prueba cuando es aplicada a programas de tamaño medio o grande. Las contribuciones de esta tesis pueden ser resumidas como sigue. (1) Se desarrolla un enfoque composicional basado en CLP para la generación de casos de prueba, el cual busca aliviar el problema de la explosión de caminos interprocedimiento analizando de forma separada cada componente (p.ej. método) del programa bajo pruebas, almacenando los resultados y reutilizándolos incrementalmente hasta obtener resultados para el programa completo. También se ha desarrollado un enfoque composicional basado en especialización de programas (evaluación parcial) para la herramienta de ejecución simbólica Symbolic PathFinder (SPF). (2) Se propone una metodología para usar información del consumo de recursos del programa bajo pruebas para guiar la ejecución simbólica hacia aquellas partes del programa que satisfacen una determinada política de recursos, evitando la exploración de aquellas partes del programa que violan dicha política. (3) Se propone una metodología genérica para guiar la ejecución simbólica hacia las partes más interesantes del programa, la cual utiliza abstracciones como generadores de trazas para guiar la ejecución de acuerdo a criterios de selección estructurales. (4) Se propone un nuevo resolutor de restricciones, el cual maneja eficientemente restricciones sobre el uso de la memoria dinámica global (heap) durante ejecución simbólica, el cual mejora considerablemente el rendimiento de la técnica estándar utilizada para este propósito, la \lazy initialization". (5) Todas las técnicas propuestas han sido implementadas en el sistema PET (el enfoque composicional ha sido también implementado en la herramienta SPF). Mediante evaluación experimental se ha confirmado que todas ellas mejoran considerablemente la escalabilidad y eficiencia de la ejecución simbólica y la generación de casos de prueba. ABSTRACT Testing is nowadays the most used technique to validate software and assess its quality. It is integrated into all practical software development methodologies and plays a crucial role towards the success of any software project. From the smallest units of code to the most complex components and their integration into a software system and later deployment; all pieces of a software product must be tested thoroughly before a software product can be released. The main limitation of software testing is that it remains a mostly manual task, representing a large fraction of the total development cost. In this scenario, test automation is paramount to alleviate such high costs. Test case generation (TCG) is the process of automatically generating test inputs that achieve high coverage of the system under test. Among a wide variety of approaches to TCG, this thesis focuses on structural (white-box) TCG, where one of the most successful enabling techniques is symbolic execution. In symbolic execution, the program under test is executed with its input arguments being symbolic expressions rather than concrete values. This thesis relies on a previously developed constraint-based TCG framework for imperative object-oriented programs (e.g., Java), in which the imperative program under test is first translated into an equivalent constraint logic program, and then such translated program is symbolically executed by relying on standard evaluation mechanisms of Constraint Logic Programming (CLP), extended with special treatment for dynamically allocated data structures. Improving the scalability and efficiency of symbolic execution constitutes a major challenge. It is well known that symbolic execution quickly becomes impractical due to the large number of paths that must be explored and the size of the constraints that must be handled. Moreover, symbolic execution-based TCG tends to produce an unnecessarily large number of test cases when applied to medium or large programs. The contributions of this dissertation can be summarized as follows. (1) A compositional approach to CLP-based TCG is developed which overcomes the inter-procedural path explosion by separately analyzing each component (method) in a program under test, stowing the results as method summaries and incrementally reusing them to obtain whole-program results. A similar compositional strategy that relies on program specialization is also developed for the state-of-the-art symbolic execution tool Symbolic PathFinder (SPF). (2) Resource-driven TCG is proposed as a methodology to use resource consumption information to drive symbolic execution towards those parts of the program under test that comply with a user-provided resource policy, avoiding the exploration of those parts of the program that violate such policy. (3) A generic methodology to guide symbolic execution towards the most interesting parts of a program is proposed, which uses abstractions as oracles to steer symbolic execution through those parts of the program under test that interest the programmer/tester most. (4) A new heap-constraint solver is proposed, which efficiently handles heap-related constraints and aliasing of references during symbolic execution and greatly outperforms the state-of-the-art standard technique known as lazy initialization. (5) All techniques above have been implemented in the PET system (and some of them in the SPF tool). Experimental evaluation has confirmed that they considerably help towards a more scalable and efficient symbolic execution and TCG.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis comprises two main objectives. The first objective involved the stereochemical studies of chiral 4,6-diamino-1-aryl-1,2-dihydro-s-triazines and an investigation on how the different conformations of these stereoisomers may affect their binding affinity to the enzyme dihydrofolate reductase (DHFR). The ortho-substituted 1-aryl-1,2-dihydro-s-triazines were synthesised by the three component method. An ortho-substitution at the C6' position was observed when meta-azidocycloguanil was decomposed in acid. The ortho-substituent restricts free rotation and this gives rise to atropisomerism. Ortho-substituted 4,6-diamino-1-aryl-2-ethyl-1,2-dihydro-2-methyl-s-triazine contains two elements of chirality and therefore exists as four stereoisomers: (S,aR), (R,aS), (R,aR) and (S,aS). The energy barriers to rotation of these compounds were calculated by a semi-empirical molecular orbital program called MOPAC and they were found to be in excess of 23 kcal/mol. The diastereoisomers were resolved and enriched by C18 reversed phase h.p.l.c. Nuclear overhauser effect experiments revealed that (S,aR) and (R,aS) were the more stable pair of stereoisomers and therefore existed as the major component. The minor diastereoisomers showed greater binding affinity for the rat liver DHFR in in vitro assay. The second objective entailed the investigation into the possibility of retaining DHFR inhibitory activity by replacing the classical diamino heterocyclic moiety with an amidinyl group. 4-Benzylamino-3-nitro-N,N-dimethyl-phenylamidine was synthesised in two steps. One of the two phenylamidines indicated weak inhibition against the rat liver DHFR. This weak activity may be due to the failure of the inhibitor molecule to form strong hydrogen bonds with residue Glu-30 at the active site of the enzyme.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Classification schemes undergo revision. However, in a networked environment revisions can be used to add dimensionality to classification. This dimensionality can be used to help explain conceptual warrant, explain the shift from disciplinary to multidisciplinary knowledge production, and as a component method of domain analysis. Further, subject ontogeny might be used in cooperative networked projects like digital preservation, online access tools, and interoperability frameworks.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper aims to establish possible tourism demand scenarios of European travellers to Portugal based on the relationship with changing population structures. A combination of the EuROBAROMETER report 370 (“Attitudes of Europeans towards Tourism in 2013”) and the cohort-component method for population projections will allow the development of different possible tourism demand scenarios. Following the European report, individuals who travelled in 2013 were most likely to live in a household with two or more individuals. Thus, if elderly couples are together till later in their life and in better physiological shape, it is possible that the number of elderly individuals travelling for tourism purposes will increase in the near future. If we can expect tourists from developing countries to be younger due to their demographic dynamics than those from developed countries, where the ageing population is growing fast, we can expect that the percentage of the elderly among tourists will increase. Furthermore, the 2013 European report found that the combination of socio-demographic variables, such as, age, population, gender, household dimension, country of residence and trip purpose explained tourism demand scenarios for Portugal, confirming that seniors and families evidence a paramount sense of importance for the destination. In the literature there is a lack of discussion about the effects of demography in the future and the role of an ageing population in tourism demand choice patterns. We aim to contribute to filling this gap. Consequently, we strongly believe that this paper contributes to the literature by introducing a new field of discussion about the importance of demographic changes in shaping travel trends.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper aims to establish possible tourism demand scenarios of European travellers to Portugal based on the relationship with changing population structures. A combination of the EUROBAROMETER report 370 (“Attitudes of Europeans towards Tourism in 2013”) and the cohort-component method for population projections will allow the development of different possible tourism demand scenarios. Following the European report, individuals who travelled in 2013 were most likely to live in a household with two or more individuals. Thus, if elderly couples are together till later in their life and in better physiological shape, it is possible that the number of elderly individuals travelling for tourism purposes will increase in the near future. If we can expect tourists from developing countries to be younger due to their demographic dynamics than those from developed countries, where the ageing population is growing fast, we can expect that the percentage of the elderly among tourists will increase. Furthermore, the 2013 European report found that the combination of socio-demographic variables, such as, age, population, gender, household dimension, country of residence and trip purpose explained tourism demand scenarios for Portugal, confirming that seniors and families evidence a paramount sense of importance for the destination. In the literature there is a lack of discussion about the effects of demography in the future and the role of an ageing population in tourism demand choice patterns. We aim to contribute to filling this gap. Consequently, we strongly believe that this paper contributes to the literature by introducing a new field of discussion about the importance of demographic changes in shaping travel trends.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Modular modelling, dynamics simulation, multibodies, O(N) method, closed loops, post-stabilization

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Diagnosis of several neurological disorders is based on the detection of typical pathological patterns in the electroencephalogram (EEG). This is a time-consuming task requiring significant training and experience. Automatic detection of these EEG patterns would greatly assist in quantitative analysis and interpretation. We present a method, which allows automatic detection of epileptiform events and discrimination of them from eye blinks, and is based on features derived using a novel application of independent component analysis. The algorithm was trained and cross validated using seven EEGs with epileptiform activity. For epileptiform events with compensation for eyeblinks, the sensitivity was 65 +/- 22% at a specificity of 86 +/- 7% (mean +/- SD). With feature extraction by PCA or classification of raw data, specificity reduced to 76 and 74%, respectively, for the same sensitivity. On exactly the same data, the commercially available software Reveal had a maximum sensitivity of 30% and concurrent specificity of 77%. Our algorithm performed well at detecting epileptiform events in this preliminary test and offers a flexible tool that is intended to be generalized to the simultaneous classification of many waveforms in the EEG.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A modified Bargmann-Wigner method is used to derive (6s + 1)-component wave equations. The relation between different forms of these equations is shown.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, a novel method for power quality signal decomposition is proposed based on Independent Component Analysis (ICA). This method aims to decompose the power system signal (voltage or current) into components that can provide more specific information about the different disturbances which are occurring simultaneously during a multiple disturbance situation. The ICA is originally a multichannel technique. However, the method proposes its use to blindly separate out disturbances existing in a single measured signal (single channel). Therefore, a preprocessing step for the ICA is proposed using a filter bank. The proposed method was applied to synthetic data, simulated data, as well as actual power system signals, showing a very good performance. A comparison with the decomposition provided by the Discrete Wavelet Transform shows that the proposed method presented better decoupling for the analyzed data. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this short review, we provide some new insights into the material synthesis and characterization of modern multi-component superconducting oxides. Two different approaches such as the high-pressure, high-temperature method and ceramic combinatorial chemistry will be reported with application to several typical examples. First, we highlight the key role of the extreme conditions in the growth of Fe-based superconductors, where a careful control of the composition-structure relation is vital for understanding the microscopic physics. The availability of high-quality LnFeAsO (Ln = lanthanide) single crystals with substitution of O by F, Sm by Th, Fe by Co, and As by P allowed us to measure intrinsic and anisotropic superconducting properties such as Hc2, Jc. Furthermore, we demonstrate that combinatorial ceramic chemistry is an efficient way to search for new superconducting compounds. A single-sample synthesis concept based on multi-element ceramic mixtures can produce a variety of local products. Such a system needs local probe analyses and separation techniques to identify compounds of interest. We present the results obtained from random mixtures of Ca, Sr, Ba, La, Zr, Pb, Tl, Y, Bi, and Cu oxides reacted at different conditions. By adding Zr but removing Tl, Y, and Bi, the bulk state superconductivity got enhanced up to about 122 K.