25 resultados para Tradução automática


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Millon describes the normal personality by means of adaptation styles that are effective in normal environments and personality disorders such as unadapted operating styles. To operacionalize his theoretical model, Millon has built several instruments, including the Millon Clinical Multiaxial Inventory III (MCMI-III), wich consists of a self report inventory composed by 175 true or false response items, containing four verification scales, and others scales wich evaluates 14 personality patterns and 10 clinical syndromes. The Substance Dependence scale (T) is placed along with Clinical Syndromes scales. This research is justified by the lack of a Brazilian instrument to assess personality psychopathological aspects, and aims to translate and semantically adapt the MCMI-III to the Brazilian context, checking validity elements of the Substance Dependence scale, and developing a computer application for assisting the evaluation of assessment results. To this intent, 2.588 individuals data was collected, male and female, aged between 18 and 85 years, characterized as belonging to a clinical or non-clinical group, who took part in the survey via the internet or in person. Respondents completed the MCMI-III, a socio-demographic questionnaire and a subgroup also answered to the Goldberg General Health Questionnaire (GHQ). Besides descriptive statistics, we performed the analysis using the Student t test, principal components analysis and internal consistency. Despite difficulties related to translating very specific English terms, the assessment by judges, experts on Millon´s theory, and the back translation, attested the adequacy of the Brazilian version. Factorial analysis indicated the grouping of translated T scale items into three factors (social activities prejudice, lack of impulse control, and oppositional behavior), by presenting a single item on a fourth factor (apparently related to seeking pleasurable stimuli). The Cronbach alpha for this set of items was 0,82, indicating an acceptable scale reliability. The data analysis resulted in distinction of scores between clinical and non-clinical groups and between men and women; the relationship between high scores on the scale T and the other scales; scores of drug users according to the declared used substance; and the relationship between high scores on T and the verification of disorder or risk on GHQ mental health factor, indicating the instrument´s adequate sensistivity in identifying psychopathologies and the relationship between the different disorders or psychopathological personality patterns. Although further studies are necessary to develop the scores transformation factors, the computerized correction tool was adequate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este trabalho apresenta uma extensão do provador haRVey destinada à verificação de obrigações de prova originadas de acordo com o método B. O método B de desenvolvimento de software abrange as fases de especificação, projeto e implementação do ciclo de vida do software. No contexto da verificação, destacam-se as ferramentas de prova Prioni, Z/EVES e Atelier-B/Click n Prove. Elas descrevem formalismos com suporte à checagem satisfatibilidade de fórmulas da teoria axiomática dos conjuntos, ou seja, podem ser aplicadas ao método B. A checagem de SMT consiste na checagem de satisfatibilidade de fórmulas da lógica de primeira-ordem livre de quantificadores dada uma teoria decidível. A abordagem de checagem de SMT implementada pelo provador automático de teoremas haRVey é apresentada, adotando-se a teoria dos vetores que não permite expressar todas as construções necessárias às especificações baseadas em conjuntos. Assim, para estender a checagem de SMT para teorias dos conjuntos destacam-se as teorias dos conjuntos de Zermelo-Frankel (ZFC) e de von Neumann-Bernays-Gödel (NBG). Tendo em vista que a abordagem de checagem de SMT implementada no haRVey requer uma teoria finita e pode ser estendida para as teorias nãodecidíveis, a teoria NBG apresenta-se como uma opção adequada para a expansão da capacidade dedutiva do haRVey à teoria dos conjuntos. Assim, através do mapeamento dos operadores de conjunto fornecidos pela linguagem B a classes da teoria NBG, obtem-se uma abordagem alternativa para a checagem de SMT aplicada ao método B

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Some programs may have their entry data specified by formalized context-free grammars. This formalization facilitates the use of tools in the systematization and the rise of the quality of their test process. This category of programs, compilers have been the first to use this kind of tool for the automation of their tests. In this work we present an approach for definition of tests from the formal description of the entries of the program. The generation of the sentences is performed by taking into account syntactic aspects defined by the specification of the entries, the grammar. For optimization, their coverage criteria are used to limit the quantity of tests without diminishing their quality. Our approach uses these criteria to drive generation to produce sentences that satisfy a specific coverage criterion. The approach presented is based on the use of Lua language, relying heavily on its resources of coroutines and dynamic construction of functions. With these resources, we propose a simple and compact implementation that can be optimized and controlled in different ways, in order to seek satisfaction the different implemented coverage criteria. To make the use of our tool simpler, the EBNF notation for the specification of the entries was adopted. Its parser was specified in the tool Meta-Environment for rapid prototyping

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This dissertation aims at extending the JCircus tool, a translator of formal specifications into code that receives a Circus specification as input, and translates the specification into Java code. Circus is a formal language whose syntax is based on Z s and CSP s syntax. JCircus generated code uses JCSP, which is a Java API that implements CSP primitives. As JCSP does not implement all CSP s primitives, the translation strategy from Circus to Java is not trivial. Some CSP primitives, like parallelism, external choice, communication and multi-synchronization are partially implemented. As an aditional scope, this dissertation will also develop a tool for testing JCSP programs, called JCSPUnit, which will also be included in JCircus new version. The extended version of JCircus will be called JCircus 2.0.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Typically Web services contain only syntactic information that describes their interfaces. Due to the lack of semantic descriptions of the Web services, service composition becomes a difficult task. To solve this problem, Web services can exploit the use of ontologies for the semantic definition of service s interface, thus facilitating the automation of discovering, publication, mediation, invocation, and composition of services. However, ontology languages, such as OWL-S, have constructs that are not easy to understand, even for Web developers, and the existing tools that support their use contains many details that make them difficult to manipulate. This paper presents a MDD tool called AutoWebS (Automatic Generation of Semantic Web Services) to develop OWL-S semantic Web services. AutoWebS uses an approach based on UML profiles and model transformations for automatic generation of Web services and their semantic description. AutoWebS offers an environment that provides many features required to model, implement, compile, and deploy semantic Web services

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The widespread growth in the use of smart cards (by banks, transport services, and cell phones, etc) has brought an important fact that must be addressed: the need of tools that can be used to verify such cards, so to guarantee the correctness of their software. As the vast majority of cards that are being developed nowadays use the JavaCard technology as they software layer, the use of the Java Modeling Language (JML) to specify their programs appear as a natural solution. JML is a formal language tailored to Java. It has been inspired by methodologies from Larch and Eiffel, and has been widely adopted as the de facto language when dealing with specification of any Java related program. Various tools that make use of JML have already been developed, covering a wide range of functionalities, such as run time and static checking. But the tools existent so far for static checking are not fully automated, and, those that are, do not offer an adequate level of soundness and completeness. Our objective is to contribute to a series of techniques, that can be used to accomplish a fully automated and confident verification of JavaCard applets. In this work we present the first steps to this. With the use of a software platform comprised by Krakatoa, Why and haRVey, we developed a set of techniques to reduce the size of the theory necessary to verify the specifications. Such techniques have yielded very good results, with gains of almost 100% in all tested cases, and has proved as a valuable technique to be used, not only in this, but in most real world problems related to automatic verification

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The northern portion of the Rio Grande do Norte State is characterized by intense coastal dynamics affecting areas with ecosystems of moderate to high environmental sensitivity. In this region are installed the main socioeconomic activities of RN State: salt industry, shrimp farm, fruit industry and oil industry. The oil industry suffers the effects of coastal dynamic action promoting problems such as erosion and exposure of wells and pipelines along the shore. Thus came the improvement of such modifications, in search of understanding of the changes which causes environmental impacts with the purpose of detecting and assessing areas with greater vulnerability to variations. Coastal areas under influence oil industry are highly vulnerable and sensitive in case of accidents involving oil spill in the vicinity. Therefore, it was established the geoenvironmental monitoring of the region with the aim of evaluating the entire coastal area evolution and check the sensitivity of the site on the presence of oil. The goal of this work was the implementation of a computer system that combines the needs of insertion and visualization of thematic maps for the generation of Environmental Vulnerability maps, using techniques of Business Intelligence (BI), from vector information previously stored in the database. The fundamental design interest was to implement a more scalable system that meets the diverse fields of study and make the appropriate system for generating online vulnerability maps, automating the methodology so as to facilitate data manipulation and fast results in cases of real time operational decision-making. In database development a geographic area was established the conceptual model of the selected data and Web system was done using the template database PostgreSQL, PostGis spatial extension, Glassfish Web server and the viewer maps Web environment, the GeoServer. To develop a geographic database it was necessary to generate the conceptual model of the selected data and the Web system development was done using the PostgreSQL database system, its spatial extension PostGIS, the web server Glassfish and GeoServer to display maps in Web

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objetivo: Traduzir e avaliar as propriedades psicométricas do Mobility Assessment Tool Physical Activity (MAT-PA) em idosos comunitários brasileiros. Métodos: Trata-se de um estudo tradução, adaptação cultural, e acurácia do instrumento MAT-PA, no qual foram avaliados 329 idosos, com idade mínima de 60 anos, residentes na comunidade. Os indivíduos submeteram-se a um formulário de avaliação composto por: questionário sócio-demográfico e de saúde percebida; avaliação física; Prova Cognitiva de Leganés (PCL); Center for Epidemiologic Studies Depression Scale (CES-D); International Physical Activity Questionnaire (IPAQ); Mobility Assessment Tool Physical Activity (MAT-PA). Dessa amostra total, 42 idosos utilizaram o acelerômetro durante 8 dias. Para verificar a confiabilidade teste-reteste do MAT-PA, reaplicou-se esse instrumento em 34 idosos 8 dias após a primeira avaliação. A análise estatística utilizada foi a correlação de Spearman, o Coeficiente de Correlação Intra-classe, o coeficiente α de Cronbach, o Bland-Altman e o teste T pareado. Resultados: As correlações dos dados IPAQ e acelerômetro versus o escore total do MAT-PA foram significativas e apresentaram um coeficiente de correlação de Spearman de 0,13 e 0,41, respectivamente. Analisou-se também a confiabilidade que apresentou as seguintes medidas: consistência interna, pelo coeficiente alfa de Cronbach (α= 0,70); Concordância teste-reteste, pelo coeficiente de correlação intra-classe (CCI=0,53; p<0,001). Conclusão: A versão brasileira do Mobility Assessment Tool Physical Activity (MAT-PA) como um instrumento de avaliação da atividade física de idosos, mostrou ser um método válido e confiável.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research studies the application of syntagmatic analysis of written texts in the language of Brazilian Portuguese as a methodology for the automatic creation of extractive summaries. The automation of abstracts, while linked to the area of natural language processing (PLN) is studying ways the computer can autonomously construct summaries of texts. For this we use as presupposed the idea that switch to the computer the way a language is structured, in our case the Brazilian Portuguese, it will help in the discovery of the most relevant sentences, and consequently build extractive summaries with higher informativeness. In this study, we propose the definition of a summarization method that automatically perform the syntagmatic analysis of texts and through them, to build an automatic summary. The phrases that make up the syntactic structures are then used to analyze the sentences of the text, so the count of these elements determines whether or not a sentence will compose the summary to be generated

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Automatic detection of blood components is an important topic in the field of hematology. The segmentation is an important stage because it allows components to be grouped into common areas and processed separately and leukocyte differential classification enables them to be analyzed separately. With the auto-segmentation and differential classification, this work is contributing to the analysis process of blood components by providing tools that reduce the manual labor and increasing its accuracy and efficiency. Using techniques of digital image processing associated with a generic and automatic fuzzy approach, this work proposes two Fuzzy Inference Systems, defined as I and II, for autosegmentation of blood components and leukocyte differential classification, respectively, in microscopic images smears. Using the Fuzzy Inference System I, the proposed technique performs the segmentation of the image in four regions: the leukocyte’s nucleus and cytoplasm, erythrocyte and plasma area and using the Fuzzy Inference System II and the segmented leukocyte (nucleus and cytoplasm) classify them differentially in five types: basophils, eosinophils, lymphocytes, monocytes and neutrophils. Were used for testing 530 images containing microscopic samples of blood smears with different methods. The images were processed and its accuracy indices and Gold Standards were calculated and compared with the manual results and other results found at literature for the same problems. Regarding segmentation, a technique developed showed percentages of accuracy of 97.31% for leukocytes, 95.39% to erythrocytes and 95.06% for blood plasma. As for the differential classification, the percentage varied between 92.98% and 98.39% for the different leukocyte types. In addition to promoting auto-segmentation and differential classification, the proposed technique also contributes to the definition of new descriptors and the construction of an image database using various processes hematological staining