999 resultados para Processamento da linguagem natural (Computação)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of interactive systems involves several professionals and the integration between them normally uses common artifacts, such as models, that drive the development process. In the model-driven development approach, the interaction model is an artifact that includes the most of the aspects related to what and how the user can do while he/she interacting with the system. Furthermore, the interactive model may be used to identify usability problems at design time. Therefore, the central problematic addressed by this thesis is twofold. In the first place, the interaction modeling, in a perspective that helps the designer to explicit to developer, who will implement the interface, the aspcts related to the interaction process. In the second place, the anticipated identification of usability problems, that aims to reduce the application final costs. To achieve these goals, this work presents (i) the ALaDIM language, that aims to help the designer on the conception, representation and validation of his interactive message models; (ii) the ALaDIM editor, which was built using the EMF (Eclipse Modeling Framework) and its standardized technologies by OMG (Object Management Group); and (iii) the ALaDIM inspection method, which allows the anticipated identification of usability problems using ALaDIM models. ALaDIM language and editor were respectively specified and implemented using the OMG standards and they can be used in MDA (Model Driven Architecture) activities. Beyond that, we evaluated both ALaDIM language and editor using a CDN (Cognitive Dimensions of Notations) analysis. Finally, this work reports an experiment that validated the ALaDIM inspection method

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJETIVO: Descrever o desempenho de crianças com Distúrbio Específico de Linguagem (DEL) em provas de leitura, escrita, aritmética, consciência fonológica e memória seqüencial auditiva, assim como, verificar se há associação positiva entre as provas que avaliam a aprendizagem escolar e as que avaliam o processamento da informação. MÉTODOS: Vinte sujeitos com diagnóstico de DEL, com idades entre 7 e 12 anos, foram submetidos ao Teste de Desempenho Escolar (TDE) e a duas provas, que avaliam o processamento da informação (Perfil de Habilidades Fonológicas e Subteste de Memória Seqüencial Auditiva do Teste de Illinois de Habilidades Psicolingüísticas - ITPA). RESULTADOS: A maioria apresentou alteração em todas as provas realizadas. As associações entre o desempenho do grupo nas diferentes provas demonstram que a habilidade metafonológica apresentou associação estatisticamente significante com as habilidades de leitura (p=0,02) e escrita (p=0,02). Por sua vez, a habilidade de memória seqüencial auditiva apresentou associação estatisticamente significante apenas com a habilidade de aritmética (p=0,0003). CONCLUSÃO: O desempenho escolar, assim como as habilidades de consciência fonológica e memória de curto prazo mostraram-se defasados na maioria dos sujeitos avaliados, havendo associação positiva entre: a prova de memória de curto prazo e a prova de aritmética; a prova de consciência fonológica e as provas de leitura e escrita. Neste contexto, reforça-se aqui a utilização de programas de intervenção baseados em Modelos Psicolingüísticos, que sugerem o uso de estratégias individuais para o desenvolvimento das habilidades metafonológicas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Na computação científica é necessário que os dados sejam o mais precisos e exatos possível, porém a imprecisão dos dados de entrada desse tipo de computação pode estar associada às medidas obtidas por equipamentos que fornecem dados truncados ou arredondados, fazendo com que os cálculos com esses dados produzam resultados imprecisos. Os erros mais comuns durante a computação científica são: erros de truncamentos, que surgem em dados infinitos e que muitas vezes são truncados", ou interrompidos; erros de arredondamento que são responsáveis pela imprecisão de cálculos em seqüências finitas de operações aritméticas. Diante desse tipo de problema Moore, na década de 60, introduziu a matemática intervalar, onde foi definido um tipo de dado que permitiu trabalhar dados contínuos,possibilitando, inclusive prever o tamanho máximo do erro. A matemática intervalar é uma saída para essa questão, já que permite um controle e análise de erros de maneira automática. Porém, as propriedades algébricas dos intervalos não são as mesmas dos números reais, apesar dos números reais serem vistos como intervalos degenerados, e as propriedades algébricas dos intervalos degenerados serem exatamente as dos números reais. Partindo disso, e pensando nas técnicas de especificação algébrica, precisa-se de uma linguagem capaz de implementar uma noção auxiliar de equivalência introduzida por Santiago [6] que ``simule" as propriedades algébricas dos números reais nos intervalos. A linguagem de especificação CASL, Common Algebraic Specification Language, [1] é uma linguagem de especificação algébrica para a descrição de requisitos funcionais e projetos modulares de software, que vem sendo desenvolvida pelo CoFI, The Common Framework Initiative [2] a partir do ano de 1996. O desenvolvimento de CASL se encontra em andamento e representa um esforço conjunto de grandes expoentes da área de especificações algébricas no sentido de criar um padrão para a área. A dissertação proposta apresenta uma especificação em CASL do tipo intervalo, munido da aritmética de Moore, afim de que ele venha a estender os sistemas que manipulem dados contínuos, sendo possível não só o controle e a análise dos erros de aproximação, como também a verificação algébrica de propriedades do tipo de sistema aqui mencionado. A especificação de intervalos apresentada aqui foi feita apartir das especificações dos números racionais proposta por Mossakowaski em 2001 [3] e introduz a noção de igualdade local proposta por Santiago [6, 5, 4]

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents JFLoat, a software implementation of IEEE-754 standard for binary floating point arithmetic. JFloat was built to provide some features not implemented in Java, specifically directed rounding support. That feature is important for Java-XSC, a project developed in this Department. Also, Java programs should have same portability when using floating point operations, mainly because IEEE-754 specifies that programs should have exactly same behavior on every configuration. However, it was noted that programs using Java native floating point types may be machine and operating system dependent. Also, JFloat is a possible solution to that problem

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The constant increase of complexity in computer applications demands the development of more powerful hardware support for them. With processor's operational frequency reaching its limit, the most viable solution is the use of parallelism. Based on parallelism techniques and the progressive growth in the capacity of transistors integration in a single chip is the concept of MPSoCs (Multi-Processor System-on-Chip). MPSoCs will eventually become a cheaper and faster alternative to supercomputers and clusters, and applications developed for these high performance systems will migrate to computers equipped with MP-SoCs containing dozens to hundreds of computation cores. In particular, applications in the area of oil and natural gas exploration are also characterized by the high processing capacity required and would benefit greatly from these high performance systems. This work intends to evaluate a traditional and complex application of the oil and gas industry known as reservoir simulation, developing a solution with integrated computational systems in a single chip, with hundreds of functional unities. For this, as the STORM (MPSoC Directory-Based Platform) platform already has a shared memory model, a new distributed memory model were developed. Also a message passing library has been developed folowing MPI standard

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Researches in Requirements Engineering have been growing in the latest few years. Researchers are concerned with a set of open issues such as: communication between several user profiles involved in software engineering; scope definition; volatility and traceability issues. To cope with these issues a set of works are concentrated in (i) defining processes to collect client s specifications in order to solve scope issues; (ii) defining models to represent requirements to address communication and traceability issues; and (iii) working on mechanisms and processes to be applied to requirements modeling in order to facilitate requirements evolution and maintenance, addressing volatility and traceability issues. We propose an iterative Model-Driven process to solve these issues, based on a double layered CIM to communicate requirements related knowledge to a wider amount of stakeholders. We also present a tool to help requirements engineer through the RE process. Finally we present a case study to illustrate the process and tool s benefits and usage

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays several electronics devices support digital videos. Some examples of these devices are cellphones, digital cameras, video cameras and digital televisions. However, raw videos present a huge amount of data, millions of bits, for their representation as the way they were captured. To store them in its primary form it would be necessary a huge amount of disk space and a huge bandwidth to allow the transmission of these data. The video compression becomes essential to make possible information storage and transmission. Motion Estimation is a technique used in the video coder that explores the temporal redundancy present in video sequences to reduce the amount of data necessary to represent the information. This work presents a hardware architecture of a motion estimation module for high resolution videos according to H.264/AVC standard. The H.264/AVC is the most advanced video coder standard, with several new features which allow it to achieve high compression rates. The architecture presented in this work was developed to provide a high data reuse. The data reuse schema adopted reduces the bandwidth required to execute motion estimation. The motion estimation is the task responsible for the largest share of the gains obtained with the H.264/AVC standard so this module is essential for final video coder performance. This work is included in Rede H.264 project which aims to develop Brazilian technology for Brazilian System of Digital Television

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents the concept, design and implementation of a MP-SoC platform, named STORM (MP-SoC DirecTory-Based PlatfORM). Currently the platform is composed of the following modules: SPARC V8 processor, GPOP processor, Cache module, Memory module, Directory module and two different modles of Network-on-Chip, NoCX4 and Obese Tree. All modules were implemented using SystemC, simulated and validated, individually or in group. The modules description is presented in details. For programming the platform in C it was implemented a SPARC assembler, fully compatible with gcc s generated assembly code. For the parallel programming it was implemented a library for mutex managing, using the due assembler s support. A total of 10 simulations of increasing complexity are presented for the validation of the presented concepts. The simulations include real parallel applications, such as matrix multiplication, Mergesort, KMP, Motion Estimation and DCT 2D

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The academic community and software industry have shown, in recent years, substantial interest in approaches and technologies related to the area of model-driven development (MDD). At the same time, continues the relentless pursuit of industry for technologies to raise productivity and quality in the development of software products. This work aims to explore those two statements, through an experiment carried by using MDD technology and evaluation of its use on solving an actual problem under the security context of enterprise systems. By building and using a tool, a visual DSL denominated CALV3, inspired by the software factory approach: a synergy between software product line, domainspecific languages and MDD, we evaluate the gains in abstraction and productivity through a systematic case study conducted in a development team. The results and lessons learned from the evaluation of this tool within industry are the main contributions of this work

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The field of Wireless Sensor and Actuator Networks (WSAN) is fast increasing and has attracted the interest of both the research community and the industry because of several factors, such as the applicability of such networks in different application domains (aviation, civil engineering, medicine, and others). Moreover, advances in wireless communication and the reduction of hardware components size also contributed for a fast spread of these networks. However, there are still several challenges and open issues that need to be tackled in order to achieve the full potential of WSAN usage. The development of WSAN systems is one of the most relevant of these challenges considering the number of variables involved in this process. Currently, a broad range of WSAN platforms and low level programming languages are available to build WSAN systems. Thus, developers need to deal with details of different sensor platforms and low-level programming abstractions of sensor operational systems on one hand, and they also need to have specific (high level) knowledge about the distinct application domains, on the other hand. Therefore, in order to decouple the handling of these two different levels of knowledge, making easier the development process of WSAN systems, we propose LWiSSy (Domain Language for Wireless Sensor and Actuator Networks Systems), a domain specific language (DSL) for WSAN. The use of DSLs raises the abstraction level during the programming of systems and modularizes the system building in several steps. Thus, LWiSSy allows the domain experts to directly contribute in the development of WSANs without having knowledge on low level sensor platforms, and network experts to program sensor nodes to meet application requirements without having specific knowledge on the application domain. Additionally, LWiSSy enables the system decomposition in different levels of abstraction according to structural and behavioral features and granularities (network, node group and single node level programming)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Web services are software accessible via the Internet that provide functionality to be used by applications. Today, it is natural to reuse third-party services to compose new services. This process of composition can occur in two styles, called orchestration and choreography. A choreography represents a collaboration between services which know their partners in the composition, to achieve the service s desired functionality. On the other hand, an orchestration have a central process (the orchestrator) that coordinates all application operations. Our work is placed in this latter context, by proposing an abstract model for running service orchestrations. For this purpose, a graph reduction machine will be defined for the implementation of service orchestrations specified in a variant of the PEWS composition language. Moreover, a prototype of this machine (in Java) is built as a proof of concept

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The widespread growth in the use of smart cards (by banks, transport services, and cell phones, etc) has brought an important fact that must be addressed: the need of tools that can be used to verify such cards, so to guarantee the correctness of their software. As the vast majority of cards that are being developed nowadays use the JavaCard technology as they software layer, the use of the Java Modeling Language (JML) to specify their programs appear as a natural solution. JML is a formal language tailored to Java. It has been inspired by methodologies from Larch and Eiffel, and has been widely adopted as the de facto language when dealing with specification of any Java related program. Various tools that make use of JML have already been developed, covering a wide range of functionalities, such as run time and static checking. But the tools existent so far for static checking are not fully automated, and, those that are, do not offer an adequate level of soundness and completeness. Our objective is to contribute to a series of techniques, that can be used to accomplish a fully automated and confident verification of JavaCard applets. In this work we present the first steps to this. With the use of a software platform comprised by Krakatoa, Why and haRVey, we developed a set of techniques to reduce the size of the theory necessary to verify the specifications. Such techniques have yielded very good results, with gains of almost 100% in all tested cases, and has proved as a valuable technique to be used, not only in this, but in most real world problems related to automatic verification

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work consists on the study of two important problems arising from the operations of petroleum and natural gas industries. The first problem the pipe dimensioning problem on constrained gas distribution networks consists in finding the least cost combination of diameters from a discrete set of commercially available ones for the pipes of a given gas network, such that it respects minimum pressure requirements at each demand node and upstream pipe conditions. On its turn, the second problem the piston pump unit routing problem comes from the need of defining the piston pump unit routes for visiting a number of non-emergent wells in on-shore fields, i.e., wells which don t have enough pressure to make the oil emerge to surface. The periodic version of this problem takes into account the wells re-filling equation to provide a more accurate planning in the long term. Besides the mathematical formulation of both problems, an exact algorithm and a taboo search were developed for the solution of the first problem and a theoretical limit and a ProtoGene transgenetic algorithm were developed for the solution of the second problem. The main concepts of the metaheuristics are presented along with the details of their application to the cited problems. The obtained results for both applications are promising when compared to theoretical limits and alternate solutions, either relative to the quality of the solutions or to associated running time

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este trabalho apresenta um algoritmo transgenético híbrido para a solução de um Problema de Configuração de uma Rede de Distribuição de Gás Natural. O problema da configuração dessas redes requer a definição de um traçado por onde os dutos devem ser colocados para atender aos clientes. É estudada neste trabalho uma maneira de conectar os clientes em uma rede com arquitetura em forma de árvore. O objetivo é minimizar o custo de construção da rede, mesmo que para isso alguns clientes que não proporcionam lucros deixem de ser atendidos. Esse problema pode ser formulado computacionalmente através do Problema de Steiner com Prêmios. Este é um problema de otimização combinatória da classe dos NPÁrduos. Este trabalho apresenta um algoritmo heurístico para a solução do problema. A abordagem utilizada é chamada de Algoritmos Transgenéticos, que se enquadram na categoria dos algoritmos evolucionários. Para a geração de soluções inicias é utilizado um algoritmo primaldual, e pathrelinking é usado como intensificador

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A degradação dos recursos naturais é talvez o principal problema da região do semiárido brasileiro, e essa degradação é principalmente resultante das perdas de solo, decorrente do processo erosivo. Na busca de melhor conhecer esta problemática vem sendo empregado o processo de modelagem ambiental, cujo objetivo é identificar e propor soluções para a degradação dos solos. Nesse sentido, o trabalho aplica o modelo da Equação Universal de Perda de Solos (EUPS), desenvolvido nos Estados Unidos ao longo da década de 1950, agregado as ferramentas de geoprocessamento, informações de sensoriamento remoto e Sistemas de Informações Geográficas (SIGs). A área de estudo é a Microbacia Riacho Passagem localizada na região oeste do Estado do Rio Grande do Norte, a microbacia tem uma área de 221,7Km² e esta inserida no semiárido, região Nordeste do Brasil. A metodologia utilizada consiste: em agrupar as variáveis da EUPS no ambiente SIG utilizando imagens de satélite, levantamentos bibliográficos e trabalhos de campo. Para determinação das extensões das vertentes foi empregado o Modelo RAMPA, e para adequar a EUPS as condições da área de estudo, foram realizados ajuste através de modelos estatísticos, aperfeiçoando o trabalho e os resultados gerados pelo modelo. Ao fim do processo foi desenvolvida uma pseudo linguagem no aplicativo Linguagem Espacial para Geoprocessamento Algébrico (LEGAL) disponível no software SPRING versão 5.1.2 servindo de suporte para o processamento das informações contidas no banco de dados, base da EUPS. Os resultados demonstram que inicialmente é necessário delimitar com precisão o período seco e chuvoso, informação fundamental para a EUPS, uma vez que o trabalho busca identificar a perda de solo por erosão hídrica. O modelo RAMPA apresentou-se satisfatório e com elevado potencial de aplicação na determinação dos comprimentos de vertentes utilizando imagens de radar. Quanto ao comportamento das extensões de vertentes, na microbacia, o mesmo apresentou uma pequena variação na porção leste, maiores vertentes, área próxima a desembocadura. Após a aplicação do modelo o valor máximo de perda de solo foi 88 ton/ha.ano com núcleos localizados no NEOSSOLOS LITÓLICOS e o mínimo 0,01 ton/ha.ano localizado no domínio dos LATOSSOLOS e NEOSSOLOS FLÚVICOS. A erosão provoca diminuição do perfil de solo, principalmente nos NEOSSOLOS LITÓLICOS, resultando em alteração no balanço hídrico e conseqüentemente aumento da temperatura do solo, podendo desencadear a desertificação. Os resultados e a metodologia do presente trabalho poderão ser aplicados na busca pelo desenvolvimento sustentável, na região do semiárido brasileiro, auxiliando na compreensão do binômio uso do solo e capacidade de suporte do meio natural.