992 resultados para memory access complexity


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Episodic memory impairment is a well-recognized feature of mesial temporal lobe epilepsy. Semantic memory has received much less attention in this patient population. In this study, semantic memory aspects (word-picture matching, word definition, confrontation and responsive naming, and word list generation) in 19 patients with left and right temporal lobe epilepsy secondary to mesial temporal sclerosis (MTS) were compared with those of normal controls. Patients with LMTS showed impaired performance in word definition (compared to controls and RMTS) and in responsive naming (compared to controls). RMTS and LMTS patients performed worse than controls in word-picture matching. Both patients with left and right mesial temporal lobe epilepsy performed worse than controls in word list generation and in confrontation naming tests. Attentional-executive dysfunction may have contributed to these deficits. We conclude that patients with left and right NITS display impaired aspects of semantic knowledge. A better understanding of semantic processing difficulties in these patients will provide better insight into the difficulties with activities of daily living in this patient population. (C) 2007 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The brain is a complex system that, in the normal condition, has emergent properties like those associated with activity-dependent plasticity in learning and memory, and in pathological situations, manifests abnormal long-term phenomena like the epilepsies. Data from our laboratory and from the literature were classified qualitatively as sources of complexity and emergent properties from behavior to electrophysiological, cellular, molecular, and computational levels. We used such models as brainstem-dependent acute audiogenic seizures and forebrain-dependent kindled audiogenic seizures. Additionally we used chemical OF electrical experimental models of temporal lobe epilepsy that induce status epilepticus with behavioral, anatomical, and molecular sequelae such as spontaneous recurrent seizures and long-term plastic changes. Current Computational neuroscience tools will help the interpretation. storage, and sharing of the exponential growth of information derived from those studies. These strategies are considered solutions to deal with the complexity of brain pathologies such as the epilepsies. (C) 2008 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

C. L. Isaac and A. R. Mayes (1999a, 1999b) compared forgetting rates in amnesic patients and normal participants across a range of memory tasks. Although the results are complex, many of them appear to be replicable and there are several commendable features to the design and analysis. Nevertheless, the authors largely ignored 2 relevant literatures: the traditional literature on proactive inhibition/interference and the formal analyses of the complexity of the bindings (associations) required for memory tasks. It is shown how the empirical results and conceptual analyses in these literatures are needed to guide the choice of task, the design of experiments, and the interpretation of results for amnesic patients and normal participants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During a naming task, time pressure and a manipulation of the proportion of related prime-target pairs were used to induce subjects to generate an expectation to the prime. On some trials, the presented target was orthographically and generally phonologically similar to the expected tal-get. The expectancy manipulation was barely detectable in the priming data but was clearly evident on a final recognition test. In addition, the recognition data showed that the nearly simultaneous activation of an expectation and sensory information derived from the orthographically and phonologically similar target produced a false memory. It is argued that this represents a blend memory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Three experiments investigated the effect of complexity on children's understanding of a beam balance. In nonconflict problems, weights or distances varied, while the other was held constant. In conflict items, both weight and distance varied, and items were of three kinds: weight dominant, distance dominant, or balance (in which neither was dominant). In Experiment 1, 2-year-old children succeeded on nonconflict-weight and nonconflict-distance problems. This result was replicated in Experiment 2, but performance on conflict items did not exceed chance. In Experiment 3, 3- and 4-year-olds succeeded on all except conflict balance problems, while 5- and 6-year-olds succeeded on all problem types. The results were interpreted in terms of relational complexity theory. Children aged 2 to 4 years succeeded on problems that entailed binary relations, but 5- and 6-year-olds also succeeded on problems that entailed ternary relations. Ternary relations tasks from other domains-transitivity and class inclusion-accounted for 93% of the age-related variance in balance scale scores. (C) 2002 Elsevier Science (USA).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two experiments tested predictions from a theory in which processing load depends on relational complexity (RC), the number of variables related in a single decision. Tasks from six domains (transitivity, hierarchical classification, class inclusion, cardinality, relative-clause sentence comprehension, and hypothesis testing) were administered to children aged 3-8 years. Complexity analyses indicated that the domains entailed ternary relations (three variables). Simpler binary-relation (two variables) items were included for each domain. Thus RC was manipulated with other factors tightly controlled. Results indicated that (i) ternary-relation items were more difficult than comparable binary-relation items, (ii) the RC manipulation was sensitive to age-related changes, (iii) ternary relations were processed at a median age of 5 years, (iv) cross-task correlations were positive, with all tasks loading on a single factor (RC), (v) RC factor scores accounted for 80% (88%) of age-related variance in fluid intelligence (compositionality of sets), (vi) binary- and ternary-relation items formed separate complexity classes, and (vii) the RC approach to defining cognitive complexity is applicable to different content domains. (C) 2002 Elsevier Science (USA). All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The phenomenon of aging is nowadays society as acquired the status of a social problem, with growing attention and concern, leading to an increase number of studies dedicated to the elderly. The lack of domestic, familiar or social support often lead elderly to nursing homes. Institutionalization is in many cases the only opportunity to have access to health care and life quality. Aging is also associated with a higher prevalence of chronic diseases that require long term medication sometimes for life. Frequently the onset of multiple pathologies at the same time require different therapies and the phenomenon of polypharmacy (five ou more drugs daily) can occur. Even more, the slow down of physiological and cognitives mechanisms associated with these chronic diseases can interphere, in one hand, with the pharmacocinetic of many medications and, on the other hand, with the facility to accomplish the therapeutical regimen. All of these realities contribute to an increase of pharmacotherapeutical complexity, decreasing the adherence and effectiveness of treatment. The pharmacotherapeutical complexity of an individual is characterized by the conciliator element of different characteristics of their drug therapy, such as: the number of medications used; dosage forms; dosing frequency and additional indications. It can be measured by the Medication Regimen Complexity Index (MRCI), originally validated in English.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A crescente complexidade dos sistemas electrónicos associada a um desenvolvimento nas tecnologias de encapsulamento levou à miniaturização dos circuitos integrados, provocando dificuldades e limitações no diagnóstico e detecção de falhas, diminuindo drasticamente a aplicabilidade dos equipamentos ICT. Como forma de lidar com este problema surgiu a infra-estrutura Boundary Scan descrita na norma IEEE1149.1 “Test Access Port and Boundary-Scan Architecture”, aprovada em 1990. Sendo esta solução tecnicamente viável e interessante economicamente para o diagnóstico de defeitos, efectua também outras aplicações. O SVF surgiu do desejo de incutir e fazer com que os fornecedores independentes incluíssem a norma IEEE 1149.1, é desenvolvido num formato ASCII, com o objectivo de enviar sinais, aguardar pela sua resposta, segundo a máscara de dados baseada na norma IEEE1149.1. Actualmente a incorporação do Boundary Scan nos circuitos integrados está em grande expansão e consequentemente usufrui de uma forte implementação no mercado. Neste contexto o objectivo da dissertação é o desenvolvimento de um controlador boundary scan que implemente uma interface com o PC e possibilite o controlo e monitorização da aplicação de teste ao PCB. A arquitectura do controlador desenvolvido contém um módulo de Memória de entrada, um Controlador TAP e uma Memória de saída. A implementação do controlador foi feita através da utilização de uma FPGA, é um dispositivo lógico reconfiguráveis constituído por blocos lógicos e por uma rede de interligações, ambos configuráveis, que permitem ao utilizador implementar as mais variadas funções digitais. A utilização de uma FPGA tem a vantagem de permitir a versatilidade do controlador, facilidade na alteração do seu código e possibilidade de inserir mais controladores dentro da FPGA. Foi desenvolvido o protocolo de comunicação e sincronização entre os vários módulos, permitindo o controlo e monitorização dos estímulos enviados e recebidos ao PCB, executados automaticamente através do software do Controlador TAP e de acordo com a norma IEEE 1149.1. A solução proposta foi validada por simulação utilizando o simulador da Xilinx. Foram analisados todos os sinais que constituem o controlador e verificado o correcto funcionamento de todos os seus módulos. Esta solução executa todas as sequências pretendidas e necessárias (envio de estímulos) à realização dos testes ao PCB. Recebe e armazena os dados obtidos, enviando-os posteriormente para a memória de saída. A execução do trabalho permitiu concluir que os projectos de componentes electrónicos tenderão a ser descritos num nível de abstracção mais elevado, recorrendo cada vez mais ao uso de linguagens de hardware, no qual o VHDL é uma excelente ferramenta de programação. O controlador desenvolvido será uma ferramenta bastante útil e versátil para o teste de PCBs e outras funcionalidades disponibilizadas pelas infra-estruturas BS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Actualmente verifica-se que a complexidade dos sistemas informáticos tem vindo a aumentar, fazendo parte das nossas ferramentas diárias de trabalho a utilização de sistemas informáticos e a utilização de serviços online. Neste âmbito, a internet obtém um papel de destaque junto das universidades, ao permitir que alunos e professores possam interagir mais facilmente. A internet e a educação baseada na Web vêm oferecer acesso remoto a qualquer informação independentemente da localização ou da hora. Como consequência, qualquer pessoa com uma ligação à internet, ao poder adquirir informações sobre um determinado tema junto dos maiores peritos, obtém vantagens significativas. Os laboratórios remotos são uma solução muito valorizada no que toca a interligar tecnologia e recursos humanos em ambientes que podem estar afastados no tempo ou no espaço. A criação deste tipo de laboratórios e a sua utilidade real só é possível porque as tecnologias de comunicação emergentes têm contribuído de uma forma muito relevante para melhorar a sua disponibilização à distância. A necessidade de criação de laboratórios remotos torna-se imprescindível para pesquisas relacionadas com engenharia que envolvam a utilização de recursos escassos ou de grandes dimensões. Apoiado neste conceito, desenvolveu-se um laboratório remoto para os alunos de engenharia que precisam de testar circuitos digitais numa carta de desenvolvimento de hardware configurável, permitindo a utilização deste recurso de uma forma mais eficiente. O trabalho consistiu na criação de um laboratório remoto de baixo custo, com base em linguagens de programação open source, sendo utilizado como unidade de processamento um router da ASUS com o firmware OpenWrt. Este firmware é uma distribuição Linux para sistemas embutidos. Este laboratório remoto permite o teste dos circuitos digitais numa carta de desenvolvimento de hardware configurável em tempo real, utilizando a interface JTAG. O laboratório desenvolvido tem a particularidade de ter como unidade de processamento um router. A utilização do router como servidor é uma solução muito pouco usual na implementação de laboratórios remotos. Este router, quando comparado com um computador normal, apresenta uma capacidade de processamento e memória muito inferior, embora os testes efectuados provassem que apresenta um desempenho muito adequado às expectativas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The recent trends of chip architectures with higher number of heterogeneous cores, and non-uniform memory/non-coherent caches, brings renewed attention to the use of Software Transactional Memory (STM) as a fundamental building block for developing parallel applications. Nevertheless, although STM promises to ease concurrent and parallel software development, it relies on the possibility of aborting conflicting transactions to maintain data consistency, which impacts on the responsiveness and timing guarantees required by embedded real-time systems. In these systems, contention delays must be (efficiently) limited so that the response times of tasks executing transactions are upper-bounded and task sets can be feasibly scheduled. In this paper we assess the use of STM in the development of embedded real-time software, defending that the amount of contention can be reduced if read-only transactions access recent consistent data snapshots, progressing in a wait-free manner. We show how the required number of versions of a shared object can be calculated for a set of tasks. We also outline an algorithm to manage conflicts between update transactions that prevents starvation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Search Optimization methods are needed to solve optimization problems where the objective function and/or constraints functions might be non differentiable, non convex or might not be possible to determine its analytical expressions either due to its complexity or its cost (monetary, computational, time,...). Many optimization problems in engineering and other fields have these characteristics, because functions values can result from experimental or simulation processes, can be modelled by functions with complex expressions or by noise functions and it is impossible or very difficult to calculate their derivatives. Direct Search Optimization methods only use function values and do not need any derivatives or approximations of them. In this work we present a Java API that including several methods and algorithms, that do not use derivatives, to solve constrained and unconstrained optimization problems. Traditional API access, by installing it on the developer and/or user computer, and remote API access to it, using Web Services, are also presented. Remote access to the API has the advantage of always allow the access to the latest version of the API. For users that simply want to have a tool to solve Nonlinear Optimization Problems and do not want to integrate these methods in applications, also two applications were developed. One is a standalone Java application and the other a Web-based application, both using the developed API.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Even though Software Transactional Memory (STM) is one of the most promising approaches to simplify concurrent programming, current STM implementations incur significant overheads that render them impractical for many real-sized programs. The key insight of this work is that we do not need to use the same costly barriers for all the memory managed by a real-sized application, if only a small fraction of the memory is under contention lightweight barriers may be used in this case. In this work, we propose a new solution based on an approach of adaptive object metadata (AOM) to promote the use of a fast path to access objects that are not under contention. We show that this approach is able to make the performance of an STM competitive with the best fine-grained lock-based approaches in some of the more challenging benchmarks. (C) 2015 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

8th International Workshop on Multiple Access Communications (MACOM2015), Helsinki, Finland.