993 resultados para Memory management


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Verbal working memory and emotional self-regulation are impaired in Bipolar Disorder (BD). Our aim was to investigate the effect of Lamotrigine (LTG), which is effective in the clinical management of BD, on the neural circuits subserving working memory and emotional processing. Functional Magnetic Resonance Imaging data from 12 stable BD patients was used to detect LTG-induced changes as the differences in brain activity between drug-free and post-LTG monotherapy conditions during a verbal working memory (N-back sequential letter task) and an angry facial affect recognition task. For both tasks, LGT monotherapy compared to baseline was associated with increased activation mostly within the prefrontal cortex and cingulate gyrus, in regions normally engaged in verbal working memory and emotional processing. Therefore, LTG monotherapy in BD patients may enhance cortical function within neural circuits involved in memory and emotional self-regulation. © 2007 Elsevier B.V. and ECNP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The International Cooperation Agency (identified in this article as IDEA) working in Colombia is one of the most important in Colombian society with programs that support gender rights, human rights, justice and peace, scholarships, aboriginal population, youth, afro descendants population, economic development in communities, and environmental development. The identified problem is based on the diversified offer of services, collaboration and social intervention which requires diverse groups of people with multiple agendas, ways to support their mandates, disciplines, and professional competences. Knowledge creation and the growth and sustainability of the organization can be in danger because of a silo culture and the resulting reduced leverage of the separate group capabilities. Organizational memory is generally formed by the tacit knowledge of the organization members, given the value of accumulated experience that this kind of social work implies. Its loss is therefore a strategic and operational risk when most problem interventions rely on direct work in the socio-economic field and living real experiences with communities. The knowledge management solution presented in this article starts first, with the identification of the people and groups concerned and the creation of a knowledge map as a means to strengthen the ties between organizational members; second, by introducing a content management system designed to support the documentation process and knowledge sharing process; and third, introducing a methodology for the adaptation of a Balanced Scorecard based on the knowledge management processes. These three main steps lead to a knowledge management “solution” that has been implemented in the organization, comprising three components: a knowledge management system, training support and promotion of cultural change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rapid growth of virtualized data centers and cloud hosting services is making the management of physical resources such as CPU, memory, and I/O bandwidth in data center servers increasingly important. Server management now involves dealing with multiple dissimilar applications with varying Service-Level-Agreements (SLAs) and multiple resource dimensions. The multiplicity and diversity of resources and applications are rendering administrative tasks more complex and challenging. This thesis aimed to develop a framework and techniques that would help substantially reduce data center management complexity.^ We specifically addressed two crucial data center operations. First, we precisely estimated capacity requirements of client virtual machines (VMs) while renting server space in cloud environment. Second, we proposed a systematic process to efficiently allocate physical resources to hosted VMs in a data center. To realize these dual objectives, accurately capturing the effects of resource allocations on application performance is vital. The benefits of accurate application performance modeling are multifold. Cloud users can size their VMs appropriately and pay only for the resources that they need; service providers can also offer a new charging model based on the VMs performance instead of their configured sizes. As a result, clients will pay exactly for the performance they are actually experiencing; on the other hand, administrators will be able to maximize their total revenue by utilizing application performance models and SLAs. ^ This thesis made the following contributions. First, we identified resource control parameters crucial for distributing physical resources and characterizing contention for virtualized applications in a shared hosting environment. Second, we explored several modeling techniques and confirmed the suitability of two machine learning tools, Artificial Neural Network and Support Vector Machine, to accurately model the performance of virtualized applications. Moreover, we suggested and evaluated modeling optimizations necessary to improve prediction accuracy when using these modeling tools. Third, we presented an approach to optimal VM sizing by employing the performance models we created. Finally, we proposed a revenue-driven resource allocation algorithm which maximizes the SLA-generated revenue for a data center.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, the adaptation of Wireless Sensor Networks (WSNs) to application areas requiring mobility increased the security threats against confidentiality, integrity and privacy of the information as well as against their connectivity. Since, key management plays an important role in securing both information and connectivity, a proper authentication and key management scheme is required in mobility enabled applications where the authentication of a node with the network is a critical issue. In this paper, we present an authentication and key management scheme supporting node mobility in a heterogeneous WSN that consists of several low capabilities sensor nodes and few high capabilities sensor nodes. We analyze our proposed solution by using MATLAB (analytically) and by simulation (OMNET++ simulator) to show that it has less memory requirement and has good network connectivity and resilience against attacks compared to some existing schemes. We also propose two levels of secure authentication methods for the mobile sensor nodes for secure authentication and key establishment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In today’s big data world, data is being produced in massive volumes, at great velocity and from a variety of different sources such as mobile devices, sensors, a plethora of small devices hooked to the internet (Internet of Things), social networks, communication networks and many others. Interactive querying and large-scale analytics are being increasingly used to derive value out of this big data. A large portion of this data is being stored and processed in the Cloud due the several advantages provided by the Cloud such as scalability, elasticity, availability, low cost of ownership and the overall economies of scale. There is thus, a growing need for large-scale cloud-based data management systems that can support real-time ingest, storage and processing of large volumes of heterogeneous data. However, in the pay-as-you-go Cloud environment, the cost of analytics can grow linearly with the time and resources required. Reducing the cost of data analytics in the Cloud thus remains a primary challenge. In my dissertation research, I have focused on building efficient and cost-effective cloud-based data management systems for different application domains that are predominant in cloud computing environments. In the first part of my dissertation, I address the problem of reducing the cost of transactional workloads on relational databases to support database-as-a-service in the Cloud. The primary challenges in supporting such workloads include choosing how to partition the data across a large number of machines, minimizing the number of distributed transactions, providing high data availability, and tolerating failures gracefully. I have designed, built and evaluated SWORD, an end-to-end scalable online transaction processing system, that utilizes workload-aware data placement and replication to minimize the number of distributed transactions that incorporates a suite of novel techniques to significantly reduce the overheads incurred both during the initial placement of data, and during query execution at runtime. In the second part of my dissertation, I focus on sampling-based progressive analytics as a means to reduce the cost of data analytics in the relational domain. Sampling has been traditionally used by data scientists to get progressive answers to complex analytical tasks over large volumes of data. Typically, this involves manually extracting samples of increasing data size (progressive samples) for exploratory querying. This provides the data scientists with user control, repeatable semantics, and result provenance. However, such solutions result in tedious workflows that preclude the reuse of work across samples. On the other hand, existing approximate query processing systems report early results, but do not offer the above benefits for complex ad-hoc queries. I propose a new progressive data-parallel computation framework, NOW!, that provides support for progressive analytics over big data. In particular, NOW! enables progressive relational (SQL) query support in the Cloud using unique progress semantics that allow efficient and deterministic query processing over samples providing meaningful early results and provenance to data scientists. NOW! enables the provision of early results using significantly fewer resources thereby enabling a substantial reduction in the cost incurred during such analytics. Finally, I propose NSCALE, a system for efficient and cost-effective complex analytics on large-scale graph-structured data in the Cloud. The system is based on the key observation that a wide range of complex analysis tasks over graph data require processing and reasoning about a large number of multi-hop neighborhoods or subgraphs in the graph; examples include ego network analysis, motif counting in biological networks, finding social circles in social networks, personalized recommendations, link prediction, etc. These tasks are not well served by existing vertex-centric graph processing frameworks whose computation and execution models limit the user program to directly access the state of a single vertex, resulting in high execution overheads. Further, the lack of support for extracting the relevant portions of the graph that are of interest to an analysis task and loading it onto distributed memory leads to poor scalability. NSCALE allows users to write programs at the level of neighborhoods or subgraphs rather than at the level of vertices, and to declaratively specify the subgraphs of interest. It enables the efficient distributed execution of these neighborhood-centric complex analysis tasks over largescale graphs, while minimizing resource consumption and communication cost, thereby substantially reducing the overall cost of graph data analytics in the Cloud. The results of our extensive experimental evaluation of these prototypes with several real-world data sets and applications validate the effectiveness of our techniques which provide orders-of-magnitude reductions in the overheads of distributed data querying and analysis in the Cloud.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Maintaining accessibility to and understanding of digital information over time is a complex challenge that often requires contributions and interventions from a variety of individuals and organizations. The processes of preservation planning and evaluation are fundamentally implicit and share similar complexity. Both demand comprehensive knowledge and understanding of every aspect of to-be-preserved content and the contexts within which preservation is undertaken. Consequently, means are required for the identification, documentation and association of those properties of data, representation and management mechanisms that in combination lend value, facilitate interaction and influence the preservation process. These properties may be almost limitless in terms of diversity, but are integral to the establishment of classes of risk exposure, and the planning and deployment of appropriate preservation strategies. We explore several research objectives within the course of this thesis. Our main objective is the conception of an ontology for risk management of digital collections. Incorporated within this are our aims to survey the contexts within which preservation has been undertaken successfully, the development of an appropriate methodology for risk management, the evaluation of existing preservation evaluation approaches and metrics, the structuring of best practice knowledge and lastly the demonstration of a range of tools that utilise our findings. We describe a mixed methodology that uses interview and survey, extensive content analysis, practical case study and iterative software and ontology development. We build on a robust foundation, the development of the Digital Repository Audit Method Based on Risk Assessment. We summarise the extent of the challenge facing the digital preservation community (and by extension users and creators of digital materials from many disciplines and operational contexts) and present the case for a comprehensive and extensible knowledge base of best practice. These challenges are manifested in the scale of data growth, the increasing complexity and the increasing onus on communities with no formal training to offer assurances of data management and sustainability. These collectively imply a challenge that demands an intuitive and adaptable means of evaluating digital preservation efforts. The need for individuals and organisations to validate the legitimacy of their own efforts is particularly prioritised. We introduce our approach, based on risk management. Risk is an expression of the likelihood of a negative outcome, and an expression of the impact of such an occurrence. We describe how risk management may be considered synonymous with preservation activity, a persistent effort to negate the dangers posed to information availability, usability and sustainability. Risk can be characterised according to associated goals, activities, responsibilities and policies in terms of both their manifestation and mitigation. They have the capacity to be deconstructed into their atomic units and responsibility for their resolution delegated appropriately. We continue to describe how the manifestation of risks typically spans an entire organisational environment, and as the focus of our analysis risk safeguards against omissions that may occur when pursuing functional, departmental or role-based assessment. We discuss the importance of relating risk-factors, through the risks themselves or associated system elements. To do so will yield the preservation best-practice knowledge base that is conspicuously lacking within the international digital preservation community. We present as research outcomes an encapsulation of preservation practice (and explicitly defined best practice) as a series of case studies, in turn distilled into atomic, related information elements. We conduct our analyses in the formal evaluation of memory institutions in the UK, US and continental Europe. Furthermore we showcase a series of applications that use the fruits of this research as their intellectual foundation. Finally we document our results in a range of technical reports and conference and journal articles. We present evidence of preservation approaches and infrastructures from a series of case studies conducted in a range of international preservation environments. We then aggregate this into a linked data structure entitled PORRO, an ontology relating preservation repository, object and risk characteristics, intended to support preservation decision making and evaluation. The methodology leading to this ontology is outlined, and lessons are exposed by revisiting legacy studies and exposing the resource and associated applications to evaluation by the digital preservation community.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rapid growth of virtualized data centers and cloud hosting services is making the management of physical resources such as CPU, memory, and I/O bandwidth in data center servers increasingly important. Server management now involves dealing with multiple dissimilar applications with varying Service-Level-Agreements (SLAs) and multiple resource dimensions. The multiplicity and diversity of resources and applications are rendering administrative tasks more complex and challenging. This thesis aimed to develop a framework and techniques that would help substantially reduce data center management complexity. We specifically addressed two crucial data center operations. First, we precisely estimated capacity requirements of client virtual machines (VMs) while renting server space in cloud environment. Second, we proposed a systematic process to efficiently allocate physical resources to hosted VMs in a data center. To realize these dual objectives, accurately capturing the effects of resource allocations on application performance is vital. The benefits of accurate application performance modeling are multifold. Cloud users can size their VMs appropriately and pay only for the resources that they need; service providers can also offer a new charging model based on the VMs performance instead of their configured sizes. As a result, clients will pay exactly for the performance they are actually experiencing; on the other hand, administrators will be able to maximize their total revenue by utilizing application performance models and SLAs. This thesis made the following contributions. First, we identified resource control parameters crucial for distributing physical resources and characterizing contention for virtualized applications in a shared hosting environment. Second, we explored several modeling techniques and confirmed the suitability of two machine learning tools, Artificial Neural Network and Support Vector Machine, to accurately model the performance of virtualized applications. Moreover, we suggested and evaluated modeling optimizations necessary to improve prediction accuracy when using these modeling tools. Third, we presented an approach to optimal VM sizing by employing the performance models we created. Finally, we proposed a revenue-driven resource allocation algorithm which maximizes the SLA-generated revenue for a data center.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Previous research has shown that crotamine, a toxin isolated from the venom of Crotalus durissus terrificus, induces the release of acetylcholine and dopamine in the central nervous system of rats. Particularly, these neurotransmitters are important modulators of memory processes. Therefore, in this study we investigated the effects of crotamine infusion on persistence of memory in rats. We verified that the intrahippocampal infusion of crotamine (1 μg/μl; 1 μl/side) improved the persistence of object recognition and aversive memory. By other side, the intrahippocampal infusion of the toxin did not alter locomotor and exploratory activities, anxiety or pain threshold. These results demonstrate a future prospect of using crotamine as potential pharmacological tool to treat diseases involving memory impairment, although it is still necessary more researches to better elucidate the crotamine effects on hippocampus and memory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ca(2+)/calmodulin-dependent protein kinase II (CaMKII) functions both in regulation of insulin secretion and neurotransmitter release through common downstream mediators. Therefore, we hypothesized that pancreatic ß-cells acquire and store the information contained in calcium pulses as a form of metabolic memory, just as neurons store cognitive information. To test this hypothesis, we developed a novel paradigm of pulsed exposure of ß-cells to intervals of high glucose, followed by a 24-h consolidation period to eliminate any acute metabolic effects. Strikingly, ß-cells exposed to this high-glucose pulse paradigm exhibited significantly stronger insulin secretion. This metabolic memory was entirely dependent on CaMKII. Metabolic memory was reflected on the protein level by increased expression of proteins involved in glucose sensing and Ca(2+)-dependent vesicle secretion, and by elevated levels of the key ß-cell transcription factor MAFA. In summary, like neurons, human and mouse ß-cells are able to acquire and retrieve information.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Brazil, malaria remains a disease of major epidemiological importance because of the high number of cases in the Amazonian Region. Plasmodium spp infections during pregnancy are a significant public health problem with substantial risks for the pregnant woman, the foetus and the newborn child. In Brazil, the control of malaria during pregnancy is primarily achieved by prompt and effective treatment of the acute episodes. Thus, to assure rapid diagnosis and treatment for pregnant women with malaria, one of the recommended strategy for low transmission areas by World Health Organization and as part of a strategy by the Ministry of Health, the National Malaria Control Program has focused on integrative measures with woman and reproductive health. Here, we discuss the approach for the prevention and management of malaria during pregnancy in Brazil over the last 10 years (2003-2012) using morbidity data from Malaria Health Information System. Improving the efficiency and quality of healthcare and education and the consolidation of prevention programmes will be challenges in the control of malaria during pregnancy in the next decade.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thoracic injuries in general are of great importance due to their high incidence and high mortality. Thoracic impalement injuries are rare but severe due to the combination of cause, effect and result. This study's primary objective is to report the case of a young man who was impaled by a two-wheeled horse carriage shaft while crashing his motorcycle in a rural zone. An EMT-B ferry was called at the crash scene and a conscious patient was found, sustaining a severe impalement injury to the left hemithorax, suspended over the floor by the axial skeleton with the carriage shaft coming across his left chest. As a secondary objective, a literature review of thoracic impalement injuries is performed. Cases of thoracic impalement injury require unique and individualized care based on injury severity and affected organs. Reported protocols for managing impalement injuries are entirely anecdotal, with no uniformity on impaled patient's approach and management. In penetrating trauma, it is essential not to remove the impaled object, so that possible vascular lesions remain buffered by the object, avoiding major bleeding and exsanguination haemorrhage. Severed impaled thoracic patients should be transferred to a specialist centre for trauma care, as these lesions typically require complex multidisciplinary treatment. High-energy thoracic impalement injuries are rare and hold a high mortality rate, due to the complexity of trauma and associated injuries such as thoracic wall and lung lesions. Modern medicine still seems limited in cases of such seriousness, not always with satisfactory results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

12 Suppl 1

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To develop recommendations for the diagnosis, management and treatment of lupus nephritis in Brazil. Extensive literature review with a selection of papers based on the strength of scientific evidence and opinion of the Commission on Systemic Lupus Erythematosus members, Brazilian Society of Rheumatology. 1) Renal biopsy should be performed whenever possible and if this procedure is indicated; and, when the procedure is not possible, the treatment should be guided with the inference of histologic class. 2) Ideally, measures and precautions should be implemented before starting treatment, with emphasis on attention to the risk of infection. 3) Risks and benefits of treatment should be shared with the patient and his/her family. 4) The use of hydroxychloroquine (preferably) or chloroquine diphosphate is recommended for all patients (unless contraindicated) during induction and maintenance phases. 5) The evaluation of the effectiveness of treatment should be made with objective criteria of response (complete remission/partial remission/refractoriness). 6) ACE inhibitors and/or ARBs are recommended as antiproteinuric agents for all patients (unless contraindicated). 7) The identification of clinical and/or laboratory signs suggestive of proliferative or membranous glomerulonephritis should indicate an immediate implementation of specific therapy, including steroids and an immunosuppressive agent, even though histological confirmation is not possible. 8) Immunosuppressives must be used during at least 36 months, but these medications can be kept for longer periods. Its discontinuation should only be done when the patient achieve and maintain a sustained and complete remission. 9) Lupus nephritis should be considered as refractory when a full or partial remission is not achieved after 12 months of an appropriate treatment, when a new renal biopsy should be considered to assist in identifying the cause of refractoriness and in the therapeutic decision.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lingual thyroid gland is a rare clinical entity. The presence of an ectopic thyroid gland located at the base of the tongue may be presented with symptoms like dysphagia, dysphonia, and upper airway obstruction. We are introducing a case of an 8-year-old girl who had lingual thyroid that presented dysphagia and foreign body sensation in the throat. The diagnostic was reached with clinical examination, thyroid scintigraphy with Tc(99m) and ultrasound. A laryngoscopy was performed which confirmed a spherical mass at base of tongue. Investigation should include thyroid function tests. In this case we observed subclinical hypothyroidism. There are different types of surgical approaches for the treatment of this condition; however, the treatment with Levothyroxine Sodium allowed the stabilization of TSH levels and clinical improvement of symptoms in a follow-up of 2 years.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dysphagia is relatively common in individuals with neurological disorders. To describe the swallowing management and investigate associated factors with swallowing in a case series of patients with Parkinson's disease. It is a long-term study with 24 patients. The patients were observed in a five-year period (2006-2011). They underwent Fiberoptic Endoscopic Evaluation of Swallowing, Functional Oral Intake Scale and therapeutic intervention every three months. In the therapeutic intervention they received orientation about exercises to improve swallowing. The Chi-square, Kruskal-Wallis and Fisher's tests were used. The period of time for improvement or worsening of swallowing was described by Kaplan-Meier analysis. During the follow-up, ten patients improved, five stayed the same and nine worsened their swallowing functionality. The median time for improvement was ten months. Prior to the worsening there was a median time of 33 months of follow-up. There was no associated factor with improvement or worsening of swallowing. The maneuvers frequently indicated in therapeutic intervention were: chin-tuck, bolus consistency, bolus effect, strengthening-tongue, multiple swallows and vocal exercises. The swallowing management was characterized by swallowing assessment every three months with indication of compensatory and rehabilitation maneuvers, aiming to maintain the oral feeding without risks. There was no associated factor with swallowing functionality in this case series.