929 resultados para Source Code Analysis
Resumo:
Objective: Gelastic seizures are a frequent and well established manifestation of the epilepsy associated with hypothalamic hamartomas. The scalp EEG recordings very seldom demonstrate clear spike activity and the information about the ictal epilepsy dynamics is limited. In this work, we try to isolate epileptic rhythms in gelastic seizures and study their generators. Methods: We extracted rhythmic activity from EEG scalp recordings of gelastic seizures using decomposition in independent components (ICA) in three patients, two with hypothalamic hamartomas and one with no hypothalamic lesion. Time analysis of these rhythms and inverse source analysis was done to recover their foci of origin and temporal dynamics. Results: In the two patients with hypothalamic hamartomas consistent ictal delta (2–3 Hz) rhythms were present, with subcortical generators in both and a superficial one in a single patient. The latter pattern was observed in the patient with no hypothalamic hamartoma visible in MRI. The deep generators activated earlier than the superficial ones, suggesting a consistent sub-cortical origin of the rhythmical activity. Conclusions: Our data is compatible with early and brief epileptic generators in deep sub-cortical regions and more superficial ones activating later. Significance: Gelastic seizures express rhythms on scalp EEG compatible with epileptic activity originating in sub-cortical generators and secondarily involving cortical ones.
Resumo:
Objective: The Panayiotopoulos type of idiopathic occipital epilepsy has peculiar and easily recognizable ictal symptoms, which are associated with complex and variable spike activity over the posterior scalp areas. These characteristics of spikes have prevented localization of the particular brain regions originating clinical manifestations. We studied spike activity in this epilepsy to determine their brain generators. Methods: The EEG of 5 patients (ages 7–9) was recorded, spikes were submitted to blind decomposition in independent components (ICs) and those to source analysis (sLORETA), revealing the spike generators. Coherence analysis evaluated the dynamics of the components. Results: Several ICs were recovered for posterior spikes in contrast to central spikes which originated a single one. Coherence analysis supports a model with epileptic activity originating near lateral occipital area and spreading to cortical temporal or parietal areas. Conclusions: Posterior spikes demonstrate rapid spread of epileptic activity to nearby lobes, starting in the lateral occipital area. In contrast, central spikes remain localized in the rolandic fissure. Significance: Rapid spread of posterior epileptic activity in the Panayitopoulos type of occipital lobe epilepsy is responsible for the variable and poorly localized spike EEG. The lateral occipital cortex is the primary generator of the epileptic activity.
Resumo:
Objective: The epilepsies associated with the tuberous sclerosis complex (TSC) are very often refractory to medical therapy. Surgery for epilepsy is an effective alternative when the critical link between the localization of seizure onset in the scalp and a particular cortical tuber can be established. In this study we perform analysis of ictal and interictal EEG to improve such link. Methods: The ictal and interictal recordings of four patients with TSC undergoing surgery for epilepsy were submitted to independent component analysis (ICA), followed by source analysis, using the sLORETA algorithm. The localizations obtained for the ictal EEG and for the average interictal spikes were compared. Results: The ICA of ictal EEG produced consistent results in different events, and there was good agreement with the tubers that were successfully removed in three of the four patients (one patient refused surgery). In some patients there was a large discrepancy between the localization of ictal and interictal sources. The interictal activity produced more widespread source localizations. Conclusions: The use of ICA of ictal EEG followed by the use of source analysis methods in four cases of epilepsy and TSC was able to localize the epileptic generators very near the lesions successfully removed in surgery for epilepsy. Significance: The ICA of ictal EEG events may be a useful add-on to the tools used to establish the connection between epileptic scalp activity and the cortical tubers originating it, in patients with TSC considered for surgery of epilepsy.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics
Resumo:
Concurrent programming is a difficult and error-prone task because the programmer must reason about multiple threads of execution and their possible interleavings. A concurrent program must synchronize the concurrent accesses to shared memory regions, but this is not enough to prevent all anomalies that can arise in a concurrent setting. The programmer can misidentify the scope of the regions of code that need to be atomic, resulting in atomicity violations and failing to ensure the correct behavior of the program. Executing a sequence of atomic operations may lead to incorrect results when these operations are co-related. In this case, the programmer may be required to enforce the sequential execution of those operations as a whole to avoid atomicity violations. This situation is specially common when the developer makes use of services from third-party packages or modules. This thesis proposes a methodology, based on the design by contract methodology, to specify which sequences of operations must be executed atomically. We developed an analysis that statically verifies that a client of a module is respecting its contract, allowing the programmer to identify the source of possible atomicity violations.
Resumo:
One of the biggest challenges for humanity is global warming and consequently, climate changes. Even though there has been increasing public awareness and investments from numerous countries concerning renewable energies, fossil fuels are and will continue to be in the near future, the main source of energy. Carbon capture and storage (CCS) is believed to be a serious measure to mitigate CO2 concentration. CCS briefly consists of capturing CO2 from the atmosphere or stationary emission sources and transporting and storing it via mineral carbonation, in oceans or geological media. The latter is referred to as carbon capture and geological storage (CCGS) and is considered to be the most promising of all solutions. Generally it consists of a storage (e.g. depleted oil reservoirs and deep saline aquifers) and sealing (commonly termed caprock in the oil industry) formations. The present study concerns the injection of CO2 into deep aquifers and regardless injection conditions, temperature gradients between carbon dioxide and the storage formation are likely to occur. Should the CO2 temperature be lower than the storage formation, a contractive behaviour of the reservoir and caprock is expected. The latter can result in the opening of new paths or re-opening of fractures, favouring leakage and compromising the CCGS project. During CO2 injection, coupled thermo-hydro-mechanical phenomena occur, which due to their complexity, hamper the assessment of each relative influence. For this purpose, several analyses were carried out in order to evaluate their influences but focusing on the thermal contractive behaviour. It was finally concluded that depending on mechanical and thermal properties of the pair aquifer-seal, the sealing caprock can undergo significant decreases in effective stress.
Resumo:
Benefits of long-term monitoring have drawn considerable attention in healthcare. Since the acquired data provides an important source of information to clinicians and researchers, the choice for long-term monitoring studies has become frequent. However, long-term monitoring can result in massive datasets, which makes the analysis of the acquired biosignals a challenge. In this case, visualization, which is a key point in signal analysis, presents several limitations and the annotations handling in which some machine learning algorithms depend on, turn out to be a complex task. In order to overcome these problems a novel web-based application for biosignals visualization and annotation in a fast and user friendly way was developed. This was possible through the study and implementation of a visualization model. The main process of this model, the visualization process, comprised the constitution of the domain problem, the abstraction design, the development of a multilevel visualization and the study and choice of the visualization techniques that better communicate the information carried by the data. In a second process, the visual encoding variables were the study target. Finally, the improved interaction exploration techniques were implemented where the annotation handling stands out. Three case studies are presented and discussed and a usability study supports the reliability of the implemented work.
Resumo:
The purpose of this work is to understand the internal and external structure in which the company operates to provide an idea of the strategic actions needed to accomplish their organizational objectives. A strategic software was employed to build up phase one and phase two, phase one involved analysing internal and external factors that influence the company, comprehending their core competences, factors that influence the market and identification of strengths and weaknesses. Phase two consisted on providing an idea of their real competitive position and the suggestion of a development strategy, given the possible limitations in the external factors, the company should carefully analyse some of the opportunities present in the industry overseas to continue to develop their business and increase its profitability. Furthermore, a source of competitive advantage was found in their outbound logistics which could serve a differentiator between their competitors.
Resumo:
Introduction Polymerase chain reaction (PCR) may offer an alternative diagnostic option when clinical signs and symptoms suggest visceral leishmaniasis (VL) but microscopic scanning and serological tests provide negative results. PCR using urine is sensitive enough to diagnose human visceral leishmaniasis (VL). However, DNA quality is a crucial factor for successful amplification. Methods A comparative performance evaluation of DNA extraction methods from the urine of patients with VL using two commercially available extraction kits and two phenol-chloroform protocols was conducted to determine which method produces the highest quality DNA suitable for PCR amplification, as well as the most sensitive, fast and inexpensive method. All commercially available kits were able to shorten the duration of DNA extraction. Results With regard to detection limits, both phenol: chloroform extraction and the QIAamp DNA Mini Kit provided good results (0.1 pg of DNA) for the extraction of DNA from a parasite smaller than Leishmania (Leishmania) infantum (< 100fg of DNA). However, among 11 urine samples from subjects with VL, better performance was achieved with the phenol:chloroform method (8/11) relative to the QIAamp DNA Mini Kit (4/11), with a greater number of positive samples detected at a lower cost using PCR. Conclusion Our results demonstrate that phenol:chloroform with an ethanol precipitation prior to extraction is the most efficient method in terms of yield and cost, using urine as a non-invasive source of DNA and providing an alternative diagnostic method at a low cost.
Resumo:
User-generated advertising changed the world of advertising and changed the strategies used by marketers. Many researchers explored the dimensions of source credibility in traditional media and online advertising. However, little previous research explored the dimensions of source credibility in the context of user-generated advertising. This exploratory study aims to investigate the different dimensions of source credibility in the case of user-generated advertising. More precisely, this study will explore the following factors: (1) objectivity, (2) trustworthiness, (3) expertise, (4) familiarity, (5) attractiveness and (6) frequency. The results suggest that some of the dimensions of source credibility (objectivity, trustworthiness, expertise, familiarity and attractiveness) remain the same in the case of user-generated advertising. Additionally, a new dimension is added to the factors that explain source credibility (reputation). Furthermore, the analysis suggests that the dimension “frequency” is not an explanatory factor of credibility in the case of user-generated advertisement. The study also suggests that companies using user-generated advertisement as part of their overall marketing strategy should focus on objectivity, trustworthiness, expertise, attractiveness and reputation when selecting users that will communicate sponsored user-generated advertisements.
Resumo:
This thesis project concentrated on both the study and treatment of an early 20th century male portrait in oil from Colecção Caixa Geral de Depósitos, Lisbon, Portugal. The portrait of Januário Correia de Almeida, exhibits a tear (approximately 4.0 cm by 2.3 cm) associated with paint loss on the right upper side, where it is possible to observe an unusually thick size layer (approximately 50 microns) and an open weave mesh canvas. Size layers made from animal glue remain subject to severe dimensional changes due to changes in relative humidity (RH), thereby affecting the stability of the painting. In this case, the response to moisture of the size layer is minimal and the painting is largely uncracked with very little active flaking. This suggests that the size layer has undergone pre-treatment to render it unresponsive to moisture or water. Reconstructions based on late nineteenth century recipes using historically appropriate materials are used to explore various options for modifying the characteristics of gelatine, some of which may relate to the Portrait’s size layer. The thesis is separated into two parts: Part 1: Describes the history, condition, materials and techniques of the painting. It also details the treatment of Januário Correia de Almeida as well as the choices made and problems encountered during the treatment. Part 2: Discusses the history of commercial gelatine production, the choice of the appropriate animal source to extract the collagen to produce reconstructions of the portrait’s size layer as well as the characterization of selected reconstructions. The execution of a shallow textured infill led to one publication and one presentation: Abstract accepted for presentation and publication, International Meeting on Retouching of Cultural Heritage (RECH3), Francisco Brites, Leslie Carlyle and Raquel Marques, ‘’Hand building a Low Profile Textured Fill for a Large Loss’’.
Resumo:
The present case-study concerns about the analysis of the sale of Banif Mais, the sub-holding of Banif Group for specialized credit activity, taking into account the bank’s financial situation in 2014. In 2011, Portugal was submitted to an external finance programme carried out by troika which imposed very restricted measures to the financial sector. Subsequently, Banif was not able to accomplish the required results having to appeal to Government financing, being under a recapitalization plan since 2012.
Resumo:
Dissertação de Mestrado em Engenharia Informática
Resumo:
Dissertação de mestrado em Genética Molecular
Resumo:
PhD thesis in Biomedical Engineering