996 resultados para ease


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Projeto de Pós-Graduação/Dissertação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Medicina Dentária

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação apresentada à Universidade Fernando Pessoa como parte dos requisitos para a obtenção do grau de Mestre em Ciências da Comunicação, ramo de Marketing e Publicidade

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The proliferation of inexpensive workstations and networks has prompted several researchers to use such distributed systems for parallel computing. Attempts have been made to offer a shared-memory programming model on such distributed memory computers. Most systems provide a shared-memory that is coherent in that all processes that use it agree on the order of all memory events. This dissertation explores the possibility of a significant improvement in the performance of some applications when they use non-coherent memory. First, a new formal model to describe existing non-coherent memories is developed. I use this model to prove that certain problems can be solved using asynchronous iterative algorithms on shared-memory in which the coherence constraints are substantially relaxed. In the course of the development of the model I discovered a new type of non-coherent behavior called Local Consistency. Second, a programming model, Mermera, is proposed. It provides programmers with a choice of hierarchically related non-coherent behaviors along with one coherent behavior. Thus, one can trade-off the ease of programming with coherent memory for improved performance with non-coherent memory. As an example, I present a program to solve a linear system of equations using an asynchronous iterative algorithm. This program uses all the behaviors offered by Mermera. Third, I describe the implementation of Mermera on a BBN Butterfly TC2000 and on a network of workstations. The performance of a version of the equation solving program that uses all the behaviors of Mermera is compared with that of a version that uses coherent behavior only. For a system of 1000 equations the former exhibits at least a 5-fold improvement in convergence time over the latter. The version using coherent behavior only does not benefit from employing more than one workstation to solve the problem while the program using non-coherent behavior continues to achieve improved performance as the number of workstations is increased from 1 to 6. This measurement corroborates our belief that non-coherent shared memory can be a performance boon for some applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite the peer-to-peer community's obvious wish to have its systems adopted, specific mechanisms to facilitate incremental adoption have not yet received the same level of attention as the many other practical concerns associated with these systems. This paper argues that ease of adoption should be elevated to a first-class concern and accordingly presents HOLD, a front-end to existing DHTs that is optimized for incremental adoption. Specifically, HOLD is backwards-compatible: it leverages DNS to provide a key-based routing service to existing Internet hosts without requiring them to install any software. This paper also presents applications that could benefit from HOLD as well as the trade-offs that accompany HOLD. Early implementation experience suggests that HOLD is practical.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Making use of very detailed neurophysiological, anatomical, and behavioral data to build biological-realistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalabiltiy, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multu-compartmental neurons with biophysical properties such as membrane potential, voltage-gated and ligand-gated channels, the presence of gap junctions of ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, local-field potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plug-in development is also presented. Further developement of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effecitively collaborate using a modern neural simulation platform.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologically-realistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multi-compartmental neurons with biophysical properties such as membrane potential, voltage-gated and ligand-gated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, local-field potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plug-in development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cerium dioxide (ceria) nanoparticles have been the subject of intense academic and industrial interest. Ceria has a host of applications but academic interest largely stems from their use in the modern automotive catalyst but it is also of interest because of many other application areas notably as the abrasive in chemical-mechanical planarisation of silicon substrates. Recently, ceria has been the focus of research investigating health effects of nanoparticles. Importantly, the role of non-stoichiometry in ceria nanoparticles is implicated in their biochemistry. Ceria has well understood non-stoichiometry based around the ease of formation of anion vacancies and these can form ordered superstructures based around the fluorite lattice structure exhibited by ceria. The anion vacancies are associated with localised or small polaron states formed by the electrons that remain after oxygen desorption. In simple terms these electrons combine with Ce4+ states to form Ce3+ states whose larger ionic radii is associated with a lattice expansion compared to stoichiometric CeO2. This is a very simplistic explanation and greater defect chemistry complexity is suggested by more recent work. Various authors have shown that vacancies are mobile and may result in vacancy clustering. Ceria nanoparticles are of particular interest because of the high activity and surface area of small particulates. The sensitivity of the cerium electronic band structure to environment would suggest that changes in the properties of ceria particles at nanoscale dimensions might be expected. Notably many authors report a lattice expansion with reducing particle size (largely confined to sub-10 nm particles). Most authors assign increased lattice dimensions to the presence of a surface stable Ce2O3 type layer at low nanoparticle dimensions. However, our understanding of oxide nanoparticles is limited and their full and quantitative characterisation offers serious challenges. In a series of chemical preparations by ourselves we see little evidence of a consistent model emerging to explain lattice parameter changes with nanoparticle size. Based on these results and a review of the literature it is worthwhile asking if a model of surface enhanced defect concentration is consistent with known cerium/cerium oxide chemistries, whether this is applicable to a range of different synthesis methods and if a more consistent description is possible. In Chapter one the science of cerium oxide is outlined including the crystal structure, defect chemistry and different oxidation states available. The uses and applications of cerium oxide are also discussed as well as modelling of the lattice parameter and the doping of the ceria lattice. Chapter two describes both the synthesis techniques and the analytical methods employed to execute this research. Chapter three focuses on high surface area ceria nano-particles and how these have been prepared using a citrate sol-gel precipitation method. Changes to the particle size have been made by calcining the ceria powders at different temperatures. X-ray diffraction methods were used to determine their lattice parameters. The particles sizes were also assessed using transmission electron microscopy (TEM), scanning electron microscopy (SEM), and BET, and, the lattice parameter was found to decrease with decreasing particle size. The results are discussed in light of the role played by surface tension effects. Chapter four describes the morphological and structural characterization of crystalline CeO2 nanoparticles prepared by forward and reverse precipitation techniques and compares these by powder x-ray diffraction (PXRD), nitrogen adsorption (BET) and high resolution transmission electron microscopy (HRTEM) analysis. The two routes give quite different materials although in both cases the products are essentially highly crystalline, dense particulates. It was found that the reverse precipitation technique gave the smallest crystallites with the narrowest size dispersion. This route also gave as-synthesised materials with higher surface areas. HRTEM confirmed the observations made from PXRD data and showed that the two methods resulted in quite different morphologies and surface chemistries. The forward route gives products with significantly greater densities of Ce3+ species compared to the reverse route. Data are explained using known precipitation chemistry and kinetic effects. Chapter five centres on the addition of terbia to ceria and has been investigated using XRD, XRF, XPS and TEM. Good solid solutions were formed across the entire composition range and there was no evidence for the formation of mixed phases or surface segregation over either the composition or temperature range investigated. Both Tb3+ and Tb4+ ions exist within the solution and the ratios of these cations are consistent with the addition of Tb8O15 to the fluorite ceria structure across a wide range of compositions. Local regions of anion vacancy ordering may be visible for small crystallites. There is no evidence of significant Ce3+ ion concentrations formed at the surface or in the bulk by the addition of terbia. The lattice parameter of these materials was seen to decrease with decreasing crystallite size. This is consistent with increased surface tension effects at small dimension. Chapter six reviews size related lattice parameter changes and surface defects in ceria nanocrystals. Ceria (CeO2) has many important applications, notably in catalysis. Many of its uses rely on generating nanodimensioned particles. Ceria has important redox chemistry where Ce4+ cations can be reversibly reduced to Ce3+ cations and associated anion vacancies. The significantly larger size of Ce3+ (compared with Ce4+) has been shown to result in lattice expansion. Many authors have observed lattice expansion in nanodimensioned crystals (nanocrystals), and these have been attributed to the presence of stabilized Ce3+ -anion vacancy combinations in these systems. Experimental results presented here show (i) that significant, but complex changes in the lattice parameter with size can occur in 2-500 nm crystallites, (ii) that there is a definitive relationship between defect chemistry and the lattice parameter in ceria nanocrystals, and (iii) that the stabilizing mechanism for the Ce3+ -anion vacancy defects at the surface of ceria nanocrystals is determined by the size, the surface status, and the analysis conditions. In this work, both lattice expansion and a more unusual lattice contraction in ultrafine nanocrystals are observed. The lattice deformations seen can be defined as a function of both the anion vacancy (hydroxyl) concentration in the nanocrystal and the intensity of the additional pressure imposed by the surface tension on the crystal. The expansion of lattice parameters in ceria nanocrystals is attributed to a number of factors, most notably, the presence of any hydroxyl moieties in the materials. Thus, a very careful understanding of the synthesis combined with characterization is required to understand the surface chemistry of ceria nanocrystals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Electronic signal processing systems currently employed at core internet routers require huge amounts of power to operate and they may be unable to continue to satisfy consumer demand for more bandwidth without an inordinate increase in cost, size and/or energy consumption. Optical signal processing techniques may be deployed in next-generation optical networks for simple tasks such as wavelength conversion, demultiplexing and format conversion at high speed (≥100Gb.s-1) to alleviate the pressure on existing core router infrastructure. To implement optical signal processing functionalities, it is necessary to exploit the nonlinear optical properties of suitable materials such as III-V semiconductor compounds, silicon, periodically-poled lithium niobate (PPLN), highly nonlinear fibre (HNLF) or chalcogenide glasses. However, nonlinear optical (NLO) components such as semiconductor optical amplifiers (SOAs), electroabsorption modulators (EAMs) and silicon nanowires are the most promising candidates as all-optical switching elements vis-à-vis ease of integration, device footprint and energy consumption. This PhD thesis presents the amplitude and phase dynamics in a range of device configurations containing SOAs, EAMs and/or silicon nanowires to support the design of all optical switching elements for deployment in next-generation optical networks. Time-resolved pump-probe spectroscopy using pulses with a pulse width of 3ps from mode-locked laser sources was utilized to accurately measure the carrier dynamics in the device(s) under test. The research work into four main topics: (a) a long SOA, (b) the concatenated SOA-EAMSOA (CSES) configuration, (c) silicon nanowires embedded in SU8 polymer and (d) a custom epitaxy design EAM with fast carrier sweepout dynamics. The principal aim was to identify the optimum operation conditions for each of these NLO device configurations to enhance their switching capability and to assess their potential for various optical signal processing functionalities. All of the NLO device configurations investigated in this thesis are compact and suitable for monolithic and/or hybrid integration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The electroencephalogram (EEG) is a medical technology that is used in the monitoring of the brain and in the diagnosis of many neurological illnesses. Although coarse in its precision, the EEG is a non-invasive tool that requires minimal set-up times, and is suitably unobtrusive and mobile to allow continuous monitoring of the patient, either in clinical or domestic environments. Consequently, the EEG is the current tool-of-choice with which to continuously monitor the brain where temporal resolution, ease-of- use and mobility are important. Traditionally, EEG data are examined by a trained clinician who identifies neurological events of interest. However, recent advances in signal processing and machine learning techniques have allowed the automated detection of neurological events for many medical applications. In doing so, the burden of work on the clinician has been significantly reduced, improving the response time to illness, and allowing the relevant medical treatment to be administered within minutes rather than hours. However, as typical EEG signals are of the order of microvolts (μV ), contamination by signals arising from sources other than the brain is frequent. These extra-cerebral sources, known as artefacts, can significantly distort the EEG signal, making its interpretation difficult, and can dramatically disimprove automatic neurological event detection classification performance. This thesis therefore, contributes to the further improvement of auto- mated neurological event detection systems, by identifying some of the major obstacles in deploying these EEG systems in ambulatory and clinical environments so that the EEG technologies can emerge from the laboratory towards real-world settings, where they can have a real-impact on the lives of patients. In this context, the thesis tackles three major problems in EEG monitoring, namely: (i) the problem of head-movement artefacts in ambulatory EEG, (ii) the high numbers of false detections in state-of-the-art, automated, epileptiform activity detection systems and (iii) false detections in state-of-the-art, automated neonatal seizure detection systems. To accomplish this, the thesis employs a wide range of statistical, signal processing and machine learning techniques drawn from mathematics, engineering and computer science. The first body of work outlined in this thesis proposes a system to automatically detect head-movement artefacts in ambulatory EEG and utilises supervised machine learning classifiers to do so. The resulting head-movement artefact detection system is the first of its kind and offers accurate detection of head-movement artefacts in ambulatory EEG. Subsequently, addtional physiological signals, in the form of gyroscopes, are used to detect head-movements and in doing so, bring additional information to the head- movement artefact detection task. A framework for combining EEG and gyroscope signals is then developed, offering improved head-movement arte- fact detection. The artefact detection methods developed for ambulatory EEG are subsequently adapted for use in an automated epileptiform activity detection system. Information from support vector machines classifiers used to detect epileptiform activity is fused with information from artefact-specific detection classifiers in order to significantly reduce the number of false detections in the epileptiform activity detection system. By this means, epileptiform activity detection which compares favourably with other state-of-the-art systems is achieved. Finally, the problem of false detections in automated neonatal seizure detection is approached in an alternative manner; blind source separation techniques, complimented with information from additional physiological signals are used to remove respiration artefact from the EEG. In utilising these methods, some encouraging advances have been made in detecting and removing respiration artefacts from the neonatal EEG, and in doing so, the performance of the underlying diagnostic technology is improved, bringing its deployment in the real-world, clinical domain one step closer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main objective of this thesis is the critical analysis of the evolution of the criminal justice systems throughout the past decade, with special attention to the fight against transnational terrorism. It is evident – for any observer - that such threats and the associated risk that terrorism entails, has changed significantly throughout the past decade. This perception has generated answers – many times radical ones – by States, as they have committed themselves to warrant the safety of their populations and to ease a growing sentiment of social panic. This thesis seeks to analyse the characteristics of this new threat and the responses that States have developed in the fight against terrorism since 9/11, which have questioned some of the essential principles and values in place in their own legal systems. In such sense, freedom and security are placed into perspective throughout the analysis of the specific antiterrorist legal reforms of five different States: Israel, Portugal, Spain, the United Kingdom and the United States of America. On the other hand, in light of those antiterrorist reforms, it will be questioned if it is possible to speak of the emergence of a new system of criminal justice (and of a process of a convergence between common law and civil law systems), built upon a control and preventive security framework, significantly different from traditional models. Finally, this research project has the fundamental objective to contribute to a better understanding on the economic, social and civilization costs of those legal reforms regarding human rights, the rule of law and democracy in modern States.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Open environments involve distributed entities interacting with each other in an open manner. Many distributed entities are unknown to each other but need to collaborate and share resources in a secure fashion. Usually resource owners alone decide who is trusted to access their resources. Since resource owners in open environments do not have a complete picture of all trusted entities, trust management frameworks are used to ensure that only authorized entities will access requested resources. Every trust management system has limitations, and the limitations can be exploited by malicious entities. One vulnerability is due to the lack of globally unique interpretation for permission specifications. This limitation means that a malicious entity which receives a permission in one domain may misuse the permission in another domain via some deceptive but apparently authorized route; this malicious behaviour is called subterfuge. This thesis develops a secure approach, Subterfuge Safe Trust Management (SSTM), that prevents subterfuge by malicious entities. SSTM employs the Subterfuge Safe Authorization Language (SSAL) which uses the idea of a local permission with a globally unique interpretation (localPermission) to resolve the misinterpretation of permissions. We model and implement SSAL with an ontology-based approach, SSALO, which provides a generic representation for knowledge related to the SSAL-based security policy. SSALO enables integration of heterogeneous security policies which is useful for secure cooperation among principals in open environments where each principal may have a different security policy with different implementation. The other advantage of an ontology-based approach is the Open World Assumption, whereby reasoning over an existing security policy is easily extended to include further security policies that might be discovered in an open distributed environment. We add two extra SSAL rules to support dynamic coalition formation and secure cooperation among coalitions. Secure federation of cloud computing platforms and secure federation of XMPP servers are presented as case studies of SSTM. The results show that SSTM provides robust accountability for the use of permissions in federation. It is also shown that SSAL is a suitable policy language to express the subterfuge-safe policy statements due to its well-defined semantics, ease of use, and integrability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

More and more often, universities make the decision to implement integrated learning management systems. Nevertheless, these technological developments are not realized without any trouble, and are achieved with more or less success and user satisfaction (Valenduc, 2000). It is why the presented study aims at identifying the factors influencing learning management system satisfaction and acceptance among students. The Technology Acceptance model created by Wixom and Todd (2005) studies information system acceptance through user satisfaction, and has the benefit of incorporating several ergonomic factors. More precisely, the survey, based on this model, investigates behavioral attitudes towards the system, perceived ease of use, perceived usefulness, as well as system satisfaction, information satisfaction and also incorporates two groups of factors affecting separately the two types of satisfaction. The study was conducted on a representative sample of 593 students from a Brussels university which had recently implemented an integrated learning management system. The results show on one hand, the impact of system reliability, accessibility, flexibility, lay-out and functionalities offered on system satisfaction. And on the other hand, the impact of information accuracy, intelligibility, relevance, exhaustiveness and actualization on information satisfaction. In conclusion, the results indicate the applicability of the theoretical model with learning management systems, and also highlight the importance of each aforementioned factor for a successful implantation of such a system in universities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gemstone Team MICE (Modifying and Improving Computer Ergonomics)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

How do separate neural networks interact to support complex cognitive processes such as remembrance of the personal past? Autobiographical memory (AM) retrieval recruits a consistent pattern of activation that potentially comprises multiple neural networks. However, it is unclear how such large-scale neural networks interact and are modulated by properties of the memory retrieval process. In the present functional MRI (fMRI) study, we combined independent component analysis (ICA) and dynamic causal modeling (DCM) to understand the neural networks supporting AM retrieval. ICA revealed four task-related components consistent with the previous literature: 1) medial prefrontal cortex (PFC) network, associated with self-referential processes, 2) medial temporal lobe (MTL) network, associated with memory, 3) frontoparietal network, associated with strategic search, and 4) cingulooperculum network, associated with goal maintenance. DCM analysis revealed that the medial PFC network drove activation within the system, consistent with the importance of this network to AM retrieval. Additionally, memory accessibility and recollection uniquely altered connectivity between these neural networks. Recollection modulated the influence of the medial PFC on the MTL network during elaboration, suggesting that greater connectivity among subsystems of the default network supports greater re-experience. In contrast, memory accessibility modulated the influence of frontoparietal and MTL networks on the medial PFC network, suggesting that ease of retrieval involves greater fluency among the multiple networks contributing to AM. These results show the integration between neural networks supporting AM retrieval and the modulation of network connectivity by behavior.