839 resultados para Ease of use


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Making use of very detailed neurophysiological, anatomical, and behavioral data to build biological-realistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalabiltiy, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multu-compartmental neurons with biophysical properties such as membrane potential, voltage-gated and ligand-gated channels, the presence of gap junctions of ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, local-field potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plug-in development is also presented. Further developement of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effecitively collaborate using a modern neural simulation platform.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Making use of very detailed neurophysiological, anatomical, and behavioral data to build biologically-realistic computational models of animal behavior is often a difficult task. Until recently, many software packages have tried to resolve this mismatched granularity with different approaches. This paper presents KInNeSS, the KDE Integrated NeuroSimulation Software environment, as an alternative solution to bridge the gap between data and model behavior. This open source neural simulation software package provides an expandable framework incorporating features such as ease of use, scalability, an XML based schema, and multiple levels of granularity within a modern object oriented programming design. KInNeSS is best suited to simulate networks of hundreds to thousands of branched multi-compartmental neurons with biophysical properties such as membrane potential, voltage-gated and ligand-gated channels, the presence of gap junctions or ionic diffusion, neuromodulation channel gating, the mechanism for habituative or depressive synapses, axonal delays, and synaptic plasticity. KInNeSS outputs include compartment membrane voltage, spikes, local-field potentials, and current source densities, as well as visualization of the behavior of a simulated agent. An explanation of the modeling philosophy and plug-in development is also presented. Further development of KInNeSS is ongoing with the ultimate goal of creating a modular framework that will help researchers across different disciplines to effectively collaborate using a modern neural simulation platform.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The electroencephalogram (EEG) is a medical technology that is used in the monitoring of the brain and in the diagnosis of many neurological illnesses. Although coarse in its precision, the EEG is a non-invasive tool that requires minimal set-up times, and is suitably unobtrusive and mobile to allow continuous monitoring of the patient, either in clinical or domestic environments. Consequently, the EEG is the current tool-of-choice with which to continuously monitor the brain where temporal resolution, ease-of- use and mobility are important. Traditionally, EEG data are examined by a trained clinician who identifies neurological events of interest. However, recent advances in signal processing and machine learning techniques have allowed the automated detection of neurological events for many medical applications. In doing so, the burden of work on the clinician has been significantly reduced, improving the response time to illness, and allowing the relevant medical treatment to be administered within minutes rather than hours. However, as typical EEG signals are of the order of microvolts (μV ), contamination by signals arising from sources other than the brain is frequent. These extra-cerebral sources, known as artefacts, can significantly distort the EEG signal, making its interpretation difficult, and can dramatically disimprove automatic neurological event detection classification performance. This thesis therefore, contributes to the further improvement of auto- mated neurological event detection systems, by identifying some of the major obstacles in deploying these EEG systems in ambulatory and clinical environments so that the EEG technologies can emerge from the laboratory towards real-world settings, where they can have a real-impact on the lives of patients. In this context, the thesis tackles three major problems in EEG monitoring, namely: (i) the problem of head-movement artefacts in ambulatory EEG, (ii) the high numbers of false detections in state-of-the-art, automated, epileptiform activity detection systems and (iii) false detections in state-of-the-art, automated neonatal seizure detection systems. To accomplish this, the thesis employs a wide range of statistical, signal processing and machine learning techniques drawn from mathematics, engineering and computer science. The first body of work outlined in this thesis proposes a system to automatically detect head-movement artefacts in ambulatory EEG and utilises supervised machine learning classifiers to do so. The resulting head-movement artefact detection system is the first of its kind and offers accurate detection of head-movement artefacts in ambulatory EEG. Subsequently, addtional physiological signals, in the form of gyroscopes, are used to detect head-movements and in doing so, bring additional information to the head- movement artefact detection task. A framework for combining EEG and gyroscope signals is then developed, offering improved head-movement arte- fact detection. The artefact detection methods developed for ambulatory EEG are subsequently adapted for use in an automated epileptiform activity detection system. Information from support vector machines classifiers used to detect epileptiform activity is fused with information from artefact-specific detection classifiers in order to significantly reduce the number of false detections in the epileptiform activity detection system. By this means, epileptiform activity detection which compares favourably with other state-of-the-art systems is achieved. Finally, the problem of false detections in automated neonatal seizure detection is approached in an alternative manner; blind source separation techniques, complimented with information from additional physiological signals are used to remove respiration artefact from the EEG. In utilising these methods, some encouraging advances have been made in detecting and removing respiration artefacts from the neonatal EEG, and in doing so, the performance of the underlying diagnostic technology is improved, bringing its deployment in the real-world, clinical domain one step closer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

More and more often, universities make the decision to implement integrated learning management systems. Nevertheless, these technological developments are not realized without any trouble, and are achieved with more or less success and user satisfaction (Valenduc, 2000). It is why the presented study aims at identifying the factors influencing learning management system satisfaction and acceptance among students. The Technology Acceptance model created by Wixom and Todd (2005) studies information system acceptance through user satisfaction, and has the benefit of incorporating several ergonomic factors. More precisely, the survey, based on this model, investigates behavioral attitudes towards the system, perceived ease of use, perceived usefulness, as well as system satisfaction, information satisfaction and also incorporates two groups of factors affecting separately the two types of satisfaction. The study was conducted on a representative sample of 593 students from a Brussels university which had recently implemented an integrated learning management system. The results show on one hand, the impact of system reliability, accessibility, flexibility, lay-out and functionalities offered on system satisfaction. And on the other hand, the impact of information accuracy, intelligibility, relevance, exhaustiveness and actualization on information satisfaction. In conclusion, the results indicate the applicability of the theoretical model with learning management systems, and also highlight the importance of each aforementioned factor for a successful implantation of such a system in universities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present paper reports the results of a study aiming to describe the attitudes of teachers in adult continuous education in the Autonomous Community of Andalusia (Spain) towards the use and integration of information and communication technologies (ITC) in the educational centres they work in, while identifying those factors that favour the development of good practice. It is a mixed methods descriptive research, and information collection techniques include a questionnaire and in-depth interviews. A total number of 172 teachers were surveyed, as well as 18 head teachers and coordinators, in adult education. For questionnaire validation the expert judgment technique was used, as they were selected by the «expert competence coefficient» or «K coefficient» procedure. To improve its psychometric properties, construct validity was determined by means of Varimax factor analysis and maximum likelihood extraction (two factors were extracted). Confidence was set by Cronbach's alpha (0.88). The interview guide was also validated by this group of experts. Results point out, on one hand, that teachers hold positive attitudes towards ICT regarding both ICT's role in professional development and their ease of use and access. On the other hand, among the most important factors for ICT-supported good educational practices lies in ICT's capacity to favour personalized work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a new method for estimating the covariance matrix of a multivariate time series of nancial returns. The method is based on estimating sample covariances from overlapping windows of observations which are then appropriately weighted to obtain the nal covariance estimate. We extend the idea of (model) covariance averaging o ered in the covariance shrinkage approach by means of greater ease of use, exibility and robustness in averaging information over different data segments. The suggested approach does not su er from the curse of dimensionality and can be used without problems of either approximation or any demand for numerical optimization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação de Mestrado apresentada ao Instituto de Contabilidade e Administração do Porto para a obtenção do grau de Mestre em Marketing Digital, sob orientação de Mestre António da Silva Vieira.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tese para obter o grau de Mestre em Engenharia Electrónica e Telecomunicações

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação de Mestrado apresentada ao Instituto Superior de Contabilidade e Administração do Porto para a obtenção do grau de Mestre em Marketing Digital, sob orientação do Mestre Paulo Gonçalves e da Doutora Madalena Vilas Boas Esta versão não contém as críticas e sugestões dos elementos do júri

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Research in Crowdfunding is an emerging priority within the field of Entrepreneurship. Hundreds of platforms provide nowadays multiple Crowdfunding schemes which are intended to make it easier for entrepreneurs and others to collect money from the crowd. However, only a few campaigns become successful as others don’t reach the pre-established funding goal. It is thus necessary to keep on understanding the dynamics of these platforms and the factors which justify success. The asymmetry of information has been shown to be a delicate issue as people perceive quality in different manners. As so, this research aims to understand which components of perceived quality mostly influence investments decisions. Mainly Entrepreneurship and Marketing theories were explored along the way. This is research follows a causal approach where nineteen hypotheses are tested. An experimental survey was conducted and data was collected from 127 people who were asked to evaluate one of the most important pieces of any Crowdfunding campaign – the pitch video – and consequently invest on the presented products.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Methoxypyrazines are aroma active compounds found in many wine varietals. These compounds can be of either grape-derived nature or can be introduced into wines via Coccinellidae beetles. Regardless of their origin, methoxypyrazines can have either a beneficial role for wine quality, contributing to the specificity of certain wine varietals (Cabernet sauvignon, Cabernet franc, Sauvignon blanc) or a detrimental role, particularly at higher concentrations, resulting in overpowering green, unripe and herbaceous notes. When methoxypyrazines of exogenous nature are responsible for these unpleasant characteristics, wines are considered to be affected by what is generally known as Ladybug taint (LBT). This is work is a collection of studies seeking to create a sensitive analytical method for the detection and quantification of methoxypyrazines in wines; to investigate the role of different Coccinellidae species in the tainting of wines with LBT and identify the main compounds in ladybug tainted wines responsible for the typical green herbaceous characteristics; to determine the human detection threshold of 2,5-dimethyl-3-methoxypyrazine in wines as well as investigate its contribution to the aroma of wines; and finally to survey methoxypyrazine concentrations in a large set of wines from around the world. In the first study, an analytical method for the detection and quantitation of methoxypyrazines in wines was created and validated. The method employs multidimensional Gas Chromatography coupled with Mass Spectrometry to detect four different methoxypyrazines (2,5-dimethyl-3-methoxypyrazine, isobutyl methoxypyrazine, secbutyl methoxypyrazine and isopropyl methoxypyrazines) in wine. The low limits of detection for the compounds of interest, improved separation and isolation capabilities, good validation data, as well as the ease of use recommend this method as a good alternative to the existing analytical methods for methoxypyrazine detection in wine. In the second study the capacity of two Coccinellidae species, found in many wine regions – Harmonia axyridis and Coccinella septempunctata - to taint wines is evaluated. Coccinella septempunctata is shown to be as capable as causing LBT in wines as Harmonia axyridis. Dimethyl methoxypyrazine, previously thought to be of exogenous nature only (from Coccinellidae haemolymph), is also detected in control (untainted) wines. The main odor active compounds in LBT wines are investigated through Aroma Extract Dilution Assay. These compounds are identified as isopropyl methoxypyrazine, sec- and iso- butyl methoxypyrazine. In the third study, the human detection threshold for dimethyl methoxypyrazine in wine is established to be 31 ng/L in the orthonasal modality and 70 ng/L retronasally. After wines spiked with various amounts of dimethyl methoxypyrazine are evaluated sensorally, dimethyl methoxypyrazine causes significant detrimental effects to wine aroma at a concentration of 120 ng/L. The final study examines methoxypyrazine (dimethyl methoxypyrazine, isopropyl methoxypyrazine, secbutyl methoxypyrazine and isobutyl methoxypyrazine) concentrations in 187 wines from around the world. Dimethyl methoxypyrazine is detected in the majority of the red wines tested. Data are interpreted through statistical analyses. A new measure for predicting greenness/herbaceousness in wines - methoxypyrazine “total impact factor” is proposed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les maladies cardio-vasculaires demeurent une cause majeure de mortalité et morbidité dans les sociétés développées. La recherche de déterminants prédictifs d’évènements vasculaires représente toujours un enjeu d’actualité face aux coûts croissants des dépenses reliées aux soins médicaux et à l’élargissement des populations concernées, notamment face à l’occidentalisation des pays émergeants comme l’Inde, le Brésil et la Chine. La cardiologie nucléaire occupe depuis trente ans, une place essentielle dans l’arsenal des méthodes diagnostiques et pronostiques des cardiopathies. De plus, de nouvelles percées permettront de dépister d’une façon plus précoce et précise, la maladie athérosclérotique cardiaque et périphérique chez les populations atteintes ainsi qu’en prévention primaire. Nous présenterons dans cette thèse, deux approches nouvelles de la cardiologie nucléaire. La dysfonction endothéliale est considérée comme le signal pathologique le plus précoce de l’athérosclérose. Les facteurs de risques cardiovasculaires traditionnels atteignent la fonction endothéliale et peuvent initier le processus d’athérosclérose même en l’absence de lésion endothéliale physique. La quantification de la fonction endothéliale coronarienne comporte donc un intérêt certain comme biomarqueur précoce de la maladie coronarienne. La pléthysmographie isotopique, méthodologie développée lors de ce cycle d’étude, permet de quantifier la fonction endothéliale périphérique, cette dernière étant corrélée à la fonction endothéliale coronarienne. Cette méthodologie est démontrée dans le premier manuscrit (Harel et. al., Physiol Meas., 2007). L’utilisation d’un radiomarquage des érythrocytes permet la mesure du flot artériel au niveau du membre supérieur pendant la réalisation d’une hyperémie réactive locale. Cette nouvelle procédure a été validée en comparaison à la pléthysmographie par jauge de contrainte sur une cohorte de 26 patients. Elle a démontré une excellente reproductibilité (coefficient de corrélation intra-classe = 0.89). De plus, la mesure du flot artérielle pendant la réaction hyperémique corrélait avec les mesure réalisées par la méthode de référence (r=0.87). Le deuxième manuscrit expose les bases de la spectroscopie infrarouge comme méthodologie de mesure du flot artériel et quantification de la réaction hyperémique (Harel et. al., Physiol Meas., 2008). Cette étude utilisa un protocole de triples mesures simultanées à l’aide de la pléthysmographie par jauge de contrainte, radio-isotopique et par spectroscopie infrarouge. La technique par spectroscopie fut démontrée précise et reproductible quant à la mesure des flots artériels au niveau de l’avant-bras. Cette nouvelle procédure a présenté des avantages indéniables quant à la diminution d’artéfact et à sa facilité d’utilisation. Le second volet de ma thèse porte sur l’analyse du synchronisme de contraction cardiaque. En effet, plus de 30% des patients recevant une thérapie de resynchronisation ne démontre pas d’amélioration clinique. De plus, ce taux de non-réponse est encore plus élevé lors de l’utilisation de critères morphologiques de réponse à la resynchronisation (réduction du volume télésystolique). Il existe donc un besoin urgent de développer une méthodologie de mesure fiable et précise de la dynamique cardiaque. Le troisième manuscrit expose les bases d’une nouvelle technique radio-isotopique permettant la quantification de la fraction d’éjection du ventricule gauche (Harel et. al. J Nucl Cardiol., 2007). L’étude portant sur 202 patients a démontré une excellente corrélation (r=0.84) avec la méthode de référence (ventriculographie planaire). La comparaison avec le logiciel QBS (Cedar-Sinai) démontrait un écart type du biais inférieur (7.44% vs 9.36%). De plus, le biais dans la mesure ne démontrait pas de corrélation avec la magnitude du paramètre pour notre méthodologie, contrairement au logiciel alterne. Le quatrième manuscrit portait sur la quantification de l’asynchronisme intra-ventriculaire gauche (Harel et. al. J Nucl Cardiol, 2008). Un nouveau paramètre tridimensionnel (CHI: contraction homogeneity index) (médiane 73.8% ; IQ 58.7% - 84.9%) permis d’intégrer les composantes d’amplitude et du synchronisme de la contraction ventriculaire. La validation de ce paramètre fut effectuée par comparaison avec la déviation standard de l’histogramme de phase (SDΦ) (médiane 28.2º ; IQ 17.5º - 46.8º) obtenu par la ventriculographie planaire lors d’une étude portant sur 235 patients. Ces quatre manuscrits, déjà publiés dans la littérature scientifique spécialisée, résument une fraction des travaux de recherche que nous avons effectués durant les trois dernières années. Ces travaux s’inscrivent dans deux axes majeurs de développement de la cardiologie du 21ième siècle.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ce mémoire a pour thèse que les fonctions devraient être transparentes lors de la phase de métaprogrammation. En effet, la métaprogrammation se veut une possibilité pour le programmeur d’étendre le compilateur. Or, dans un style de programmation fonctionnelle, la logique du programme se retrouve dans les définitions des diverses fonctions le composant. Puisque les fonctions sont généralement opaques, l’impossibilité d’accéder à cette logique limite les applications possibles de la phase de métaprogrammation. Nous allons illustrer les avantages que procurent les fonctions transparentes pour la métaprogrammation. Nous donnerons notamment l’exemple du calcul symbolique et un exemple de nouvelles optimisations désormais possibles. Nous illustrerons également que la transparence des fonctions permet de faire le pont entre les datatypes du programme et les fonctions. Nous allons également étudier ce qu'implique la présence de fonctions transparentes au sein d'un langage. Nous nous concentrerons sur les aspects reliés à l'implantation de ces dernières, aux performances et à la facilité d'utilisation. Nous illustrerons nos propos avec le langage Abitbol, un langage créé sur mesure pour la métaprogrammation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Indian economy is witnessing stellar growth over the last few years. There have been rapid developments in infrastructural and business front during the growth period.Internet adoption among Indians has been increasing over the last one decade.Indian banks have also risen to the occasion by offering new channels of delivery to their customers.Internet banking is one such new channel which has become available to Indian customers.Customer acceptance for internet banking has been good so far.In this study the researcher tried to conduct a qualitative and quantitative investigation of internet banking customer acceptance among Indians. The researcher tried to identify important factors that affect customer's behavioral intention for internet banking .The researcher also proposes a research model which has extended from Technology Acceptance Model for predicting internet banking acceptance.The findings of the study would be useful for Indian banks in planning and upgrading their internet banking service.Banks could increase internet banking adoption by making their customer awareness about the usefulness of the service.It is seen that from the study that the variable perceived usefulness has a positive influence on internet banking use,therefore internet banking acceptance would increase when customers find it more usefulness.Banks should plan their marketing campaigns taking into consideration this factor.Proper marketing communications which would increase consumer awareness would result in better acceptance of internet banking.The variable perceived ease of use had a positive influence on internet banking use.That means customers would increase internet banking usage when they find it easier to use.Banks should therefore try to develop their internet banking site and interface easier to use.Banks could also consider providing practical training sessions for customers at their branches on usage of internet banking interface.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The process of developing software that takes advantage of multiple processors is commonly referred to as parallel programming. For various reasons, this process is much harder than the sequential case. For decades, parallel programming has been a problem for a small niche only: engineers working on parallelizing mostly numerical applications in High Performance Computing. This has changed with the advent of multi-core processors in mainstream computer architectures. Parallel programming in our days becomes a problem for a much larger group of developers. The main objective of this thesis was to find ways to make parallel programming easier for them. Different aims were identified in order to reach the objective: research the state of the art of parallel programming today, improve the education of software developers about the topic, and provide programmers with powerful abstractions to make their work easier. To reach these aims, several key steps were taken. To start with, a survey was conducted among parallel programmers to find out about the state of the art. More than 250 people participated, yielding results about the parallel programming systems and languages in use, as well as about common problems with these systems. Furthermore, a study was conducted in university classes on parallel programming. It resulted in a list of frequently made mistakes that were analyzed and used to create a programmers' checklist to avoid them in the future. For programmers' education, an online resource was setup to collect experiences and knowledge in the field of parallel programming - called the Parawiki. Another key step in this direction was the creation of the Thinking Parallel weblog, where more than 50.000 readers to date have read essays on the topic. For the third aim (powerful abstractions), it was decided to concentrate on one parallel programming system: OpenMP. Its ease of use and high level of abstraction were the most important reasons for this decision. Two different research directions were pursued. The first one resulted in a parallel library called AthenaMP. It contains so-called generic components, derived from design patterns for parallel programming. These include functionality to enhance the locks provided by OpenMP, to perform operations on large amounts of data (data-parallel programming), and to enable the implementation of irregular algorithms using task pools. AthenaMP itself serves a triple role: the components are well-documented and can be used directly in programs, it enables developers to study the source code and learn from it, and it is possible for compiler writers to use it as a testing ground for their OpenMP compilers. The second research direction was targeted at changing the OpenMP specification to make the system more powerful. The main contributions here were a proposal to enable thread-cancellation and a proposal to avoid busy waiting. Both were implemented in a research compiler, shown to be useful in example applications, and proposed to the OpenMP Language Committee.