957 resultados para interpretazione automatica,machine interpreting,sperimentazione,sperimentazione in contesto reale
Resumo:
Il gioco è un concetto che accompagna la vita di innumerevoli specie animali in forme, modi e tempi differenti. L’uomo scopre il gioco sin dai primi mesi di vita. Con l’obiettivo di migliorare la condizione emotiva dell'uomo nello svolgimento delle azioni quotidiane, nasce negli ultimi anni la gamification. Il termine consta nell’integrazione delle tecniche di progettazione dei giochi in contesti esterni ai giochi. Consiste nel progettare ponendo particolare attenzione sul coinvolgimento dell’utente per renderlo capace di sperimentare le emozioni tipiche dello svago: fierezza per le proprie azioni, qualunque esse siano. Gli ambiti di applicazione sono innumerevoli. Questa tesi si concentra sullo studio del contesto aziendale, focalizzandosi sulle mansioni di data entry, allo scopo di creare una piattaforma completa, composta da strumenti informatici ed elementi di gioco, che possa aumentare il coinvolgimento dei dipendenti nel proprio lavoro. Si è scelto questo tipo di attività in quanto composta da incarichi facilmente misurabili e allo stesso tempo poco appassionanti per il dipendente perché altamente meccanici e ripetitivi. La sperimentazione in questo ambito permette quindi di valutare con certezza matematica se i miglioramenti introdotti dall'integrazione delle tecniche di gamification nello stato d’animo dei dipendenti hanno anche la conseguenza di aumentare la produttività, verificando quindi se una piattaforma ludicizzata possa essere auto-sostenibile in ambito aziendale. Al termine della tesi si giungerà ad ottenere il progetto di un sistema completo, composto da software ed attività extra-informatiche, che i dipendenti valuteranno con un questionario. La piattaforma otterrà buoni voti necessitando principalmente di un maggior apporto contenutistico e del contributo professionale di un esperto progettista di giochi perché abbia le potenzialità per diventare un caso di successo.
Resumo:
We experimentally demonstrate 7-dB reduction of nonlinearity penalty in 40-Gb/s CO-OFDM at 2000-km using support vector machine regression-based equalization. Simulation in WDM-CO-OFDM shows up to 12-dB enhancement in Q-factor compared to linear equalization.
Resumo:
Computational intelligent support for decision making is becoming increasingly popular and essential among medical professionals. Also, with the modern medical devices being capable to communicate with ICT, created models can easily find practical translation into software. Machine learning solutions for medicine range from the robust but opaque paradigms of support vector machines and neural networks to the also performant, yet more comprehensible, decision trees and rule-based models. So how can such different techniques be combined such that the professional obtains the whole spectrum of their particular advantages? The presented approaches have been conceived for various medical problems, while permanently bearing in mind the balance between good accuracy and understandable interpretation of the decision in order to truly establish a trustworthy ‘artificial’ second opinion for the medical expert.
Resumo:
Il Machine Learning si sta rivelando una tecnologia dalle incredibili potenzialità nei settori più disparati. Le diverse tecniche e gli algoritmi che vi fanno capo abilitano analisi dei dati molto più efficaci rispetto al passato. Anche l’industria assicurativa sta sperimentando l’adozione di soluzioni di Machine Learning e diverse sono le direzioni di innovamento che ne stanno conseguendo, dall’efficientamento dei processi interni all’offerta di prodotti rispondenti in maniera adattiva alle esigenze del cliente. Questo lavoro di tesi è stato realizzato durante un tirocinio presso Unisalute S.p.A., la prima assicurazione in ambito sanitario in Italia. La criticità intercettata è stata la sovrastima del capitale da destinare a riserva a fronte dell’impegno nei confronti dell’assicurato: questo capitale immobilizzato va a sottrarre risorse ad investimenti più proficui nel medio e lungo termine, per cui è di valore stimarlo appropriatamente. All'interno del settore IT di Unisalute, ho lavorato alla progettazione e implementazione di un modello di Machine Learning che riesca a prevedere se un sinistro appena preso in gestione sarà liquidato o meno. Dotare gli uffici impegnati nella determinazione del riservato di questa stima aggiuntiva basata sui dati, sarebbe di notevole supporto. La progettazione del modello di Machine Learning si è articolata in una Data Pipeline contenente le metodologie più efficienti con riferimento al preprocessamento e alla modellazione dei dati. L’implementazione ha visto Python come linguaggio di programmazione; il dataset, ottenuto a seguito di estrazioni e integrazioni a partire da diversi database Oracle, presenta una cardinalità di oltre 4 milioni di istanze caratterizzate da 32 variabili. A valle del tuning degli iperparamentri e dei vari addestramenti, si è raggiunta un’accuratezza dell’86% che, nel dominio di specie, è ritenuta più che soddisfacente e sono emersi contributi non noti alla liquidabilità dei sinistri.
Resumo:
Il monitoraggio basato su emissioni acustiche (AE) guidate si è confermato tra le tecniche più affidabili nel campo del Non-Destructive Testing delle strutture planari, vista anche la sua semplicità implementativa, i bassi costi che lo caratterizzano, la non invasività e la possibilità di realizzare un sistema che agisca in maniera continuativa ed in tempo reale sfruttando reti di sensori permanentemente installati, senza la necessità di ispezioni periodiche. In tale contesto, è possibile sfruttare l’abilità dell’apprendimento automatico nell’individuazione dei pattern nascosti all’interno dei segnali grezzi registrati, ottenendo così informazioni utili ai fini dell’applicazione considerata. L’esecuzione on-edge dei modelli, ovvero sul punto di acquisizione, consente di superare le limitazioni imposte dal processamento centralizzato dei dati, con notevoli vantaggi in termini di consumo energetico, tempestività nella risposta ed integrità degli stessi. A questo scopo, si rivela però necessario sviluppare modelli compatibili con le stringenti risorse hardware dei dispositivi a basso costo tipicamente impiegati. In questo elaborato verranno prese in esame alcune tipologie di reti neurali artificiali per l’estrazione dell’istante di arrivo (ToA) di un’emissione acustica all’interno di una sequenza temporale, in particolare quelle convoluzionali (CNNs) ed una loro variante più recente, le CapsNet basate su rounting by agreement. L’individuazione dei ToA relativi al medesimo evento su segnali acquisiti in diverse posizioni spaziali consente infatti di localizzare la sorgente da cui esso è scaturito. Le dimensioni di questi modelli permettono di eseguire l’inferenza direttamente su edge-device. I risultati ottenuti confermano la maggiore robustezza delle tecniche di apprendimento profondo rispetto ai metodi statistici tradizionali nel far fronte a diverse tipologie di disturbo, in particolare negli scenari più critici dal punto di vista del rapporto segnale-rumore.
Resumo:
Age-related changes in running kinematics have been reported in the literature using classical inferential statistics. However, this approach has been hampered by the increased number of biomechanical gait variables reported and subsequently the lack of differences presented in these studies. Data mining techniques have been applied in recent biomedical studies to solve this problem using a more general approach. In the present work, we re-analyzed lower extremity running kinematic data of 17 young and 17 elderly male runners using the Support Vector Machine (SVM) classification approach. In total, 31 kinematic variables were extracted to train the classification algorithm and test the generalized performance. The results revealed different accuracy rates across three different kernel methods adopted in the classifier, with the linear kernel performing the best. A subsequent forward feature selection algorithm demonstrated that with only six features, the linear kernel SVM achieved 100% classification performance rate, showing that these features provided powerful combined information to distinguish age groups. The results of the present work demonstrate potential in applying this approach to improve knowledge about the age-related differences in running gait biomechanics and encourages the use of the SVM in other clinical contexts. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Neonatal screening for congenital adrenal hyperplasia (CAH) is useful in diagnosing salt wasting form (SW). However, there are difficulties in interpreting positive results in asymptomatic newborns. The main objective is to analyze genotyping as a confirmatory test in children with neonatal positive results. Patients comprised 23 CAH children and 19 asymptomatic infants with persistently elevated 17-hydroxyprogesterone (17OHP) levels. CYP21A2 gene was sequenced and genotypes were grouped according to the enzymatic activity of the less severe allele: A1 null, A2 < 2%, B 3-7%, C > 20%. Twenty-one children with neonatal symptoms and/or 17OHP levels > 80 ng/ml carried A genotypes, except two virilized girls (17OHP < 50 ng/ml) without CAH genotypes. Patients carrying SW genotypes (A1, A2) and low serum sodium levels presented with neonatal 17OHP > 200 ng/ml. Three asymptomatic boys carried simple virilizing genotypes (A2 and B): in two, the symptoms began at 18 months; another two asymptomatic boys had nonclassical genotypes (C). The remaining 14 patients did not present CAH genotypes, and their 17OHP levels were normalized by 14 months of age. Molecular analysis is useful as a confirmatory test of CAH, mainly in boys. It can predict clinical course, identify false-positives and help distinguish between clinical forms of CAH.
Resumo:
Dissertation to obtain the degree of Master in Electrical and Computer Engineering
Resumo:
Background. Accurate quantification of the prevalence of human immunodeficiency virus type 1 (HIV-1) drug resistance in patients who are receiving antiretroviral therapy (ART) is difficult, and results from previous studies vary. We attempted to assess the prevalence and dynamics of resistance in a highly representative patient cohort from Switzerland. Methods. On the basis of genotypic resistance test results and clinical data, we grouped patients according to their risk of harboring resistant viruses. Estimates of resistance prevalence were calculated on the basis of either the proportion of individuals with a virologic failure or confirmed drug resistance (lower estimate) or the frequency-weighted average of risk group-specific probabilities for the presence of drug resistance mutations (upper estimate). Results. Lower and upper estimates of drug resistance prevalence in 8064 ART-exposed patients were 50% and 57% in 1999 and 37% and 45% in 2007, respectively. This decrease was driven by 2 mechanisms: loss to follow-up or death of high-risk patients exposed to mono- or dual-nucleoside reverse-transcriptase inhibitor therapy (lower estimates range from 72% to 75%) and continued enrollment of low-risk patients who were taking combination ART containing boosted protease inhibitors or nonnucleoside reverse-transcriptase inhibitors as first-line therapy (lower estimates range from 7% to 12%). A subset of 4184 participants (52%) had 1 study visit per year during 2002-2007. In this subset, lower and upper estimates increased from 45% to 49% and from 52% to 55%, respectively. Yearly increases in prevalence were becoming smaller in later years. Conclusions. Contrary to earlier predictions, in situations of free access to drugs, close monitoring, and rapid introduction of new potent therapies, the emergence of drug-resistant viruses can be minimized at the population level. Moreover, this study demonstrates the necessity of interpreting time trends in the context of evolving cohort populations.
Resumo:
The progressive development of Alzheimer's disease (AD)-related lesions such as neurofibrillary tangles,amyloid deposits and synaptic loss within the cerebral cortex is a main event of brain aging.Recent neuropathologic studies strongly suggested that the clinical diagnosis of dementia depends more on the severity and topography of pathologic changes than on the presence of a qualitative marker. However, several methodological problems such as selection biases, case-control design,density-based measures, and masking effects of concomitant pathologies should be taken into account when interpreting these data. In last years, the use of stereologic counting permitted to define reliably the cognitive impact of AD lesions in the human brain. Unlike fibrillar amyloid deposits that are poorly or not related to the dementia severity, the use of this method documented that total neurofibrillary tangles and neuron numbers in the CA1 field are the best correlates of cognitive deterioration in brain aging. Loss of dendritic spines in neocortical but not hippocampal areas has a modest but independent contribution to dementia. In contrast, the importance of early dendritic and axonal tau-related pathologic changes such as neuropil threads remains doubtful. Despite these progresses, neuronal pathology and synaptic loss in cases with pure AD pathology cannot explain more than 50% of clinical severity. The present review discusses the complex structure/function relationships in brain aging and AD within the theoretical framework of the functional neuropathology of brain aging.
Resumo:
Grid is a hardware and software infrastructure that provides dependable, consistent, pervasive, and inexpensive access to high-end computational resources. Grid enables access to the resources but it does not guarantee any quality of service. Moreover, Grid does not provide performance isolation; job of one user can influence the performance of other user’s job. The other problem with Grid is that the users of Grid belong to scientific community and the jobs require specific and customized software environment. Providing the perfect environment to the user is very difficult in Grid for its dispersed and heterogeneous nature. Though, Cloud computing provide full customization and control, but there is no simple procedure available to submit user jobs as in Grid. The Grid computing can provide customized resources and performance to the user using virtualization. A virtual machine can join the Grid as an execution node. The virtual machine can also be submitted as a job with user jobs inside. Where the first method gives quality of service and performance isolation, the second method also provides customization and administration in addition. In this thesis, a solution is proposed to enable virtual machine reuse which will provide performance isolation with customization and administration. The same virtual machine can be used for several jobs. In the proposed solution customized virtual machines join the Grid pool on user request. Proposed solution describes two scenarios to achieve this goal. In first scenario, user submits their customized virtual machine as a job. The virtual machine joins the Grid pool when it is powered on. In the second scenario, user customized virtual machines are preconfigured in the execution system. These virtual machines join the Grid pool on user request. Condor and VMware server is used to deploy and test the scenarios. Condor supports virtual machine jobs. The scenario 1 is deployed using Condor VM universe. The second scenario uses VMware-VIX API for scripting powering on and powering off of the remote virtual machines. The experimental results shows that as scenario 2 does not need to transfer the virtual machine image, the virtual machine image becomes live on pool more faster. In scenario 1, the virtual machine runs as a condor job, so it easy to administrate the virtual machine. The only pitfall in scenario 1 is the network traffic.
Resumo:
This study describes major electrocardiogram (ECG) measurements and diagnoses in a population of African individuals; most reference data have been collected in Caucasian populations and evidence exists for interethnic differences in ECG findings. This study was conducted in the Seychelles islands (Indian Ocean) and included 709 black individuals (343 men and 366 women) aged 25 to 64 years randomly selected from the general population. Resting ECG were recorded by using a validated ECG unit equipped with a measurement and interpretation software (Cardiovit AT-6, Schiller, Switzerland). The epidemiology of 14 basic ECG measurements, 6 composite criteria for left ventricular hypertrophy and 19 specific ECG diagnoses including abnormal rhythms, conduction abnormalities, repolarization abnormalities, and myocardial infarction were examined. Substantial gender and age differences were found for several ECG parameters. Moreover, tracings recorded in African individuals of the Seychelles differed from those collected similarly in Caucasian populations in many respects. For instance, heart rate was approximately 5 beats per minute lower in the African individuals than in selected Caucasian populations, prevalence of first degree atrio-ventricular block was especially high (4.8%), and the average Sokolow-Lyon voltage was markedly higher in African individuals of the Seychelles compared with black and white Americans. The integrated interpretation software detected "old myocardial infarction" in 3.8% of men and 0% of women and "old myocardial infarction possible" in 6.1% and 3%, respectively. Cardiac infarction injury scores are also provided. In conclusion, the study provides reference values for ECG findings in a specific population of people of African descent and stresses the need to systematically consider gender, age, and ethnicity when interpreting ECG tracings in individuals.
Resumo:
Aquest projecte té la intenció d'identificar i analitzar els efectes de la introducció d'Internet a les escoles catalanes (educació primària i secundària). L'objectiu és posar de manifest la manera com s'utilitza la xarxa en aquest àmbit i en quina mesura contribueix a l'aparició, en els centres educatius, d'una nova cultura adaptada a les necessitats de la societat xarxa. Amb aquest propòsit, aquest projecte desplega les seves línies d'anàlisi per a fer atenció al procés d'incorporació d'Internet, principalment, en tres direccions: la pràctica pedagògica, les formes d'organització i gestió dels centres educatius i la seva vinculació amb la comunitat i el territori. Aquesta investigació ha estat desenvolupada pel grup de recerca ENS (Education and Network Society). Amb una perspectiva comparativa, el treball d'aquest grup vol contribuir, sobre la base de dades empíriques, a interpretar la transformació de l'àmbit educatiu no universitari en els paràmetres que estableix, avui dia, la nostra societat.
Resumo:
The standard one-machine scheduling problem consists in schedulinga set of jobs in one machine which can handle only one job at atime, minimizing the maximum lateness. Each job is available forprocessing at its release date, requires a known processing timeand after finishing the processing, it is delivery after a certaintime. There also can exists precedence constraints between pairsof jobs, requiring that the first jobs must be completed beforethe second job can start. An extension of this problem consistsin assigning a time interval between the processing of the jobsassociated with the precedence constrains, known by finish-starttime-lags. In presence of this constraints, the problem is NP-hardeven if preemption is allowed. In this work, we consider a specialcase of the one-machine preemption scheduling problem with time-lags, where the time-lags have a chain form, and propose apolynomial algorithm to solve it. The algorithm consist in apolynomial number of calls of the preemption version of the LongestTail Heuristic. One of the applicability of the method is to obtainlower bounds for NP-hard one-machine and job-shop schedulingproblems. We present some computational results of thisapplication, followed by some conclusions.
Resumo:
ABSTRACT: BACKGROUND: One central concept in evolutionary ecology is that current and residual reproductive values are negatively linked by the so-called cost of reproduction. Previous studies examining the nature of this cost suggested a possible involvement of oxidative stress resulting from the imbalance between pro- and anti-oxidant processes. Still, data remain conflictory probably because, although oxidative damage increases during reproduction, high systemic levels of oxidative stress might also constrain parental investment in reproduction. Here, we investigated variation in oxidative balance (i.e. oxidative damage and antioxidant defences) over the course of reproduction by comparing female laboratory mice rearing or not pups. RESULTS: A significant increase in oxidative damage over time was only observed in females caring for offspring, whereas antioxidant defences increased over time regardless of reproductive status. Interestingly, oxidative damage measured prior to reproduction was negatively associated with litter size at birth (constraint), whereas damage measured after reproduction was positively related to litter size at weaning (cost). CONCLUSIONS: Globally, our correlative results and the review of literature describing the links between reproduction and oxidative stress underline the importance of timing/dynamics when studying and interpreting oxidative balance in relation to reproduction. Our study highlights the duality (constraint and cost) of oxidative stress in life-history trade-offs, thus supporting the theory that oxidative stress plays a key role in life-history evolution.