908 resultados para Task-to-core mapping
Resumo:
We report the performance of a group of adult dyslexics and matched controls in an array-matching task where two strings of either consonants or symbols are presented side by side and have to be judged to be the same or different. The arrays may differ either in the order or identity of two adjacent characters. This task does not require naming – which has been argued to be the cause of dyslexics’ difficulty in processing visual arrays – but, instead, has a strong serial component as demonstrated by the fact that, in both groups, Reaction times (RTs) increase monotonically with position of a mismatch. The dyslexics are clearly impaired in all conditions and performance in the identity conditions predicts performance across orthographic tasks even after age, performance IQ and phonology are partialled out. Moreover, the shapes of serial position curves are revealing of the underlying impairment. In the dyslexics, RTs increase with position at the same rate as in the controls (lines are parallel) ruling out reduced processing speed or difficulties in shifting attention. Instead, error rates show a catastrophic increase for positions which are either searched later or more subject to interference. These results are consistent with a reduction in the attentional capacity needed in a serial task to bind together identity and positional information. This capacity is best seen as a reduction in the number of spotlights into which attention can be split to process information at different locations rather than as a more generic reduction of resources which would also affect processing the details of single objects.
Resumo:
Vaccine design is highly suited to the application of in silico techniques, for both the discovery and development of new and existing vaccines. Here, we discuss computational contributions to epitope mapping and reverse vaccinology, two techniques central to the new discipline of immunomics. Also discussed are methods to improve the efficiency of vaccination, such as codon optimization and adjuvant discovery in addition to the identification of allergenic proteins. We also review current software developed to facilitate vaccine design.
An agent approach to improving radio frequency identification enabled Returnable Transport Equipment
Resumo:
Returnable transport equipment (RTE) such as pallets form an integral part of the supply chain and poor management leads to costly losses. Companies often address this matter by outsourcing the management of RTE to logistics service providers (LSPs). LSPs are faced with the task to provide logistical expertise to reduce RTE related waste, whilst differentiating their own services to remain competitive. In the current challenging economic climate, the role of the LSP to deliver innovative ways to achieve competitive advantage has never been so important. It is reported that radio frequency identification (RFID) application to RTE enables LSPs such as DHL to gain competitive advantage and offer clients improvements such as loss reduction, process efficiency improvement and effective security. However, the increased visibility and functionality of RFID enabled RTE requires further investigation in regards to decision‐making. The distributed nature of the RTE network favours a decentralised decision‐making format. Agents are an effective way to represent objects from the bottom‐up, capturing the behaviour and enabling localised decision‐making. Therefore, an agent based system is proposed to represent the RTE network and utilise the visibility and data gathered from RFID tags. Two types of agents are developed in order to represent the trucks and RTE, which have bespoke rules and algorithms in order to facilitate negotiations. The aim is to create schedules, which integrate RTE pick‐ups as the trucks go back to the depot. The findings assert that: - agent based modelling provides an autonomous tool, which is effective in modelling RFID enabled RTE in a decentralised utilising the real‐time data facility. ‐ the RFID enabled RTE model developed enables autonomous agent interaction, which leads to a feasible schedule integrating both forward and reverse flows for each RTE batch. ‐ the RTE agent scheduling algorithm developed promotes the utilisation of RTE by including an automatic return flow for each batch of RTE, whilst considering the fleet costs andutilisation rates. ‐ the research conducted contributes an agent based platform, which LSPs can use in order to assess the most appropriate strategies to implement for RTE network improvement for each of their clients.
Resumo:
Fibre-to-the-premises (FTTP) has been long sought as the ultimate solution to satisfy the demand for broadband access in the foreseeable future, and offer distance-independent data rate within access network reach. However, currently deployed FTTP networks have in most cases only replaced the transmission medium, without improving the overall architecture, resulting in deployments that are only cost efficient in densely populated areas (effectively increasing the digital divide). In addition, the large potential increase in access capacity cannot be matched by a similar increase in core capacity at competitive cost, effectively moving the bottleneck from access to core. DISCUS is a European Integrated Project that, building on optical-centric solutions such as Long-Reach Passive Optical access and flat optical core, aims to deliver a cost-effective architecture for ubiquitous broadband services. One of the key features of the project is the end-to-end approach, which promises to deliver a complete network design and a conclusive analysis of its economic viability. © 2013 IEEE.
Resumo:
We present our approach to real-time service-oriented scheduling problems with the objective of maximizing the total system utility. Different from the traditional utility accrual scheduling problems that each task is associated with only a single time utility function (TUF), we associate two different TUFs—a profit TUF and a penalty TUF—with each task, to model the real-time services that not only need to reward the early completions but also need to penalize the abortions or deadline misses. The scheduling heuristics we proposed in this paper judiciously accept, schedule, and abort real-time services when necessary to maximize the accrued utility. Our extensive experimental results show that our proposed algorithms can significantly outperform the traditional scheduling algorithms such as the Earliest Deadline First (EDF), the traditional utility accrual (UA) scheduling algorithms, and an earlier scheduling approach based on a similar model.
Resumo:
Alarming statistics provides that only 10,2 percentage of companies listed on the Swedish stock exchange has achieved gender equality in their top management. The fact is that women being discriminated, since men dominates these positions of power. The study is of a qualitative nature and aims to achieve a deeper understanding and knowledge contribution of how gender equal companies´ has achieved this gender diversity in their top management. Sweden's highest ranking business leaders has been interviewed in order to obtain their view, and the companies they represent, in order to get an answer to what the most important requirements has been in the achievement. The study's main result has shown that strong core values and corporate culture are basic and required condition for a successful gender equality strategy. A deliberate or emergent strategy can then be successfully implemented, and it is mainly the impact of structural barriers that determine which strategy a company uses. At a deliberate strategy, following measures are in additional to core values and corporate cultural crucial; commitment towards gender equality, a specific plan with clear objectives, and a conscious objective recruitment process. The result found aboute these two factors and three measures also identified a required specific order to follow in order to achieve gender diversity in top management. These findings, which in a near future, aims to contribute to a more gender equal Sweden.
Resumo:
Objective: Caffeine has been shown to have effects on certain areas of cognition, but in executive functioning the research is limited and also inconsistent. One reason could be the need for a more sensitive measure to detect the effects of caffeine on executive function. This study used a new non-immersive virtual reality assessment of executive functions known as JEF© (the Jansari Assessment of Executive Function) alongside the ‘classic’ Stroop Colour- Word task to assess the effects of a normal dose of caffeinated coffee on executive function. Method: Using a double-blind, counterbalanced within participants procedure 43 participants were administered either a caffeinated or decaffeinated coffee and completed the ‘JEF©’ and Stroop tasks, as well as a subjective mood scale and blood pressure pre- and post condition on two separate occasions a week apart. JEF© yields measures for eight separate aspects of executive functions, in addition to a total average score. Results: Findings indicate that performance was significantly improved on the planning, creative thinking, event-, time- and action-based prospective memory, as well as total JEF© score following caffeinated coffee relative to the decaffeinated coffee. The caffeinated beverage significantly decreased reaction times on the Stroop task, but there was no effect on Stroop interference. Conclusion: The results provide further support for the effects of a caffeinated beverage on cognitive functioning. In particular, it has demonstrated the ability of JEF© to detect the effects of caffeine across a number of executive functioning constructs, which weren’t shown in the Stroop task, suggesting executive functioning improvements as a result of a ‘typical’ dose of caffeine may only be detected by the use of more real-world, ecologically valid tasks.
Resumo:
In this master’s thesis, I examine the development of writer-characters and metafiction from John Irving’s The World According to Garp to Last Night in Twisted River and how this development relates to the development of late twentieth century postmodern literary theory to twenty-first century post-postmodern literary theory. The purpose of my study is to determine how the prominently postmodern feature metafiction, created through the writer-character’s stories-within-stories, has changed in form and function in the two novels published thirty years apart from one another, and what possible features this indicates for future post-postmodern theory. I establish my theoretical framework on the development of metafiction largely on late twentieth-century models of author and authorship as discussed by Roland Barthes, Wayne Booth and Michel Foucault. I base my close analysis of metafiction mostly on Linda Hutcheon’s model of overt and covert metafiction. At the end of my study, I examine Irving’s later novel through Suzanne Rohr’s models of reality constitution and fictional reality. The analysis of the two novels focuses on excerpts that feature the writer-characters, their stories-within-stories and the novels’ other characters and the narrators’ evaluations of these two. I draw examples from both novels, but I illustrate my choice of focus on the novels at the beginning of each section. Through this, I establish a method of analysis that best illustrates the development as a continuum from pre-existing postmodern models and theories to the formation of new post-postmodern theory. Based on my findings, the thesis argues that twenty-first century literary theory has moved away from postmodern overt deconstruction of the narrative and its meaning. New post-postmodern literary theory reacquires the previously deconstructed boundaries that define reality and truth and re-establishes them as having intrinsic value that cannot be disputed. In establishing fictional reality as self-governing and non-intrudable, post-postmodern theory takes a stance against postmodern nihilism, which indicates the re-founded, non-questionable value of the text’s reality. To continue mapping other possible features of future post-postmodern theory, I recommend further analysis solely on John Irving’s novels’ published in the twenty-first century.
Resumo:
Glutamine synthetase (GS) is a vital enzyme for the assimilation of ammonia into amino acids in higher plants. In legumes, GS plays a crucial role in the assimilation of the ammonium released by nitrogen-fixing bacteria in root nodules, constituting an important metabolic knob controlling the nitrogen (N) assimilatory pathways. To identify new regulators of nodule metabolism, we profiled the transcriptome of Medicago truncatula nodules impaired in N assimilation by specifically inhibiting GS activity using phosphinothricin (PPT). Global transcript expression of nodules collected before and after PPT addition (4, 8, and 24 h) was assessed using Affymetrix M. truncatula GeneChip arrays. Hundreds of genes were regulated at the three time points, illustrating the dramatic alterations in cell metabolism that are imposed on the nodules upon GS inhibition. The data indicate that GS inhibition triggers a fast plant defense response, induces premature nodule senescence, and promotes loss of root nodule identity. Consecutive metabolic changes were identified at the three time points analyzed. The results point to a fast repression of asparagine synthesis and of the glycolytic pathway and to the synthesis of glutamate via reactions alternative to the GS/GOGAT cycle. Several genes potentially involved in the molecular surveillance for internal organic N availability are identified and a number of transporters potentially important for nodule functioning are pinpointed. The data provided by this study contributes to the mapping of regulatory and metabolic networks involved in root nodule functioning and highlight candidate modulators for functional analysis.
Resumo:
Phonation distortion leaves relevant marks in a speaker's biometric profile. Dysphonic voice production may be used for biometrical speaker characterization. In the present paper phonation features derived from the glottal source (GS) parameterization, after vocal tract inversion, is proposed for dysphonic voice characterization in Speaker Verification tasks. The glottal source derived parameters are matched in a forensic evaluation framework defining a distance-based metric specification. The phonation segments used in the study are derived from fillers, long vowels, and other phonation segments produced in spontaneous telephone conversations. Phonated segments from a telephonic database of 100 male Spanish native speakers are combined in a 10-fold cross-validation task to produce the set of quality measurements outlined in the paper. Shimmer, mucosal wave correlate, vocal fold cover biomechanical parameter unbalance and a subset of the GS cepstral profile produce accuracy rates as high as 99.57 for a wide threshold interval (62.08-75.04%). An Equal Error Rate of 0.64 % can be granted. The proposed metric framework is shown to behave more fairly than classical likelihood ratios in supporting the hypothesis of the defense vs that of the prosecution, thus ofering a more reliable evaluation scoring. Possible applications are Speaker Verification and Dysphonic Voice Grading.
Resumo:
Lors de l'intégration d'infirmières nouvellement diplômées, nommées candidates à l'exercice de la profession infirmière (CEPI), ces dernières s’appuient fréquemment sur l’expérience de leurs collègues infirmières afin de les guider dans les soins à offrir (Ballem et McIntosh, 2014 ; Fink, Krugman, Casey, et Goode, 2008). Ce type de collaboration permet de faire un transfert de connaissances (D’Amour, 2002 ; Lavoie-Tremblay, Wright, Desforges, et Drevniok, 2008) et d’augmenter la qualité des soins offerts (Pfaff, Baxter, et Ploeg, 2013). Cependant, cette collaboration peut être plus difficile à initier sur certaines unités de soins (Thrysoe, Hounsgaard, Dohn, et Wagner, 2012). La littérature disponible portant principalement sur l’expérience qu’en ont les infirmières débutantes, l'expérience des infirmières quant à ce phénomène est encore méconnue. Cette étude qualitative exploratoire inspirée de l'approche de théorisation ancrée avait pour but d'explorer l’expérience d’infirmières de l’équipe de soins quant à la collaboration intra professionnelle durant l’intégration de CEPI en centre hospitalier. Des entrevues réalisées auprès de huit infirmières ont été analysées selon la démarche de théorisation ancrée. Les résultats de cette recherche ont mené à la schématisation de l'expérience d'infirmières quant à la collaboration durant l'intégration des CEPI. Cette schématisation souligne l'importance de la collaboration durant les différentes périodes d’intégration des CEPI ainsi que la complémentarité des rôles infirmiers dans l'équipe de soins, incluant l'assistante infirmière-chef, la préceptrice et l'infirmière soignante. Le résultat de cette collaboration est l’autonomie dans la tâche et le fait d’entrer dans l’équipe. En regard de cette schématisation, des recommandations ont été formulées pour la recherche, la formation, la gestion et la pratique.
Resumo:
Lors de l'intégration d'infirmières nouvellement diplômées, nommées candidates à l'exercice de la profession infirmière (CEPI), ces dernières s’appuient fréquemment sur l’expérience de leurs collègues infirmières afin de les guider dans les soins à offrir (Ballem et McIntosh, 2014 ; Fink, Krugman, Casey, et Goode, 2008). Ce type de collaboration permet de faire un transfert de connaissances (D’Amour, 2002 ; Lavoie-Tremblay, Wright, Desforges, et Drevniok, 2008) et d’augmenter la qualité des soins offerts (Pfaff, Baxter, et Ploeg, 2013). Cependant, cette collaboration peut être plus difficile à initier sur certaines unités de soins (Thrysoe, Hounsgaard, Dohn, et Wagner, 2012). La littérature disponible portant principalement sur l’expérience qu’en ont les infirmières débutantes, l'expérience des infirmières quant à ce phénomène est encore méconnue. Cette étude qualitative exploratoire inspirée de l'approche de théorisation ancrée avait pour but d'explorer l’expérience d’infirmières de l’équipe de soins quant à la collaboration intra professionnelle durant l’intégration de CEPI en centre hospitalier. Des entrevues réalisées auprès de huit infirmières ont été analysées selon la démarche de théorisation ancrée. Les résultats de cette recherche ont mené à la schématisation de l'expérience d'infirmières quant à la collaboration durant l'intégration des CEPI. Cette schématisation souligne l'importance de la collaboration durant les différentes périodes d’intégration des CEPI ainsi que la complémentarité des rôles infirmiers dans l'équipe de soins, incluant l'assistante infirmière-chef, la préceptrice et l'infirmière soignante. Le résultat de cette collaboration est l’autonomie dans la tâche et le fait d’entrer dans l’équipe. En regard de cette schématisation, des recommandations ont été formulées pour la recherche, la formation, la gestion et la pratique.
Resumo:
This Thesis is composed of a collection of works written in the period 2019-2022, whose aim is to find methodologies of Artificial Intelligence (AI) and Machine Learning to detect and classify patterns and rules in argumentative and legal texts. We define our approach “hybrid”, since we aimed at designing hybrid combinations of symbolic and sub-symbolic AI, involving both “top-down” structured knowledge and “bottom-up” data-driven knowledge. A first group of works is dedicated to the classification of argumentative patterns. Following the Waltonian model of argument and the related theory of Argumentation Schemes, these works focused on the detection of argumentative support and opposition, showing that argumentative evidences can be classified at fine-grained levels without resorting to highly engineered features. To show this, our methods involved not only traditional approaches such as TFIDF, but also some novel methods based on Tree Kernel algorithms. After the encouraging results of this first phase, we explored the use of a some emerging methodologies promoted by actors like Google, which have deeply changed NLP since 2018-19 — i.e., Transfer Learning and language models. These new methodologies markedly improved our previous results, providing us with best-performing NLP tools. Using Transfer Learning, we also performed a Sequence Labelling task to recognize the exact span of argumentative components (i.e., claims and premises), thus connecting portions of natural language to portions of arguments (i.e., to the logical-inferential dimension). The last part of our work was finally dedicated to the employment of Transfer Learning methods for the detection of rules and deontic modalities. In this case, we explored a hybrid approach which combines structured knowledge coming from two LegalXML formats (i.e., Akoma Ntoso and LegalRuleML) with sub-symbolic knowledge coming from pre-trained (and then fine-tuned) neural architectures.
Resumo:
Salient stimuli, like sudden changes in the environment or emotional stimuli, generate a priority signal that captures attention even if they are task-irrelevant. However, to achieve goal-driven behavior, we need to ignore them and to avoid being distracted. It is generally agreed that top-down factors can help us to filter out distractors. A fundamental question is how and at which stage of processing the rejection of distractors is achieved. Two circumstances under which the allocation of attention to distractors is supposed to be prevented are represented by the case in which distractors occur at an unattended location (as determined by the deployment of endogenous spatial attention) and when the amount of visual working memory resources is reduced by an ongoing task. The present thesis is focused on the impact of these factors on three sources of distraction, namely auditory and visual onsets (Experiments 1 and 2, respectively) and pleasant scenes (Experiment 3). In the first two studies we recorded neural correlates of distractor processing (i.e., Event-Related Potentials), whereas in the last study we used interference effects on behavior (i.e., a slowing down of response times on a simultaneous task) to index distraction. Endogenous spatial attention reduced distraction by auditory stimuli and eliminated distraction by visual onsets. Differently, visual working memory load only affected the processing of visual onsets. Emotional interference persisted even when scenes occurred always at unattended locations and when visual working memory was loaded. Altogether, these findings indicate that the ability to detect the location of salient task-irrelevant sounds and identify the affective significance of natural scenes is preserved even when the amount of visual working memory resources is reduced by an ongoing task and when endogenous attention is elsewhere directed. However, these results also indicate that the processing of auditory and visual distractors is not entirely automatic.
Resumo:
The establishment of the most stable structures of eight membered rings is a challenging task to the field of conformational analysis. In this work, a series of 2-halocyclooctanones were synthesized (including fluorine, chlorine, bromine and iodine derivatives) and submitted to conformational studies using a combination of theoretical calculation and infrared spectroscopy. For each compound, four conformations were identified as the most important ones. These conformations are derived from the chair-boat conformation of cyclooctanone. The pseudo-equatorial (with respect to the halogen) conformer is preferred in vacuum and in low polarity solvents for chlorine, bromine and iodine derivatives. For 2-fluorocyclooctanone, the preferred conformation in vacuum is pseudo-axial. In acetonitrile, the pseudo-axial conformer becomes the most stable for the chlorine derivative. According to NBO calculations, the conformational preference is not dictated by electron delocalization, but by classical electrostatic repulsions.