839 resultados para coincident timing task
Resumo:
O estudo da Psicopatia tem conquistado, nos anos mais recentes, um grande relevo dentro da investigação científica, estando sempre a serem feitos novos estudos que têm como objetivo a deteção eficiente da psicopatia. Portugal não é diferente. No entanto, apesar de já existirem investigações na área da psicopatia, os estudos e os instrumentos devidamente aferidos para a sua avaliação na população portuguesa ainda são escassos, pelo que se torna necessário continuar a trabalhar e a evoluir nesta área. É neste contexto que surge a ideia da aplicação da Wason Selection Task em conjunto com a PCL:SV na população portuguesa. Acredita-se que a Wason Selection Task poderá complementar a aplicação da PCL:SV, pois esta é um teste de lógica, onde os resultados obtidos são mais fidedignos, na medida de em que há menor permeabilidade a fenómenos de desejabilidade social.
Resumo:
Wireless communication technologies have become widely adopted, appearing in heterogeneous applications ranging from tracking victims, responders and equipments in disaster scenarios to machine health monitoring in networked manufacturing systems. Very often, applications demand a strictly bounded timing response, which, in distributed systems, is generally highly dependent on the performance of the underlying communication technology. These systems are said to have real-time timeliness requirements since data communication must be conducted within predefined temporal bounds, whose unfulfillment may compromise the correct behavior of the system and cause economic losses or endanger human lives. The potential adoption of wireless technologies for an increasingly broad range of application scenarios has made the operational requirements more complex and heterogeneous than before for wired technologies. On par with this trend, there is an increasing demand for the provision of cost-effective distributed systems with improved deployment, maintenance and adaptation features. These systems tend to require operational flexibility, which can only be ensured if the underlying communication technology provides both time and event triggered data transmission services while supporting on-line, on-the-fly parameter modification. Generally, wireless enabled applications have deployment requirements that can only be addressed through the use of batteries and/or energy harvesting mechanisms for power supply. These applications usually have stringent autonomy requirements and demand a small form factor, which hinders the use of large batteries. As the communication support may represent a significant part of the energy requirements of a station, the use of power-hungry technologies is not adequate. Hence, in such applications, low-range technologies have been widely adopted. In fact, although low range technologies provide smaller data rates, they spend just a fraction of the energy of their higher-power counterparts. The timeliness requirements of data communications, in general, can be met by ensuring the availability of the medium for any station initiating a transmission. In controlled (close) environments this can be guaranteed, as there is a strict regulation of which stations are installed in the area and for which purpose. Nevertheless, in open environments, this is hard to control because no a priori abstract knowledge is available of which stations and technologies may contend for the medium at any given instant. Hence, the support of wireless real-time communications in unmanaged scenarios is a highly challenging task. Wireless low-power technologies have been the focus of a large research effort, for example, in the Wireless Sensor Network domain. Although bringing extended autonomy to battery powered stations, such technologies are known to be negatively influenced by similar technologies contending for the medium and, especially, by technologies using higher power transmissions over the same frequency bands. A frequency band that is becoming increasingly crowded with competing technologies is the 2.4 GHz Industrial, Scientific and Medical band, encompassing, for example, Bluetooth and ZigBee, two lowpower communication standards which are the base of several real-time protocols. Although these technologies employ mechanisms to improve their coexistence, they are still vulnerable to transmissions from uncoordinated stations with similar technologies or to higher power technologies such as Wi- Fi, which hinders the support of wireless dependable real-time communications in open environments. The Wireless Flexible Time-Triggered Protocol (WFTT) is a master/multi-slave protocol that builds on the flexibility and timeliness provided by the FTT paradigm and on the deterministic medium capture and maintenance provided by the bandjacking technique. This dissertation presents the WFTT protocol and argues that it allows supporting wireless real-time communication services with high dependability requirements in open environments where multiple contention-based technologies may dispute the medium access. Besides, it claims that it is feasible to provide flexible and timely wireless communications at the same time in open environments. The WFTT protocol was inspired on the FTT paradigm, from which higher layer services such as, for example, admission control has been ported. After realizing that bandjacking was an effective technique to ensure the medium access and maintenance in open environments crowded with contention-based communication technologies, it was recognized that the mechanism could be used to devise a wireless medium access protocol that could bring the features offered by the FTT paradigm to the wireless domain. The performance of the WFTT protocol is reported in this dissertation with a description of the implemented devices, the test-bed and a discussion of the obtained results.
Resumo:
Studies examined the potential use of Virtual Environments (VEs) in teaching historical chronology to 127 children of primary school age (8–9 years). The use of passive fly-through VEs had been found, in an earlier study, to be disadvantageous with this age group when tested for their subsequent ability to place displayed sequential events in correct chronological order. All VEs in the present studies included active challenge, previously shown to enhance learning in older participants. Primary school children in the UK (all frequent computer users) were tested using UK historical materials, but no significant effect was found between three conditions (Paper, PowerPoint and VE) with minimal pre-training. However, excellent (error free) learning occurred when children were allowed greater exploration prior to training in the VE. In Ukraine, with children having much less computer familiarity, training in a VE (depicting Ukrainian history) produced better learning compared to PowerPoint, but no better than in a Paper condition. The results confirmed the benefit of using challenge in a VE with primary age children, but only with adequate prior familiarisation with the medium. Familiarity may reduce working memory load and increase children’s spatial memory capacity for acquiring sequential temporal-spatial information from virtual displays. Keywords: timeline, chronographics
Resumo:
Sound localization can be defined as the ability to identify the position of an input sound source and is considered a powerful aspect of mammalian perception. For low frequency sounds, i.e., in the range 270 Hz-1.5 KHz, the mammalian auditory pathway achieves this by extracting the Interaural Time Difference between sound signals being received by the left and right ear. This processing is performed in a region of the brain known as the Medial Superior Olive (MSO). This paper presents a Spiking Neural Network (SNN) based model of the MSO. The network model is trained using the Spike Timing Dependent Plasticity learning rule using experimentally observed Head Related Transfer Function data in an adult domestic cat. The results presented demonstrate how the proposed SNN model is able to perform sound localization with an accuracy of 91.82% when an error tolerance of +/-10 degrees is used. For angular resolutions down to 2.5 degrees , it will be demonstrated how software based simulations of the model incur significant computation times. The paper thus also addresses preliminary implementation on a Field Programmable Gate Array based hardware platform to accelerate system performance.
Resumo:
Dissertação de mest., Finanças Empresariais, Faculdade de Economia, Univ. do Algarve, 2010
Resumo:
Tese de doutoramento, Medicina (Ginecologia e Obstetrícia), Universidade de Lisboa, Faculdade de Medicina, 2014
Resumo:
Tese de doutoramento (co-tutela), Geologia (Geodinâmica Interna), Faculdade de Ciências da Universidade de Lisboa, Faculté des Sciences D’Orsay-Université Paris-Sud, 2014
Resumo:
In light of heightened interest in the response of pollen phenology to temperature, we investigated recent changes to the onset of Betula (birch) pollen seasons in central and southern England, including a test of predicted advancement of the Betula pollen season for London. We calculated onset of birch pollen seasons using daily airborne pollen data obtained at London, Plymouth and Worcester, determined trends in the start of the pollen season and compared timing of the birch pollen season with observed temperature patterns for the period 1995–2010. We found no overall change in the onset of birch pollen in the study period although there was evidence that the response to temperature was nonlinear and that a lower asymptotic start of the pollen season may exist. The start of the birch pollen season was strongly correlated with March mean temperature. These results reinforce previous findings showing that the timing of the birch pollen season in the UK is particularly sensitive to spring temperatures. The climate relationship shown here persists over both longer decadal-scale trends and shorter, seasonal trends as well as during periods of ‘sign-switching’ when cooler spring temperatures result in later start dates. These attributes, combined with the wide geographical coverage of airborne pollen monitoring sites, some with records extending back several decades, provide a powerful tool for the detection of climate change impacts, although local site factors and the requirement for winter chilling may be confounding factors.
Resumo:
Indices of post awakening cortisol secretion (PACS), include the rise in cortisol(cortisol awakening response: CAR) and overall cortisol concentrations (e.g. area under the curve with reference to ground: AUCg) in the first 30—45 min. Both are commonly investigated in relation to psychosocial variables. Although sampling within the domestic setting is ecologically valid, participant non-adherence to the required timing protocol results in erroneous measurement of PACS and this may explain discrepancies in the literature linking these measures to trait well-being (TWB). We have previously shown that delays of little over 5 min(between awakening and the start of sampling) to result in erroneous CAR estimates. In this study, we report for the first time on the negative impact of sample timing inaccuracy (verified by electronic-monitoring) on the efficacy to detect significant relationships between PACS and TWB when measured in the domestic setting.Healthy females (N = 49, 20.5 ± 2.8 years) selected for differences in TWB collected saliva samples (S1—4) on 4 days at 0, 15, 30, 45 min post awakening, to determine PACS. Adherence to the sampling protocol was objectively monitored using a combination of electronic estimates of awakening (actigraphy) and sampling times (track caps).Relationships between PACS and TWB were found to depend on sample timing accuracy. Lower TWB was associated with higher post awakening cortisol AUCg in proportion to the mean sample timing accuracy (p < .005). There was no association between TWB and the CAR even taking into account sample timing accuracy. These results highlight the importance of careful electronic monitoring of participant adherence for measurement of PACS in the domestic setting. Mean sample timing inaccuracy, mainly associated with delays of >5 min between awakening and collection of sample 1 (median = 8 min delay), negatively impacts on the sensitivity of analysis to detect associations between PACS and TWB.
Resumo:
The paper concerns the moral status of persons for the purposes of rights-holding and duty-bearing. Developing from Gewirth’s argument to the Principle of Generic Consistency (PGC) and Beyleveld et al.’s Principle of Precautionary Reasoning, I argue in favour of a capacity-based assessment of the task competencies required for choice-rights and certain duties (within the Hohfeldian analytic). Unlike other, traditional, theories of rights, I claim that precautionary reasoning as to agentic status holds the base justification for rights-holding. If this is the basis for generic legal rights, then the contingent argument must be used to explain communities of rights. Much in the same way as two ‘normal’ adult agents may not have equal rights to be an aeroplane pilot, not all adults hold the same task competencies in relation to the exercise of the generic rights to freedom derived from the PGC. In this paper, I set out to consider the rights held by children, persons suffering from mental illness and generic ‘full’ agents. In mapping the developing ‘portfolio’ of rights and duties that a person carries during their life we might better understand the legal relations of those who do not ostensibly fulfil the criteria of ‘full’ agent.
Resumo:
Consider the problem of assigning real-time tasks on a heterogeneous multiprocessor platform comprising two different types of processors — such a platform is referred to as two-type platform. We present two linearithmic timecomplexity algorithms, SA and SA-P, each providing the follow- ing guarantee. For a given two-type platform and a given task set, if there exists a feasible task-to-processor-type assignment such that tasks can be scheduled to meet deadlines by allowing them to migrate only between processors of the same type, then (i) using SA, it is guaranteed to find such a feasible task-to- processor-type assignment where the same restriction on task migration applies but given a platform in which processors are 1+α/2 times faster and (ii) SA-P succeeds in finding 2 a feasible task-to-processor assignment where tasks are not allowed to migrate between processors but given a platform in which processors are 1+α/times faster, where 0<α≤1. The parameter α is a property of the task set — it is the maximum utilization of any task which is less than or equal to 1.
Resumo:
A large part of power dissipation in a system is generated by I/O devices. Increasingly these devices provide power saving mechanisms, inter alia to enhance battery life. While I/O device scheduling has been studied in the past for realtime systems, the use of energy resources by these scheduling algorithms may be improved. These approaches are crafted considering a very large overhead of device transitions. Technology enhancements have allowed the hardware vendors to reduce the device transition overhead and energy consumption. We propose an intra-task device scheduling algorithm for real time systems that allows to shut-down devices while ensuring system schedulability. Our results show an energy gain of up to 90% when compared to the techniques proposed in the state-of-the-art.
Resumo:
In embedded systems, the timing behaviour of the control mechanisms are sometimes of critical importance for the operational safety. These high criticality systems require strict compliance with the offline predicted task execution time. The execution of a task when subject to preemption may vary significantly in comparison to its non-preemptive execution. Hence, when preemptive scheduling is required to operate the workload, preemption delay estimation is of paramount importance. In this paper a preemption delay estimation method for floating non-preemptive scheduling policies is presented. This work builds on [1], extending the model and optimising it considerably. The preemption delay function is subject to a major tightness improvement, considering the WCET analysis context. Moreover more information is provided as well in the form of an extrinsic cache misses function, which enables the method to provide a solution in situations where the non-preemptive regions sizes are small. Finally experimental results from the implementation of the proposed solutions in Heptane are provided for real benchmarks which validate the significance of this work.