43 resultados para Task-Based Instruction (TBI)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Inferring population admixture from genetic data and quantifying it is a difficult but crucial task in evolutionary and conservation biology. Unfortunately state-of-the-art probabilistic approaches are computationally demanding. Effectively exploiting the computational power of modern multiprocessor systems can thus have a positive impact to Monte Carlo-based simulation of admixture modeling. A novel parallel approach is briefly described and promising results on its message passing interface (MPI)-based C implementation are reported.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The hazards associated with high voltage three phase inverters ond the rotating sha@s of large electrical machines have resulted in most of the engineering courses covering these topics to be predominantly theoretical. This paper describes a set of purpose built, low voltage and low cost teaching equipment which allows the “hands on I’ instruction of three phase inverters and rotating machines. By using low voltages, the student can experiment freely with the motors and inverter and can access all of the current and voltage waveforms, which until now could only be studied in text books or observed as part of laboratory demonstrations. Both the motor and the inverter designs are optimized for teaching purposes, cost around $25 and can be made with minimal effort.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The hazards associated with high-voltage three-phase inverters and high-powered large electrical machines have resulted in most of the engineering courses covering three-phase machines and drives theoretically. This paper describes a set of purpose-built, low-voltage, and low-cost teaching equipment that allows the hands-on instruction of three-phase inverters and rotating machines. The motivation for moving towards a system running at low voltages is that the students can safely experiment freely with the motors and inverter. The students can also access all of the current and voltage waveforms, which until now could only be studied in textbooks or observed as part of laboratory demonstrations. Both the motor and the inverter designs are for teaching purposes and require minimal effort and cost.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Individuals with dysphagia may be prescribed thickened fluids to promote a safer and more successful swallow. Starch-based thickening agents are often employed; however, these exhibit great variation in consistency. The aim of this study was to compare viscosity and the rheological profile parameters complex (G*), viscous (G″), and elastic modulus (G′) over a range of physiological shear rates. UK commercially available dysphagia products at “custard” consistency were examined. Commercially available starch-based dysphagia products were prepared according to manufacturers’ instructions; the viscosity and rheological parameters were tested on a CVOR Rheometer. At a measured shear rate of 50 s−1, all products fell within the viscosity limits defined according to the National Dysphagia Diet Task Force guidelines. However, at lower shear rates, large variations in viscosity were observed. Rheological parameters G*, G′, and G″ also demonstrated considerable differences in both overall strength and rheological behavior between different batches of the same product and different product types. The large range in consistency and changes in the overall structure of the starch-based products over a range of physiological shear rates show that patients could be receiving fluids with very different characteristics from that advised. This could have detrimental effects on their ability to swallow.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The utility of an "ecologically rational" recognition-based decision rule in multichoice decision problems is analyzed, varying the type of judgment required (greater or lesser). The maximum size and range of a counterintuitive advantage associated with recognition-based judgment (the "less-is-more effect") is identified for a range of cue validity values. Greater ranges of the less-is-more effect occur when participants are asked which is the greatest of to choices (m > 2) than which is the least. Less-is-more effects also have greater range for larger values of in. This implies that the classic two-altemative forced choice task, as studied by Goldstein and Gigerenzer (2002), may not be the most appropriate test case for less-is-more effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, for the first time, prospective memory was investigated in 11 school-aged children with autism spectrum disorders and 11 matched neurotypical controls. A computerised time-based prospective memory task was embedded in a visuospatial working memory test and required participants to remember to respond to certain target times. Controls had significantly more correct prospective memory responses than the autism spectrum group. Moreover, controls checked the time more often and increased time-monitoring more steeply as the target times approached. These differences in time-checking may suggest that prospective memory in autism spectrum disorders is affected by reduced self-initiated processing as indicated by reduced task monitoring.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Frequency recognition is an important task in many engineering fields such as audio signal processing and telecommunications engineering, for example in applications like Dual-Tone Multi-Frequency (DTMF) detection or the recognition of the carrier frequency of a Global Positioning, System (GPS) signal. This paper will present results of investigations on several common Fourier Transform-based frequency recognition algorithms implemented in real time on a Texas Instruments (TI) TMS320C6713 Digital Signal Processor (DSP) core. In addition, suitable metrics are going to be evaluated in order to ascertain which of these selected algorithms is appropriate for audio signal processing(1).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the past decade, airborne based LIght Detection And Ranging (LIDAR) has been recognised by both the commercial and public sectors as a reliable and accurate source for land surveying in environmental, engineering and civil applications. Commonly, the first task to investigate LIDAR point clouds is to separate ground and object points. Skewness Balancing has been proven to be an efficient non-parametric unsupervised classification algorithm to address this challenge. Initially developed for moderate terrain, this algorithm needs to be adapted to handle sloped terrain. This paper addresses the difficulty of object and ground point separation in LIDAR data in hilly terrain. A case study on a diverse LIDAR data set in terms of data provider, resolution and LIDAR echo has been carried out. Several sites in urban and rural areas with man-made structure and vegetation in moderate and hilly terrain have been investigated and three categories have been identified. A deeper investigation on an urban scene with a river bank has been selected to extend the existing algorithm. The results show that an iterative use of Skewness Balancing is suitable for sloped terrain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper derives an efficient algorithm for constructing sparse kernel density (SKD) estimates. The algorithm first selects a very small subset of significant kernels using an orthogonal forward regression (OFR) procedure based on the D-optimality experimental design criterion. The weights of the resulting sparse kernel model are then calculated using a modified multiplicative nonnegative quadratic programming algorithm. Unlike most of the SKD estimators, the proposed D-optimality regression approach is an unsupervised construction algorithm and it does not require an empirical desired response for the kernel selection task. The strength of the D-optimality OFR is owing to the fact that the algorithm automatically selects a small subset of the most significant kernels related to the largest eigenvalues of the kernel design matrix, which counts for the most energy of the kernel training data, and this also guarantees the most accurate kernel weight estimate. The proposed method is also computationally attractive, in comparison with many existing SKD construction algorithms. Extensive numerical investigation demonstrates the ability of this regression-based approach to efficiently construct a very sparse kernel density estimate with excellent test accuracy, and our results show that the proposed method compares favourably with other existing sparse methods, in terms of test accuracy, model sparsity and complexity, for constructing kernel density estimates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent research in multi-agent systems incorporate fault tolerance concepts, but does not explore the extension and implementation of such ideas for large scale parallel computing systems. The work reported in this paper investigates a swarm array computing approach, namely 'Intelligent Agents'. A task to be executed on a parallel computing system is decomposed to sub-tasks and mapped onto agents that traverse an abstracted hardware layer. The agents intercommunicate across processors to share information during the event of a predicted core/processor failure and for successfully completing the task. The feasibility of the approach is validated by implementation of a parallel reduction algorithm on a computer cluster using the Message Passing Interface.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent research in multi-agent systems incorporate fault tolerance concepts, but does not explore the extension and implementation of such ideas for large scale parallel computing systems. The work reported in this paper investigates a swarm array computing approach, namely 'Intelligent Agents'. A task to be executed on a parallel computing system is decomposed to sub-tasks and mapped onto agents that traverse an abstracted hardware layer. The agents intercommunicate across processors to share information during the event of a predicted core/processor failure and for successfully completing the task. The feasibility of the approach is validated by simulations on an FPGA using a multi-agent simulator, and implementation of a parallel reduction algorithm on a computer cluster using the Message Passing Interface.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent research in multi-agent systems incorporate fault tolerance concepts. However, the research does not explore the extension and implementation of such ideas for large scale parallel computing systems. The work reported in this paper investigates a swarm array computing approach, namely ‘Intelligent Agents’. In the approach considered a task to be executed on a parallel computing system is decomposed to sub-tasks and mapped onto agents that traverse an abstracted hardware layer. The agents intercommunicate across processors to share information during the event of a predicted core/processor failure and for successfully completing the task. The agents hence contribute towards fault tolerance and towards building reliable systems. The feasibility of the approach is validated by simulations on an FPGA using a multi-agent simulator and implementation of a parallel reduction algorithm on a computer cluster using the Message Passing Interface.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The National Grid Company plc. owns and operates the electricity transmission network in England and Wales, the day to day running of the network being carried out by teams of engineers within the national control room. The task of monitoring and operating the transmission network involves the transfer of large amounts of data and a high degree of cooperation between these engineers. The purpose of the research detailed in this paper is to investigate the use of interfacing techniques within the control room scenario, in particular, the development of an agent based architecture for the support of cooperative tasks. The proposed architecture revolves around the use of interface and user supervisor agents. Primarily, these agents are responsible for the flow of information to and from individual users and user groups. The agents are also responsible for tackling the synchronisation and control issues arising during the completion of cooperative tasks. In this paper a novel approach to human computer interaction (HCI) for power systems incorporating an embedded agent infrastructure is presented. The agent architectures used to form the base of the cooperative task support system are discussed, as is the nature of the support system and tasks it is intended to support.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The current study investigated the influence of encoding modality and cue-action relatedness on prospective memory (PM) performance in young and older adults using a modified version of the Virtual Week task. Participants encoded regular and irregular intentions either verbally or by physically performing the action during encoding. For half of the intentions there was a close semantic relation between the retrieval cue and the intended action, while for the remaining intentions the cue and action were semantically unrelated. For irregular tasks, both age groups showed superior PM for related intentions compared to unrelated intentions in both encoding conditions. While older adults retrieved fewer irregular intentions than young adults after verbal encoding, there was no age difference following enactment. Possible mechanisms of enactment and relatedness effects are discussed in the context of current theories of event-based PM.