832 resultados para ICGS (Electronic computer system)
Resumo:
The ability of millimetre wave and terahertz systems to penetrate clothing is well known. The fact that the transmission of clothing and the reflectivity of the body vary as a function of frequency is less so. Several instruments have now been developed to exploit this capability. The choice of operating frequency, however, has often been associated with the maturity and the cost of the enabling technology rather than a sound systems engineering approach. Top level user and systems requirements have been derived to inform the development of design concepts. Emerging micro and nano technology concepts have been reviewed and we have demonstrated how these can be evaluated against these requirements by simulation using OpenFx. Openfx is an open source suite of 3D tools for modeling, animation and visualization which has been modified for use at millimeter waves. © 2012 SPIE.
Resumo:
In this paper, a data driven orthogonal basis function approach is proposed for non-parametric FIR nonlinear system identification. The basis functions are not fixed a priori and match the structure of the unknown system automatically. This eliminates the problem of blindly choosing the basis functions without a priori structural information. Further, based on the proposed basis functions, approaches are proposed for model order determination and regressor selection along with their theoretical justifications. © 2008 IEEE.
Resumo:
This work presents a novel approach for human action recognition based on the combination of computer vision techniques and common-sense knowledge and reasoning capabilities. The emphasis of this work is on how common sense has to be leveraged to a vision-based human action recognition so that nonsensical errors can be amended at the understanding stage. The proposed framework is to be deployed in a realistic environment in which humans behave rationally, that is, motivated by an aim or a reason. © 2012 Springer-Verlag.
Resumo:
This paper presents a novel method to carry out monitoring of transport infrastructure such as pavements and bridges through the analysis of vehicle accelerations. An algorithm is developed for the identification of dynamic vehicle-bridge interaction forces using the vehicle response. Moving force identification theory is applied to a vehicle model in order to identify these dynamic forces between the vehicle and the road and/or bridge. A coupled half-car vehicle-bridge interaction model is used in theoretical simulations to test the effectiveness of the approach in identifying the forces. The potential of the method to identify the global bending stiffness of the bridge and to predict the pavement roughness is presented. The method is tested for a range of bridge spans using theoretical simulations and the influences of road roughness and signal noise on the accuracy of the results are investigated.
Resumo:
There is a dearth of evidence focusing on student preferences for computer-based testing versus
testing via student response systems for summative assessment in undergraduate education.
This quantitative study compared the preference and acceptability of computer-based testing
and a student response system for completing multiple choice questions in undergraduate
nursing education. After using both computer-based testing and a student response system to
complete multiple choice questions, 192 first year undergraduate nursing students rated their
preferences and attitudes towards using computer-based testing and a student response system.
Results indicated that seventy four percent felt the student response system was easy to use.
Fifty six percent felt the student response system took more time than the computer-based testing
to become familiar with. Sixty Percent felt computer-based testing was more users friendly.
Seventy Percent of students would prefer to take a multiple choice question summative exam
via computer-based testing, although Fifty percent would be happy to take using student response
system. Results are useful for undergraduate educators in relation to student’s preference
for using computer-based testing or student response system to undertake a summative
multiple choice question exam
Resumo:
Thesis submitted in the fulfilment of the requirements for the Degree of Master in Electronic and Telecomunications Engineering
Resumo:
Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.
Resumo:
The "Java Intelligent Tutoring System" (JITS) research project focused on designing, constructing, and determining the effectiveness of an Intelligent Tutoring System for beginner Java programming students at the postsecondary level. The participants in this research were students in the School of Applied Computing and Engineering Sciences at Sheridan College. This research involved consistently gathering input from students and instructors using JITS as it developed. The cyclic process involving designing, developing, testing, and refinement was used for the construction of JITS to ensure that it adequately meets the needs of students and instructors. The second objective in this dissertation determined the effectiveness of learning within this environment. The main findings indicate that JITS is a richly interactive ITS that engages students on Java programming problems. JITS is equipped with a sophisticated personalized feedback mechanism that models and supports each student in his/her learning style. The assessment component involved 2 main quantitative experiments to determine the effectiveness of JITS in terms of student performance. In both experiments it was determined that a statistically significant difference was achieved between the control group and the experimental group (i.e., JITS group). The main effect for Test (i.e., pre- and postiest), F( l , 35) == 119.43,p < .001, was qualified by a Test by Group interaction, F( l , 35) == 4.98,p < .05, and a Test by Time interaction, F( l , 35) == 43.82, p < .001. Similar findings were found for the second experiment; Test by Group interaction revealed F( 1 , 92) == 5.36, p < .025. In both experiments the JITS groups outperformed the corresponding control groups at posttest.
Resumo:
A method using L-cysteine for the determination of arsenous acid (As(III)), arsenic acid (As(V)), monomethylarsonic acid (MMAA), and dimethylarsinic acid (DMAA) by hydride generation was demonstrated. The instrument used was a d.c. plasma atomic emission spectrometer (OCP-AES). Complete recovery was reported for As(III), As(V), and DMAA while 86% recovery was reported for MMAA. Detection limits were determined, as arsenic for the species listed previously, to be 1.2, 0.8, 1.1, and 1.0 ngemL-l, respectively. Precision values, at 50 ngemL-1 arsenic concentration, were f.80/0, 2.50/0, 2.6% and 2.6% relative standard deviation, respectively. The L-cysteine reagent was compared directly with the conventional hydride generation technique which uses a potassium iodide-hydrochloric acid medium. Recoveries using L-cysteine when compared with the conventional method provided the following results: similar recoveries were obtained for As(III), slightly better recoveries were obtained for As(V) and MMAA, and significantly better recoveries for DMAA. In addition, tall and sharp peak shapes were observed for all four species when using L-cysteine. The arsenic speciation method involved separation by ion exchange .. high perfonnance liquid chromatography (HPLC) with on-line hydride generation using the L.. cysteine reagent and measurement byOCP-AES. Total analysis time per sample was 12 min while the time between the start of subsequent runs was approximately 20 min. A binary . gradient elution program, which incorporated the following two eluents: 0.01 and 0.5 mM tri.. sodium citrate both containing 5% methanol (v/v) and both at a pH of approximately 9, was used during the separation by HPLC. Recoveries of the four species which were measured as peak area, and were normalized against As(III), were 880/0, 290/0, and 40% for DMAA, MMAA and As(V), respectively. Resolution factors between adjacent analyte peaks of As(III) and DMAA was 1.1; DMAA and MMAA was 1.3; and MMAA and As(V) was 8.6. During the arsenic speciation study, signals from the d.c. plasma optical system were measured using a new photon-signal integrating device. The_new photon integrator developed and built in this laboratory was based on a previously published design which was further modified to reflect current available hardware. This photon integrator was interfaced to a personal computer through an AID convertor. The .photon integrator has adjustable threshold settings and an adjustable post-gain device.
Resumo:
Given a heterogeneous relation algebra R, it is well known that the algebra of matrices with coefficient from R is relation algebra with relational sums that is not necessarily finite. When a relational product exists or the point axiom is given, we can represent the relation algebra by concrete binary relations between sets, which means the algebra may be seen as an algebra of Boolean matrices. However, it is not possible to represent every relation algebra. It is well known that the smallest relation algebra that is not representable has only 16 elements. Such an algebra can not be put in a Boolean matrix form.[15] In [15, 16] it was shown that every relation algebra R with relational sums and sub-objects is equivalent to an algebra of matrices over a suitable basis. This basis is given by the integral objects of R, and is, compared to R, much smaller. Aim of my thesis is to develop a system called ReAlM - Relation Algebra Manipulator - that is capable of visualizing computations in arbitrary relation algebras using the matrix approach.
Resumo:
The Robocup Rescue Simulation System (RCRSS) is a dynamic system of multi-agent interaction, simulating a large-scale urban disaster scenario. Teams of rescue agents are charged with the tasks of minimizing civilian casualties and infrastructure damage while competing against limitations on time, communication, and awareness. This thesis provides the first known attempt of applying Genetic Programming (GP) to the development of behaviours necessary to perform well in the RCRSS. Specifically, this thesis studies the suitability of GP to evolve the operational behaviours required of each type of rescue agent in the RCRSS. The system developed is evaluated in terms of the consistency with which expected solutions are the target of convergence as well as by comparison to previous competition results. The results indicate that GP is capable of converging to some forms of expected behaviour, but that additional evolution in strategizing behaviours must be performed in order to become competitive. An enhancement to the standard GP algorithm is proposed which is shown to simplify the initial search space allowing evolution to occur much quicker. In addition, two forms of population are employed and compared in terms of their apparent effects on the evolution of control structures for intelligent rescue agents. The first is a single population in which each individual is comprised of three distinct trees for the respective control of three types of agents, the second is a set of three co-evolving subpopulations one for each type of agent. Multiple populations of cooperating individuals appear to achieve higher proficiencies in training, but testing on unseen instances raises the issue of overfitting.
Resumo:
Please consult the paper edition of this thesis to read. It is available on the 5th Floor of the Library at Call Number: Z 9999 E38 D56 1992
Resumo:
L'utilisation des méthodes formelles est de plus en plus courante dans le développement logiciel, et les systèmes de types sont la méthode formelle qui a le plus de succès. L'avancement des méthodes formelles présente de nouveaux défis, ainsi que de nouvelles opportunités. L'un des défis est d'assurer qu'un compilateur préserve la sémantique des programmes, de sorte que les propriétés que l'on garantit à propos de son code source s'appliquent également au code exécutable. Cette thèse présente un compilateur qui traduit un langage fonctionnel d'ordre supérieur avec polymorphisme vers un langage assembleur typé, dont la propriété principale est que la préservation des types est vérifiée de manière automatisée, à l'aide d'annotations de types sur le code du compilateur. Notre compilateur implante les transformations de code essentielles pour un langage fonctionnel d'ordre supérieur, nommément une conversion CPS, une conversion des fermetures et une génération de code. Nous présentons les détails des représentation fortement typées des langages intermédiaires, et les contraintes qu'elles imposent sur l'implantation des transformations de code. Notre objectif est de garantir la préservation des types avec un minimum d'annotations, et sans compromettre les qualités générales de modularité et de lisibilité du code du compilateur. Cet objectif est atteint en grande partie dans le traitement des fonctionnalités de base du langage (les «types simples»), contrairement au traitement du polymorphisme qui demande encore un travail substantiel pour satisfaire la vérification de type.
Resumo:
La fibrillation auriculaire, l'arythmie la plus fréquente en clinique, affecte 2.3 millions de patients en Amérique du Nord. Pour en étudier les mécanismes et les thérapies potentielles, des modèles animaux de fibrillation auriculaire ont été développés. La cartographie électrique épicardique à haute densité est une technique expérimentale bien établie pour suivre in vivo l'activité des oreillettes en réponse à une stimulation électrique, à du remodelage, à des arythmies ou à une modulation du système nerveux autonome. Dans les régions qui ne sont pas accessibles par cartographie épicardique, la cartographie endocardique sans contact réalisée à l'aide d'un cathéter en forme de ballon pourrait apporter une description plus complète de l'activité auriculaire. Dans cette étude, une expérience chez le chien a été conçue et analysée. Une reconstruction électro-anatomique, une cartographie épicardique (103 électrodes), une cartographie endocardique sans contact (2048 électrodes virtuelles calculées à partir un cathéter en forme de ballon avec 64 canaux) et des enregistrements endocardiques avec contact direct ont été réalisés simultanément. Les systèmes d'enregistrement ont été également simulés dans un modèle mathématique d'une oreillette droite de chien. Dans les simulations et les expériences (après la suppression du nœud atrio-ventriculaire), des cartes d'activation ont été calculées pendant le rythme sinusal. La repolarisation a été évaluée en mesurant l'aire sous l'onde T auriculaire (ATa) qui est un marqueur de gradient de repolarisation. Les résultats montrent un coefficient de corrélation épicardique-endocardique de 0.8 (expérience) and 0.96 (simulation) entre les cartes d'activation, et un coefficient de corrélation de 0.57 (expérience) and 0.92 (simulation) entre les valeurs de ATa. La cartographie endocardique sans contact apparait comme un instrument expérimental utile pour extraire de l'information en dehors des régions couvertes par les plaques d'enregistrement épicardique.