792 resultados para computer-based technology


Relevância:

90.00% 90.00%

Publicador:

Resumo:

An Automatic Vehicle Location (AVL) system is a computer-based vehicle tracking system that is capable of determining a vehicle's location in real time. As a major technology of the Advanced Public Transportation System (APTS), AVL systems have been widely deployed by transit agencies for purposes such as real-time operation monitoring, computer-aided dispatching, and arrival time prediction. AVL systems make a large amount of transit performance data available that are valuable for transit performance management and planning purposes. However, the difficulties of extracting useful information from the huge spatial-temporal database have hindered off-line applications of the AVL data. ^ In this study, a data mining process, including data integration, cluster analysis, and multiple regression, is proposed. The AVL-generated data are first integrated into a Geographic Information System (GIS) platform. The model-based cluster method is employed to investigate the spatial and temporal patterns of transit travel speeds, which may be easily translated into travel time. The transit speed variations along the route segments are identified. Transit service periods such as morning peak, mid-day, afternoon peak, and evening periods are determined based on analyses of transit travel speed variations for different times of day. The seasonal patterns of transit performance are investigated by using the analysis of variance (ANOVA). Travel speed models based on the clustered time-of-day intervals are developed using important factors identified as having significant effects on speed for different time-of-day periods. ^ It has been found that transit performance varied from different seasons and different time-of-day periods. The geographic location of a transit route segment also plays a role in the variation of the transit performance. The results of this research indicate that advanced data mining techniques have good potential in providing automated techniques of assisting transit agencies in service planning, scheduling, and operations control. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The need to provide computers with the ability to distinguish the affective state of their users is a major requirement for the practical implementation of affective computing concepts. This dissertation proposes the application of signal processing methods on physiological signals to extract from them features that can be processed by learning pattern recognition systems to provide cues about a person's affective state. In particular, combining physiological information sensed from a user's left hand in a non-invasive way with the pupil diameter information from an eye-tracking system may provide a computer with an awareness of its user's affective responses in the course of human-computer interactions. In this study an integrated hardware-software setup was developed to achieve automatic assessment of the affective status of a computer user. A computer-based "Paced Stroop Test" was designed as a stimulus to elicit emotional stress in the subject during the experiment. Four signals: the Galvanic Skin Response (GSR), the Blood Volume Pulse (BVP), the Skin Temperature (ST) and the Pupil Diameter (PD), were monitored and analyzed to differentiate affective states in the user. Several signal processing techniques were applied on the collected signals to extract their most relevant features. These features were analyzed with learning classification systems, to accomplish the affective state identification. Three learning algorithms: Naïve Bayes, Decision Tree and Support Vector Machine were applied to this identification process and their levels of classification accuracy were compared. The results achieved indicate that the physiological signals monitored do, in fact, have a strong correlation with the changes in the emotional states of the experimental subjects. These results also revealed that the inclusion of pupil diameter information significantly improved the performance of the emotion recognition system. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Research on the mechanisms and processes underlying navigation has traditionally been limited by the practical problems of setting up and controlling navigation in a real-world setting. Thanks to advances in technology, a growing number of researchers are making use of computer-based virtual environments to draw inferences about real-world navigation. However, little research has been done on factors affecting human–computer interactions in navigation tasks. In this study female students completed a virtual route learning task and filled out a battery of questionnaires, which determined levels of computer experience, wayfinding anxiety, neuroticism, extraversion, psychoticism and immersive tendencies as well as their preference for a route or survey strategy. Scores on personality traits and individual differences were then correlated with the time taken to complete the navigation task, the length of path travelled,the velocity of the virtual walk and the number of errors. Navigation performance was significantly influenced by wayfinding anxiety, psychoticism, involvement and overall immersive tendencies and was improved in those participants who adopted a survey strategy. In other words, navigation in virtual environments is effected not only by navigational strategy, but also an individual’s personality, and other factors such as their level of experience with computers. An understanding of these differences is crucial before performance in virtual environments can be generalised to real-world navigational performance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

If patients at risk of admission or readmission to hospital or other forms of care could be identified and offered suitable early interventions then their lives and long-term health may be improved by reducing the chances of future admission or readmission to care, and hopefully, their cost of care reduced. Considerable work has been carried out in this subject area especially in the USA and the UK. This has led for instance to the development of tools such as PARR, PARR-30, and the Combined Predictive Model for prediction of emergency readmission or admission to acute care. Here we perform a structured review the academic and grey literature on predictive risk tools for social care utilisation, as well as admission and readmission to general hospitals and psychiatric hospitals. This is the first phase of a project in partnership with Docobo Ltd and funded by Innovate UK,in which we seek to develop novel predictive risk tools and dashboards to assist commissioners in Clinical Commissioning Groups with the triangulation of the intelligence available from routinely collected data to optimise integrated care and better understand the complex needs of individuals.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Computer game technology is poised to make a significant impact on the way our youngsters will learn. Our youngsters are ‘Digital Natives’, immersed in digital technologies, especially computer games. They expect to utilize these technologies in learning contexts. This expectation, and our response as educators, may change classroom practice and inform curriculum developments. This chapter approaches these issues ‘head on’. Starting from a review of the current educational issues, an evaluation of educational theory and instructional design principles, a new theoretical approach to the construction of “Educational Immersive Environments” (EIEs) is proposed. Elements of this approach are applied to development of an EIE to support Literacy Education in UK Primary Schools. An evaluation of a trial within a UK Primary School is discussed. Conclusions from both the theoretical development and the evaluation suggest how future teacher-practitioners may embrace both the technology and our approach to develop their own learning resources.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Die Nützlichkeit des Einsatzes von Computern in Schule und Ausbildung ist schon seit einigen Jahren unbestritten. Uneinigkeit herrscht gegenwärtig allerdings darüber, welche Aufgaben von Computern eigenständig wahrgenommen werden können. Bewertet man die Übernahme von Lehrfunktionen durch computerbasierte Lehrsysteme, müssen häufig Mängel festgestellt werden. Das Ziel der vorliegenden Arbeit ist es, ausgehend von aktuellen Praxisrealisierungen computerbasierter Lehrsysteme unterschiedliche Klassen von zentralen Lehrkompetenzen (Schülermodellierung, Fachwissen und instruktionale Aktivitäten im engeren Sinne) zu bestimmen. Innerhalb jeder Klasse werden globale Leistungen der Lehrsysteme und notwendige, in komplementärer Relation stehende Tätigkeiten menschlicher Tutoren bestimmt. Das dabei entstandene Klassifikationsschema erlaubt sowohl die Einordnung typischer Lehrsysteme als auch die Feststellung von spezifischen Kompetenzen, die in der Lehrer- bzw. Trainerausbildung zukünftig vermehrt berücksichtigt werden sollten. (DIPF/Orig.)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The surge of interest in graphene, as epitomized by the Nobel Prize in Physics in 2010, is attributed to its extraordinary properties. Graphene is ultrathin, mechanically tough, and has amendable surface chemistry. These features make graphene and graphene based nanostructure an ideal candidate for the use of molecular mass manipulation. The controllable and programmable molecular mass manipulation is crucial in enabling future graphene based applications, however is challenging to achieve. This dissertation studies several aspects in molecular mass manipulation including mass transportation, patterning and storage. For molecular mass transportation, two methods based on carbon nanoscroll are demonstrated to be effective. They are torsional buckling instability assisted transportation and surface energy induced radial shrinkage. To achieve a more controllable transportation, a fundamental law of direction transport of molecular mass by straining basal graphene is studied. For molecular mass patterning, we reveal a barrier effect of line defects in graphene, which can enable molecular confining and patterning in a domain of desirable geometry. Such a strategy makes controllable patterning feasible for various types of molecules. For molecular mass storage, we propose a novel partially hydrogenated bilayer graphene structure which has large capacity for mass uptake. Also the mass release can be achieved by simply stretching the structure. Therefore the mass uptake and release is reversible. This kind of structure is crucial in enabling hydrogen fuel based technology. Lastly, spontaneous nanofluidic channel formation enabled by patterned hydrogenation is studied. This novel strategy enables programmable channel formation with pre-defined complex geometry.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Part 11: Reference and Conceptual Models

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In knowledge technology work, as expressed by the scope of this conference, there are a number of communities, each uncovering new methods, theories, and practices. The Library and Information Science (LIS) community is one such community. This community, through tradition and innovation, theories and practice, organizes knowledge and develops knowledge technologies formed by iterative research hewn to the values of equal access and discovery for all. The Information Modeling community is another contributor to knowledge technologies. It concerns itself with the construction of symbolic models that capture the meaning of information and organize it in ways that are computer-based, but human understandable. A recent paper that examines certain assumptions in information modeling builds a bridge between these two communities, offering a forum for a discussion on common aims from a common perspective. In a June 2000 article, Parsons and Wand separate classes from instances in information modeling in order to free instances from what they call the “tyranny” of classes. They attribute a number of problems in information modeling to inherent classification – or the disregard for the fact that instances can be conceptualized independent of any class assignment. By faceting instances from classes, Parsons and Wand strike a sonorous chord with classification theory as understood in LIS. In the practice community and in the publications of LIS, faceted classification has shifted the paradigm of knowledge organization theory in the twentieth century. Here, with the proposal of inherent classification and the resulting layered information modeling, a clear line joins both the LIS classification theory community and the information modeling community. Both communities have their eyes turned toward networked resource discovery, and with this conceptual conjunction a new paradigmatic conversation can take place. Parsons and Wand propose that the layered information model can facilitate schema integration, schema evolution, and interoperability. These three spheres in information modeling have their own connotation, but are not distant from the aims of classification research in LIS. In this new conceptual conjunction, established by Parsons and Ward, information modeling through the layered information model, can expand the horizons of classification theory beyond LIS, promoting a cross-fertilization of ideas on the interoperability of subject access tools like classification schemes, thesauri, taxonomies, and ontologies. This paper examines the common ground between the layered information model and faceted classification, establishing a vocabulary and outlining some common principles. It then turns to the issue of schema and the horizons of conventional classification and the differences between Information Modeling and Library and Information Science. Finally, a framework is proposed that deploys an interpretation of the layered information modeling approach in a knowledge technologies context. In order to design subject access systems that will integrate, evolve and interoperate in a networked environment, knowledge organization specialists must consider a semantic class independence like Parsons and Wand propose for information modeling.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Le journalisme informatique est une pratique émergente qui consiste en l’emploi de méthodes et d’outils empruntés au domaine de l’informatique dans la cueillette, le traitement, l’analyse ou la présentation en ligne de l’information, dans le respect des valeurs fondamentales du journalisme (Diakopoulos, 2010). Cette recherche caractérise le journalisme informatique tel qu’il se pratique au Québec en 2015 au moyen d’une série d’entrevues semi-dirigées avec 30 participants : des praticiens, bien sûr, mais aussi des responsables des principales entreprises de presse où le journalisme informatique est pratiqué ou a été pratiqué, ainsi que des professionnels de l’informatique afin d’avoir un point de vue extérieur. Elle met notamment en relief deux attitudes à l’égard de cette pratique, attitudes qui s’expriment par un travail-frontière. Il y a, d’une part, les tenants d’une ségrégation entre le journalisme et l’informatique, et, d’autre part, les partisans d’une hybridation entre les deux disciplines. Pour ces derniers, le journalisme informatique occupe un territoire professionnel distinct, à la frontière du journalisme et de l’informatique. Cette recherche décrit de façon détaillée les motivations, les compétences et les activités des journalistes informatiques québécois et fait valoir qu’ils participent à une certaine « re-professionnalisation » du journalisme. Mots-clés : journalisme; informatique; technologie; journalisme informatique; journalisme de données; datajournalisme; journalisme assisté par ordinateur; professionnalisme; identité professionnelle; compétences professionnelles; travail-frontière; innovation en journalisme; Canada; Québec

Relevância:

90.00% 90.00%

Publicador:

Resumo:

At the intersection of biology, chemistry, and engineering, biosensors are a multidisciplinary innovation that provide a cost-effective alternative to traditional laboratory techniques. Due to their advantages, biosensors are used in medical diagnostics, environmental monitoring, food safety and many other fields. The first part of the thesis is concerned with learning the state of the art of paper-based immunosensors with bioluminescent (BL) and chemiluminescent (CL) detection. The use of biospecific assays combined with CL detection and paper-based technology offers an optimal approach to creating analytical tools for on-site applications and we have focused on the specific areas that need to be considered more in order to ensure a future practical implementation of these methods in routine analyses. The subsequent part of the thesis addresses the development of an autonomous lab-on-chip platform for performing chemiluminescent-based bioassays in space environment, exploiting a CubeSat platform for astrobiological investigations. An origami-inspired microfluidic paper-based analytical device has been developed with the purpose of assesses its performance in space and to evaluate its functionality and the resilience of the (bio)molecules when exposed to a radiation-rich environment. Subsequently, we designed a paper-based assay to detect traces of ovalbumin in food samples, creating a user-friendly immunosensing platform. To this purpose, we developed an origami device that exploits a competitive immunoassay coupled with chemiluminescence detection and magnetic microbeads used to immobilize ovalbumin on paper. Finally, with the aim of exploring the use of biomimetic materials, an hydrogel-based chemiluminescence biosensor for the detection of H2O2 and glucose was developed. A guanosine hydrogel was prepared and loaded with luminol and hemin, miming a DNAzyme activity. Subsequently, the hydrogel was modified by incorporating glucose oxidase enzyme to enable glucose biosensing. The emitted photons were detected using a portable device equipped with a smartphone's CMOS (complementary metal oxide semiconductor) camera for CL emission detection.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Despite the frequent use of stepping motors in robotics, automation, and a variety of precision instruments, they can hardly be found in rotational viscometers. This paper proposes the use of a stepping motor to drive a conventional constant-shear-rate laboratory rotational viscometer to avoid the use of velocity sensor and gearbox and, thus, simplify the instrument design. To investigate this driving technique, a commercial rotating viscometer has been adapted to be driven by a bipolar stepping motor, which is controlled via a personal computer. Special circuitry has been added to microstep the stepping motor at selectable step sizes and to condition the torque signal. Tests have been carried out using the prototype to produce flow curves for two standard Newtonian fluids (920 and 12 560 mPa (.) s, both at 25 degrees C). The flow curves have been obtained by employing several distinct microstep sizes within the shear rate range of 50-500 s(-1). The results indicate the feasibility of the proposed driving technique.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In mapping the evolutionary process of online news and the socio-cultural factors determining this development, this paper has a dual purpose. First, in reworking the definition of “online communication”, it argues that despite its seemingly sudden emergence in the 1990s, the history of online news started right in the early days of the telegraphs and spread throughout the development of the telephone and the fax machine before becoming computer-based in the 1980s and Web-based in the 1990s. Second, merging macro-perspectives on the dynamic of media evolution by DeFleur and Ball-Rokeach (1989) and Winston (1998), the paper consolidates a critical point for thinking about new media development: that something technically feasible does not always mean that it will be socially accepted and/or demanded. From a producer-centric perspective, the birth and development of pre-Web online news forms have been more or less generated by the traditional media’s sometimes excessive hype about the power of new technologies. However, placing such an emphasis on technological potentials at the expense of their social conditions not only can be misleading but also can be detrimental to the development of new media, including the potential of today’s online news.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Este projeto refere-se ?? implementa????o de processos de treinamento em microinform??tica realizada pela Dataprev, utilizando uma metodologia autoinstrucional - CBT (Computer Based Trainning). Esta metodologia foi adotada no programa TREINABEM, o que possibilitou uma economia de custos consider??vel, al??m de levar a uma mudan??a cultural a favor do aprendizado aut??nomo e cont??nuo

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The present work reports on the practical cooperation between two Universities from Hungary and Portugal. Students from Portugal are remotely accessing an experimental facility, which is physically in Hungary. The cooperation among these Higher Education establishments allowed the development and testing of a Remote Laboratory at the BME. This paper reports on the characteristics and initial testing of the Thermocouples Rise Time Measurement System and provides information on development and students' feedback.