792 resultados para computer-based technology
Resumo:
The technological advances in last decades have transformed the external resources of Vocational Counseling, Occupational Information and assessment of clients. Most computer systems follow a behaviorist-cognitive approach. However, the use of vocational counseling software is not exclusive to one conceptual approach. Computers are introduced in education from primary school; counselors and other educators are expected to use those systems. The attitude of counselors ranges from enthusiastic acceptance to complete refusal. Many counselors fear that computers will replace them. An underlying theory holds that counseling is based on the counselor-client interaction. A computer- client interaction cannot be considered vocational counseling. Counseling has five basic aims: prevention, assistance, education and development, service of diverse groups and research. The most relevant trends in computer-based counseling are: tests and questionnaires based on computers, adaptive development, computarized information, vocational counseling systems and research. Basic aims and the potential role of computers in achieving them are discussed. Present vocational counselors can use the technology of computers to link the past of our profession to its promising future. In view of these premises we have developed two computer systems that assist the vocational counseling process: "Professional Interests Questionnaire, Computer Version", and "Computer-based System of Vocational Counseling".
Resumo:
EPICS (Experimental Physics and Industrial Control System) lies in a set of software tools and applications which provide a software infrastructure for building distributed data acquisition and control systems. Currently there is an increase in use of such systems in large Physics experiments like ITER, ESS, and FREIA. In these experiments, advanced data acquisition systems using FPGA-based technology like FlexRIO are more frequently been used. The particular case of ITER (International Thermonuclear Experimental Reactor), the instrumentation and control system is supported by CCS (CODAC Core System), based on RHEL (Red Hat Enterprise Linux) operating system, and by the plant design specifications in which every CCS element is defined either hardware, firmware or software. In this degree final project the methodology proposed in Implementation of Intelligent Data Acquisition Systems for Fusion Experiments using EPICS and FlexRIO Technology Sanz et al. [1] is used. The final objective is to provide a document describing the fulfilled process and the source code of the data acquisition system accomplished. The use of the proposed methodology leads to have two diferent stages. The first one consists of the hardware modelling with graphic design tools like LabVIEWFPGA which later will be implemented in the FlexRIO device. In the next stage the design cycle is completed creating an EPICS controller that manages the device using a generic device support layer named NDS (Nominal Device Support). This layer integrates the data acquisition system developed into CCS (Control, data access and communication Core System) as an EPICS interface to the system. The use of FlexRIO technology drives the use of LabVIEW and LabVIEW FPGA respectively. RESUMEN. EPICS (Experimental Physics and Industrial Control System) es un conjunto de herramientas software utilizadas para el desarrollo e implementación de sistemas de adquisición de datos y control distribuidos. Cada vez es más utilizado para entornos de experimentación física a gran escala como ITER, ESS y FREIA entre otros. En estos experimentos se están empezando a utilizar sistemas de adquisición de datos avanzados que usan tecnología basada en FPGA como FlexRIO. En el caso particular de ITER, el sistema de instrumentación y control adoptado se basa en el uso de la herramienta CCS (CODAC Core System) basado en el sistema operativo RHEL (Red Hat) y en las especificaciones del diseño del sistema de planta, en la cual define todos los elementos integrantes del CCS, tanto software como firmware y hardware. En este proyecto utiliza la metodología propuesta para la implementación de sistemas de adquisición de datos inteligente basada en EPICS y FlexRIO. Se desea generar una serie de ejemplos que cubran dicho ciclo de diseño completo y que serían propuestos como casos de uso de dichas tecnologías. Se proporcionará un documento en el que se describa el trabajo realizado así como el código fuente del sistema de adquisición. La metodología adoptada consta de dos etapas diferenciadas. En la primera de ellas se modela el hardware y se sintetiza en el dispositivo FlexRIO utilizando LabVIEW FPGA. Posteriormente se completa el ciclo de diseño creando un controlador EPICS que maneja cada dispositivo creado utilizando una capa software genérica de manejo de dispositivos que se denomina NDS (Nominal Device Support). Esta capa integra la solución en CCS realizando la interfaz con la capa EPICS del sistema. El uso de la tecnología FlexRIO conlleva el uso del lenguaje de programación y descripción hardware LabVIEW y LabVIEW FPGA respectivamente.
Resumo:
The role of technology management in achieving improved manufacturing performance has been receiving increased attention as enterprises are becoming more exposed to competition from around the world. In the modern market for manufactured goods the demand is now for more product variety, better quality, shorter delivery and greater flexibility, while the financial and environmental cost of resources has become an urgent concern to manufacturing managers. This issue of the International Journal of Technology Management addresses the question of how the diffusion, implementation and management of technology can improve the performance of manufacturing industries. The authors come from a large number of different countries and their contributions cover a wide range of topics within this general theme. Some papers are conceptual, others report on research carried out in a range of different industries including steel production, iron founding, electronics, robotics, machinery, precision engineering, metal working and motor manufacture. In some cases they describe situations in specific countries. Several are based on presentations made at the UK Operations Management Association's Sixth International Conference held at Aston University at which the conference theme was 'Achieving Competitive Edge: Getting Ahead Through Technology and People'. The first two papers deal with questions of advanced manufacturing technology implementation and management. Firstly Beatty describes a three year longitudinal field study carried out in ten Canadian manufacturing companies using CADICAM and CIM systems. Her findings relate to speed of implementation, choice of system type, the role of individuals in implementation, organization and job design. This is followed by a paper by Bessant in which he argues that a more a strategic approach should be taken towards the management of technology in the 1990s and beyond. Also considered in this paper are the capabilities necessary in order to deploy advanced manufacturing technology as a strategic resource and the way such capabilities might be developed within the firm. These two papers, which deal largely with the implementation of hardware, are supplemented by Samson and Sohal's contribution in which they argue that a much wider perspective should be adopted based on a new approach to manufacturing strategy formulation. Technology transfer is the topic of the following two papers. Pohlen again takes the case of advanced manufacturing technology and reports on his research which considers the factors contributing to successful realisation of AMT transfer. The paper by Lee then provides a more detailed account of technology transfer in the foundry industry. Using a case study based on a firm which has implemented a number of transferred innovations a model is illustrated in which the 'performance gap' can be identified and closed. The diffusion of technology is addressed in the next two papers. In the first of these, by Lowe and Sim, the managerial technologies of 'Just in Time' and 'Manufacturing Resource Planning' (or MRP 11) are examined. A study is described from which a number of factors are found to influence the adoption process including, rate of diffusion and size. Dahlin then considers the case of a specific item of hardware technology, the industrial robot. Her paper reviews the history of robot diffusion since the early 1960s and then tries to predict how the industry will develop in the future. The following two papers deal with the future of manufacturing in a more general sense. The future implementation of advanced manufacturing technology is the subject explored by de Haan and Peters who describe the results of their Dutch Delphi forecasting study conducted among a panel of experts including scientists, consultants, users and suppliers of AMT. Busby and Fan then consider a type of organisational model, 'the extended manufacturing enterprise', which would represent a distinct alternative pure market-led and command structures by exploiting the shared knowledge of suppliers and customers. The three country-based papers consider some strategic issues relating manufacturing technology. In a paper based on investigations conducted in China He, Liff and Steward report their findings from strategy analyses carried out in the steel and watch industries with a view to assessing technology needs and organizational change requirements. This is followed by Tang and Nam's paper which examines the case of machinery industry in Korea and its emerging importance as a key sector in the Korean economy. In his paper which focuses on Venezuela, Ernst then considers the particular problem of how this country can address the problem of falling oil revenues. He sees manufacturing as being an important contributor to Venezuela's future economy and proposes a means whereby government and private enterprise can co-operate in development of the manufacturing sector. The last six papers all deal with specific topics relating to the management manufacturing. Firstly Youssef looks at the question of manufacturing flexibility, introducing and testing a conceptual model that relates computer based technologies flexibility. Dangerfield's paper which follows is based on research conducted in the steel industry. He considers the question of scale and proposes a modelling approach determining the plant configuration necessary to meet market demand. Engstrom presents the results of a detailed investigation into the need for reorganising material flow where group assembly of products has been adopted. Sherwood, Guerrier and Dale then report the findings of a study into the effectiveness of Quality Circle implementation. Stillwagon and Burns, consider how manufacturing competitiveness can be improved individual firms by describing how the application of 'human performance engineering' can be used to motivate individual performance as well as to integrate organizational goals. Finally Sohal, Lewis and Samson describe, using a case study example, how just-in-time control can be applied within the context of computer numerically controlled flexible machining lines. The papers in this issue of the International Journal of Technology Management cover a wide range of topics relating to the general question of improving manufacturing performance through the dissemination, implementation and management of technology. Although they differ markedly in content and approach, they have the collective aim addressing the concepts, principles and practices which provide a better understanding the technology of manufacturing and assist in achieving and maintaining a competitive edge.
Resumo:
The No Child Left Behind Act of 2001 (NCLB) brought many significant changes to American schools including accessibility to technology. Through an extensive literature review of the relationship between technology leadership and student achievement, five major themes emerged from data that support the need for more effective computer-based education in schools.
Resumo:
The purpose of this research was to investigate the relationship of computer anxiety to selected demographic variables: learning styles, age, gender, ethnicity, teaching/professional areas, educational level, and school types among vocational-technical educators.^ The subjects (n = 202) were randomly selected vocational-technical educators from Dade County Public School System, Florida, stratified across teaching/professional areas. All subjects received the same survey package in the spring of 1996. Subjects self-reported their learning style and level of computer anxiety by completing Kolb's Learning Style Inventory (LSI) and Oetting's Computer Anxiety Scale (COMPAS, Short Form). Subjects' general demographic information and their experience with computers were collected through a self-reported Participant Inventory Form.^ The distribution of scores suggested that some educators (25%) experienced some overall computer anxiety. There were significant correlations between computer related experience as indicated by self-ranked computer competence and computer based training and computer anxiety. One-way analyses of variance (ANOVA) indicated no significant differences between computer anxiety and/or computer related experiences, and learning style, age, and ethnicity. There were significant differences between educational level, teaching area, school type, and computer anxiety and/or computer related experiences. T-tests indicated significant differences between gender and computer related experiences. However, there was no difference between gender and computer anxiety.^ Analyses of covariance (ANCOVA) were performed for each independent variable on computer anxiety, with computer related experiences (self-ranked computer competence and computer based training) as the respective covariates. There were significant main effects for the educational level and school type on computer anxiety. All other variables were insignificant on computer anxiety. ANCOVA also revealed an effect for learning style varied notably on computer anxiety. All analyses were conducted at the.05 level of significance. ^
Resumo:
Computer game technology produces compelling ‘immersive environments’ where our digitally-native youth play and explore. Players absorb visual, auditory and other signs and process these in real time, making rapid choices on how to move through the game-space to experience ‘meaningful play’. How can immersive environments be designed to elicit perception and understanding of signs? In this paper we explore game design and gameplay from a semiotic perspective, focusing on the creation of meaning for players as they play the game. We propose a theory of game design based on semiotics.
Resumo:
Computer games are significant since they embody our youngsters’ engagement with contemporary culture, including both play and education. These games rely heavily on visuals, systems of sign and expression based on concepts and principles of Art and Architecture. We are researching a new genre of computer games, ‘Educational Immersive Environments’ (EIEs) to provide educational materials suitable for the school classroom. Close collaboration with subject teachers is necessary, but we feel a specific need to engage with the practicing artist, the art theoretician and historian. Our EIEs are loaded with multimedia (but especially visual) signs which act to direct the learner and provide the ‘game-play’ experience forming semiotic systems. We suggest the hypothesis that computer games are a space of deconstruction and reconstruction (DeRe): When players enter the game their physical world and their culture is torn apart; they move in a semiotic system which serves to reconstruct an alternate reality where disbelief is suspended. The semiotic system draws heavily on visuals which direct the players’ interactions and produce motivating gameplay. These can establish a reconstructed culture and emerging game narrative. We have recently tested our hypothesis and have used this in developing design principles for computer game designers. Yet there are outstanding issues concerning the nature of the visuals used in computer games, and so questions for contemporary artists. Currently, the computer game industry employs artists in a ‘classical’ role in production of concept sketches, storyboards and 3D content. But this is based on a specification from the client which restricts the artist in intellectual freedom. Our DeRe hypothesis places the artist at the generative centre, to inform the game designer how art may inform our DeRe semiotic spaces. This must of course begin with the artists’ understanding of DeRe in this time when our ‘identities are becoming increasingly fractured, networked, virtualized and distributed’ We hope to persuade artists to engage with the medium of computer game technology to explore these issues. In particular, we pose several questions to the artist: (i) How can particular ‘periods’ in art history be used to inform the design of computer games? (ii) How can specific artistic elements or devices be used to design ‘signs’ to guide the player through the game? (iii) How can visual material be integrated with other semiotic strata such as text and audio?
Resumo:
The power of computer game technology is currently being harnessed to produce “serious games”. These “games” are targeted at the education and training marketplace, and employ various key game-engine components such as the graphics and physics engines to produce realistic “digital-world” simulations of the real “physical world”. Many approaches are driven by the technology and often lack a consideration of a firm pedagogical underpinning. The authors believe that an analysis and deployment of both the technological and pedagogical dimensions should occur together, with the pedagogical dimension providing the lead. This chapter explores the relationship between these two dimensions, and explores how “pedagogy may inform the use of technology”, how various learning theories may be mapped onto the use of the affordances of computer game engines. Autonomous and collaborative learning approaches are discussed. The design of a serious game is broken down into spatial and temporal elements. The spatial dimension is related to the theories of knowledge structures, especially “concept maps”. The temporal dimension is related to “experiential learning”, especially the approach of Kolb. The multi-player aspect of serious games is related to theories of “collaborative learning” which is broken down into a discussion of “discourse” versus “dialogue”. Several general guiding principles are explored, such as the use of “metaphor” (including metaphors of space, embodiment, systems thinking, the internet and emergence). The topological design of a serious game is also highlighted. The discussion of pedagogy is related to various serious games we have recently produced and researched, and is presented in the hope of informing the “serious game community”.
Resumo:
Creative ways of utilising renewable energy sources in electricity generation especially in remote areas and particularly in countries depending on imported energy, while increasing energy security and reducing cost of such isolated off-grid systems, is becoming an urgently needed necessity for the effective strategic planning of Energy Systems. The aim of this research project was to design and implement a new decision support framework for the optimal design of hybrid micro grids considering different types of different technologies, where the design objective is to minimize the total cost of the hybrid micro grid while at the same time satisfying the required electric demand. Results of a comprehensive literature review, of existing analytical, decision support tools and literature on HPS, has identified the gaps and the necessary conceptual parts of an analytical decision support framework. As a result this research proposes and reports an Iterative Analytical Design Framework (IADF) and its implementation for the optimal design of an Off-grid renewable energy based hybrid smart micro-grid (OGREH-SμG) with intra and inter-grid (μG2μG & μG2G) synchronization capabilities and a novel storage technique. The modelling design and simulations were based on simulations conducted using HOMER Energy and MatLab/SIMULINK, Energy Planning and Design software platforms. The design, experimental proof of concept, verification and simulation of a new storage concept incorporating Hydrogen Peroxide (H2O2) fuel cell is also reported. The implementation of the smart components consisting Raspberry Pi that is devised and programmed for the semi-smart energy management framework (a novel control strategy, including synchronization capabilities) of the OGREH-SμG are also detailed and reported. The hybrid μG was designed and implemented as a case study for the Bayir/Jordan area. This research has provided an alternative decision support tool to solve Renewable Energy Integration for the optimal number, type and size of components to configure the hybrid μG. In addition this research has formulated and reported a linear cost function to mathematically verify computer based simulations and fine tune the solutions in the iterative framework and concluded that such solutions converge to a correct optimal approximation when considering the properties of the problem. As a result of this investigation it has been demonstrated that, the implemented and reported OGREH-SμG design incorporates wind and sun powered generation complemented with batteries, two fuel cell units and a diesel generator is a unique approach to Utilizing indigenous renewable energy with a capability of being able to synchronize with other μ-grids is the most effective and optimal way of electrifying developing countries with fewer resources in a sustainable way, with minimum impact on the environment while also achieving reductions in GHG. The dissertation concludes with suggested extensions to this work in the future.
Resumo:
Introduction: Current advances in frame modeling and computer software allow stereotactic procedures to be performed with great accuracy and minimal risk of neural tissue or vascular injury. Case Report: In this report we associate a previously described minimally invasive stereotactic technique with state-of-the-art 3D computer guidance technology to successfully treat a 55-year-old patient with an arachnoidal cyst obstructing the aqueduct of Sylvius. We provide 1 detailed technical information and discuss how this technique deals with previous limitations for stereotactic manipulation of the aqueductal region. We further discuss current advances in neuroendoscopy for treating obstructive hydrocephalus and make comparisons with our proposed technique. Conclusion: We advocate that this technique is not only capable of treating this pathology but it also has the advantages to enable reestablishment of physiological CSF flow thus preventing future brainstem compression by cyst enlargement.
Resumo:
The personal computer revolution has resulted in the widespread availability of low-cost image analysis hardware. At the same time, new graphic file formats have made it possible to handle and display images at resolutions beyond the capability of the human eye. Consequently, there has been a significant research effort in recent years aimed at making use of these hardware and software technologies for flotation plant monitoring. Computer-based vision technology is now moving out of the research laboratory and into the plant to become a useful means of monitoring and controlling flotation performance at the cell level. This paper discusses the metallurgical parameters that influence surface froth appearance and examines the progress that has been made in image analysis of flotation froths. The texture spectrum and pixel tracing techniques developed at the Julius Kruttschnitt Mineral Research Centre are described in detail. The commercial implementation, JKFrothCam, is one of a number of froth image analysis systems now reaching maturity. In plants where it is installed, JKFrothCam has shown a number of performance benefits. Flotation runs more consistently, meeting product specifications while maintaining high recoveries. The system has also shown secondary benefits in that reagent costs have been significantly reduced as a result of improved flotation control. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Distance learners are self-directed learners traditionally taught via study books, collections of readings, and exercises to test understanding of learning packages. Despite advances in e-Learning environments and computer-based teaching interfaces, distance learners still lack opportunities to participate in exercises and debates available to classroom learners, particularly through non-text based learning techniques. Effective distance teaching requires flexible learning opportunities. Using arguments developed in interpretation literature, we argue that effective distance learning must also be Entertaining, Relevant, Organised, Thematic, Involving and Creative—E.R.O.T.I.C. (after Ham, 1992). We discuss an experiment undertaken with distance learners at The University of Queensland Gatton Campus, where we initiated an E.R.O.T.I.C. external teaching package aimed at engaging distance learners but using multimedia, including but not limited to text-based learning tools. Student responses to non-text media were positive.
Resumo:
In 2002, an integrated basic science course was introduced into the Bachelor of Dental Sciences programme at the University of Queensland, Australia. Learning activities for the Metabolism and Nutrition unit within this integrated course included lectures, problem-based learning tutorials, computer-based self-directed learning exercises and practicals. To support student learning and assist students to develop the skills necessary to become lifelong learners, an extensive bank of formative assessment questions was set up using the commercially available package, WebCT®. Questions included short-answer, multiple-choice and extended matching questions. As significant staff time was involved in setting up the question database, the extent to which students used the formative assessment and their perceptions of its usefulness to their learning were evaluated to determine whether formative assessment should be extended to other units within the course. More than 90% of the class completed formative assessment tasks associated with learning activities scheduled in the first two weeks of the block, but this declined to less than 50% by the fourth and final week of the block. Patterns of usage of the formative assessment were also compared in students who scored in the top 10% for all assessment for the semester with those who scored in the lowest 10%. High-performing students accessed the Web-based formative assessment about twice as often as those who scored in the lowest band. However, marks for the formative assessment tests did not differ significantly between the two groups. In a questionnaire that was administered at the completion of the block, students rated the formative assessment highly, with 80% regarding it as being helpful for their learning. In conclusion, although substantial staff time was required to set up the question database, this appeared to be justified by the positive responses of the students.
Resumo:
Group decision making plays an important role in organizations, especially in the present-day economy that demands high-quality, yet quick decisions. Group decision-support systems (GDSSs) are interactive computer-based environments that support concerted, coordinated team efforts toward the completion of joint tasks. The need for collaborative work in organizations has led to the development of a set of general collaborative computer-supported technologies and specific GDSSs that support distributed groups (in time and space) in various domains. However, each person is unique and has different reactions to various arguments. Many times a disagreement arises because of the way we began arguing, not because of the content itself. Nevertheless, emotion, mood, and personality factors have not yet been addressed in GDSSs, despite how strongly they influence results. Our group’s previous work considered the roles that emotion and mood play in decision making. In this article, we reformulate these factors and include personality as well. Thus, this work incorporates personality, emotion, and mood in the negotiation process of an argumentbased group decision-making process. Our main goal in this work is to improve the negotiation process through argumentation using the affective characteristics of the involved participants. Each participant agent represents a group decision member. This representation lets us simulate people with different personalities. The discussion process between group members (agents) is made through the exchange of persuasive arguments. Although our multiagent architecture model4 includes two types of agents—the facilitator and the participant— this article focuses on the emotional, personality, and argumentation components of the participant agent.
Resumo:
During the recent years human society evolved from the “industrial society age” and transitioned into the “knowledge society age”. This means that knowledge media support migrated from “pen and paper” to computer-based Information Systems. Due to this fact Ergonomics has assumed an increasing importance, as a science/technology that deals with the problem of adapting the work to the man, namely in terms of Usability. This paper presents some relevant Ergonomics, Usability and User-centred design concepts regarding Information Systems.