911 resultados para DMS (Computer system)
Resumo:
The technological advances in last decades have transformed the external resources of Vocational Counseling, Occupational Information and assessment of clients. Most computer systems follow a behaviorist-cognitive approach. However, the use of vocational counseling software is not exclusive to one conceptual approach. Computers are introduced in education from primary school; counselors and other educators are expected to use those systems. The attitude of counselors ranges from enthusiastic acceptance to complete refusal. Many counselors fear that computers will replace them. An underlying theory holds that counseling is based on the counselor-client interaction. A computer- client interaction cannot be considered vocational counseling. Counseling has five basic aims: prevention, assistance, education and development, service of diverse groups and research. The most relevant trends in computer-based counseling are: tests and questionnaires based on computers, adaptive development, computarized information, vocational counseling systems and research. Basic aims and the potential role of computers in achieving them are discussed. Present vocational counselors can use the technology of computers to link the past of our profession to its promising future. In view of these premises we have developed two computer systems that assist the vocational counseling process: "Professional Interests Questionnaire, Computer Version", and "Computer-based System of Vocational Counseling".
Resumo:
The ubiquitous marine trace gas dimethyl sulfide (DMS) comprises the greatest natural source of sulfur to the atmosphere and is a key player in atmospheric chemistry and climate. We explore the short-term response of DMS production and cycling and that of its algal precursor dimethyl sulfoniopropionate (DMSP) to elevated carbon dioxide (CO2) and ocean acidification (OA) in five 96 h shipboard bioassay experiments. Experiments were performed in June and July 2011, using water collected from contrasting sites in NW European waters (Outer Hebrides, Irish Sea, Bay of Biscay, North Sea). Concentrations of DMS and DMSP, alongside rates of DMSP synthesis and DMS production and consumption, were determined during all experiments for ambient CO2 and three high-CO2 treatments (550, 750, 1000 µatm). In general, the response to OA throughout this region showed little variation, despite encompassing a range of biological and biogeochemical conditions. We observed consistent and marked increases in DMS concentrations relative to ambient controls (110% (28-223%) at 550 µatm, 153% (56-295%) at 750 µatm and 225% (79-413%) at 1000 µatm), and decreases in DMSP concentrations (28% (18-40%) at 550 µatm, 44% (18-64%) at 750 µatm and 52% (24-72%) at 1000 µatm). Significant decreases in DMSP synthesis rate constants (µDMSP /d) and DMSP production rates (nmol/d) were observed in two experiments (7-90% decrease), whilst the response under high CO2 from the remaining experiments was generally indistinguishable from ambient controls. Rates of bacterial DMS gross consumption and production gave weak and inconsistent responses to high CO2. The variables and rates we report increase our understanding of the processes behind the response to OA. This could provide the opportunity to improve upon mesocosm-derived empirical modelling relationships and to move towards a mechanistic approach for predicting future DMS concentrations.
Resumo:
The wet bulk density is one of the most important parameters of the physical and geological properties of marine sediments. The density is connected directly with sedimentation history and a few sedirnent properties. Knowledge of the fine scale density-depth structure is the base for many model calculations, for both sedimentological and palaeoclimatic research. A density measurement system was designed and built at the Alfred Wegener Institute in Bremerhaven for measuring the wet buk density of sediment cores with high resolution in a non-destructive way. The density is deterrnined by measuring the absorption of Gamma-rays in the sediment. This principle has been used since the 50's in materials research and in the geosciences. In the present case, Cs137 is used as the radioactive source and the intensity is measured by a detector system (scintillator and photomultiplier). Density values are obtainable in both longitudinal core sections and planar cross-sections (the latter are a function of the axial rotation angle). Special studies on inhomogenity can be applied with core rotation. Detection of ice rafted debris (IRD) is made possible with this option. The processes that run the density measurement system are computer controlled. Besides the absorption measurement the core diameter at every measurement point is determined with a potentiometric system. The data values taken are stored on a personal computer. Before starting routine measurements on the sediment cores, a few experiments conceming the statistical aspects of the gamma-ray signal and its accuracy were carried out. These experiments led to such things as the optimum operational parameters. A high spatial resolution in the mm-range is possible with the 4mm-thin gamma-ray measurements. Within five seconds the wet bulk density can be deterrnined with an absolute accuracy of 1%. A comparison between data measured with the new system and conventional measurements on core samples after core splitting shows an agreement within +I- 5% for most of the values. For this thesis, density determinations were carried out on ten sediment cores. A few sediment characteristics are obtainable from using just the standard measurement results without core rotation. In addition to differentes and steps in the absolute density range, variations in the "frequency" of the density-depth structure can be detected due to the close spatial measurement interval and high resolution. Examples from measurements with small (9°) and great (90°) angle increments show that abrupt and smooth transitional changes of sedirnent layers as well as ice rafted debris of several dimensions can be detected and distiflguished clearly. After the presentation of the wet bulk density results, a comparison with data from other investigations was made. Measurements of the electrical resistivity correlated very well with the density data because both parameters are closely related to the porosity of the sedirnent. Additionally, results from measurements of the magnetic susceptibility and from ultra-sonic wave velocity investigations were considered for a integrative interpretation. The correlation of these both parameters and wet bulk density data is strongly dependent on the local (environmental) conditions. Finally, the densities were compared with recordings from sediment-echographic soundings and an X-ray computer tomography analysis. The individual results of all investigations were then finally combined into an accurate picture of the core. Problems of ambiguity, which exist when just one Parameter is determined alone, can be reduced more or less according to the number of parameters and sedimentary characteristics measured. The important role of the density data among other parameters of such an integrated interpretation is evident. Evidence of this role include the high resolution of the measurement, the excellent accuracy and the key position within methods and parameters concerning marine sediments.
Resumo:
The technique of Abstract Interpretation has allowed the development of very sophisticated global program analyses which are at the same time provably correct and practical. We present in a tutorial fashion a novel program development framework which uses abstract interpretation as a fundamental tool. The framework uses modular, incremental abstract interpretation to obtain information about the program. This information is used to validate programs, to detect bugs with respect to partial specifications written using assertions (in the program itself and/or in system libraries), to generate and simplify run-time tests, and to perform high-level program transformations such as multiple abstract specialization, parallelization, and resource usage control, all in a provably correct way. In the case of validation and debugging, the assertions can refer to a variety of program points such as procedure entry, procedure exit, points within procedures, or global computations. The system can reason with much richer information than, for example, traditional types. This includes data structure shape (including pointer sharing), bounds on data structure sizes, and other operational variable instantiation properties, as well as procedure-level properties such as determinacy, termination, nonfailure, and bounds on resource consumption (time or space cost). CiaoPP, the preprocessor of the Ciao multi-paradigm programming system, which implements the described functionality, will be used to illustrate the fundamental ideas.
Resumo:
This paper presents a blended learning approach and a study evaluating instruction in a software engineering-related course unit as part of an undergraduate engineering degree program in computing. In the past, the course unit had a lecture-based format. In view of student underachievement and the high course unit dropout rate, a distance-learning system was deployed, where students were allowed to choose between a distance-learning approach driven by a moderate constructivist instructional model or a blended-learning approach. The results of this experience are presented, with the aim of showing the effectiveness of the teaching/learning system deployed compared to the lecture-based system previously in place. The grades earned by students under the new system, following the distance-learning and blended-learning courses, are compared statistically to the grades attained in earlier years in the traditional face-to-face classroom (lecture-based) learning.
Resumo:
This article presents a multi-agent expert system (SMAF) , that allows the input of incidents which occur in different elements of the telecommunications area. SMAF interacts with experts and general users, and each agent with all the agents? community, recording the incidents and their solutions in a knowledge base, without the analysis of their causes. The incidents are expressed using keywords taken from natural language (originally Spanish) and their main concepts are recorded with their severities as the users express them. Then, there is a search of the best solution for each incident, being helped by a human operator using a distancenotions between them.
Resumo:
A small Positron Emission Tomography demonstrator based on LYSO slabs and Silicon Photomultiplier matrices is under construction at the University and INFN of Pisa. In this paper we present the characterization results of the read-out electronics and of the detection system. Two SiPM matrices, composed by 8 × 8 SiPM pixels, 1.5 mm pitch, have been coupled one to one to a LYSO crystals array. Custom Front-End ASICs were used to read the 64 channels of each matrix. Data from each Front-End were multiplexed and sent to a DAQ board for the digital conversion; a motherboard collects the data and communicates with a host computer through a USB port. Specific tests were carried out on the system in order to assess its performance. Futhermore we have measured some of the most important parameters of the system for PET application.
Resumo:
BACKGROUND: Antiretroviral therapy has changed the natural history of human immunodeficiency virus (HIV) infection in developed countries, where it has become a chronic disease. This clinical scenario requires a new approach to simplify follow-up appointments and facilitate access to healthcare professionals. METHODOLOGY: We developed a new internet-based home care model covering the entire management of chronic HIV-infected patients. This was called Virtual Hospital. We report the results of a prospective randomised study performed over two years, comparing standard care received by HIV-infected patients with Virtual Hospital care. HIV-infected patients with access to a computer and broadband were randomised to be monitored either through Virtual Hospital (Arm I) or through standard care at the day hospital (Arm II). After one year of follow up, patients switched their care to the other arm. Virtual Hospital offered four main services: Virtual Consultations, Telepharmacy, Virtual Library and Virtual Community. A technical and clinical evaluation of Virtual Hospital was carried out. FINDINGS: Of the 83 randomised patients, 42 were monitored during the first year through Virtual Hospital (Arm I) and 41 through standard care (Arm II). Baseline characteristics of patients were similar in the two arms. The level of technical satisfaction with the virtual system was high: 85% of patients considered that Virtual Hospital improved their access to clinical data and they felt comfortable with the videoconference system. Neither clinical parameters [level of CD4+ T lymphocytes, proportion of patients with an undetectable level of viral load (p = 0.21) and compliance levels >90% (p = 0.58)] nor the evaluation of quality of life or psychological questionnaires changed significantly between the two types of care. CONCLUSIONS: Virtual Hospital is a feasible and safe tool for the multidisciplinary home care of chronic HIV patients. Telemedicine should be considered as an appropriate support service for the management of chronic HIV infection. TRIAL REGISTRATION: Clinical-Trials.gov: NCT01117675.
Resumo:
This article presents the model of a multi-agent system (SMAF), which objectives are the input of fuzzy incidents as the human experts express them with different severities degrees and the further search and suggestion of solutions. The solutions will be later confirm or not by the users. This model was designed, implemented and tested in the telecommunications field, with heterogeneous agents in a cooperative model. In the design, different abstract levels where considered, according to the agents? objectives, their ways to carry it out and the environment in which they act. Each agent is modeled with different spectrum of the knowledge base
Resumo:
A novel HCPV nonimaging concentrator concept with high concentration (>500×) is presented. It uses the combination of a commercial concentration GaInP∕GaInAs∕Ge 3J cell and a concentration Back‐Point‐Contact (BPC) concentration silicon cell for efficient spectral utilization, and external confinement techniques for recovering the 3J cell′s reflection. The primary optical element (POE) is a flat Fresnel lens and the secondary optical element (SOE) is a free‐form RXI‐type concentrator with a band‐pass filter embedded it, both POE and SOE performing Köhler integration to produce light homogenization. The band‐pass filter sends the IR photons in the 900–1200 nm band to the silicon cell. Computer simulations predict that four‐terminal terminal designs could achieve ∼46% added cell efficiencies using commercial 39% 3J and 26% Si cells. A first proof‐of concept receiver prototype has been manufactured using a simpler optical architecture (with a lower concentration, ∼ 100× and lower simulated added efficiency), and experimental measurements have shown up to 39.8% 4J receiver efficiency using a 3J with peak efficiency of 36.9%
Resumo:
Industrial applications of computer vision sometimes require detection of atypical objects that occur as small groups of pixels in digital images. These objects are difficult to single out because they are small and randomly distributed. In this work we propose an image segmentation method using the novel Ant System-based Clustering Algorithm (ASCA). ASCA models the foraging behaviour of ants, which move through the data space searching for high data-density regions, and leave pheromone trails on their path. The pheromone map is used to identify the exact number of clusters, and assign the pixels to these clusters using the pheromone gradient. We applied ASCA to detection of microcalcifications in digital mammograms and compared its performance with state-of-the-art clustering algorithms such as 1D Self-Organizing Map, k-Means, Fuzzy c-Means and Possibilistic Fuzzy c-Means. The main advantage of ASCA is that the number of clusters needs not to be known a priori. The experimental results show that ASCA is more efficient than the other algorithms in detecting small clusters of atypical data.
Resumo:
This paper outlines an automatic computervision system for the identification of avena sterilis which is a special weed seed growing in cereal crops. The final goal is to reduce the quantity of herbicide to be sprayed as an important and necessary step for precision agriculture. So, only areas where the presence of weeds is important should be sprayed. The main problems for the identification of this kind of weed are its similar spectral signature with respect the crops and also its irregular distribution in the field. It has been designed a new strategy involving two processes: image segmentation and decision making. The image segmentation combines basic suitable image processing techniques in order to extract cells from the image as the low level units. Each cell is described by two area-based attributes measuring the relations among the crops and weeds. The decision making is based on the SupportVectorMachines and determines if a cell must be sprayed. The main findings of this paper are reflected in the combination of the segmentation and the SupportVectorMachines decision processes. Another important contribution of this approach is the minimum requirements of the system in terms of memory and computation power if compared with other previous works. The performance of the method is illustrated by comparative analysis against some existing strategies.
Resumo:
Compilation techniques such as those portrayed by the Warren Abstract Machine(WAM) have greatly improved the speed of execution of logic programs. The research presented herein is geared towards providing additional performance to logic programs through the use of parallelism, while preserving the conventional semantics of logic languages. Two áreas to which special attention is given are the preservation of sequential performance and storage efficiency, and the use of low overhead mechanisms for controlling parallel execution. Accordingly, the techniques used for supporting parallelism are efficient extensions of those which have brought high inferencing speeds to sequential implementations. At a lower level, special attention is also given to design and simulation detail and to the architectural implications of the execution model behavior. This paper offers an overview of the basic concepts and techniques used in the parallel design, simulation tools used, and some of the results obtained to date.
Resumo:
El audio multicanal ha avanzado a pasos agigantados en los últimos años, y no solo en las técnicas de reproducción, sino que en las de capitación también. Por eso en este proyecto se encuentran ambas cosas: un array microfónico, EigenMike32 de MH Acoustics, y un sistema de reproducción con tecnología Wave Field Synthesis, instalado Iosono en la Jade Höchscule Oldenburg. Para enlazar estos dos puntos de la cadena de audio se proponen dos tipos distintos de codificación: la reproducción de la toma horizontal del EigenMike32; y el 3er orden de Ambisonics (High Order Ambisonics, HOA), una técnica de codificación basada en Armónicos Esféricos mediante la cual se simula el campo acústico en vez de simular las distintas fuentes. Ambas se desarrollaron en el entorno Matlab y apoyadas por la colección de scripts de Isophonics llamada Spatial Audio Matlab Toolbox. Para probar éstas se llevaron a cabo una serie de test en los que se las comparó con las grabaciones realizadas a la vez con un Dummy Head, a la que se supone el método más aproximado a nuestro modo de escucha. Estas pruebas incluían otras grabaciones hechas con un Doble MS de Schoeps que se explican en el proyecto “Sally”. La forma de realizar éstas fue, una batería de 4 audios repetida 4 veces para cada una de las situaciones garbadas (una conversación, una clase, una calle y un comedor universitario). Los resultados fueron inesperados, ya que la codificación del tercer orden de HOA quedo por debajo de la valoración Buena, posiblemente debido a la introducción de material hecho para un array tridimensional dentro de uno de 2 dimensiones. Por el otro lado, la codificación que consistía en extraer los micrófonos del plano horizontal se mantuvo en el nivel de Buena en todas las situaciones. Se concluye que HOA debe seguir siendo probado con mayores conocimientos sobre Armónicos Esféricos; mientras que el otro codificador, mucho más sencillo, puede ser usado para situaciones sin mucha complejidad en cuanto a espacialidad. In the last years the multichannel audio has increased in leaps and bounds and not only in the playback techniques, but also in the recording ones. That is the reason of both things being in this project: a microphone array, EigenMike32 from MH Acoustics; and a playback system with Wave Field Synthesis technology, installed by Iosono in Jade Höchscule Oldenburg. To link these two points of the audio chain, 2 different kinds of codification are proposed: the reproduction of the EigenMike32´s horizontal take, and the Ambisonics´ third order (High Order Ambisonics, HOA), a codification technique based in Spherical Harmonics through which the acoustic field is simulated instead of the different sound sources. Both have been developed inside Matlab´s environment and supported by the Isophonics´ scripts collection called Spatial Audio Matlab Toolbox. To test these, a serial of tests were made in which they were compared with recordings made at the time by a Dummy Head, which is supposed to be the closest method to our hearing way. These tests included other recording and codifications made by a Double MS (DMS) from Schoeps which are explained in the project named “3D audio rendering through Ambisonics techniques: from multi-microphone recordings (DMS Schoeps) to a WFS system, through Matlab”. The way to perform the tests was, a collection made of 4 audios repeated 4 times for each recorded situation (a chat, a class, a street and college canteen or Mensa). The results were unexpected, because the HOA´s third order stood under the Well valuation, possibly caused by introducing material made for a tridimensional array inside one made only by 2 dimensions. On the other hand, the codification that consisted of extracting the horizontal plane microphones kept the Well valuation in all the situations. It is concluded that HOA should keep being tested with larger knowledge about Spherical Harmonics; while the other coder, quite simpler, can be used for situations without a lot of complexity with regards to spatiality.
Resumo:
Nowadays the stress is a frequent problem in the society. The level of stress could be important in order to recognise health problems later. Electrocardiogram technics allows to supervise the heart condition and the detection of anomalies about the patient. Sometimes the data collection systems by sensors placed on the patient restrict his mobility. Therefore the elimination of wires is a good solution for this trouble. Then the Bluetooth protocol is chosen as way for transmitting and receive data between stations. There are three ECG sensors placed on the right hand, the left hand and the right leg. It is possible to measure the heart signal with this technique. Besides there is an extra sensor in order to measure the temperature of the patient. Depending of the value of these parameters is possible to recognise stress levels. All sensors are connected to a special box with a microcontroller which treat every signal. This module has a Bluetooth part that transmitts wireless the new digital signal to the receiver. This one will be a dongle connected to the computer by Serial Port. A program in the computer has been implemented in order to receive the Bluetooth Data sent from the box and saving the data in a file for subsequent activities. El objetivo principal de este proyecto es el estudio de parámetros como la temperatura corporal y las señales de electrocardiograma para el diagnóstico del estrés. Existen varios estudios que relacionan estos parámetros y sus niveles con posibles casos de estrés y ansiedad. Para este fin usamos unos sensores colocados en el brazo derecho, brazo izquierdo y pierna izquierda. Esto forma el Eindhoven Triangle, que es conocido por dar una señal de electrocardiograma. A su vez también tendremos un sensor de temperatura colocado en un dedo de la mano para medir los grados a los que está el cuerpo en ese momento y así poder detectar ciertas anomalías. Estos sensores están conectados a un modulo que trata las señales analógicas recogidas, las une, y digitaliza para que el modulo transmisor pueda enviar via Bluetooth los datos hacia un receptor colocado en un área cercana. En el módulo hay una electrónica que ayuda a resolver problemas importantes como ruido o interferencias. Este receptor está conectado a un ordenador en el cual he desarrollado una aplicación que implementa el protocolo HCI y cuya funcionalidad es recoger los datos recibidos. Este programa es capaz de crear y gestionar conexiones Bluetooth entre dispositivos. El programa está preparado para que si las conexiones se cortan, se traten en la medida de lo posible los datos recogidos. Los datos se interpretarán y guardarán en un fichero .bin para posteriores usos, como graficaciones y análisis de parámetros. El programa está enteramente hecho en lenguaje Java y tiene un mecanismo de eventos que se activa cada vez que hay datos en el receptor, los recoge y los procesa con el fin de darles un trato posteriormente. Se eligió el formato .bin para los ficheros debido a su pequeño tamaño, ya que aunque sean más laboriosos de usar es mucho más eficiente que un .txt, que en este caso podría ocupar varios megabytes.