928 resultados para Computer input-output equipment
Resumo:
Pós-graduação em Engenharia Mecânica - FEG
Resumo:
The study analyzed contours of the thoracic region of patients undergoing radiotherapy of breast tumors in Hospital Manuel de Abreu of Bauru (institution 1) and the Hospital da Faculdade de Medicina of Botucatu (Institution 2). Were prepared isodose curves corresponding to the contours of the patient, which were presented for radiotherapists doctors both hospital services that enabled the choice of the curve isodose that provides the best distribution of radiation dose in the irradiated volume. Some boundaries were digitized in one institution and sent for preparation of isodose lines in the institution 2, both curves plotted in each of the institutions and for the same contour, were compared, showing that the methodology of Curves of distance is feasible and reliable, while optimizing the routine procedures regarding the handling of isodose plans provided by different radiation equipment. It compares the calculation of the exposure time determined using the isodose curve selected by calculating the value obtained considering the PDP at the point of the middle line of separation between the internal and external fields, the difference between the two methods for determining exposure time was around 2.4%. A study on the angle of the radiation beam at the input field (region breast-air) was conducted showing that, once known a tangent angle of the input beam, one can estimate the angle of the wedge filter used in some procedures for uniformity of dose within the irradiated volume compensation and the lack of tissue in the treatment volume. A comparative study between the isodose curves produced manually with the curves obtained in a two-dimensional computer system, the computer system showed that provides further information regarding the dose gradient within the irradiated volume, in addition to reducing the time spent in preparing the curves isodose
Resumo:
In this work was developed a fuzzy computational model type-2 predictive interval, using the software of the type-2 fuzzy MATLAB toolbox, the final idea is to estimate the number of hospitalizations of patients with respiratory diseases. The interest in the creation of this model is to assist in decision makeshift hospital environment, where there are no medical or professional equipment available to provide the care that the population need. It began working with the study of fuzzy logic, the fuzzy inference system and fuzzy toolbox. Through a real database provided by the Departamento de Informática do Sistema Único de Saúde (DATASUS) and Companhia de Tecnologia de Saneamento Básico (CETESB), was possible to start the model. The analyzed database is composed of the number of patients admitted with respiratory diseases a day for the public hospital in São José dos Campos, during the year 2009 and by factors such as PM10, SO2, wind and humidity. These factors were analyzed as input variables and, through these, is possible to get the number of admissions a day, which is the output variable of the model. For data analysis we used the fuzzy control method type-2 Mamdani. In the following steps the performance developed in this work was compared with the performance of the same model using fuzzy logic type-1. Finally, the validity of the models was estimated by the ROC curve
Resumo:
The energy analysis development in this study contributes to the understanding of the dynamics of the organic coffee productive system, in particular to assess the independence of this system with respect to the use of industrialized input products. Thus, it provides information about the sustainability of that production system. Technical itineraries used in this study consist of energy expenditure made with coffee cultivation, according to the type, source and form of energy inputs, agricultural machines, equipment and labor force used in that production system. The energy expenditure, converted into energy units, quantified the input energy. And the organic coffee production, measured in kilograms of processed coffee beans, was the output energy. Primary data used in this study were obtained from organic coffee producers in the Southern region of Minas Gerais State, Brazil, in 2011. Energy balance identified was positive, since the estimated output energy was 626.465 MJ/ha and the energy expenditure was 112.998 MJ/ha, during the useful life of the crop.
Resumo:
This paper analyzed the energy flow of a route currently designed to transport ethanol from the Midwest region of Brazil for exportation, more precisely from the city of Aparecida do Taboado (MS) to the port of São Sebastiao (SP). The route studied a single modal combined into two pieces, duct - duct. The direct and indirect energy, involved in the operations were used to account for the inputs and outputs of energy from and into the system. The energy input and output were the variables, diesel fuel, lubricants, greases, indirect energy consumption of machinery and equipment, power consumption of labor, the energy consumption and energy consumption in depreciation and maintenance of roads. We found that this route has specific energy consumption of 0,14 MJ km-1 m-3 . The Net Energy Gain (GEl), the Energy Efficiency global (EEg) and Renewable Energy Balance (BEr), which were the energy indicators adopted in this study were obtained respectively: 1.585.958.977,00 MJ; 200,72 and 1.593.900.000,00MJ.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
A transparent (wide-area) wavelength-routed optical network may be constructed by using wavelength cross-connect switches connected together by fiber to form an arbitrary mesh structure. The network is accessed through electronic stations that are attached to some of these cross-connects. These wavelength cross-connect switches have the property that they may configure themselves into unspecified states. Each input port of a switch is always connected to some output port of the switch whether or not such a connection is required for the purpose of information transfer. Due to the presence of these unspecified states, there exists the possibility of setting up unintended alloptical cycles in the network (viz., a loop with no terminating electronics in it). If such a cycle contains amplifiers [e.g., Erbium- Doped Fiber Amplifiers (EDFA’s)], there exists the possibility that the net loop gain is greater than the net loop loss. The amplified spontaneous emission (ASE) noise from amplifiers can build up in such a feedback loop to saturate the amplifiers and result in oscillations of the ASE noise in the loop. Such all-optical cycles as defined above (and hereafter referred to as “white” cycles) must be eliminated from an optical network in order for the network to perform any useful operation. Furthermore, for the realistic case in which the wavelength cross-connects result in signal crosstalk, there is a possibility of having closed cycles with oscillating crosstalk signals. We examine algorithms that set up new transparent optical connections upon request while avoiding the creation of such cycles in the network. These algorithms attempt to find a route for a connection and then (in a post-processing fashion) configure switches such that white cycles that might get created would automatically get eliminated. In addition, our call-set-up algorithms can avoid the possibility of crosstalk cycles.
Resumo:
In this report a new automated optical test for next generation of photonic integrated circuits (PICs) is provided by the test-bed design and assessment. After a briefly analysis of critical problems of actual optical tests, the main test features are defined: automation and flexibility, relaxed alignment procedure, speed up of entire test and data reliability. After studying varied solutions, the test-bed components are defined to be lens array, photo-detector array, and software controller. Each device is studied and calibrated, the spatial resolution, and reliability against interference at the photo-detector array are studied. The software is programmed in order to manage both PIC input, and photo-detector array output as well as data analysis. The test is validated by analysing state-of-art 16 ports PIC: the waveguide location, current versus power, and time-spatial power distribution are measured as well as the optical continuity of an entire path of PIC. Complexity, alignment tolerance, time of measurement are also discussed.
Resumo:
This thesis aimed at addressing some of the issues that, at the state of the art, avoid the P300-based brain computer interface (BCI) systems to move from research laboratories to end users’ home. An innovative asynchronous classifier has been defined and validated. It relies on the introduction of a set of thresholds in the classifier, and such thresholds have been assessed considering the distributions of score values relating to target, non-target stimuli and epochs of voluntary no-control. With the asynchronous classifier, a P300-based BCI system can adapt its speed to the current state of the user and can automatically suspend the control when the user diverts his attention from the stimulation interface. Since EEG signals are non-stationary and show inherent variability, in order to make long-term use of BCI possible, it is important to track changes in ongoing EEG activity and to adapt BCI model parameters accordingly. To this aim, the asynchronous classifier has been subsequently improved by introducing a self-calibration algorithm for the continuous and unsupervised recalibration of the subjective control parameters. Finally an index for the online monitoring of the EEG quality has been defined and validated in order to detect potential problems and system failures. This thesis ends with the description of a translational work involving end users (people with amyotrophic lateral sclerosis-ALS). Focusing on the concepts of the user centered design approach, the phases relating to the design, the development and the validation of an innovative assistive device have been described. The proposed assistive technology (AT) has been specifically designed to meet the needs of people with ALS during the different phases of the disease (i.e. the degree of motor abilities impairment). Indeed, the AT can be accessed with several input devices either conventional (mouse, touchscreen) or alterative (switches, headtracker) up to a P300-based BCI.
Resumo:
The monitoring of cognitive functions aims at gaining information about the current cognitive state of the user by decoding brain signals. In recent years, this approach allowed to acquire valuable information about the cognitive aspects regarding the interaction of humans with external world. From this consideration, researchers started to consider passive application of brain–computer interface (BCI) in order to provide a novel input modality for technical systems solely based on brain activity. The objective of this thesis is to demonstrate how the passive Brain Computer Interfaces (BCIs) applications can be used to assess the mental states of the users, in order to improve the human machine interaction. Two main studies has been proposed. The first one allows to investigate whatever the Event Related Potentials (ERPs) morphological variations can be used to predict the users’ mental states (e.g. attentional resources, mental workload) during different reactive BCI tasks (e.g. P300-based BCIs), and if these information can predict the subjects’ performance in performing the tasks. In the second study, a passive BCI system able to online estimate the mental workload of the user by relying on the combination of the EEG and the ECG biosignals has been proposed. The latter study has been performed by simulating an operative scenario, in which the occurrence of errors or lack of performance could have significant consequences. The results showed that the proposed system is able to estimate online the mental workload of the subjects discriminating three different difficulty level of the tasks ensuring a high reliability.
Resumo:
A field of computational neuroscience develops mathematical models to describe neuronal systems. The aim is to better understand the nervous system. Historically, the integrate-and-fire model, developed by Lapique in 1907, was the first model describing a neuron. In 1952 Hodgkin and Huxley [8] described the so called Hodgkin-Huxley model in the article “A Quantitative Description of Membrane Current and Its Application to Conduction and Excitation in Nerve”. The Hodgkin-Huxley model is one of the most successful and widely-used biological neuron models. Based on experimental data from the squid giant axon, Hodgkin and Huxley developed their mathematical model as a four-dimensional system of first-order ordinary differential equations. One of these equations characterizes the membrane potential as a process in time, whereas the other three equations depict the opening and closing state of sodium and potassium ion channels. The membrane potential is proportional to the sum of ionic current flowing across the membrane and an externally applied current. For various types of external input the membrane potential behaves differently. This thesis considers the following three types of input: (i) Rinzel and Miller [15] calculated an interval of amplitudes for a constant applied current, where the membrane potential is repetitively spiking; (ii) Aihara, Matsumoto and Ikegaya [1] said that dependent on the amplitude and the frequency of a periodic applied current the membrane potential responds periodically; (iii) Izhikevich [12] stated that brief pulses of positive and negative current with different amplitudes and frequencies can lead to a periodic response of the membrane potential. In chapter 1 the Hodgkin-Huxley model is introduced according to Izhikevich [12]. Besides the definition of the model, several biological and physiological notes are made, and further concepts are described by examples. Moreover, the numerical methods to solve the equations of the Hodgkin-Huxley model are presented which were used for the computer simulations in chapter 2 and chapter 3. In chapter 2 the statements for the three different inputs (i), (ii) and (iii) will be verified, and periodic behavior for the inputs (ii) and (iii) will be investigated. In chapter 3 the inputs are embedded in an Ornstein-Uhlenbeck process to see the influence of noise on the results of chapter 2.
Resumo:
Un'interfaccia cervello-computer (BCI: Brain-Computer Interface) è un sistema di comunicazione diretto tra il cervello e un dispositivo esterno che non dipende dalle normali vie di output del cervello, costituite da nervi o muscoli periferici. Il segnale generato dall'utente viene acquisito per mezzo di appositi sensori, poi viene processato e classificato estraendone così le informazioni di interesse che verranno poi utilizzate per produrre un output reinviato all'utente come feedback. La tecnologia BCI trova interessanti applicazioni nel campo biomedico dove può essere di grande aiuto a persone soggette da paralisi, ma non sono da escludere altri utilizzi. Questa tesi in particolare si concentra sulle componenti hardware di una interfaccia cervello-computer analizzando i pregi e i difetti delle varie possibilità: in particolar modo sulla scelta dell'apparecchiatura per il rilevamento della attività cerebrale e dei meccanismi con cui gli utilizzatori della BCI possono interagire con l'ambiente circostante (i cosiddetti attuatori). Le scelte saranno effettuate tenendo in considerazione le necessità degli utilizzatori in modo da ridurre i costi e i rischi aumentando il numero di utenti che potranno effettivamente beneficiare dell'uso di una interfaccia cervello-computer.
Resumo:
Con Brain-Computer Interface si intende un collegamento diretto tra cervello e macchina, che essa sia un computer o un qualsiasi dispositivo esterno, senza l’utilizzo di muscoli. Grazie a sensori applicati alla cute del cranio i segnali cerebrali del paziente vengono rilevati, elaborati, classificati (per mezzo di un calcolatore) e infine inviati come output a un device esterno. Grazie all'utilizzo delle BCI, persone con gravi disabilità motorie o comunicative (per esempio malati di SLA o persone colpite dalla sindrome del chiavistello) hanno la possibilità di migliorare la propria qualità di vita. L'obiettivo di questa tesi è quello di fornire una panoramica nell'ambito dell'interfaccia cervello-computer, mostrando le tipologie esistenti, cercando di farne un'analisi critica sui pro e i contro di ogni applicazione, ponendo maggior attenzione sull'uso dell’elettroencefalografia come strumento per l’acquisizione dei segnali in ingresso all'interfaccia.
Resumo:
A patient-specific surface model of the proximal femur plays an important role in planning and supporting various computer-assisted surgical procedures including total hip replacement, hip resurfacing, and osteotomy of the proximal femur. The common approach to derive 3D models of the proximal femur is to use imaging techniques such as computed tomography (CT) or magnetic resonance imaging (MRI). However, the high logistic effort, the extra radiation (CT-imaging), and the large quantity of data to be acquired and processed make them less functional. In this paper, we present an integrated approach using a multi-level point distribution model (ML-PDM) to reconstruct a patient-specific model of the proximal femur from intra-operatively available sparse data. Results of experiments performed on dry cadaveric bones using dozens of 3D points are presented, as well as experiments using a limited number of 2D X-ray images, which demonstrate promising accuracy of the present approach.