864 resultados para modularised computing unit
Resumo:
Children are an especially vulnerable population, particularly in respect to drug administration. It is estimated that neonatal and pediatric patients are at least three times more vulnerable to damage due to adverse events and medication errors than adults are. With the development of this framework, it is intended the provision of a Clinical Decision Support System based on a prototype already tested in a real environment. The framework will include features such as preparation of Total Parenteral Nutrition prescriptions, table pediatric and neonatal emergency drugs, medical scales of morbidity and mortality, anthropometry percentiles (weight, length/height, head circumference and BMI), utilities for supporting medical decision on the treatment of neonatal jaundice and anemia and support for technical procedures and other calculators and widespread use tools. The solution in development means an extension of INTCare project. The main goal is to provide an approach to get the functionality at all times of clinical practice and outside the hospital environment for dissemination, education and simulation of hypothetical situations. The aim is also to develop an area for the study and analysis of information and extraction of knowledge from the data collected by the use of the system. This paper presents the architecture, their requirements and functionalities and a SWOT analysis of the solution proposed.
Resumo:
The MAP-i Doctoral Program of the Universities of Minho, Aveiro and Porto.
Resumo:
Kidney renal failure means that one’s kidney have unexpectedly stopped functioning, i.e., once chronic disease is exposed, the presence or degree of kidney dysfunction and its progression must be assessed, and the underlying syndrome has to be diagnosed. Although the patient’s history and physical examination may denote good practice, some key information has to be obtained from valuation of the glomerular filtration rate, and the analysis of serum biomarkers. Indeed, chronic kidney sickness depicts anomalous kidney function and/or its makeup, i.e., there is evidence that treatment may avoid or delay its progression, either by reducing and prevent the development of some associated complications, namely hypertension, obesity, diabetes mellitus, and cardiovascular complications. Acute kidney injury appears abruptly, with a rapid deterioration of the renal function, but is often reversible if it is recognized early and treated promptly. In both situations, i.e., acute kidney injury and chronic kidney disease, an early intervention can significantly improve the prognosis.The assessment of these pathologies is therefore mandatory, although it is hard to do it with traditional methodologies and existing tools for problem solving. Hence, in this work, we will focus on the development of a hybrid decision support system, in terms of its knowledge representation and reasoning procedures based on Logic Programming, that will allow one to consider incomplete, unknown, and even contradictory information, complemented with an approach to computing centered on Artificial Neural Networks, in order to weigh the Degree-of-Confidence that one has on such a happening. The present study involved 558 patients with an age average of 51.7 years and the chronic kidney disease was observed in 175 cases. The dataset comprise twenty four variables, grouped into five main categories. The proposed model showed a good performance in the diagnosis of chronic kidney disease, since the sensitivity and the specificity exhibited values range between 93.1 and 94.9 and 91.9–94.2 %, respectively.
Resumo:
This paper deals with a computing simulation for an offshore wind energy system taking into account the influence of the marine waves action throughout the floating platform. The wind energy system has a variable-speed turbine equipped with a permanent magnet synchronous generator and a full-power five level converter, injecting energy into the electric grid through a high voltage alternate current link. A reduction on the unbalance of the voltage in the DC-link capacitors of the five-level converter is proposed by a strategic selection of the output voltage vectors. The model for the drive train of the wind energy system is a two mass model, including the dynamics of the floating platform. A case study is presented and the assessment of the quality of the energy injected into the electric grid is discussed.
Resumo:
This workshop aims at stimulating children’s oral language skills by involving them on playing and creating different language games or activities using the t-stories interface. The interface allows recording and playing audio on the stories’ modules, as well as recording and playing based on identification with NFC tags that can be used as sticker on objects, paper, or other materials and placed in different locations. After the presentation of t-stories by the workshop facilitators, children will have the opportunity to explore the interface on their own, then they will be asked to participate in different language games, whereby they actively create their own content. Afterwards children will be challenged to imagine and create activities for their peers.
Resumo:
OBJECTIVE: In this study we aim to characterize a sample of 85 pregnant crack addicts admitted for detoxification in a psychiatric inpatient unit. METHOD: Cross-sectional study. Sociodemographic, clinical, obstetric and lifestyle information were evaluated. RESULTS: Age of onset for crack use varied from 11 to 35 years (median = 21). Approximately 25% of the patients smoked more than 20 crack rocks in a typical day of use (median = 10; min-max = 1-100). Tobacco (89.4%), alcohol (63.5%) and marijuana (51.8%) were the drugs other than crack most currently used. Robbery was reported by 32 patients (41.2%), imprisonment experience by 21 (24.7%), trade of sex for money/drugs by 38 (44.7%), home desertion by 33 (38.8%); 15.3% were positive for HIV, 5.9% for HCV, 1.2% for HBV and 8.2% for syphilis. After discharge from the psychiatric unit, only 25% of the sample followed the proposed treatment in the chemical dependency outpatient service. CONCLUSION: Greater risky behaviors for STD, as well as high rates of maternal HIV and Syphilis were found. Moreover, the high rates of concurrent use of other drugs and involvement in illegal activities contribute to show their chaotic lifestyles. Prevention and intervention programs need to be developed to address the multifactorial nature of this problem.
Resumo:
Dissertação de mestrado integrado em Engenharia Civil
Resumo:
The development of ubiquitous computing (ubicomp) environments raises several challenges in terms of their evaluation. Ubicomp virtual reality prototyping tools enable users to experience the system to be developed and are of great help to face those challenges, as they support developers in assessing the consequences of a design decision in the early phases of development. Given the situated nature of ubicomp environments, a particular issue to consider is the level of realism provided by the prototypes. This work presents a case study where two ubicomp prototypes, featuring different levels of immersion (desktop-based versus CAVE-based), were developed and compared. The goal was to determine the cost/benefits relation of both solutions, which provided better user experience results, and whether or not simpler solutions provide the same user experience results as more elaborate one.
Resumo:
In this study, a mathematical model for the production of Fructo-oligosaccharides (FOS) by Aureobasidium pullulans is developed. This model contains a relatively large set of unknown parameters, and the identification problem is analyzed using simulation data, as well as experimental data. Batch experiments were not sufficiently informative to uniquely estimate all the unknown parameters, thus, additional experiments have to be achieved in fed-batch mode to supplement the missing information. © 2015 IEEE.
Resumo:
OBJECTIVE: To evaluate the characteristics of the patients receiving medical care in the Ambulatory of Hypertension of the Emergency Department, Division of Cardiology, and in the Emergency Unit of the Clinical Hospital of the Ribeirão Preto Medical School. METHODS: Using a protocol, we compared the care of the same hypertensive patients in on different occasions in the 2 different places. The characteristics of 62 patients, 29 men with a mean age of 57 years, were analyzed between January 1996 and December 1997. RESULTS: The care of these patients resulted in different medical treatment regardless of their clinical features and blood pressure levels. Thus, in the Emergency Unit, 97% presented with symptoms, and 64.5% received medication to rapidly reduce blood pressure. In 50% of the cases, nifedipine SL was the elected medication. Patients who applied to the Ambulatory of Hypertension presenting with similar features, or, in some cases, presenting with similar clinically higher levels of blood pressure, were not prescribed medication for a rapid reduction of blood pressure at any of the appointments. CONCLUSION: The therapeutic approach to patients with high blood pressure levels, symptomatic or asymptomatic, was dependent on the place of treatment. In the Emergency Unit, the conduct was, in the majority of cases, to decrease blood pressure immediately, whereas in the Ambulatory of Hypertension, the same levels of blood pressure, in the same individuals, resulted in therapeutic adjustment with nonpharmacological management. These results show the need to reconsider the concept of hypertensive crises and their therapeutical implications.
Resumo:
OBJECTIVE: To evaluate clinical and evolutive characteristics of patients admitted in an intensive care unit after cardiopulmonary resuscitation, identifying prognostic survival factors.METHODS: A retrospective study of 136 patients admitted between 1995 and 1999 to an intensive care unit, evaluating clinical conditions, mechanisms and causes of cardiopulmonary arrest, and their relation to hospital mortality.RESULTS: A 76% mortality rate independent of age and sex was observed. Asystole was the most frequent mechanism of death, and seen in isolation pulmonary arrest was the least frequent. Cardiac failure, need for mechanical ventilation, cirrhosis and previous stroke were clinically significant (p<0.01) death factors.CONCLUSION: Prognostic factors supplement the doctor's decision as to whether or not a patient will benefit from cardiopulmonary resuscitation.
Resumo:
Uno de los temas centrales del proyecto concierne la naturaleza de la ciencia de la computación. La reciente aparición de esta disciplina sumada a su origen híbrido como ciencia formal y disciplina tecnológica hace que su caracterización aún no esté completa y menos aún acordada entre los científicos del área. En el trabajo Three paradigms of Computer Science de A. Eden, se presentan tres posiciones admitidamente exageradas acerca de como entender tanto el objeto de estudio (ontología) como los métodos de trabajo (metodología) y la estructura de la teoría y las justificaciones del conocimiento informático (epistemología): La llamada racionalista, la cual se basa en la idea de que los programas son fórmulas lógicas y que la forma de trabajo es deductiva, la tecnocrática que presenta a la ciencia computacional como una disciplina ingenieril y la ahi llamada científica, la cual asimilaría a la computación a las ciencias empíricas. Algunos de los problemas de ciencia de la computación están relacionados con cuestiones de filosofía de la matemática, en particular la relación entre las entidades abstractas y el mundo. Sin embargo, el carácter prescriptivo de los axiomas y teoremas de las teorías de la programación puede permitir interpretaciones alternativas y cuestionaría fuertemente la posibilidad de pensar a la ciencia de la computación como una ciencia empírica, al menos en el sentido tradicional. Por otro lado, es posible que el tipo de análisis aplicado a las ciencias de la computación propuesto en este proyecto aporte nuevas ideas para pensar problemas de filosofía de la matemática. Un ejemplo de estos posibles aportes puede verse en el trabajo de Arkoudas Computers, Justi?cation, and Mathematical Knowledge el cual echa nueva luz al problema del significado de las demostraciones matemáticas.Los objetivos del proyecto son: Caracterizar el campo de las ciencias de la computación.Evaluar los fundamentos ontológicos, epistemológicos y metodológicos de la ciencia de la computación actual.Analizar las relaciones entre las diferentes perspectivas heurísticas y epistémicas y las practicas de la programación.
Resumo:
El avance en la potencia de cómputo en nuestros días viene dado por la paralelización del procesamiento, dadas las características que disponen las nuevas arquitecturas de hardware. Utilizar convenientemente este hardware impacta en la aceleración de los algoritmos en ejecución (programas). Sin embargo, convertir de forma adecuada el algoritmo en su forma paralela es complejo, y a su vez, esta forma, es específica para cada tipo de hardware paralelo. En la actualidad los procesadores de uso general más comunes son los multicore, procesadores paralelos, también denominados Symmetric Multi-Processors (SMP). Hoy en día es difícil hallar un procesador para computadoras de escritorio que no tengan algún tipo de paralelismo del caracterizado por los SMP, siendo la tendencia de desarrollo, que cada día nos encontremos con procesadores con mayor numero de cores disponibles. Por otro lado, los dispositivos de procesamiento de video (Graphics Processor Units - GPU), a su vez, han ido desarrollando su potencia de cómputo por medio de disponer de múltiples unidades de procesamiento dentro de su composición electrónica, a tal punto que en la actualidad no es difícil encontrar placas de GPU con capacidad de 200 a 400 hilos de procesamiento paralelo. Estos procesadores son muy veloces y específicos para la tarea que fueron desarrollados, principalmente el procesamiento de video. Sin embargo, como este tipo de procesadores tiene muchos puntos en común con el procesamiento científico, estos dispositivos han ido reorientándose con el nombre de General Processing Graphics Processor Unit (GPGPU). A diferencia de los procesadores SMP señalados anteriormente, las GPGPU no son de propósito general y tienen sus complicaciones para uso general debido al límite en la cantidad de memoria que cada placa puede disponer y al tipo de procesamiento paralelo que debe realizar para poder ser productiva su utilización. Los dispositivos de lógica programable, FPGA, son dispositivos capaces de realizar grandes cantidades de operaciones en paralelo, por lo que pueden ser usados para la implementación de algoritmos específicos, aprovechando el paralelismo que estas ofrecen. Su inconveniente viene derivado de la complejidad para la programación y el testing del algoritmo instanciado en el dispositivo. Ante esta diversidad de procesadores paralelos, el objetivo de nuestro trabajo está enfocado en analizar las características especificas que cada uno de estos tienen, y su impacto en la estructura de los algoritmos para que su utilización pueda obtener rendimientos de procesamiento acordes al número de recursos utilizados y combinarlos de forma tal que su complementación sea benéfica. Específicamente, partiendo desde las características del hardware, determinar las propiedades que el algoritmo paralelo debe tener para poder ser acelerado. Las características de los algoritmos paralelos determinará a su vez cuál de estos nuevos tipos de hardware son los mas adecuados para su instanciación. En particular serán tenidos en cuenta el nivel de dependencia de datos, la necesidad de realizar sincronizaciones durante el procesamiento paralelo, el tamaño de datos a procesar y la complejidad de la programación paralela en cada tipo de hardware. Today´s advances in high-performance computing are driven by parallel processing capabilities of available hardware architectures. These architectures enable the acceleration of algorithms when thes ealgorithms are properly parallelized and exploit the specific processing power of the underneath architecture. Most current processors are targeted for general pruposes and integrate several processor cores on a single chip, resulting in what is known as a Symmetric Multiprocessing (SMP) unit. Nowadays even desktop computers make use of multicore processors. Meanwhile, the industry trend is to increase the number of integrated rocessor cores as technology matures. On the other hand, Graphics Processor Units (GPU), originally designed to handle only video processing, have emerged as interesting alternatives to implement algorithm acceleration. Current available GPUs are able to implement from 200 to 400 threads for parallel processing. Scientific computing can be implemented in these hardware thanks to the programability of new GPUs that have been denoted as General Processing Graphics Processor Units (GPGPU).However, GPGPU offer little memory with respect to that available for general-prupose processors; thus, the implementation of algorithms need to be addressed carefully. Finally, Field Programmable Gate Arrays (FPGA) are programmable devices which can implement hardware logic with low latency, high parallelism and deep pipelines. Thes devices can be used to implement specific algorithms that need to run at very high speeds. However, their programmability is harder that software approaches and debugging is typically time-consuming. In this context where several alternatives for speeding up algorithms are available, our work aims at determining the main features of thes architectures and developing the required know-how to accelerate algorithm execution on them. We look at identifying those algorithms that may fit better on a given architecture as well as compleme
Resumo:
Monitoring, object-orientation, real-time, execution-time, scheduling
Resumo:
Background:Recent studies have suggested that B-type Natriuretic Peptide (BNP) is an important predictor of ischemia and death in patients with suspected acute coronary syndrome. Increased levels of BNP are seen after episodes of myocardial ischemia and may be related to future adverse events.Objectives:To determine the prognostic value of BNP for major cardiac events and to evaluate its association with ischemic myocardial perfusion scintigraphy (MPS).Methods:This study included retrospectively 125 patients admitted to the chest pain unit between 2002 and 2006, who had their BNP levels measured on admission and underwent CPM for risk stratification. BNP values were compared with the results of the MPS. The chi-square test was used for qualitative variables and the Student t test, for quantitative variables. Survival curves were adjusted using the Kaplan-Meier method and analyzed by using Cox regression. The significance level was 5%.Results:The mean age was 63.9 ± 13.8 years, and the male sex represented 51.2% of the sample. Ischemia was found in 44% of the MPS. The mean BNP level was higher in patients with ischemia compared to patients with non-ischemic MPS (188.3 ± 208.7 versus 131.8 ± 88.6; p = 0.003). A BNP level greater than 80 pg/mL was the strongest predictor of ischemia on MPS (sensitivity = 60%, specificity = 70%, accuracy = 66%, PPV = 61%, NPV = 70%), and could predict medium-term mortality (RR = 7.29, 95% CI: 0.90-58.6; p = 0.045) independently of the presence of ischemia.Conclusions:BNP levels are associated with ischemic MPS findings and adverse prognosis in patients presenting with acute chest pain to the emergency room, thus, providing important prognostic information for an unfavorable clinical outcome.