914 resultados para Monitoring system


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our efforts are directed towards the understanding of the coscheduling mechanism in a NOW system when a parallel job is executed jointly with local workloads, balancing parallel performance against the local interactive response. Explicit and implicit coscheduling techniques in a PVM-Linux NOW (or cluster) have been implemented. Furthermore, dynamic coscheduling remains an open question when parallel jobs are executed in a non-dedicated Cluster. A basis model for dynamic coscheduling in Cluster systems is presented in this paper. Also, one dynamic coscheduling algorithm for this model is proposed. The applicability of this algorithm has been proved and its performance analyzed by simulation. Finally, a new tool (named Monito) for monitoring the different queues of messages in such an environments is presented. The main aim of implementing this facility is to provide a mean of capturing the bottlenecks and overheads of the communication system in a PVM-Linux cluster.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The control of the right application of medical protocols is a key issue in hospital environments. For the automated monitoring of medical protocols, we need a domain-independent language for their representation and a fully, or semi, autonomous system that understands the protocols and supervises their application. In this paper we describe a specification language and a multi-agent system architecture for monitoring medical protocols. We model medical services in hospital environments as specialized domain agents and interpret a medical protocol as a negotiation process between agents. A medical service can be involved in multiple medical protocols, and so specialized domain agents are independent of negotiation processes and autonomous system agents perform monitoring tasks. We present the detailed architecture of the system agents and of an important domain agent, the database broker agent, that is responsible of obtaining relevant information about the clinical history of patients. We also describe how we tackle the problems of privacy, integrity and authentication during the process of exchanging information between agents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dynamic adaptations of one"s behavior by means of performance monitoring are a central function of the human executive system, that underlies considerable interindividual variation. Converging evidence from electrophysiological and neuroimaging studies in both animals and humans hints atthe importance ofthe dopaminergic system forthe regulation of performance monitoring. Here, we studied the impact of two polymorphisms affecting dopaminergic functioning in the prefrontal cortex [catechol-O-methyltransferase (COMT) Val108/158Met and dopamine D4 receptor (DRD4) single-nucleotide polymorphism (SNP)-521] on neurophysiological correlates of performance monitoring. We applied a modified version of a standard flanker task with an embedded stop-signal task to tap into the different functions involved, particularly error monitoring, conflict detection and inhibitory processes. Participants homozygous for the DRD4 T allele produced an increased error-related negativity after both choice errors and failed inhibitions compared with C-homozygotes. This was associated with pronounced compensatory behavior reflected in higher post-error slowing. No group differences were seen in the incompatibility N2, suggesting distinct effects of the DRD4 polymorphism on error monitoring processes. Additionally, participants homozygous for the COMTVal allele, with a thereby diminished prefrontal dopaminergic level, revealed increased prefrontal processing related to inhibitory functions, reflected in the enhanced stop-signal-related components N2 and P3a. The results extend previous findings from mainly behavioral and neuroimaging data on the relationship between dopaminergic genes and executive functions and present possible underlying mechanisms for the previously suggested association between these dopaminergic polymorphisms and psychiatric disorders as schizophrenia or attention deficit hyperactivity disorder.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tärkeä tehtävä ympäristön tarkkailussa on arvioida ympäristön nykyinen tila ja ihmisen siihen aiheuttamat muutokset sekä analysoida ja etsiä näiden yhtenäiset suhteet. Ympäristön muuttumista voidaan hallita keräämällä ja analysoimalla tietoa. Tässä diplomityössä on tutkittu vesikasvillisuudessa hai vainuja muutoksia käyttäen etäältä hankittua mittausdataa ja kuvan analysointimenetelmiä. Ympäristön tarkkailuun on käytetty Suomen suurimmasta järvestä Saimaasta vuosina 1996 ja 1999 otettuja ilmakuvia. Ensimmäinen kuva-analyysin vaihe on geometrinen korjaus, jonka tarkoituksena on kohdistaa ja suhteuttaa otetut kuvat samaan koordinaattijärjestelmään. Toinen vaihe on kohdistaa vastaavat paikalliset alueet ja tunnistaa kasvillisuuden muuttuminen. Kasvillisuuden tunnistamiseen on käytetty erilaisia lähestymistapoja sisältäen valvottuja ja valvomattomia tunnistustapoja. Tutkimuksessa käytettiin aitoa, kohinoista mittausdataa, minkä perusteella tehdyt kokeet antoivat hyviä tuloksia tutkimuksen onnistumisesta.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: Adequate empirical antibiotic dose selection for critically ill burn patients is difficult due to extreme variability in drug pharmacokinetics. Therapeutic drug monitoring (TDM) may aid antibiotic prescription and implementation of initial empirical antimicrobial dosage recommendations. This study evaluated how gradual TDM introduction altered empirical dosages of meropenem and imipenem/cilastatin in our burn ICU. METHODS: Imipenem/cilastatin and meropenem use and daily empirical dosage at a five-bed burn ICU were analyzed retrospectively. Data for all burn admissions between 2001 and 2011 were extracted from the hospital's computerized information system. For each patient receiving a carbapenem, episodes of infection were reviewed and scored according to predefined criteria. Carbapenem trough serum levels were characterized. Prior to May 2007, TDM was available only by special request. Real-time carbapenem TDM was introduced in June 2007; it was initially available weekly and has been available 4 days a week since 2010. RESULTS: Of 365 patients, 229 (63%) received antibiotics (109 received carbapenems). Of 23 TDM determinations for imipenem/cilastatin, none exceeded the predefined upper limit and 11 (47.8%) were insufficient; the number of TDM requests was correlated with daily dose (r=0.7). Similar numbers of inappropriate meropenem trough levels (30.4%) were below and above the upper limit. Real-time TDM introduction increased the empirical dose of imipenem/cilastatin, but not meropenem. CONCLUSIONS: Real-time carbapenem TDM availability significantly altered the empirical daily dosage of imipenem/cilastatin at our burn ICU. Further studies are needed to evaluate the individual impact of TDM-based antibiotic adjustment on infection outcomes in these patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tarve tälle työlle on noussut sanomapalvelinsoveluksissa (servers) esiintyvistä ongelmista. Sanomapalvelinsovelluksia käytetään lähettämään ja vastaanottamaan sanomia paperiteollisuuden myynnin ja jakelun järjestelmässä maantieteellisesti erillään olevista paperiteollisuuden tehtaista. Sanomapalvelinsovelusten kunnollinen toimivuus on tärkeää koko järjestelmän toimivuuden kannalta, koska nämä palvelimet käsittelevät päivittäin tuhansia sanomia, jotka sisältävät merkityksellistä järjestelmätietoa. Tässä työssä on tutkittu mahdollisia toteutustekniikoita ja näihin tutkimuksiin pohjautuen toteutettu työkalut sanomapalvelinsovellusten testaukseen ja valvontaan. Sovellus-arkkituuritekniikoita tutkittaessa tutkimus rajattiin 3-tasoarkkitehtuuritekniikkaan, erityisesti TUXEDOTM -järjestelmätekniikkaan, koska toteutettavaa sovellusta käytetään hajautetussa sovellusympäristössä. Sovellusasiakkaan (client) toteutusta varten tutkittiin ja vertailtiin XML-tekniikkaa ja Microsoft Visual C++ -tekniikkaa käytettynä Tieto-Enatorin Phobos Interaktiivisen C++ -luokkakirjaston kanssa. XML-tekniikoita sekä Visual C++ ja Phobos-luokkakirjasto –tekniikkaa tutkittiin niiltä osin, mitä tarvittiin sanomamerkkijonojen katseluun. XML-tietokantatekniikoita tutkittiin mahdollisena vaihtoehtona tietokanta ja sovelluspalvelintekniikalle. Työn ensimmäisenä tavoitteena oli toteuttaa työkalu sanomapalvelinsovellusten testaamiseen. Toisena tavoitteena oli toteuttaa työkalu sanomien sisällön oikeellisuuden valvontaan. Kolmantena tavoitteena oli analysoida olemassaolevaa sanomavirheiden valvontasovellusta ja kehittää sitä eteenpäin. Diplomityön tuloksena toteutettiin sovellus sanomapalvelinsovellusten testaamiseen ja valvontaan. Tutkituista asiakassovelustekniikoista valittiin toteutus-tekniikaksi MS Visual C++ käytettynä Phobos Interaktiivisen C++ luokkakirjaston kanssa tekniikan tunnettavuuden vuoksi. 3-taso TUXEDOTM-tekniikka valittiin sovelluksen arkkitehtuuriksi. Lisäksi löydettiin parannuksia olemassa oleviin sanoma-virheiden valvontatoimintoihin. Tutkitut toteutustekniikat ovat yleisiä ja niitä voidaan käyttää, kun toteutetaan samanlaisia sovelluksia samanlaisiin sovellusympäristöihin.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increase in seafood production, especially in mariculture worldwide, has brought out the need of continued monitoring of shellfish production areas in order to ensure safety to human consumption. The purpose of this research was to evaluate pathogenic protozoa, viruses and bacteria contamination in oysters before and after UV depuration procedure, in brackish waters at all stages of cultivation and treatment steps and to enumerate microbiological indicators of fecal contamination from production site up to depuration site in an oyster cooperative located at the Southeastern estuarine area of Brazil. Oysters and brackish water were collected monthly from September 2009 to November 2010. Four sampling sites were selected for enteropathogens analysis: site 1- oyster growth, site 2- catchment water (before UV depuration procedure), site 3 - filtration stage of water treatment (only for protozoa analysis) and site 4- oyster's depuration tank. Three microbiological indicators ! were examined at sites 1, 2 and 4. The following pathogenic microorganisms were searched: Giardia cysts, Cryptosporidium oocysts, Human Adenovirus (HAdV), Hepatitis A virus (HAV), Human Norovirus (HnoV) (genogroups I and II), JC strain Polyomavirus (JCPyV) and Salmonella sp. Analysis consisted of molecular detection (qPCR) for viruses (oysters and water samples); immunomagnetic separation followed by direct immunofluorescence assay for Cryptosporidium oocysts and Giardia cysts and also molecular detection (PCR) for the latter (oysters and water samples); commercial kit (Reveal-Neogee (R)) for Salmonella analysis (oysters). Giardia was the most prevalent pathogen in all sites where it was detected: 36.3%, 18.1%, 36.3% and 27.2% of water from sites 1, 2, 3 and 4 respectively; 36.3% of oysters from site 1 and 54.5% of depurated oysters were harboring Giardia cysts. The huge majority of contaminated samples were classified as Giardia duodenalis. HAdv was detected in water and o! ysters from growth site and HnoV GI in two batches of oysters ! (site 1) in huge concentrations (2.11 x 10(13), 3.10 x 10(12) gc/g). In depuration tank site, Salmonella sp., HAV (4.84 x 10(3)) and HnoV GII (7.97 x 10(14)) were detected once in different batches of oysters. Cryptosporidium spp. oocysts were present in 9.0% of water samples from site four. These results reflect the contamination of oysters even when UV depuration procedures are employed in this shellfish treatment plant. Moreover, the molecular comprehension of the sources of contamination is necessary to develop an efficient management strategy allied to shellfish treatment improvement to prevent foodborne illnesses. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Nucleus accumbens (Nacc) has been proposed to act as a limbic-motor interface. Here, using invasive intraoperative recordings in an awake patient suffering from obsessive-compulsive disease (OCD), we demonstrate that its activity is modulated by the quality of performance of the subject in a choice reaction time task designed to tap action monitoring processes. Action monitoring, that is, error detection and correction, is thought to be supported by a system involving the dopaminergic midbrain, the basal ganglia, and the medial prefrontal cortex. In surface electrophysiological recordings, action monitoring is indexed by an error-related negativity (ERN) appearing time-locked to the erroneous responses and emanating from the medial frontal cortex. In preoperative scalp recordings the patient's ERN was found to be signifi cantly increased compared to a large (n = 83) normal sample, suggesting enhanced action monitoring processes. Intraoperatively, error-related modulations were obtained from the Nacc but not from a site 5 mm above. Importantly, crosscorrelation analysis showed that error-related activity in the Nacc preceded surface activity by 40 ms. We propose that the Nacc is involved in action monitoring, possibly by using error signals from the dopaminergic midbrain to adjust the relative impact of limbic and prefrontal inputs on frontal control systems in order to optimize goal-directed behavior.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Occurrence and removal of 81 representative Pharmaceutical Active Compounds (PhACs) were assessed in a municipal WWTP located in a highly industrialized area, with partial water reuse after UV tertiary treatment and discharge to a Mediterranean river. Water monitoring was performed in an integrated way at different points in the WWTP and river along three seasons. Consistent differences between therapeutic classes were observed in terms of influent concentration, removal efficiencies and seasonal variation. Conventional (primary and secondary) treatment was unable to completely remove numerous compounds and UV-based tertiary treatment played a complementary role for some of them. Industrial activity influence was highlighted in terms of PhACs presence and seasonal distribution. Even if global WWTP effluent impact on the studied river appeared to be minor, PhACs resulted widespread pollutants in river waters. Contamination can be particularly critical in summer in water scarcity areas, when water flow decreases considerably

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Workflow management systems aim at the controlled execution of complex application processes in distributed and heterogeneous environments. These systems will shape the structure of information systems in business and non-business environments. E business and system integration is a fertile soil for WF and groupware tools. This thesis aims to study WF and groupware tools in order to gather in house knowledge of WF to better utilize WF solutions in future, and to focus on SAP Business Workflow in order to find a global solution for Application Link Enabling support for system integration. Piloting this solution in Nokia collects the experience of SAP R/3 WF tool for other development projects in future. The literary part of this study will guide to the world of business process automation providing a general description of the history, use and potentials of WF & groupware software. The empirical part of this study begins with the background of the case study describing the IT environment initiating the case by introducing SAP R/3 in Nokia, the communication technique in use and WF tool. Case study is focused in one solution with SAP Business Workflow. This study provides a concept to monitor communication between ERP systems and to increase the quality of system integration. Case study describes a way create support model for ALE/EDI interfaces. Support model includes monitoring organization and the workflow processes to solve the most common IDoc related errors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Especially in global enterprises, key data is fragmented in multiple Enterprise Resource Planning (ERP) systems. Thus the data is inconsistent, fragmented and redundant across the various systems. Master Data Management (MDM) is a concept, which creates cross-references between customers, suppliers and business units, and enables corporate hierarchies and structures. The overall goal for MDM is the ability to create an enterprise-wide consistent data model, which enables analyzing and reporting customer and supplier data. The goal of the study was defining the properties and success factors of a master data system. The theoretical background was based on literature and the case consisted of enterprise specific needs and demands. The theoretical part presents the concept, background, and principles of MDM and then the phases of system planning and implementation project. Case consists of background, definition of as is situation, definition of project, evaluation criterions and concludes the key results of the thesis. In the end chapter Conclusions combines common principles with the results of the case. The case part ended up dividing important factors of the system in success factors, technical requirements and business benefits. To clarify the project and find funding for the project, business benefits have to be defined and the realization has to be monitored. The thesis found out six success factors for the MDM system: Well defined business case, data management and monitoring, data models and structures defined and maintained, customer and supplier data governance, delivery and quality, commitment, and continuous communication with business. Technical requirements emerged several times during the thesis and therefore those can’t be ignored in the project. Conclusions chapter goes through these factors on a general level. The success factors and technical requirements are related to the essentials of MDM: Governance, Action and Quality. This chapter could be used as guidance in a master data management project.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present paper reports a bacteria autonomous controlled concentrator prototype with a user-friendly interface for bench-top applications. It is based on a micro-fluidic lab-on-a-chip and its associated custom instrumentation, which consists in a dielectrophoretic actuator, to pre-concentrate the sample, and an impedance analyser, to measure concentrated bacteria levels. The system is composed by a single micro-fluidic chamber with interdigitated electrodes and a instrumentation with custom electronics. The prototype is supported by a real-time platform connected to a remote computer, which automatically controls the system and displays impedance data used to monitor the status of bacteria accumulation on-chip. The system automates the whole concentrating operation. Performance has been studied for controlled volumes of Escherichia coli (E. coli) samples injected into the micro-fluidic chip at constant flow rate of 10 μL/min. A media conductivity correcting protocol has been developed, as the preliminary results showed distortion of the impedance analyser measurement produced by bacterial media conductivity variations through time. With the correcting protocol, the measured impedance values were related to the quantity of bacteria concentrated with a correlation of 0.988 and a coefficient of variation of 3.1%. Feasibility of E. coli on-chip automated concentration, using the miniaturized system, has been demonstrated. Furthermore, the impedance monitoring protocol had been adjusted and optimized, to handle changes in the electrical properties of the bacteria media over time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Centrifugal pumps are widely used in industrial and municipal applications, and they are an important end-use application of electric energy. However, in many cases centrifugal pumps operate with a significantly lower energy efficiency than they actually could, which typically has an increasing effect on the pump energy consumption and the resulting energy costs. Typical reasons for this are the incorrect dimensioning of the pumping system components and inefficiency of the applied pump control method. Besides the increase in energy costs, an inefficient operation may increase the risk of a pump failure and thereby the maintenance costs. In the worst case, a pump failure may lead to a process shutdown accruing additional costs. Nowadays, centrifugal pumps are often controlled by adjusting their rotational speed, which affects the resulting flow rate and output pressure of the pumped fluid. Typically, the speed control is realised with a frequency converter that allows the control of the rotational speed of an induction motor. Since a frequency converter can estimate the motor rotational speed and shaft torque without external measurement sensors on the motor shaft, it also allows the development and use of sensorless methods for the estimation of the pump operation. Still today, the monitoring of pump operation is based on additional measurements and visual check-ups, which may not be applicable to determine the energy efficiency of the pump operation. This doctoral thesis concentrates on the methods that allow the use of a frequency converter as a monitoring and analysis device for a centrifugal pump. Firstly, the determination of energy-efficiency- and reliability-based limits for the recommendable operating region of a variable-speed-driven centrifugal pump is discussed with a case study for the laboratory pumping system. Then, three model-based estimation methods for the pump operating location are studied, and their accuracy is determined by laboratory tests. In addition, a novel method to detect the occurrence of cavitation or flow recirculation in a centrifugal pump by a frequency converter is introduced. Its sensitivity compared with known cavitation detection methods is evaluated, and its applicability is verified by laboratory measurements for three different pumps and by using two different frequency converters. The main focus of this thesis is on the radial flow end-suction centrifugal pumps, but the studied methods can also be feasible with mixed and axial flow centrifugal pumps, if allowed by their characteristics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is done as a part of project called FuncMama that is a project between Technical Research Centre of Finland (VTT), Oulu University (OY), Lappeenranta University of Technology (LUT) and Finnish industrial partners. Main goal of the project is to manufacture electric and mechanical components from mixed materials using laser sintering. Aim of this study was to create laser sintered pieces from ceramic material and monitor the sintering event by using spectrometer. Spectrometer is a device which is capable to record intensity of different wavelengths in relation with time. In this study the monitoring of laser sintering was captured with the equipment which consists of Ocean Optics spectrometer, optical fiber and optical lens (detector head). Light from the sintering process hit first to the lens system which guides the light in to the optical fibre. Optical fibre transmits the light from the sintering process to the spectrometer where wavelengths intensity level information is detected. The optical lens of the spectrometer was rigidly set and did not move along with the laser beam. Data which was collected with spectrometer from the laser sintering process was converted with Excel spreadsheet program for result’s evaluation. Laser equipment used was IPG Photonics pulse fibre laser. Laser parameters were kept mainly constant during experimental part and only sintering speed was changed. That way it was possible to find differences in the monitoring results without fear of too many parameters mixing together and affecting to the conclusions. Parts which were sintered had one layer and size of 5 x 5 mm. Material was CT2000 – tape manufactured by Heraeus which was later on post processed to powder. Monitoring of different sintering speeds was tested by using CT2000 reference powder. Moreover tests how different materials effect to the process monitoring were done by adding foreign powder Du Pont 951 which had suffered in re-grinding and which was more reactive than CT2000. By adding foreign material it simulates situation where two materials are accidently mixed together and it was studied if that can be seen with the spectrometer. It was concluded in this study that with the spectrometer it is possible to detect changes between different laser sintering speeds. When the sintering speed is lowered the intensity level of light is higher from the process. This is a result of higher temperature at the sintering spot and that can be noticed with the spectrometer. That indicates it could be possible to use spectrometer as a tool for process observation and support the idea of having system that can help setting up the process parameter window. Also important conclusion was how well the adding of foreign material could be seen with the spectrometer. When second material was added a significant intensity level raise could be noticed in that part where foreign material was mixed. That indicates it is possible to see if there are any variations in the material or if there are more materials mixed together. Spectrometric monitoring of laser sintering could be useful tool for process window observation and temperature controlling of the sintering process. For example if the process window for specific material is experimentally determined to get wanted properties and satisfying sintering speed. It is possible if the data is constantly recorded that the results can show faults in the part texture between layers. Changes between the monitoring data and the experimentally determined values can then indicate changes in the material being generated by material faults or by wrong process parameters. The results of this study show that spectrometer could be one possible tool for monitoring. But to get in that point where this all can be made possible much more researching is needed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study presents an evaluation of a pilot multistage filtration system (MSF) with different dosages, 131 mg L-1 and 106 mg L-1, of the natural coagulant extracted from Moringa oleifera seeds in pre-filtration and slow filtration stages, respectively. The system was comprised by a dynamic pre-filter unit, two upflow filters in parallel and four slow filters in parallel, and in one of the four filters had the filter media altered. The performance of the system was evaluated by monitoring some water quality parameters such as: turbidity, apparent color and slow filter load loss. The stages that have received the coagulant solution had better treatment efficiency compared with the steps without it. However, the direct application of the coagulant solution in the slow filter caused rapid clogging of the non-woven blanket and shorter career length.