999 resultados para microbiological monitoring
Resumo:
A novel sensing technique for the in situ monitoring of the rate of pulsed laser deposition (PLD) of metal thin films has been developed. This optical fibre based sensor works on the principle of the evanescent wave penetration of waveguide modes into the uncladded portion of a multimode fibre. The utility of this optical fibre sensor is demonstrated in the case of PLD of silver thin films obtained by a Q-switched Nd:YAG laser which is used to irradiate a silver target at the required conditions for the preparation of thin films. This paper describes the performance and characteristics of the sensor and shows how the device can be used as an effective tool for the monitoring of the deposition rate of silver thin films. The fibre optic sensor is very simple, inexpensive and highly sensitive compared with existing techniques for thin film deposition rate measurements.
Resumo:
Modern computer systems are plagued with stability and security problems: applications lose data, web servers are hacked, and systems crash under heavy load. Many of these problems or anomalies arise from rare program behavior caused by attacks or errors. A substantial percentage of the web-based attacks are due to buffer overflows. Many methods have been devised to detect and prevent anomalous situations that arise from buffer overflows. The current state-of-art of anomaly detection systems is relatively primitive and mainly depend on static code checking to take care of buffer overflow attacks. For protection, Stack Guards and I-leap Guards are also used in wide varieties.This dissertation proposes an anomaly detection system, based on frequencies of system calls in the system call trace. System call traces represented as frequency sequences are profiled using sequence sets. A sequence set is identified by the starting sequence and frequencies of specific system calls. The deviations of the current input sequence from the corresponding normal profile in the frequency pattern of system calls is computed and expressed as an anomaly score. A simple Bayesian model is used for an accurate detection.Experimental results are reported which show that frequency of system calls represented using sequence sets, captures the normal behavior of programs under normal conditions of usage. This captured behavior allows the system to detect anomalies with a low rate of false positives. Data are presented which show that Bayesian Network on frequency variations responds effectively to induced buffer overflows. It can also help administrators to detect deviations in program flow introduced due to errors.
Resumo:
The evolution of wireless sensor network technology has enabled us to develop advanced systems for real time monitoring. In the present scenario wireless sensor networks are increasingly being used for precision agriculture. The advantages of using wireless sensor networks in agriculture are distributed data collection and monitoring, monitor and control of climate, irrigation and nutrient supply. Hence decreasing the cost of production and increasing the efficiency of production.This paper describes the application of wireless sensor network for crop monitoring in the paddy fields of kuttand, a region of Kerala, the southern state of India.
Resumo:
Effect of delayed icing on the microbial quality and shelf-life of Hilsa toll was studied . Fish iced in rigor condition had a shelf-life of 11 days irrespective of the state of rigor . Fish procured from the landing centre had a shelf-life of only 8 days . It showed the presence of coagulase positive staphylococci, faecal streptococci and E. coli. Total bacterial count was low in all the samples and it increased after spoilage
Resumo:
The microalgal community as primary producers has to play a significant role in the biotic and abitoic interactions of any aquatic ecosystem. Whenever a community is exposed to a pollutant, responses can occur because individuals acclimate to pollutant caused changes and selection can occur favouring resistant genotypes within a population and selection among species can result in changes in community structure. The microalgal community of industrial effluent treatment systems are continuously exposed to pollutants and there is little data available on the structure and seasonal variation of microalgal community of industrial effluent holding ponds, especially of a complex effluent like that of refinery. The aim of the present study was to investigate the annual variation in the ecology, biomass, productivity and community structure of the algal community of a refinery effluent holding pond. The results of the study showed the pond to be a eutrophic system with a resistant microalgal community with distinct seasonal variation in species composition
Resumo:
Corrosion represents one of the largest through life cost component of ships. Ship owners and operators recognize that combating corrosion significantly impacts the vessels’ reliability, availability and through life costs. Primary objective of this paper is to review various inspections, monitoring systems and life cycle management with respect to corrosion control of ships and to develop the concept of “Corrosion Health” (CH) which would quantify the extent of corrosion at any point of ships’ operational life. A system approach in which the ship structure is considered as a corrosion system and divided into several corrosion zones, with distinct characteristics, is presented. Various corrosion assessment criteria for assessment of corrosion condition are listed. A CH rating system for representation of complex corrosion condition with a numeric number along with recommendations for repair/maintenance action is also discussed
Resumo:
In der psycholinguistischen Forschung ist die Annahme weitverbreitet, dass die Bewertung von Informationen hinsichtlich ihres Wahrheitsgehaltes oder ihrer Plausibilität (epistemische Validierung; Richter, Schroeder & Wöhrmann, 2009) ein strategischer, optionaler und dem Verstehen nachgeschalteter Prozess ist (z.B. Gilbert, 1991; Gilbert, Krull & Malone, 1990; Gilbert, Tafarodi & Malone, 1993; Herbert & Kübler, 2011). Eine zunehmende Anzahl an Studien stellt dieses Zwei-Stufen-Modell von Verstehen und Validieren jedoch direkt oder indirekt in Frage. Insbesondere Befunde zu Stroop-artigen Stimulus-Antwort-Kompatibilitätseffekten, die auftreten, wenn positive und negative Antworten orthogonal zum aufgaben-irrelevanten Wahrheitsgehalt von Sätzen abgegeben werden müssen (z.B. eine positive Antwort nach dem Lesen eines falschen Satzes oder eine negative Antwort nach dem Lesen eines wahren Satzes; epistemischer Stroop-Effekt, Richter et al., 2009), sprechen dafür, dass Leser/innen schon beim Verstehen eine nicht-strategische Überprüfung der Validität von Informationen vornehmen. Ausgehend von diesen Befunden war das Ziel dieser Dissertation eine weiterführende Überprüfung der Annahme, dass Verstehen einen nicht-strategischen, routinisierten, wissensbasierten Validierungsprozesses (epistemisches Monitoring; Richter et al., 2009) beinhaltet. Zu diesem Zweck wurden drei empirische Studien mit unterschiedlichen Schwerpunkten durchgeführt. Studie 1 diente der Untersuchung der Fragestellung, ob sich Belege für epistemisches Monitoring auch bei Informationen finden lassen, die nicht eindeutig wahr oder falsch, sondern lediglich mehr oder weniger plausibel sind. Mithilfe des epistemischen Stroop-Paradigmas von Richter et al. (2009) konnte ein Kompatibilitätseffekt von aufgaben-irrelevanter Plausibilität auf die Latenzen positiver und negativer Antworten in zwei unterschiedlichen experimentellen Aufgaben nachgewiesen werden, welcher dafür spricht, dass epistemisches Monitoring auch graduelle Unterschiede in der Übereinstimmung von Informationen mit dem Weltwissen berücksichtigt. Darüber hinaus belegen die Ergebnisse, dass der epistemische Stroop-Effekt tatsächlich auf Plausibilität und nicht etwa auf der unterschiedlichen Vorhersagbarkeit von plausiblen und unplausiblen Informationen beruht. Das Ziel von Studie 2 war die Prüfung der Hypothese, dass epistemisches Monitoring keinen evaluativen Mindset erfordert. Im Gegensatz zu den Befunden anderer Autoren (Wiswede, Koranyi, Müller, Langner, & Rothermund, 2013) zeigte sich in dieser Studie ein Kompatibilitätseffekt des aufgaben-irrelevanten Wahrheitsgehaltes auf die Antwortlatenzen in einer vollständig nicht-evaluativen Aufgabe. Die Ergebnisse legen nahe, dass epistemisches Monitoring nicht von einem evaluativen Mindset, möglicherweise aber von der Tiefe der Verarbeitung abhängig ist. Studie 3 beleuchtete das Verhältnis von Verstehen und Validieren anhand einer Untersuchung der Online-Effekte von Plausibilität und Vorhersagbarkeit auf Augenbewegungen beim Lesen kurzer Texte. Zusätzlich wurde die potentielle Modulierung dieser Effeke durch epistemische Marker, die die Sicherheit von Informationen anzeigen (z.B. sicherlich oder vielleicht), untersucht. Entsprechend der Annahme eines schnellen und nicht-strategischen epistemischen Monitoring-Prozesses zeigten sich interaktive Effekte von Plausibilität und dem Vorhandensein epistemischer Marker auf Indikatoren früher Verstehensprozesse. Dies spricht dafür, dass die kommunizierte Sicherheit von Informationen durch den Monitoring-Prozess berücksichtigt wird. Insgesamt sprechen die Befunde gegen eine Konzeptualisierung von Verstehen und Validieren als nicht-überlappenden Stufen der Informationsverarbeitung. Vielmehr scheint eine Bewertung des Wahrheitsgehalts oder der Plausibilität basierend auf dem Weltwissen – zumindest in gewissem Ausmaß – eine obligatorische und nicht-strategische Komponente des Sprachverstehens zu sein. Die Bedeutung der Befunde für aktuelle Modelle des Sprachverstehens und Empfehlungen für die weiterführende Forschung zum Vehältnis von Verstehen und Validieren werden aufgezeigt.
Resumo:
First discussion on compositional data analysis is attributable to Karl Pearson, in 1897. However, notwithstanding the recent developments on algebraic structure of the simplex, more than twenty years after Aitchison’s idea of log-transformations of closed data, scientific literature is again full of statistical treatments of this type of data by using traditional methodologies. This is particularly true in environmental geochemistry where besides the problem of the closure, the spatial structure (dependence) of the data have to be considered. In this work we propose the use of log-contrast values, obtained by a simplicial principal component analysis, as LQGLFDWRUV of given environmental conditions. The investigation of the log-constrast frequency distributions allows pointing out the statistical laws able to generate the values and to govern their variability. The changes, if compared, for example, with the mean values of the random variables assumed as models, or other reference parameters, allow defining monitors to be used to assess the extent of possible environmental contamination. Case study on running and ground waters from Chiavenna Valley (Northern Italy) by using Na+, K+, Ca2+, Mg2+, HCO3-, SO4 2- and Cl- concentrations will be illustrated
Resumo:
The purpose of resource management is the efficient and effective use of network resources, for instance bandwidth. In this article, a connection oriented network scenario is considered, where a certain amount of bandwidth is reserved for each label switch path (LSP), which is a logical path, in a MPLS or GMPLS environment. Assuming there is also some kind of admission control (explicit or implicit), these environments typically provide quality of service (QoS) guarantees. It could happen that some LSPs become busy, thus rejecting connections, while other LSPs may be under-utilised. We propose a distributed lightweight monitoring technique, based on threshold values, the objective of which is to detect congestion when it occurs in an LSP and activate the corresponding alarm which will trigger a dynamic bandwidth reallocation mechanism
Resumo:
Resumen tomado del autor
Resumo:
This training video is intended to familiarise researchers and technicians, working with potentially airborne pathogens, on the correct and safe use of Microbiological Safety Cabinets. The video also provides instruction on cleaning, disinfection and fumigation regimes; maintenance and testing regimes; and commissioning and decommissioning requirements of such Local Exhaust Ventilation (LEV) systems. It is in Windows Media Video format which will require a free media player such as Windows Media Player or VLC Media Player (http://www.videolan.org/vlc/) to watch.
Resumo:
Wednesday 23rd April 2014 Speaker(s): Willi Hasselbring Organiser: Leslie Carr Time: 23/04/2014 14:00-15:00 Location: B32/3077 File size: 802Mb Abstract The internal behavior of large-scale software systems cannot be determined on the basis of static (e.g., source code) analysis alone. Kieker provides complementary dynamic analysis capabilities, i.e., monitoring/profiling and analyzing a software system's runtime behavior. Application Performance Monitoring is concerned with continuously observing a software system's performance-specific runtime behavior, including analyses like assessing service level compliance or detecting and diagnosing performance problems. Architecture Discovery is concerned with extracting architectural information from an existing software system, including both structural and behavioral aspects like identifying architectural entities (e.g., components and classes) and their interactions (e.g., local or remote procedure calls). In addition to the Architecture Discovery of Java systems, Kieker supports Architecture Discovery for other platforms, including legacy systems, for instance, inplemented in C#, C++, Visual Basic 6, COBOL or Perl. Thanks to Kieker's extensible architecture it is easy to implement and use custom extensions and plugins. Kieker was designed for continuous monitoring in production systems inducing only a very low overhead, which has been evaluated in extensive benchmark experiments. Please, refer to http://kieker-monitoring.net/ for more information.
Resumo:
Dentro de las actividades para el control de calidad en el laboratorio, los resultados finales de un analito en particular son considerados productos intermedios, dada la pertinencia otorgada al aseguramiento de la calidad como fin último de los programas de gestión de la calidad. Esta concepción precisa el establecimiento de instrumentos integrales para la detección de eventos como la contaminación cruzada y la adopción de medidas para evitar que se afecte la marcha analítica. Objetivo: el objetivo principal fue establecer un sistema para el monitoreo y control de la contaminación cruzada en el laboratorio de análisis microbiológico de alimentos. Materiales y métodos: la metodología empleada consistió en desarrollar diagramas de flujo para los procedimientos sobre el control de las poblaciones de mesófilos aerobios y mohos provenientes de la contaminación en los ambientes, superficies, material estéril y medios de cultivos. Dichos diagramas incluyeron un árbol de decisiones, diseñado para efectuar acciones de control con base en los intervalos de tolerancia, establecidos como herramienta objetiva hacia la toma de decisiones que normalicen los recuentos de las poblaciones microbianas en cuestión. Resultados: los límites de alerta más estrictos se obtuvieron para las poblaciones de mesófilos aerobios y mohos en los diferentes controles, excepto para el ambiente del área de preparación de medios y los correspondientes al material estéril. Conclusión: el proceso desarrollado permitió complementar el sistema de control de calidad interno en el laboratorio, al disponer de un medio objetivo para el cierre de no conformidades por contaminación cruzada.
Resumo:
Resumen basado en el de la revista