976 resultados para information units
Resumo:
This paper proposes a new approach for optimal phasor measurement units placement for fault location on electric power distribution systems using Greedy Randomized Adaptive Search Procedure metaheuristic and Monte Carlo simulation. The optimized placement model herein proposed is a general methodology that can be used to place devices aiming to record the voltage sag magnitudes for any fault location algorithm that uses voltage information measured at a limited set of nodes along the feeder. An overhead, three-phase, three-wire, 13.8 kV, 134-node, real-life feeder model is used to evaluate the algorithm. Tests show that the results of the fault location methodology were improved thanks to the new optimized allocation of the meters pinpointed using this methodology. © 2011 IEEE.
Resumo:
Includes bibliography
Resumo:
The control of molecular architectures has been exploited in layer-by-layer (LbL) films deposited on Au interdigitated electrodes, thus forming an electronic tongue (e-tongue) system that reached an unprecedented high sensitivity (down to 10-12 M) in detecting catechol. Such high sensitivity was made possible upon using units containing the enzyme tyrosinase, which interacted specifically with catechol, and by processing impedance spectroscopy data with information visualization methods. These latter methods, including the parallel coordinates technique, were also useful for identifying the major contributors to the high distinguishing ability toward catechol. Among several film architectures tested, the most efficient had a tyrosinase layer deposited atop LbL films of alternating layers of dioctadecyldimethylammonium bromide (DODAB) and 1,2-dipalmitoyl-sn-3-glycero-fosfo-rac-(1-glycerol) (DPPG), viz., (DODAB/DPPG)5/DODAB/Tyr. The latter represents a more suitable medium for immobilizing tyrosinase when compared to conventional polyelectrolytes. Furthermore, the distinction was more effective at low frequencies where double-layer effects on the film/liquid sample dominate the electrical response. Because the optimization of film architectures based on information visualization is completely generic, the approach presented here may be extended to designing architectures for other types of applications in addition to sensing and biosensing. © 2013 American Chemical Society.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
In this paper we discuss the detection of glucose and triglycerides using information visualization methods to process impedance spectroscopy data. The sensing units contained either lipase or glucose oxidase immobilized in layer-by-layer (LbL) films deposited onto interdigitated electrodes. The optimization consisted in identifying which part of the electrical response and combination of sensing units yielded the best distinguishing ability. It is shown that complete separation can be obtained for a range of concentrations of glucose and triglyceride when the interactive document map (IDMAP) technique is used to project the data into a two-dimensional plot. Most importantly, the optimization procedure can be extended to other types of biosensors, thus increasing the versatility of analysis provided by tailored molecular architectures exploited with various detection principles. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
In den letzten drei Jahrzehnten sind Fernerkundung und GIS in den Geowissenschaften zunehmend wichtiger geworden, um die konventionellen Methoden von Datensammlung und zur Herstellung von Landkarten zu verbessern. Die vorliegende Arbeit befasst sich mit der Anwendung von Fernerkundung und geographischen Informationssystemen (GIS) für geomorphologische Untersuchungen. Durch die Kombination beider Techniken ist es vor allem möglich geworden, geomorphologische Formen im Überblick und dennoch detailliert zu erfassen. Als Grundlagen werden in dieser Arbeit topographische und geologische Karten, Satellitenbilder und Klimadaten benutzt. Die Arbeit besteht aus 6 Kapiteln. Das erste Kapitel gibt einen allgemeinen Überblick über den Untersuchungsraum. Dieser umfasst folgende morphologische Einheiten, klimatischen Verhältnisse, insbesondere die Ariditätsindizes der Küsten- und Gebirgslandschaft sowie das Siedlungsmuster beschrieben. Kapitel 2 befasst sich mit der regionalen Geologie und Stratigraphie des Untersuchungsraumes. Es wird versucht, die Hauptformationen mit Hilfe von ETM-Satellitenbildern zu identifizieren. Angewandt werden hierzu folgende Methoden: Colour Band Composite, Image Rationing und die sog. überwachte Klassifikation. Kapitel 3 enthält eine Beschreibung der strukturell bedingten Oberflächenformen, um die Wechselwirkung zwischen Tektonik und geomorphologischen Prozessen aufzuklären. Es geht es um die vielfältigen Methoden, zum Beispiel das sog. Image Processing, um die im Gebirgskörper vorhandenen Lineamente einwandfrei zu deuten. Spezielle Filtermethoden werden angewandt, um die wichtigsten Lineamente zu kartieren. Kapitel 4 stellt den Versuch dar, mit Hilfe von aufbereiteten SRTM-Satellitenbildern eine automatisierte Erfassung des Gewässernetzes. Es wird ausführlich diskutiert, inwieweit bei diesen Arbeitsschritten die Qualität kleinmaßstäbiger SRTM-Satellitenbilder mit großmaßstäbigen topographischen Karten vergleichbar ist. Weiterhin werden hydrologische Parameter über eine qualitative und quantitative Analyse des Abflussregimes einzelner Wadis erfasst. Der Ursprung von Entwässerungssystemen wird auf der Basis geomorphologischer und geologischer Befunde interpretiert. Kapitel 5 befasst sich mit der Abschätzung der Gefahr episodischer Wadifluten. Die Wahrscheinlichkeit ihres jährlichen Auftretens bzw. des Auftretens starker Fluten im Abstand mehrerer Jahre wird in einer historischen Betrachtung bis 1921 zurückverfolgt. Die Bedeutung von Regentiefs, die sich über dem Roten Meer entwickeln, und die für eine Abflussbildung in Frage kommen, wird mit Hilfe der IDW-Methode (Inverse Distance Weighted) untersucht. Betrachtet werden außerdem weitere, regenbringende Wetterlagen mit Hilfe von Meteosat Infrarotbildern. Genauer betrachtet wird die Periode 1990-1997, in der kräftige, Wadifluten auslösende Regenfälle auftraten. Flutereignisse und Fluthöhe werden anhand von hydrographischen Daten (Pegelmessungen) ermittelt. Auch die Landnutzung und Siedlungsstruktur im Einzugsgebiet eines Wadis wird berücksichtigt. In Kapitel 6 geht es um die unterschiedlichen Küstenformen auf der Westseite des Roten Meeres zum Beispiel die Erosionsformen, Aufbauformen, untergetauchte Formen. Im abschließenden Teil geht es um die Stratigraphie und zeitliche Zuordnung von submarinen Terrassen auf Korallenriffen sowie den Vergleich mit anderen solcher Terrassen an der ägyptischen Rotmeerküste westlich und östlich der Sinai-Halbinsel.
Resumo:
Bei der vorliegenden Studie wurde die Machbarkeit und Qualität der Arzneimittelverteilung von oralen Arzneimitteln in Einzeldosisblisterverpackungen je abgeteilte Arzneiform (EVA) untersucht.rnDie Studie wurde als offene, vergleichende, prospektive und multizentrische Patientenstudie durchgeführt. Als Studienmedikation standen Diovan®, CoDiovan® und Amlodipin in der EVA-Verpackung zur Verfügung. Die Verteilfehlerrate in der EVA- und Kontroll-Gruppe stellte den primären Zielparameter dar. Das Patientenwissen, die Patientenzufriedenheit und die Praktikabilität des EVA-Systems, sowie die Zufriedenheit der Pflegekräfte wurden mithilfe von Fragebogen evaluiert. Insgesamt wurden 2070 gültige Tablettenvergaben bei 332 Patienten in sechs verschiedenen Krankenhäusern geprüft. Es wurde in der EVA-Gruppe ein Verteilungsfehler von 1,8% und in der Kontroll-Gruppe von 0,7% ermittelt. Bei den Patienten-Fragebogen konnten insgesamt 292 Fragebogen ausgewertet werden. Die Ergebnisse zeigten einen ungenügenden Informationsstand der Patienten über ihre aktuellen, oralen Arzneimittel. In den 80 ausgefüllten Pflegekräfte-Fragebogen gaben über 80% an, dass Fehler beim Richten durch das EVA-System besser erkannt werden können. rnZusammenfassend kann gesagt werden, dass die erhöhte Fehlerrate in der EVA-Gruppe im Vergleich zur Kontroll-Gruppe durch mehrere Störfaktoren bedingt wurde. Grundsätzlich konnte eine sehr positive Resonanz auf das EVA-System bei den Patienten und den Pflegekräften beobachtet werden. rn
Resumo:
SMARTDIAB is a platform designed to support the monitoring, management, and treatment of patients with type 1 diabetes mellitus (T1DM), by combining state-of-the-art approaches in the fields of database (DB) technologies, communications, simulation algorithms, and data mining. SMARTDIAB consists mainly of two units: 1) the patient unit (PU); and 2) the patient management unit (PMU), which communicate with each other for data exchange. The PMU can be accessed by the PU through the internet using devices, such as PCs/laptops with direct internet access or mobile phones via a Wi-Fi/General Packet Radio Service access network. The PU consists of an insulin pump for subcutaneous insulin infusion to the patient and a continuous glucose measurement system. The aforementioned devices running a user-friendly application gather patient's related information and transmit it to the PMU. The PMU consists of a diabetes data management system (DDMS), a decision support system (DSS) that provides risk assessment for long-term diabetes complications, and an insulin infusion advisory system (IIAS), which reside on a Web server. The DDMS can be accessed from both medical personnel and patients, with appropriate security access rights and front-end interfaces. The DDMS, apart from being used for data storage/retrieval, provides also advanced tools for the intelligent processing of the patient's data, supporting the physician in decision making, regarding the patient's treatment. The IIAS is used to close the loop between the insulin pump and the continuous glucose monitoring system, by providing the pump with the appropriate insulin infusion rate in order to keep the patient's glucose levels within predefined limits. The pilot version of the SMARTDIAB has already been implemented, while the platform's evaluation in clinical environment is being in progress.
Resumo:
Soil erosion on sloping agricultural land poses a serious problem for the environment, as well as for production. In areas with highly erodible soils, such as those in loess zones, application of soil and water conservation measures is crucial to sustain agricultural yields and to prevent or reduce land degradation. The present study, carried out in Faizabad, Tajikistan, was designed to evaluate the potential of local conservation measures on cropland using a spatial modelling approach to provide decision-making support for the planning of spatially explicit sustainable land use. A sampling design to support comparative analysis between well-conserved units and other field units was established in order to estimate factors that determine water erosion, according to the Revised Universal Soil Loss Equation (RUSLE). Such factor-based approaches allow ready application using a geographic information system (GIS) and facilitate straightforward scenario modelling in areas with limited data resources. The study showed first that assessment of erosion and conservation in an area with inhomogeneous vegetation cover requires the integration of plot-based cover. Plot-based vegetation cover can be effectively derived from high-resolution satellite imagery, providing a useful basis for plot-wise conservation planning. Furthermore, thorough field assessments showed that 25.7% of current total cropland is covered by conservation measures (terracing, agroforestry and perennial herbaceous fodder). Assessment of the effectiveness of these local measures, combined with the RUSLE calculations, revealed that current average soil loss could be reduced through low-cost measures such as contouring (by 11%), fodder plants (by 16%), and drainage ditches (by 53%). More expensive measures such as terracing and agroforestry can reduce erosion by as much as 63% (for agroforestry) and 93% (for agroforestry combined with terracing). Indeed, scenario runs for different levels of tolerable erosion rates showed that more cost-intensive and technologically advanced measures would lead to greater reduction of soil loss. However, given economic conditions in Tajikistan, it seems advisable to support the spread of low-cost and labourextensive measures.
Resumo:
According to Bandura (1997) efficacy beliefs are a primary determinant of motivation. Still, very little is known about the processes through which people integrate situational factors to form efficacy beliefs (Myers & Feltz, 2007). The aim of this study was to gain insight into the cognitive construction of subjective group-efficacy beliefs. Only with a sound understanding of those processes is there a sufficient base to derive psychological interventions aimed at group-efficacy beliefs. According to cognitive theories (e.g., Miller, Galanter, & Pribram, 1973) individual group-efficacy beliefs can be seen as the result of a comparison between the demands of a group task and the resources of the performing group. At the center of this comparison are internally represented structures of the group task and plans to perform it. The empirical plausibility of this notion was tested using functional measurement theory (Anderson, 1981). Twenty-three students (M = 23.30 years; SD = 3.39; 35 % females) of the University of Bern repeatedly judged the efficacy of groups in different group tasks. The groups consisted of the subjects and another one to two fictive group members. The latter were manipulated by their value (low, medium, high) in task-relevant abilities. Data obtained from multiple full factorial designs were structured with individuals as second level units and analyzed using mixed linear models. The task-relevant abilities of group members, specified as fixed factors, all had highly significant effects on subjects’ group-efficacy judgments. The effect sizes of the ability factors showed to be dependent on the respective abilities’ importance in a given task. In additive tasks (Steiner, 1972) group resources were integrated in a linear fashion whereas significant interaction between factors was obtained in interdependent tasks. The results also showed that people take into account other group members’ efficacy beliefs when forming their own group-efficacy beliefs. The results support the notion that personal group-efficacy beliefs are obtained by comparing the demands of a task with the performing groups’ resources. Psychological factors such as other team members’ efficacy beliefs are thereby being considered task relevant resources and affect subjective group-efficacy beliefs. This latter finding underlines the adequacy of multidimensional measures. While the validity of collective efficacy measures is usually estimated by how well they predict performances, the results of this study allow for a somewhat internal validity criterion. It is concluded that Information Integration Theory holds potential to further help understand people’s cognitive functioning in sport relevant situations.
Resumo:
IPOD Leg 49 recovered basalts from 9 holes at 7 sites along 3 transects across the Mid-Atlantic Ridge: 63°N (Reykjanes), 45°N and 36°N (FAMOUS area). This has provided further information on the nature of mantle heterogeneity in the North Atlantic by enabling studies to be made of the variation of basalt composition with depth and with time near critical areas (Iceland and the Azores) where deep mantle plumes are thought to exist. Over 150 samples have been analysed for up to 40 major and trace elements and the results used to place constraints on the petrogenesis of the erupted basalts and hence on the geochemical nature of their source regions. It is apparent that few of the recovered basalts have the geochemical characteristics of typical "depleted" midocean ridge basalts (MORB). An unusually wide range of basalt compositions may be erupted at a single site: the range of rare earth patterns within the short section cored at Site 413, for instance, encompasses the total variation of REE patterns previously reported from the FAMOUS area. Nevertheless it is possible to account for most of the compositional variation at a single site by partial melting processes (including dynamic melting) and fractional crystallization. Partial melting mechanisms seem to be the dominant processes relating basalt compositions, particularly at 36°N and 45°N, suggesting that long-lived sub-axial magma chambers may not be a consistent feature of the slow-spreading Mid-Atlantic Ridge. Comparisons of basalts erupted at the same ridge segment for periods of the order of 35 m.y. (now lying along the same mantle flow line) do show some significant inter-site differences in Rb/Sr, Ce/Yb, 87Sr/86Sr, etc., which cannot be accounted for by fractionation mechanisms and which must reflect heterogeneities in the mantle source. However when hygromagmatophile (HYG) trace element levels and ratios are considered, it is the constancy or consistency of these HYG ratios which is the more remarkable, implying that the mantle source feeding a particular ridge segment was uniform with respect to these elements for periods of the order of 35 m.y. and probably since the opening of the Atlantic. Yet these HYG element ratios at 63°N are very different from those at 45°N and 36°N and significantly different from the values at 22°N and in "MORB". The observed variations are difficult to reconcile with current concepts of mantle plumes and binary mixing models. The mantle is certainly heterogeneous, but there is not simply an "enriched" and a "depleted" source, but rather a range of sources heterogeneous on different scales for different elements - to an extent and volume depending on previous depletion/enrichment events. HYG element ratios offer the best method of defining compositionally different mantle segments since they are little modified by the fractionation processes associated with basalt generation.
Resumo:
panels provides a quick way to count the number of panels (groups) in a dataset and display some basic information about the sizes of the panels. Furthermore, -panels- can be used as a prefix command to other Stata commands to apply them to panel units instead of individual observations. This is useful, for example, if you want to compute frequency distributions or summary statistics for panel characteristics.
Resumo:
Analysis of big amount of data is a field with many years of research. It is centred in getting significant values, to make it easier to understand and interpret data. Being the analysis of interdependence between time series an important field of research, mainly as a result of advances in the characterization of dynamical systems from the signals they produce. In the medicine sphere, it is easy to find many researches that try to understand the brain behaviour, its operation mode and its internal connections. The human brain comprises approximately 1011 neurons, each of which makes about 103 synaptic connections. This huge number of connections between individual processing elements provides the fundamental substrate for neuronal ensembles to become transiently synchronized or functionally connected. A similar complex network configuration and dynamics can also be found at the macroscopic scales of systems neuroscience and brain imaging. The emergence of dynamically coupled cell assemblies represents the neurophysiological substrate for cognitive function such as perception, learning, thinking. Understanding the complex network organization of the brain on the basis of neuroimaging data represents one of the most impervious challenges for systems neuroscience. Brain connectivity is an elusive concept that refers to diferent interrelated aspects of brain organization: structural, functional connectivity (FC) and efective connectivity (EC). Structural connectivity refers to a network of physical connections linking sets of neurons, it is the anatomical structur of brain networks. However, FC refers to the statistical dependence between the signals stemming from two distinct units within a nervous system, while EC refers to the causal interactions between them. This research opens the door to try to resolve diseases related with the brain, like Parkinson’s disease, senile dementia, mild cognitive impairment, etc. One of the most important project associated with Alzheimer’s research and other diseases are enclosed in the European project called Blue Brain. The center for Biomedical Technology (CTB) of Universidad Politecnica de Madrid (UPM) forms part of the project. The CTB researches have developed a magnetoencephalography (MEG) data processing tool that allow to visualise and analyse data in an intuitive way. This tool receives the name of HERMES, and it is presented in this document. Analysis of big amount of data is a field with many years of research. It is centred in getting significant values, to make it easier to understand and interpret data. Being the analysis of interdependence between time series an important field of research, mainly as a result of advances in the characterization of dynamical systems from the signals they produce. In the medicine sphere, it is easy to find many researches that try to understand the brain behaviour, its operation mode and its internal connections. The human brain comprises approximately 1011 neurons, each of which makes about 103 synaptic connections. This huge number of connections between individual processing elements provides the fundamental substrate for neuronal ensembles to become transiently synchronized or functionally connected. A similar complex network configuration and dynamics can also be found at the macroscopic scales of systems neuroscience and brain imaging. The emergence of dynamically coupled cell assemblies represents the neurophysiological substrate for cognitive function such as perception, learning, thinking. Understanding the complex network organization of the brain on the basis of neuroimaging data represents one of the most impervious challenges for systems neuroscience. Brain connectivity is an elusive concept that refers to diferent interrelated aspects of brain organization: structural, functional connectivity (FC) and efective connectivity (EC). Structural connectivity refers to a network of physical connections linking sets of neurons, it is the anatomical structur of brain networks. However, FC refers to the statistical dependence between the signals stemming from two distinct units within a nervous system, while EC refers to the causal interactions between them. This research opens the door to try to resolve diseases related with the brain, like Parkinson’s disease, senile dementia, mild cognitive impairment, etc. One of the most important project associated with Alzheimer’s research and other diseases are enclosed in the European project called Blue Brain. The center for Biomedical Technology (CTB) of Universidad Politecnica de Madrid (UPM) forms part of the project. The CTB researches have developed a magnetoencephalography (MEG) data processing tool that allow to visualise and analyse data in an intuitive way. This tool receives the name of HERMES, and it is presented in this document.