925 resultados para OPTICAL PERFORMANCE MONITORING


Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN] We propose four algorithms for computing the inverse optical flow between two images. We assume that the forward optical flow has already been obtained and we need to estimate the flow in the backward direction. The forward and backward flows can be related through a warping formula, which allows us to propose very efficient algorithms. These are presented in increasing order of complexity. The proposed methods provide high accuracy with low memory requirements and low running times.In general, the processing reduces to one or two image passes. Typically, when objects move in a sequence, some regions may appear or disappear. Finding the inverse flows in these situations is difficult and, in some cases, it is not possible to obtain a correct solution. Our algorithms deal with occlusions very easy and reliably. On the other hand, disocclusions have to be overcome as a post-processing step. We propose three approaches for filling disocclusions. In the experimental results, we use standard synthetic sequences to study the performance of the proposed methods, and show that they yield very accurate solutions. We also analyze the performance of the filling strategies. 

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN] We analyze the discontinuity preserving problem in TV-L1 optical flow methods. This type of methods typically creates rounded effects at flow boundaries, which usually do not coincide with object contours. A simple strategy to overcome this problem consists in inhibiting the diffusion at high image gradients. In this work, we first introduce a general framework for TV regularizers in optical flow and relate it with some standard approaches. Our survey takes into account several methods that use decreasing functions for mitigating the diffusion at image contours. Consequently, this kind of strategies may produce instabilities in the estimation of the optical flows. Hence, we study the problem of instabilities and show that it actually arises from an ill-posed formulation. From this study, it is possible to come across with different schemes to solve this problem. One of these consists in separating the pure TV process from the mitigating strategy. This has been used in another work and we demonstrate here that it has a good performance. Furthermore, we propose two alternatives to avoid the instability problems: (i) we study a fully automatic approach that solves the problem based on the information of the whole image; (ii) we derive a semi-automatic approach that takes into account the image gradients in a close neighborhood adapting the parameter in each position. In the experimental results, we present a detailed study and comparison between the different alternatives. These methods provide very good results, especially for sequences with a few dominant gradients. Additionally, a surprising effect of these approaches is that they can cope with occlusions. This can be easily achieved by using strong regularizations and high penalizations at image contours.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although Recovery is often defined as the less studied and documented phase of the Emergency Management Cycle, a wide literature is available for describing characteristics and sub-phases of this process. Previous works do not allow to gain an overall perspective because of a lack of systematic consistent monitoring of recovery utilizing advanced technologies such as remote sensing and GIS technologies. Taking into consideration the key role of Remote Sensing in Response and Damage Assessment, this thesis is aimed to verify the appropriateness of such advanced monitoring techniques to detect recovery advancements over time, with close attention to the main characteristics of the study event: Hurricane Katrina storm surge. Based on multi-source, multi-sensor and multi-temporal data, the post-Katrina recovery was analysed using both a qualitative and a quantitative approach. The first phase was dedicated to the investigation of the relation between urban types, damage and recovery state, referring to geographical and technological parameters. Damage and recovery scales were proposed to review critical observations on remarkable surge- induced effects on various typologies of structures, analyzed at a per-building level. This wide-ranging investigation allowed a new understanding of the distinctive features of the recovery process. A quantitative analysis was employed to develop methodological procedures suited to recognize and monitor distribution, timing and characteristics of recovery activities in the study area. Promising results, gained by applying supervised classification algorithms to detect localization and distribution of blue tarp, have proved that this methodology may help the analyst in the detection and monitoring of recovery activities in areas that have been affected by medium damage. The study found that Mahalanobis Distance was the classifier which provided the most accurate results, in localising blue roofs with 93.7% of blue roof classified correctly and a producer accuracy of 70%. It was seen to be the classifier least sensitive to spectral signature alteration. The application of the dissimilarity textural classification to satellite imagery has demonstrated the suitability of this technique for the detection of debris distribution and for the monitoring of demolition and reconstruction activities in the study area. Linking these geographically extensive techniques with expert per-building interpretation of advanced-technology ground surveys provides a multi-faceted view of the physical recovery process. Remote sensing and GIS technologies combined to advanced ground survey approach provides extremely valuable capability in Recovery activities monitoring and may constitute a technical basis to lead aid organization and local government in the Recovery management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The motivation for the work presented in this thesis is to retrieve profile information for the atmospheric trace constituents nitrogen dioxide (NO2) and ozone (O3) in the lower troposphere from remote sensing measurements. The remote sensing technique used, referred to as Multiple AXis Differential Optical Absorption Spectroscopy (MAX-DOAS), is a recent technique that represents a significant advance on the well-established DOAS, especially for what it concerns the study of tropospheric trace consituents. NO2 is an important trace gas in the lower troposphere due to the fact that it is involved in the production of tropospheric ozone; ozone and nitrogen dioxide are key factors in determining the quality of air with consequences, for example, on human health and the growth of vegetation. To understand the NO2 and ozone chemistry in more detail not only the concentrations at ground but also the acquisition of the vertical distribution is necessary. In fact, the budget of nitrogen oxides and ozone in the atmosphere is determined both by local emissions and non-local chemical and dynamical processes (i.e. diffusion and transport at various scales) that greatly impact on their vertical and temporal distribution: thus a tool to resolve the vertical profile information is really important. Useful measurement techniques for atmospheric trace species should fulfill at least two main requirements. First, they must be sufficiently sensitive to detect the species under consideration at their ambient concentration levels. Second, they must be specific, which means that the results of the measurement of a particular species must be neither positively nor negatively influenced by any other trace species simultaneously present in the probed volume of air. Air monitoring by spectroscopic techniques has proven to be a very useful tool to fulfill these desirable requirements as well as a number of other important properties. During the last decades, many such instruments have been developed which are based on the absorption properties of the constituents in various regions of the electromagnetic spectrum, ranging from the far infrared to the ultraviolet. Among them, Differential Optical Absorption Spectroscopy (DOAS) has played an important role. DOAS is an established remote sensing technique for atmospheric trace gases probing, which identifies and quantifies the trace gases in the atmosphere taking advantage of their molecular absorption structures in the near UV and visible wavelengths of the electromagnetic spectrum (from 0.25 μm to 0.75 μm). Passive DOAS, in particular, can detect the presence of a trace gas in terms of its integrated concentration over the atmospheric path from the sun to the receiver (the so called slant column density). The receiver can be located at ground, as well as on board an aircraft or a satellite platform. Passive DOAS has, therefore, a flexible measurement configuration that allows multiple applications. The ability to properly interpret passive DOAS measurements of atmospheric constituents depends crucially on how well the optical path of light collected by the system is understood. This is because the final product of DOAS is the concentration of a particular species integrated along the path that radiation covers in the atmosphere. This path is not known a priori and can only be evaluated by Radiative Transfer Models (RTMs). These models are used to calculate the so called vertical column density of a given trace gas, which is obtained by dividing the measured slant column density to the so called air mass factor, which is used to quantify the enhancement of the light path length within the absorber layers. In the case of the standard DOAS set-up, in which radiation is collected along the vertical direction (zenith-sky DOAS), calculations of the air mass factor have been made using “simple” single scattering radiative transfer models. This configuration has its highest sensitivity in the stratosphere, in particular during twilight. This is the result of the large enhancement in stratospheric light path at dawn and dusk combined with a relatively short tropospheric path. In order to increase the sensitivity of the instrument towards tropospheric signals, measurements with the telescope pointing the horizon (offaxis DOAS) have to be performed. In this circumstances, the light path in the lower layers can become very long and necessitate the use of radiative transfer models including multiple scattering, the full treatment of atmospheric sphericity and refraction. In this thesis, a recent development in the well-established DOAS technique is described, referred to as Multiple AXis Differential Optical Absorption Spectroscopy (MAX-DOAS). The MAX-DOAS consists in the simultaneous use of several off-axis directions near the horizon: using this configuration, not only the sensitivity to tropospheric trace gases is greatly improved, but vertical profile information can also be retrieved by combining the simultaneous off-axis measurements with sophisticated RTM calculations and inversion techniques. In particular there is a need for a RTM which is capable of dealing with all the processes intervening along the light path, supporting all DOAS geometries used, and treating multiple scattering events with varying phase functions involved. To achieve these multiple goals a statistical approach based on the Monte Carlo technique should be used. A Monte Carlo RTM generates an ensemble of random photon paths between the light source and the detector, and uses these paths to reconstruct a remote sensing measurement. Within the present study, the Monte Carlo radiative transfer model PROMSAR (PROcessing of Multi-Scattered Atmospheric Radiation) has been developed and used to correctly interpret the slant column densities obtained from MAX-DOAS measurements. In order to derive the vertical concentration profile of a trace gas from its slant column measurement, the AMF is only one part in the quantitative retrieval process. One indispensable requirement is a robust approach to invert the measurements and obtain the unknown concentrations, the air mass factors being known. For this purpose, in the present thesis, we have used the Chahine relaxation method. Ground-based Multiple AXis DOAS, combined with appropriate radiative transfer models and inversion techniques, is a promising tool for atmospheric studies in the lower troposphere and boundary layer, including the retrieval of profile information with a good degree of vertical resolution. This thesis has presented an application of this powerful comprehensive tool for the study of a preserved natural Mediterranean area (the Castel Porziano Estate, located 20 km South-West of Rome) where pollution is transported from remote sources. Application of this tool in densely populated or industrial areas is beginning to look particularly fruitful and represents an important subject for future studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, computing is migrating from traditional high performance and distributed computing to pervasive and utility computing based on heterogeneous networks and clients. The current trend suggests that future IT services will rely on distributed resources and on fast communication of heterogeneous contents. The success of this new range of services is directly linked to the effectiveness of the infrastructure in delivering them. The communication infrastructure will be the aggregation of different technologies even though the current trend suggests the emergence of single IP based transport service. Optical networking is a key technology to answer the increasing requests for dynamic bandwidth allocation and configure multiple topologies over the same physical layer infrastructure, optical networks today are still “far” from accessible from directly configure and offer network services and need to be enriched with more “user oriented” functionalities. However, current Control Plane architectures only facilitate efficient end-to-end connectivity provisioning and certainly cannot meet future network service requirements, e.g. the coordinated control of resources. The overall objective of this work is to provide the network with the improved usability and accessibility of the services provided by the Optical Network. More precisely, the definition of a service-oriented architecture is the enable technology to allow user applications to gain benefit of advanced services over an underlying dynamic optical layer. The definition of a service oriented networking architecture based on advanced optical network technologies facilitates users and applications access to abstracted levels of information regarding offered advanced network services. This thesis faces the problem to define a Service Oriented Architecture and its relevant building blocks, protocols and languages. In particular, this work has been focused on the use of the SIP protocol as a inter-layers signalling protocol which defines the Session Plane in conjunction with the Network Resource Description language. On the other hand, an advantage optical network must accommodate high data bandwidth with different granularities. Currently, two main technologies are emerging promoting the development of the future optical transport network, Optical Burst and Packet Switching. Both technologies respectively promise to provide all optical burst or packet switching instead of the current circuit switching. However, the electronic domain is still present in the scheduler forwarding and routing decision. Because of the high optics transmission frequency the burst or packet scheduler faces a difficult challenge, consequentially, high performance and time focused design of both memory and forwarding logic is need. This open issue has been faced in this thesis proposing an high efficiently implementation of burst and packet scheduler. The main novelty of the proposed implementation is that the scheduling problem has turned into simple calculation of a min/max function and the function complexity is almost independent of on the traffic conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Clusters have increasingly become an essential part of policy discourses at all levels, EU, national, regional, dealing with regional development, competitiveness, innovation, entrepreneurship, SMEs. These impressive efforts in promoting the concept of clusters on the policy-making arena have been accompanied by much less academic and scientific research work investigating the actual economic performance of firms in clusters, the design and execution of cluster policies and going beyond singular case studies to a more methodologically integrated and comparative approach to the study of clusters and their real-world impact. The theoretical background is far from being consolidated and there is a variety of methodologies and approaches for studying and interpreting this phenomenon while at the same time little comparability among studies on actual cluster performances. The conceptual framework of clustering suggests that they affect performance but theory makes little prediction as to the ultimate distribution of the value being created by clusters. This thesis takes the case of Eastern European countries for two reasons. One is that clusters, as coopetitive environments, are a new phenomenon as the previous centrally-based system did not allow for such types of firm organizations. The other is that, as new EU member states, they have been subject to the increased popularization of the cluster policy approach by the European Commission, especially in the framework of the National Reform Programmes related to the Lisbon objectives. The originality of the work lays in the fact that starting from an overview of theoretical contributions on clustering, it offers a comparative empirical study of clusters in transition countries. There have been very few examples in the literature that attempt to examine cluster performance in a comparative cross-country perspective. It adds to this an analysis of cluster policies and their implementation or lack of such as a way to analyse the way the cluster concept has been introduced to transition economies. Our findings show that the implementation of cluster policies does vary across countries with some countries which have embraced it more than others. The specific modes of implementation, however, are very similar, based mostly on soft measures such as funding for cluster initiatives, usually directed towards the creation of cluster management structures or cluster facilitators. They are essentially founded on a common assumption that the added values of clusters is in the creation of linkages among firms, human capital, skills and knowledge at the local level, most often perceived as the regional level. Often times geographical proximity is not a necessary element in the application process and cluster application are very similar to network membership. Cluster mapping is rarely a factor in the selection of cluster initiatives for funding and the relative question about critical mass and expected outcomes is not considered. In fact, monitoring and evaluation are not elements of the cluster policy cycle which have received a lot of attention. Bulgaria and the Czech Republic are the countries which have implemented cluster policies most decisively, Hungary and Poland have made significant efforts, while Slovakia and Romania have only sporadically and not systematically used cluster initiatives. When examining whether, in fact, firms located within regional clusters perform better and are more efficient than similar firms outside clusters, we do find positive results across countries and across sectors. The only country with negative impact from being located in a cluster is the Czech Republic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Satellite remote sensing has proved to be an effective support in timely detection and monitoring of marine oil pollution, mainly due to illegal ship discharges. In this context, we have developed a new methodology and technique for optical oil spill detection, which make use of MODIS L2 and MERIS L1B satellite top of atmosphere (TOA) reflectance imagery, for the first time in a highly automated way. The main idea was combining wide swaths and short revisit times of optical sensors with SAR observations, generally used in oil spill monitoring. This arises from the necessity to overcome the SAR reduced coverage and long revisit time of the monitoring area. This can be done now, given the MODIS and MERIS higher spatial resolution with respect to older sensors (250-300 m vs. 1 km), which consents the identification of smaller spills deriving from illicit discharge at sea. The procedure to obtain identifiable spills in optical reflectance images involves removal of oceanic and atmospheric natural variability, in order to enhance oil-water contrast; image clustering, which purpose is to segment the oil spill eventually presents in the image; finally, the application of a set of criteria for the elimination of those features which look like spills (look-alikes). The final result is a classification of oil spill candidate regions by means of a score based on the above criteria.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last three decades, sensors based on the phenomenon of surface plasmon resonance have proven particularly suitable for real time thin film characterization, gas detection, biomolecular interaction examination and to supplement electrochemical methods. Systems based on prism coupling have been combined with fluorescence detection under the name of surface plasmon fluorescence spectroscopy to increase sensitivity even further. Alternatively, metal gratings can be employed to match photons for plasmon resonance. The real time monitoring of binding reactions not yet been reported in the combination of fluorescence detection and grating coupling. Grating-based systems promise more competitive products, because of reduced operating costs, and offer benefits for device engineering. This thesis is comprised of a comprehensive study of the suitability of grating coupling for fluorescence based analyte detection. Fundamental properties of grating coupled surface plasmon fluorescence spectroscopy are described, as well as issues related to the commercial realization of the method. Several new experimental techniques are introduced and demonstrated in order to optimize performance in certain areas and improve upon capabilities in respect to prism-based systems. Holographically fabricated gratings are characterized by atomic force microscopy and optical methods, aided by simulations and profile parameters responsible for efficient coupling are analyzed. The directional emission of fluorophores immobilized on a grating surface is studied in detail, including the magnitude and geometry of the fluorescence emission pattern for different grating constants and polarizations. Additionally, the separation between the minimum of the reflected intensity and the maximum fluorescence excitation position is examined. One of the key requirements for the commercial feasibility of grating coupling is the cheap and faithful mass production of disposable samples from a given master grating. The replication of gratings is demonstrated by a simple hot embossing method with good reproducibility to address this matter. The in-situ fluorescence detection of analyte immobilization and affinity measurements using grating coupling are described for the first time. The physical factors related to the sensitivity of the technique are assessed and the lower limit of detection of the technique is determined for an exemplary assay. Particular attention is paid to the contribution of bulk fluorophores to the total signal in terms of magnitude and polarization of incident and emitted light. Emission from the bulk can be a limiting factor for experiments with certain assay formats. For that reason, a novel optical method, based on the modulation of both polarization and intensity of the incident beam, is introduced and demonstrated to be capable of eliminating this contribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Physicochemical experimental techniques combined with the specificity of a biological recognition system have resulted in a variety of new analytical devices known as biosensors. Biosensors are under intensive development worldwide because they have many potential applications, e.g. in the fields of clinical diagnostics, food analysis, and environmental monitoring. Much effort is spent on the development of highly sensitive sensor platforms to study interactions on the molecular scale. In the first part, this thesis focuses on exploiting the biosensing application of nanoporous gold (NPG) membranes. NPG with randomly distributed nanopores (pore sizes less than 50 nm) will be discussed here. The NPG membrane shows unique plasmonic features, i.e. it supports both propagating and localized surface plasmon resonance modes (p SPR and l-SPR, respectively), both offering sensitive probing of the local refractive index variation on/in NPG. Surface refractive index sensors have an inherent advantage over fluorescence optical biosensors that require a chromophoric group or other luminescence label to transduce the binding event. In the second part, gold/silica composite inverse opals with macroporous structures were investigated with bio- or chemical sensing applications in mind. These samples combined the advantages of a larger available gold surface area with a regular and highly ordered grating structure. The signal of the plasmon was less noisy in these ordered substrate structures compared to the random pore structures of the NPG samples. In the third part of the thesis, surface plasmon resonance (SPR) spectroscopy was applied to probe the protein-protein interaction of the calcium binding protein centrin with the heterotrimeric G-protein transducin on a newly designed sensor platform. SPR spectroscopy was intended to elucidate how the binding of centrin to transducin is regulated towards understanding centrin functions in photoreceptor cells.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis proposes design methods and test tools, for optical systems, which may be used in an industrial environment, where not only precision and reliability but also ease of use is important. The approach to the problem has been conceived to be as general as possible, although in the present work, the design of a portable device for automatic identification applications has been studied, because this doctorate has been funded by Datalogic Scanning Group s.r.l., a world-class producer of barcode readers. The main functional components of the complete device are: electro-optical imaging, illumination and pattern generator systems. For what concerns the electro-optical imaging system, a characterization tool and an analysis one has been developed to check if the desired performance of the system has been achieved. Moreover, two design tools for optimizing the imaging system have been implemented. The first optimizes just the core of the system, the optical part, improving its performance ignoring all other contributions and generating a good starting point for the optimization of the whole complex system. The second tool optimizes the system taking into account its behavior with a model as near as possible to reality including optics, electronics and detection. For what concerns the illumination and the pattern generator systems, two tools have been implemented. The first allows the design of free-form lenses described by an arbitrary analytical function exited by an incoherent source and is able to provide custom illumination conditions for all kind of applications. The second tool consists of a new method to design Diffractive Optical Elements excited by a coherent source for large pattern angles using the Iterative Fourier Transform Algorithm. Validation of the design tools has been obtained, whenever possible, comparing the performance of the designed systems with those of fabricated prototypes. In other cases simulations have been used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study is to provide empirical evidence on how ownership structure and owner’s identity affect performance, in the banking industry by using a panel of Indonesia banks over the period 2000–2009. Firstly, we analysed the impact of the presence of multiple blockholders on bank ownership structure and performance. Building on multiple agency and principal-principal theories, we investigated whether the presence and shares dispersion across blockholders with different identities (i.e. central and regional government; families; foreign banks and financial institutions) affected bank performance, in terms of profitability and efficiency. We found that the number of blockholders has a negative effect on banks’ performance, while blockholders’ concentration has a positive effect. Moreover, we observed that the dispersion of ownership across different types of blockholders has a negative effect on banks’ performance. We interpret such results as evidence that, when heterogeneous blockholders are present, the disadvantage from conflicts of interests between blockholders seems to outweigh the advantage of the increase in additional monitoring by additional blockholder. Secondly, we conducted a joint analysis of the static, selection, and dynamic effects of different types of ownership on banks’ performance. We found that regional banks and foreign banks have a higher profitability and efficiency as compared to domestic private banks. In the short-run, foreign acquisitions and domestic M&As reduce the level of overhead costs, while in the long-run they increase the Net Interest Margin (NIM). Further, we analysed NIM determinants, to asses the impact of ownership on bank business orientation. Our findings lend support to our prediction that the NIM determinants differs accordingly to the type of bank ownership. We also observed that banks that experienced changes in ownership, such as foreign-acquired banks, manifest different interest margin determinants with respect to domestic or foreign banks that did not experience ownership rearrangements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The optical resonances of metallic nanoparticles placed at nanometer distances from a metal plane were investigated. At certain wavelengths, these “sphere-on-plane” systems become resonant with the incident electromagnetic field and huge enhancements of the field are predicted localized in the small gaps created between the nanoparticle and the plane. An experimental architecture to fabricate sphere-on-plane systems was successfully achieved in which in addition to the commonly used alkanethiols, polyphenylene dendrimers were used as molecular spacers to separate the metallic nanoparticles from the metal planes. They allow for a defined nanoparticle-plane separation and some often are functionalized with a chromophore core which is therefore positioned exactly in the gap. The metal planes used in the system architecture consisted of evaporated thin films of gold or silver. Evaporated gold or silver films have a smooth interface with their substrate and a rougher top surface. To investigate the influence of surface roughness on the optical response of such a film, two gold films were prepared with a smooth and a rough side which were as similar as possible. Surface plasmons were excited in Kretschmann configuration both on the rough and on the smooth side. Their reflectivity could be well modeled by a single gold film for each individual measurement. The film has to be modeled as two layers with significantly different optical constants. The smooth side, although polycrystalline, had an optical response that was very similar to a monocrystalline surface while for the rough side the standard response of evaporated gold is retrieved. For investigations on thin non-absorbing dielectric films though, this heterogeneity introduces only a negligible error. To determine the resonant wavelength of the sphere-on-plane systems a strategy was developed which is based on multi-wavelength surface plasmon spectroscopy experiments in Kretschmann-configuration. The resonant behavior of the system lead to characteristic changes in the surface plasmon dispersion. A quantitative analysis was performed by calculating the polarisability per unit area /A treating the sphere-on-plane systems as an effective layer. This approach completely avoids the ambiguity in the determination of thickness and optical response of thin films in surface plasmon spectroscopy. Equal area densities of polarisable units yielded identical response irrespective of the thickness of the layer they are distributed in. The parameter range where the evaluation of surface plasmon data in terms of /A is applicable was determined for a typical experimental situation. It was shown that this analysis yields reasonable quantitative agreement with a simple theoretical model of the sphere-on-plane resonators and reproduces the results from standard extinction experiments having a higher information content and significantly increased signal-to-noise ratio. With the objective to acquire a better quantitative understanding of the dependence of the resonance wavelength on the geometry of the sphere-on-plane systems, different systems were fabricated in which the gold nanoparticle size, type of spacer and ambient medium were varied and the resonance wavelength of the system was determined. The gold nanoparticle radius was varied in the range from 10 nm to 80 nm. It could be shown that the polyphenylene dendrimers can be used as molecular spacers to fabricate systems which support gap resonances. The resonance wavelength of the systems could be tuned in the optical region between 550 nm and 800 nm. Based on a simple analytical model, a quantitative analysis was developed to relate the systems’ geometry with the resonant wavelength and surprisingly good agreement of this simple model with the experiment without any adjustable parameters was found. The key feature ascribed to sphere-on-plane systems is a very large electromagnetic field localized in volumes in the nanometer range. Experiments towards a quantitative understanding of the field enhancements taking place in the gap of the sphere-on-plane systems were done by monitoring the increase in fluorescence of a metal-supported monolayer of a dye-loaded dendrimer upon decoration of the surface with nanoparticles. The metal used (gold and silver), the colloid mean size and the surface roughness were varied. Large silver crystallites on evaporated silver surfaces lead to the most pronounced fluorescence enhancements in the order of 104. They constitute a very promising sample architecture for the study of field enhancements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Therapeutisches Drug Monitoring (TDM) umfasst die Messung von Medikamentenspiegeln im Blut und stellt die Ergebnisse in Zusammenhang mit dem klinischen Erscheinungsbild der Patienten. Dabei wird angenommen, dass die Konzentrationen im Blut besser mit der Wirkung korrelieren als die Dosis. Dies gilt auch für Antidepressiva. Voraussetzung für eine Therapiesteuerung durch TDM ist die Verfügbarkeit valider Messmethoden im Labor und die korrekte Anwendung des Verfahrens in der Klinik. Ziel dieser Arbeit war es, den Einsatz von TDM für die Depressionsbehandlung zu analysieren und zu verbessern. Im ersten Schritt wurde für das neu zugelassene Antidepressivum Duloxetin eine hochleistungsflüssig-chromatographische (HPLC) Methode mit Säulenschaltung und spektrophotometrischer Detektion etabliert und an Patienten für TDM angewandt. Durch Analyse von 280 Patientenproben wurde herausgefunden, dass Duloxetin-Konzentrationen von 60 bis 120 ng/ml mit gutem klinischen Ansprechen und einem geringen Risiko für Nebenwirkungen einhergingen. Bezüglich seines Interaktionspotentials erwies sich Duloxetin im Vergleich zu anderen Antidepressiva als schwacher Inhibitor des Cytochrom P450 (CYP) Isoenzyms 2D6. Es gab keinen Hinweis auf eine klinische Relevanz. Im zweiten Schritt sollte eine Methode entwickelt werden, mit der möglichst viele unterschiedliche Antidepressiva einschließlich deren Metaboliten messbar sind. Dazu wurde eine flüssigchromatographische Methode (HPLC) mit Ultraviolettspektroskopie (UV) entwickelt, mit der die quantitative Analyse von zehn antidepressiven und zusätzlich zwei antipsychotischen Substanzen innerhalb von 25 Minuten mit ausreichender Präzision und Richtigkeit (beide über 85%) und Sensitivität erlaubte. Durch Säulenschaltung war eine automatisierte Analyse von Blutplasma oder –serum möglich. Störende Matrixbestandteile konnten auf einer Vorsäule ohne vorherige Probenaufbereitung abgetrennt werden. Das kosten- und zeiteffektive Verfahren war eine deutliche Verbesserung für die Bewältigung von Proben im Laboralltag und damit für das TDM von Antidepressiva. Durch Analyse des klinischen Einsatzes von TDM wurden eine Reihe von Anwendungsfehlern identifiziert. Es wurde deshalb versucht, die klinische Anwendung des TDM von Antidepressiva durch die Umstellung von einer weitgehend händischen Dokumentation auf eine elektronische Bearbeitungsweise zu verbessern. Im Rahmen der Arbeit wurde untersucht, welchen Effekt man mit dieser Intervention erzielen konnte. Dazu wurde eine Labor-EDV eingeführt, mit der der Prozess vom Probeneingang bis zur Mitteilung der Messergebnisse auf die Stationen elektronisch erfolgte und die Anwendung von TDM vor und nach der Umstellung untersucht. Die Umstellung fand bei den behandelnden Ärzten gute Akzeptanz. Die Labor-EDV erlaubte eine kumulative Befundabfrage und eine Darstellung des Behandlungsverlaufs jedes einzelnen Patienten inklusive vorhergehender Klinikaufenthalte. Auf die Qualität der Anwendung von TDM hatte die Implementierung des Systems jedoch nur einen geringen Einfluss. Viele Anforderungen waren vor und nach der Einführung der EDV unverändert fehlerhaft, z.B. wurden häufig Messungen vor Erreichen des Steady State angefordert. Die Geschwindigkeit der Bearbeitung der Proben war im Vergleich zur vorher händischen Ausführung unverändert, ebenso die Qualität der Analysen bezüglich Richtigkeit und Präzision. Ausgesprochene Empfehlungen hinsichtlich der Dosierungsstrategie der angeforderten Substanzen wurden häufig nicht beachtet. Verkürzt wurde allerdings die mittlere Latenz, mit der eine Dosisanpassung nach Mitteilung des Laborbefundes erfolgte. Insgesamt ist es mit dieser Arbeit gelungen, einen Beitrag zur Verbesserung des Therapeutischen Drug Monitoring von Antidepressiva zu liefern. In der klinischen Anwendung sind allerdings Interventionen notwendig, um Anwendungsfehler beim TDM von Antidepressiva zu minimieren.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Der Erfolg einer Schizophrenie-Behandlung ist zum größten Teil abhängig vom Ansprechen des Patienten auf seine antipsychotische Medikation. Welches Medikament und welche Dosis bei einem individuellen Patienten wirksam sind, kann derzeit erst nach mehrwöchiger Behandlung beurteilt werden. Ein Grund für variierendes Therapieansprechen sind variable Plasmakonzentrationen der Antipsychotika. Ziel dieser Arbeit war es, zu untersuchen, in wieweit der Therapieerfolg zu einem frühen Zeitpunkt der Behandlung durch objektive Symptomerfassung vorhersagbar ist und welche Faktoren die hohe Variabilität der Antipsychotikaspiegel im Blut beeinflussen. rnEine 18-monatige naturalistische klinische Studie an schizophrenen Patienten wurde durchgeführt, um folgende Fragen zu beantworten: Kann man das Therapieansprechen prädizieren und welche Instrumente sind dafür geeignet? Die Psychopathologie wurde anhand zweier Messskalen (Brief Psychiatric Rating Scale, BPRS und Clinical Global Impressions, CGI) wöchentlich ermittelt, um die Besserung der Krankheitssymptome im Verlauf von 8 Wochen zu bewerten. Therapiebegleitend wurden noch die Serum-Konzentrationen der Antipsychotika gemessen. Objektive Symptomerfassung durch BPRS oder CGI waren als Messinstrumente geeignet, Therapieansprechen vorherzusagen. Bezogen auf den Behandlungsbeginn war eine Verminderung der Symptome hoch prädiktiv für späteres Therapieversagen oder -ansprechen. Eine Verminderung um mehr als 36,5% auf der BPRS Skala in Woche 2 wurde als signifikanter Schwellenwert für Nichtansprechen ermittelt. Patienten, deren Symptombesserung unterhalb des Schwellenwertes lag, hatten eine 11,2-fach höhere Wahrscheinlichkeit, am Ende der Studie nicht auf ihre medikamentöse Therapie anzusprechen als die Patienten, die sich um mindestens 36,5% verbesserten. Andere Faktoren, wie Alter, Geschlecht, Dauer der Erkrankung oder Anzahl der stationären Aufenthalte hatten keinen Einfluss auf die Prädiktion des Therapieansprechens. Therapeutische Antipsychotika-Spiegel übten einen positiven Einfluss auf die Ansprechrate aus. Bei Patienten mit therapeutischen Spiegeln war das Ansprechen rascher und die Ansprechrate größer als unter denjenigen deren Spiegel außerhalb der therapeutisch üblichen Bereiche lag. rnEine wichtige Voraussetzung für den Einsatz von TDM ist das Vorhandensein einer präzisen, reproduzierbaren, zeit- und kostensparenden analytischen Methode zur quantitativen Bestimmung der untersuchten Substanzen. Die Entwicklung und Validierung einer solchen geeigneten Methode wurde für den Nachweis von Haloperidol vorgenommen. Eine HPLC-Methode mit Säulenschaltung erwies sich für TDM geeignet. rnBasierend auf den Ergebnissen der eigenen klinischen Studie zur Response Prädiktion wurde untersucht, welche Faktoren die Variabilität der Pharmakokinetik von Antipsychotika beeinflussen. Die Variabilität der Pharmakokinetik ist ein Grund für fehlendes oder unzureichendes Ansprechen. Es wurde zum einen der Einfluss der galenischen Formulierung auf die Freisetzung und zum anderen der Einfluss von entzündlichen Prozessen auf die Metabolisierung eines Antipsychotikums untersucht. Dazu wurden Patientendaten retrospektiv ausgewertet.rnDie Analyse von 247 Serumspiegeln von Patienten, die mit Paliperidon in OROS®Formulierung, einer neu eingeführten Retardform, behandelt wurden, zeigte, dass die intraindividuelle Variabilität der Talspiegel (Vk) von Paliperidon 35% betrug. Er war damit vergleichbar wie für nicht retardiertes Risperidon 32% (p=n.s.). Die Retardierung hatte demnach keinen Varianz mindernden Effekt auf die Talspiegel des Antipsychotikums. Der Wirkstoff-Konzentrations-Bereich lag bei 21-55 ng/ml und entsprach ebenfalls nahezu dem therapeutischen Bereich von Risperidon (20-60 ng/ml). rnEntzündliche Prozesse können die Metabolisierung von Medikamenten verändern. Dies wurde bisher für Medikamente nachgewiesen, die über CYP1A2 abgebaut werden. Durch die eigene Analyse von 84 Patienten-Serumspiegeln konnte festgestellt werden, dass die Metabolisierung von Quetiapin während eines entzündlichen Prozesses beeinträchtigt war, wahrscheinlich durch Hemmung von CYP3A4. Dies sprach dafür, dass auch Wirkstoffe, die über CYP3A4 abgebaut werden, während eines entzündlichen Prozesses im Körper in ihrer Pharmakokinetik beeinträchtigt sein können. Aus diesem Grund sollte während einer Infektion unter der Therapie mit Quetiapin besonders auf die Nebenwirkungen geachtet werden und der Serumspiegel sollte in dieser Zeit überwacht werden, um den Patienten vor eventuellen Nebenwirkungen oder sogar Intoxikationen zu schützen. rnDie Befunde dieser Arbeit zeigen, dass bei einer Behandlung schizophrener Patienten mit Antipsychotika die Messung der Psychopathologie zur Vorhersage des Therapieansprechens und die Messung der Blutspiegel zur Identifizierung von Faktoren, die die pharmakokinetische Variabilität bedingen, geeignet sind. Objektive Symptomerfassung und Therapeutisches Drug Monitoring sind demnach Instrumente, die für die Steuerung der antipsychotischen Pharmakotherapie genutzt werden sollten.rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rapid and sensitive detection of chemical and biological analytes becomes increasingly important in areas such as medical diagnostics, food control and environmental monitoring. Optical biosensors based on surface plasmon resonance (SPR) and optical waveguide spectroscopy have been extensively pushed forward in these fields. In this study, we combine SPR, surface plasmon-enhanced fluorescence spectroscopy (SPFS) and optical waveguide spectroscopy with hydrogel thin film for highly sensitive detection of molecular analytes.rnrnA novel biosensor based on SPFS which was advanced through the excitation of long range surface plasmons (LRSPs) is reported in this study. LRSPs are special surface plasmon waves propagating along thin metal films with orders of magnitude higher electromagnetic field intensity and lower damping than conventional SPs. Therefore, their excitation on the sensor surface provides further increased fluorescence signal. An inhibition immunoassay based on LRSP-enhanced fluorescence spectroscopy (LRSP-FS) was developed for the detection of aflatoxin M1 (AFM1) in milk. The biosensor allowed for the detection of AFM1 in milk at concentrations as low as 0.6 pg mL-1, which is about two orders of magnitude lower than the maximum AFM1 residue level in milk stipulated by the European Commission legislation.rnrnIn addition, LRSPs probe the medium adjacent to the metallic surface with more extended evanescent field than regular SPs. Therefore, three-dimensional binding matrices with up to micrometer thickness have been proposed for the immobilization of biomolecular recognition elements with large surface density that allows to exploit the whole evanescent field of LRSP. A photocrosslinkable carboxymethyl dextran (PCDM) hydrogel thin film is used as a binding matrix, and it is applied for the detection of free prostate specific antigen (f-PSA) based on the LRSP-FS and sandwich immunoassay. We show that this approach allows for the detection of f-PSA at low femto-molar range, which is approximately four orders of magnitude lower than that for direct detection of f-PSA based on the monitoring of binding-induced refractive index changes.rnrnHowever, a three dimensional hydrogel binding matrix with micrometer thickness can also serve as an optical waveguide. Based on the measurement of binding-induced refractive index changes, a hydrogel optical waveguide spectroscopy (HOWS) is reported for a label-free biosensor. This biosensor is implemented by using a SPR optical setup in which a carboxylated poly(N-isoproprylacrylamide) (PNIPAAm) hydrogel film is attached on a metallic surface and modified by protein catcher molecules. Compared to regular SPR biosensor with thiol self-assembled monolayer (SAM), HOWS provides an order of magnitude improved resolution in the refractive index measurements and enlarged binding capacity owing to its low damping and large swelling ratio, respectively. A model immunoassay experiment revealed that HOWS allowed detection of IgG molecules with a 10 pM limit of detection (LOD) that was five-fold lower than that achieved for SPR with thiol SAM. For the high capacity hydrogel matrix, the affinity binding was mass transport limited.rnrnThe mass transport of target molecules to the sensor surface can play as critical a role as the chemical reaction itself. In order to overcome the diffusion-limited mass transfer, magnetic iron oxide nanoparticles were employed. The magnetic nanoparticles (MNPs) can serve both as labels providing enhancement of the refractive index changes, and “vehicles” for rapidly delivering the analytes from sample solution to an SPR sensor surface with a gradient magnetic field. A model sandwich assay for the detection of β human chorionic gonadotropin (βhCG) has been utilized on a gold sensor surface with metallic diffraction grating structure supporting the excitation of SPs. Various detection formats including a) direct detection, b) sandwich assay, c) MNPs immunoassay without and d) with applied magnetic field were compared. The results show that the highly-sensitive MNPs immunoassay improves the LOD on the detection of βhCG by a factor of 5 orders of magnitude with respect to the direct detection.rn