951 resultados para Over-voltage problem


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Actualmente, la física de plasmas constituye una parte importante de la investigación en física que está siendo desarrollada. Su campo de aplicación varía desde el estudio de plasmas interestelares y cósmicos, como las estrellas, las nebulosas, el medio intergaláctico, etc.; hasta aplicaciones más terrenales como la producción de microchips o los dispositivos de iluminación. Resulta particularmente interesante el estudio del contacto de una superficie metálica con un plasma. Siendo la razón que, la dinámica de la interfase formada entre un plasma imperturbado y una superficie metálica, resulta de gran importancia cuando se trata de estudiar problemas como: la implantación iónica en una oblea de silicio, el grabado por medio de plasmas, la carga de una aeronave cuando atraviesa la ionosfera y la diagnosis de plasmas mediante sondas de Langmuir. El uso de las sondas de Langmuir está extendido a través de multitud de aplicaciones tecnológicas e industriales como método de diagnosis de plasmas. Algunas de estas aplicaciones han sido mencionadas justo en el párrafo anterior. Es más, su uso también es muy popular en la investigación en física de plasmas, por ser una de las pocas técnicas de diagnosis que proporciona información local sobre el plasma. El equipamiento donde es habitualmente implementado varía desde plasmas de laboratorio de baja temperatura hasta plasmas de fusión en dispositivos como tokamaks o stellerators. La geometría más popular de este tipo de sondas es cilíndrica, y la principal magnitud que se usa para diagnosticar el plasma es la corriente recogida por la sonda cuando se encuentra polarizada a un cierto potencial. Existe un interes especial en diagnosticar por medio de la medida de la corriente iónica recogida por la sonda, puesto que produce una perturbación muy pequeña del plasma en comparación con el uso de la corriente electrónica. Dada esta popularidad, no es de extrañar que grandes esfuerzos se hayan realizado en la consecución de un modelo teórico que explique el comportamiento de una sonda de Langmuir inmersa en un plasma. Hay que remontarse a la primera mitad del siglo XX para encontrar las primeras teorías que permiten diagnosticar parámetros del plasma mediante la medida de la corriente iónica recogida por la sonda de Langmuir. Desde entonces, las mejoras en estos modelos y el desarrollo de otros nuevos ha sido una constante en la investigación en física de plasmas. No obstante, todavía no está claro como los iones se aproximan a la superficie de la sonda. Las dos principales, a la par que opuestas, aproximaciones al problema que están ampliamente aceptadas son: la radial y la orbital; siendo el problema que ambas predicen diferentes valores para la corriente iónica. Los experimentos han arrojado resultados de acuerdo con ambas teorías, la radial y la orbital; y lo que es más importante, una transición entre ambos ha sido recientemente observada. La mayoría de los logros conseguidos a la hora de comprender como los iones caen desde el plasma hacia la superficie de la sonda, han sido llevados a cabo en el campo de la dinámica de fluidos o la teoría cinética. Por otra parte, este problema puede ser abordado mediante el uso de simulaciones de partículas. La principal ventaja de las simulaciones de partículas sobre los modelos de fluidos o cinéticos es que proporcionan mucha más información sobre los detalles microscópicos del movimiento de las partículas, además es relativamente fácil introducir interacciones complejas entre las partículas. No obstante, estas ventajas no se obtienen gratuitamente, ya que las simulaciones de partículas requieren grandísimos recursos. Por esta razón, es prácticamente obligatorio el uso de técnicas de procesamiento paralelo en este tipo de simulaciones. El vacío en el conocimiento de las sondas de Langmuir, es el que motiva nuestro trabajo. Nuestra aproximación, y el principal objetivo de este trabajo, ha sido desarrollar una simulación de partículas que nos permita estudiar el problema de una sonda de Langmuir inmersa en un plasma y que está negativamente polarizada con respecto a éste. Dicha simulación nos permitiría estudiar el comportamiento de los iones en los alrededores de una sonda cilíndrica de Langmuir, así como arrojar luz sobre la transición entre las teorías radiales y orbitales que ha sido observada experimentalmente. Justo después de esta sección introductoria, el resto de la tesis está dividido en tres partes tal y como sigue: La primera parte está dedicada a establecer los fundamentos teóricos de las sondas de Langmuir. En primer lugar, se realiza una introducción general al problema y al uso de sondas de Langmuir como método de diagnosis de plasmas. A continuación, se incluye una extensiva revisión bibliográfica sobre las diferentes teorías que proporcionan la corriente iónica recogida por una sonda. La segunda parte está dedicada a explicar los detalles de las simulaciones de partículas que han sido desarrolladas a lo largo de nuestra investigación, así como los resultados obtenidos con las mismas. Esta parte incluye una introducción sobre la teoría que subyace el tipo de simulaciones de partículas y las técnicas de paralelización que han sido usadas en nuestros códigos. El resto de esta parte está dividido en dos capítulos, cada uno de los cuales se ocupa de una de las geometrías consideradas en nuestras simulaciones (plana y cilíndrica). En esta parte discutimos también los descubrimientos realizados relativos a la transición entre el comportamiento radial y orbital de los iones en los alrededores de una sonda cilíndrica de Langmuir. Finalmente, en la tercera parte de la tesis se presenta un resumen del trabajo realizado. En este resumen, se enumeran brevemente los resultados de nuestra investigación y se han incluido algunas conclusiones. Después de esto, se enumeran una serie de perspectivas futuras y extensiones para los códigos desarrollados.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents an investigation on endoscopic optical coherence tomography (OCT). As a noninvasive imaging modality, OCT emerges as an increasingly important diagnostic tool for many clinical applications. Despite of many of its merits, such as high resolution and depth resolvability, a major limitation is the relatively shallow penetration depth in tissue (about 2∼3 mm). This is mainly due to tissue scattering and absorption. To overcome this limitation, people have been developing many different endoscopic OCT systems. By utilizing a minimally invasive endoscope, the OCT probing beam can be brought to the close vicinity of the tissue of interest and bypass the scattering of intervening tissues so that it can collect the reflected light signal from desired depth and provide a clear image representing the physiological structure of the region, which can not be disclosed by traditional OCT. In this thesis, three endoscope designs have been studied. While they rely on vastly different principles, they all converge to solve this long-standing problem.

A hand-held endoscope with manual scanning is first explored. When a user is holding a hand- held endoscope to examine samples, the movement of the device provides a natural scanning. We proposed and implemented an optical tracking system to estimate and record the trajectory of the device. By registering the OCT axial scan with the spatial information obtained from the tracking system, one can use this system to simply ‘paint’ a desired volume and get any arbitrary scanning pattern by manually waving the endoscope over the region of interest. The accuracy of the tracking system was measured to be about 10 microns, which is comparable to the lateral resolution of most OCT system. Targeted phantom sample and biological samples were manually scanned and the reconstructed images verified the method.

Next, we investigated a mechanical way to steer the beam in an OCT endoscope, which is termed as Paired-angle-rotation scanning (PARS). This concept was proposed by my colleague and we further developed this technology by enhancing the longevity of the device, reducing the diameter of the probe, and shrinking down the form factor of the hand-piece. Several families of probes have been designed and fabricated with various optical performances. They have been applied to different applications, including the collector channel examination for glaucoma stent implantation, and vitreous remnant detection during live animal vitrectomy.

Lastly a novel non-moving scanning method has been devised. This approach is based on the EO effect of a KTN crystal. With Ohmic contact of the electrodes, the KTN crystal can exhibit a special mode of EO effect, termed as space-charge-controlled electro-optic effect, where the carrier electron will be injected into the material via the Ohmic contact. By applying a high voltage across the material, a linear phase profile can be built under this mode, which in turn deflects the light beam passing through. We constructed a relay telescope to adapt the KTN deflector into a bench top OCT scanning system. One of major technical challenges for this system is the strong chromatic dispersion of KTN crystal within the wavelength band of OCT system. We investigated its impact on the acquired OCT images and proposed a new approach to estimate and compensate the actual dispersion. Comparing with traditional methods, the new method is more computational efficient and accurate. Some biological samples were scanned by this KTN based system. The acquired images justified the feasibility of the usage of this system into a endoscopy setting. My research above all aims to provide solutions to implement an OCT endoscope. As technology evolves from manual, to mechanical, and to electrical approaches, different solutions are presented. Since all have their own advantages and disadvantages, one has to determine the actual requirements and select the best fit for a specific application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Matrix power converters are used for transforming one alternating-current power supply to another, with different peak voltage and frequency. There are three input lines, with sinusoidally varying voltages which are 120◦ out of phase one from another, and the output is to be delivered as a similar three-phase supply. The matrix converter switches rapidly, to connect each output line in sequence to each of the input lines in an attempt to synthesize the prescribed output voltages. The switching is carried out at high frequency and it is of practical importance to know the frequency spectra of the output voltages and of the input and output currents. We determine in this paper these spectra using a new method, which has significant advantages over the prior default method (a multiple Fourier series technique), leading to a considerably more direct calculation. In particular, the determination of the input current spectrum is feasible here, whereas it would be a significantly more daunting procedure using the prior method instead.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

How can we calculate earthquake magnitudes when the signal is clipped and over-run? When a volcano is very active, the seismic record may saturate (i.e., the full amplitude of the signal is not recorded) or be over-run (i.e., the end of one event is covered by the start of a new event). The duration, and sometimes the amplitude, of an earthquake signal are necessary for determining event magnitudes; thus, it may be impossible to calculate earthquake magnitudes when a volcano is very active. This problem is most likely to occur at volcanoes with limited networks of short period seismometers. This study outlines two methods for calculating earthquake magnitudes when events are clipped and over-run. The first method entails modeling the shape of earthquake codas as a power law function and extrapolating duration from the decay of the function. The second method draws relations between clipped duration (i.e., the length of time a signal is clipped) and the full duration. These methods allow for magnitudes to be determined within 0.2 to 0.4 units of magnitude. This error is within the range of analyst hand-picks and is within the acceptable limits of uncertainty when quickly quantifying volcanic energy release during volcanic crises. Most importantly, these estimates can be made when data are clipped or over-run. These methods were developed with data from the initial stages of the 2004-2008 eruption at Mount St. Helens. Mount St. Helens is a well-studied volcano with many instruments placed at varying distances from the vent. This fact makes the 2004-2008 eruption a good place to calibrate and refine methodologies that can be applied to volcanoes with limited networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the Cauchy problem for the Laplace equation in 3-dimensional doubly-connected domains, that is the reconstruction of a harmonic function from knowledge of the function values and normal derivative on the outer of two closed boundary surfaces. We employ the alternating iterative method, which is a regularizing procedure for the stable determination of the solution. In each iteration step, mixed boundary value problems are solved. The solution to each mixed problem is represented as a sum of two single-layer potentials giving two unknown densities (one for each of the two boundary surfaces) to determine; matching the given boundary data gives a system of boundary integral equations to be solved for the densities. For the discretisation, Weinert's method [24] is employed, which generates a Galerkin-type procedure for the numerical solution via rewriting the boundary integrals over the unit sphere and expanding the densities in terms of spherical harmonics. Numerical results are included as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several decision and control tasks involve networks of cyber-physical systems that need to be coordinated and controlled according to a fully-distributed paradigm involving only local communications without any central unit. This thesis focuses on distributed optimization and games over networks from a system theoretical perspective. In the addressed frameworks, we consider agents communicating only with neighbors and running distributed algorithms with optimization-oriented goals. The distinctive feature of this thesis is to interpret these algorithms as dynamical systems and, thus, to resort to powerful system theoretical tools for both their analysis and design. We first address the so-called consensus optimization setup. In this context, we provide an original system theoretical analysis of the well-known Gradient Tracking algorithm in the general case of nonconvex objective functions. Then, inspired by this method, we provide and study a series of extensions to improve the performance and to deal with more challenging settings like, e.g., the derivative-free framework or the online one. Subsequently, we tackle the recently emerged framework named distributed aggregative optimization. For this setup, we develop and analyze novel schemes to handle (i) online instances of the problem, (ii) ``personalized'' optimization frameworks, and (iii) feedback optimization settings. Finally, we adopt a system theoretical approach to address aggregative games over networks both in the presence or absence of linear coupling constraints among the decision variables of the players. In this context, we design and inspect novel fully-distributed algorithms, based on tracking mechanisms, that outperform state-of-the-art methods in finding the Nash equilibrium of the game.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, energy modernization has focused on smart engineering advancements. This entails designing complicated software and hardware for variable-voltage digital substations. A digital substation consists of electrical and auxiliary devices, control and monitoring devices, computers, and control software. Intelligent measurement systems use digital instrument transformers and IEC 61850-compliant information exchange protocols in digital substations. Digital instrument transformers used for real-time high-voltage measurements should combine advanced digital, measuring, information, and communication technologies. Digital instrument transformers should be cheap, small, light, and fire- and explosion-safe. These smaller and lighter transformers allow long-distance transmission of an optical signal that gauges direct or alternating current. Cost-prohibitive optical converters are a problem. To improve the tool's accuracy, amorphous alloys are used in the magnetic circuits and compensating feedback. Large-scale voltage converters can be made cheaper by using resistive, capacitive, or hybrid voltage dividers. In known electronic voltage transformers, the voltage divider output is generally on the low-voltage side, facilitating power supply organization. Combining current and voltage transformers reduces equipment size, installation, and maintenance costs. These two gadgets cost less together than individually. To increase commercial power metering accuracy, current and voltage converters should be included into digital instrument transformers so that simultaneous analogue-to-digital samples are obtained. Multichannel ADC microcircuits with synchronous conversion start allow natural parallel sample drawing. Digital instrument transformers are created adaptable to substation operating circumstances and environmental variables, especially ambient temperature. An embedded microprocessor auto-diagnoses and auto-calibrates the proposed digital instrument transformer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a joint location-inventory model is proposed that simultaneously optimises strategic supply chain design decisions such as facility location and customer allocation to facilities, and tactical-operational inventory management and production scheduling decisions. All this is analysed in a context of demand uncertainty and supply uncertainty. While demand uncertainty stems from potential fluctuations in customer demands over time, supply-side uncertainty is associated with the risk of “disruption” to which facilities may be subject. The latter is caused by external factors such as natural disasters, strikes, changes of ownership and information technology security incidents. The proposed model is formulated as a non-linear mixed integer programming problem to minimise the expected total cost, which includes four basic cost items: the fixed cost of locating facilities at candidate sites, the cost of transport from facilities to customers, the cost of working inventory, and the cost of safety stock. Next, since the optimisation problem is very complex and the number of evaluable instances is very low, a "matheuristic" solution is presented. This approach has a twofold objective: on the one hand, it considers a larger number of facilities and customers within the network in order to reproduce a supply chain configuration that more closely reflects a real-world context; on the other hand, it serves to generate a starting solution and perform a series of iterations to try to improve it. Thanks to this algorithm, it was possible to obtain a solution characterised by a lower total system cost than that observed for the initial solution. The study concludes with some reflections and the description of possible future insights.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Resource specialisation, although a fundamental component of ecological theory, is employed in disparate ways. Most definitions derive from simple counts of resource species. We build on recent advances in ecophylogenetics and null model analysis to propose a concept of specialisation that comprises affinities among resources as well as their co-occurrence with consumers. In the distance-based specialisation index (DSI), specialisation is measured as relatedness (phylogenetic or otherwise) of resources, scaled by the null expectation of random use of locally available resources. Thus, specialists use significantly clustered sets of resources, whereas generalists use over-dispersed resources. Intermediate species are classed as indiscriminate consumers. The effectiveness of this approach was assessed with differentially restricted null models, applied to a data set of 168 herbivorous insect species and their hosts. Incorporation of plant relatedness and relative abundance greatly improved specialisation measures compared to taxon counts or simpler null models, which overestimate the fraction of specialists, a problem compounded by insufficient sampling effort. This framework disambiguates the concept of specialisation with an explicit measure applicable to any mode of affinity among resource classes, and is also linked to ecological and evolutionary processes. This will enable a more rigorous deployment of ecological specialisation in empirical and theoretical studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Obesity is currently considered a major public health problem in the world, already reaching epidemic characteristics, according to the World Health Organization. Excess weight is the major risk factor associated with various diseases, such as type 2 diabetes mellitus, hypertension, dyslipidemia and osteometabolic diseases, including osteoporosis and osteoarthritis. Osteoarthritis is the most prevalent rheumatic disease and the leading cause of physical disability and reduced quality of life of the population over 65 years. It mainly involves the joints that bear weight - knees and hips. However, along with the cases of obesity, its prevalence is increasing, and even in other joints, such as hands. Thus, it is assumed that the influence of obesity on the development of OA is beyond mechanical overload. The purpose of this review was to correlate the possible mechanisms underlying the genesis and development of these two diseases. Increased fat mass is directly proportional to excessive consumption of saturated fatty acids, responsible for systemic low-grade inflammation condition and insulin and leptin resistance. At high levels, leptin assumes inflammatory characteristics and acts in the articular cartilage, triggering the inflammatory process and changing homeostasis this tissue with consequent degeneration. We conclude that obesity is a risk factor for osteoarthritis and that physical activity and changes in diet composition can reverse the inflammatory and leptin resistance, reducing progression or preventing the onset of osteoarthritis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Brazil, malaria remains a disease of major epidemiological importance because of the high number of cases in the Amazonian Region. Plasmodium spp infections during pregnancy are a significant public health problem with substantial risks for the pregnant woman, the foetus and the newborn child. In Brazil, the control of malaria during pregnancy is primarily achieved by prompt and effective treatment of the acute episodes. Thus, to assure rapid diagnosis and treatment for pregnant women with malaria, one of the recommended strategy for low transmission areas by World Health Organization and as part of a strategy by the Ministry of Health, the National Malaria Control Program has focused on integrative measures with woman and reproductive health. Here, we discuss the approach for the prevention and management of malaria during pregnancy in Brazil over the last 10 years (2003-2012) using morbidity data from Malaria Health Information System. Improving the efficiency and quality of healthcare and education and the consolidation of prevention programmes will be challenges in the control of malaria during pregnancy in the next decade.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract In this paper, we address the problem of picking a subset of bids in a general combinatorial auction so as to maximize the overall profit using the first-price model. This winner determination problem assumes that a single bidding round is held to determine both the winners and prices to be paid. We introduce six variants of biased random-key genetic algorithms for this problem. Three of them use a novel initialization technique that makes use of solutions of intermediate linear programming relaxations of an exact mixed integer-linear programming model as initial chromosomes of the population. An experimental evaluation compares the effectiveness of the proposed algorithms with the standard mixed linear integer programming formulation, a specialized exact algorithm, and the best-performing heuristics proposed for this problem. The proposed algorithms are competitive and offer strong results, mainly for large-scale auctions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Health economic evaluations require estimates of expected survival from patients receiving different interventions, often over a lifetime. However, data on the patients of interest are typically only available for a much shorter follow-up time, from randomised trials or cohorts. Previous work showed how to use general population mortality to improve extrapolations of the short-term data, assuming a constant additive or multiplicative effect on the hazards for all-cause mortality for study patients relative to the general population. A more plausible assumption may be a constant effect on the hazard for the specific cause of death targeted by the treatments. To address this problem, we use independent parametric survival models for cause-specific mortality among the general population. Because causes of death are unobserved for the patients of interest, a polyhazard model is used to express their all-cause mortality as a sum of latent cause-specific hazards. Assuming proportional cause-specific hazards between the general and study populations then allows us to extrapolate mortality of the patients of interest to the long term. A Bayesian framework is used to jointly model all sources of data. By simulation, we show that ignoring cause-specific hazards leads to biased estimates of mean survival when the proportion of deaths due to the cause of interest changes through time. The methods are applied to an evaluation of implantable cardioverter defibrillators for the prevention of sudden cardiac death among patients with cardiac arrhythmia. After accounting for cause-specific mortality, substantial differences are seen in estimates of life years gained from implantable cardioverter defibrillators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new platinum(II) complex with the amino acid L-tryptophan (trp), named Pt-trp, was synthesized and characterized. Elemental, thermogravimetric and ESI-QTOF mass spectrometric analyses led to the composition [Pt(C11H11N2O2)2]⋅6H2O. Infrared spectroscopic data indicate the coordination of trp to Pt(II) through the oxygen of the carboxylate group and also through the nitrogen atom of the amino group. The (13)C CP/MAS NMR spectroscopic data confirm coordination through the oxygen atom of the carboxylate group, while the (15)N CP/MAS NMR data confirm coordination of the nitrogen of the NH2 group to the metal. Density functional theory (DFT) studies were applied to evaluate the cis and trans coordination modes of trp to platinum(II). The trans isomer was shown to be energetically more stable than the cis one. The Pt-trp complex was evaluated as a cytotoxic agent against SK-Mel 103 (human melanoma) and Panc-1 (human pancreatic carcinoma) cell lines. The complex was shown to be cytotoxic over the considered cells.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ecological science contributes to solving a broad range of environmental problems. However, lack of ecological literacy in practice often limits application of this knowledge. In this paper, we highlight a critical but often overlooked demand on ecological literacy: to enable professionals of various careers to apply scientific knowledge when faced with environmental problems. Current university courses on ecology often fail to persuade students that ecological science provides important tools for environmental problem solving. We propose problem-based learning to improve the understanding of ecological science and its usefulness for real-world environmental issues that professionals in careers as diverse as engineering, public health, architecture, social sciences, or management will address. Courses should set clear learning objectives for cognitive skills they expect students to acquire. Thus, professionals in different fields will be enabled to improve environmental decision-making processes and to participate effectively in multidisciplinary work groups charged with tackling environmental issues.