907 resultados para Reactive Power Control


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Las redes del futuro, incluyendo las redes de próxima generación, tienen entre sus objetivos de diseño el control sobre el consumo de energía y la conectividad de la red. Estos objetivos cobran especial relevancia cuando hablamos de redes con capacidades limitadas, como es el caso de las redes de sensores inalámbricos (WSN por sus siglas en inglés). Estas redes se caracterizan por estar formadas por dispositivos de baja o muy baja capacidad de proceso y por depender de baterías para su alimentación. Por tanto la optimización de la energía consumida se hace muy importante. Son muchas las propuestas que se han realizado para optimizar el consumo de energía en este tipo de redes. Quizás las más conocidas son las que se basan en la planificación coordinada de periodos de actividad e inactividad, siendo una de las formas más eficaces para extender el tiempo de vida de las baterías. La propuesta que se presenta en este trabajo se basa en el control de la conectividad mediante una aproximación probabilística. La idea subyacente es que se puede esperar que una red mantenga la conectividad si todos sus nodos tienen al menos un número determinado de vecinos. Empleando algún mecanismo que mantenga ese número, se espera que se pueda mantener la conectividad con un consumo energético menor que si se empleara una potencia de transmisión fija que garantizara una conectividad similar. Para que el mecanismo sea eficiente debe tener la menor huella posible en los dispositivos donde se vaya a emplear. Por eso se propone el uso de un sistema auto-adaptativo basado en control mediante lógica borrosa. En este trabajo se ha diseñado e implementado el sistema descrito, y se ha probado en un despliegue real confirmando que efectivamente existen configuraciones posibles que permiten mantener la conectividad ahorrando energía con respecto al uso de una potencia de transmisión fija. ABSTRACT. Among the design goals for future networks, including next generation networks, we can find the energy consumption and the connectivity. These two goals are of special relevance when dealing with constrained networks. That is the case of Wireless Sensors Networks (WSN). These networks consist of devices with low or very low processing capabilities. They also depend on batteries for their operation. Thus energy optimization becomes a very important issue. Several proposals have been made for optimizing the energy consumption in this kind of networks. Perhaps the best known are those based on the coordinated planning of active and sleep intervals. They are indeed one of the most effective ways to extend the lifetime of the batteries. The proposal presented in this work uses a probabilistic approach to control the connectivity of a network. The underlying idea is that it is highly probable that the network will have a good connectivity if all the nodes have a minimum number of neighbors. By using some mechanism to reach that number, we hope that we can preserve the connectivity with a lower energy consumption compared to the required one if a fixed transmission power is used to achieve a similar connectivity. The mechanism must have the smallest footprint possible on the devices being used in order to be efficient. Therefore a fuzzy control based self-adaptive system is proposed. This work includes the design and implementation of the described system. It also has been validated in a real scenario deployment. We have obtained results supporting that there exist configurations where it is possible to get a good connectivity saving energy when compared to the use of a fixed transmission power for a similar connectivity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El uso de aritmética de punto fijo es una opción de diseño muy extendida en sistemas con fuertes restricciones de área, consumo o rendimiento. Para producir implementaciones donde los costes se minimicen sin impactar negativamente en la precisión de los resultados debemos llevar a cabo una asignación cuidadosa de anchuras de palabra. Encontrar la combinación óptima de anchuras de palabra en coma fija para un sistema dado es un problema combinatorio NP-hard al que los diseñadores dedican entre el 25 y el 50 % del ciclo de diseño. Las plataformas hardware reconfigurables, como son las FPGAs, también se benefician de las ventajas que ofrece la aritmética de coma fija, ya que éstas compensan las frecuencias de reloj más bajas y el uso más ineficiente del hardware que hacen estas plataformas respecto a los ASICs. A medida que las FPGAs se popularizan para su uso en computación científica los diseños aumentan de tamaño y complejidad hasta llegar al punto en que no pueden ser manejados eficientemente por las técnicas actuales de modelado de señal y ruido de cuantificación y de optimización de anchura de palabra. En esta Tesis Doctoral exploramos distintos aspectos del problema de la cuantificación y presentamos nuevas metodologías para cada uno de ellos: Las técnicas basadas en extensiones de intervalos han permitido obtener modelos de propagación de señal y ruido de cuantificación muy precisos en sistemas con operaciones no lineales. Nosotros llevamos esta aproximación un paso más allá introduciendo elementos de Multi-Element Generalized Polynomial Chaos (ME-gPC) y combinándolos con una técnica moderna basada en Modified Affine Arithmetic (MAA) estadístico para así modelar sistemas que contienen estructuras de control de flujo. Nuestra metodología genera los distintos caminos de ejecución automáticamente, determina las regiones del dominio de entrada que ejercitarán cada uno de ellos y extrae los momentos estadísticos del sistema a partir de dichas soluciones parciales. Utilizamos esta técnica para estimar tanto el rango dinámico como el ruido de redondeo en sistemas con las ya mencionadas estructuras de control de flujo y mostramos la precisión de nuestra aproximación, que en determinados casos de uso con operadores no lineales llega a tener tan solo una desviación del 0.04% con respecto a los valores de referencia obtenidos mediante simulación. Un inconveniente conocido de las técnicas basadas en extensiones de intervalos es la explosión combinacional de términos a medida que el tamaño de los sistemas a estudiar crece, lo cual conlleva problemas de escalabilidad. Para afrontar este problema presen tamos una técnica de inyección de ruidos agrupados que hace grupos con las señales del sistema, introduce las fuentes de ruido para cada uno de los grupos por separado y finalmente combina los resultados de cada uno de ellos. De esta forma, el número de fuentes de ruido queda controlado en cada momento y, debido a ello, la explosión combinatoria se minimiza. También presentamos un algoritmo de particionado multi-vía destinado a minimizar la desviación de los resultados a causa de la pérdida de correlación entre términos de ruido con el objetivo de mantener los resultados tan precisos como sea posible. La presente Tesis Doctoral también aborda el desarrollo de metodologías de optimización de anchura de palabra basadas en simulaciones de Monte-Cario que se ejecuten en tiempos razonables. Para ello presentamos dos nuevas técnicas que exploran la reducción del tiempo de ejecución desde distintos ángulos: En primer lugar, el método interpolativo aplica un interpolador sencillo pero preciso para estimar la sensibilidad de cada señal, y que es usado después durante la etapa de optimización. En segundo lugar, el método incremental gira en torno al hecho de que, aunque es estrictamente necesario mantener un intervalo de confianza dado para los resultados finales de nuestra búsqueda, podemos emplear niveles de confianza más relajados, lo cual deriva en un menor número de pruebas por simulación, en las etapas iniciales de la búsqueda, cuando todavía estamos lejos de las soluciones optimizadas. Mediante estas dos aproximaciones demostramos que podemos acelerar el tiempo de ejecución de los algoritmos clásicos de búsqueda voraz en factores de hasta x240 para problemas de tamaño pequeño/mediano. Finalmente, este libro presenta HOPLITE, una infraestructura de cuantificación automatizada, flexible y modular que incluye la implementación de las técnicas anteriores y se proporciona de forma pública. Su objetivo es ofrecer a desabolladores e investigadores un entorno común para prototipar y verificar nuevas metodologías de cuantificación de forma sencilla. Describimos el flujo de trabajo, justificamos las decisiones de diseño tomadas, explicamos su API pública y hacemos una demostración paso a paso de su funcionamiento. Además mostramos, a través de un ejemplo sencillo, la forma en que conectar nuevas extensiones a la herramienta con las interfaces ya existentes para poder así expandir y mejorar las capacidades de HOPLITE. ABSTRACT Using fixed-point arithmetic is one of the most common design choices for systems where area, power or throughput are heavily constrained. In order to produce implementations where the cost is minimized without negatively impacting the accuracy of the results, a careful assignment of word-lengths is required. The problem of finding the optimal combination of fixed-point word-lengths for a given system is a combinatorial NP-hard problem to which developers devote between 25 and 50% of the design-cycle time. Reconfigurable hardware platforms such as FPGAs also benefit of the advantages of fixed-point arithmetic, as it compensates for the slower clock frequencies and less efficient area utilization of the hardware platform with respect to ASICs. As FPGAs become commonly used for scientific computation, designs constantly grow larger and more complex, up to the point where they cannot be handled efficiently by current signal and quantization noise modelling and word-length optimization methodologies. In this Ph.D. Thesis we explore different aspects of the quantization problem and we present new methodologies for each of them: The techniques based on extensions of intervals have allowed to obtain accurate models of the signal and quantization noise propagation in systems with non-linear operations. We take this approach a step further by introducing elements of MultiElement Generalized Polynomial Chaos (ME-gPC) and combining them with an stateof- the-art Statistical Modified Affine Arithmetic (MAA) based methodology in order to model systems that contain control-flow structures. Our methodology produces the different execution paths automatically, determines the regions of the input domain that will exercise them, and extracts the system statistical moments from the partial results. We use this technique to estimate both the dynamic range and the round-off noise in systems with the aforementioned control-flow structures. We show the good accuracy of our approach, which in some case studies with non-linear operators shows a 0.04 % deviation respect to the simulation-based reference values. A known drawback of the techniques based on extensions of intervals is the combinatorial explosion of terms as the size of the targeted systems grows, which leads to scalability problems. To address this issue we present a clustered noise injection technique that groups the signals in the system, introduces the noise terms in each group independently and then combines the results at the end. In this way, the number of noise sources in the system at a given time is controlled and, because of this, the combinato rial explosion is minimized. We also present a multi-way partitioning algorithm aimed at minimizing the deviation of the results due to the loss of correlation between noise terms, in order to keep the results as accurate as possible. This Ph.D. Thesis also covers the development of methodologies for word-length optimization based on Monte-Carlo simulations in reasonable times. We do so by presenting two novel techniques that explore the reduction of the execution times approaching the problem in two different ways: First, the interpolative method applies a simple but precise interpolator to estimate the sensitivity of each signal, which is later used to guide the optimization effort. Second, the incremental method revolves on the fact that, although we strictly need to guarantee a certain confidence level in the simulations for the final results of the optimization process, we can do it with more relaxed levels, which in turn implies using a considerably smaller amount of samples, in the initial stages of the process, when we are still far from the optimized solution. Through these two approaches we demonstrate that the execution time of classical greedy techniques can be accelerated by factors of up to ×240 for small/medium sized problems. Finally, this book introduces HOPLITE, an automated, flexible and modular framework for quantization that includes the implementation of the previous techniques and is provided for public access. The aim is to offer a common ground for developers and researches for prototyping and verifying new techniques for system modelling and word-length optimization easily. We describe its work flow, justifying the taken design decisions, explain its public API and we do a step-by-step demonstration of its execution. We also show, through an example, the way new extensions to the flow should be connected to the existing interfaces in order to expand and improve the capabilities of HOPLITE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human ether-a-gogo related gene (HERG) K+ channels are key elements in the control of cell excitability in both the cardiovascular and the central nervous systems. For this reason, the possible modulation by reactive oxygen species (ROS) of HERG and other cloned K+ channels expressed in Xenopus oocytes has been explored in the present study. Exposure of Xenopus oocytes to an extracellular solution containing FeSO4 (25–100 μM) and ascorbic acid (50–200 μM) (Fe/Asc) increased both malondialdehyde content and 2′,7′-dichlorofluorescin fluorescence, two indexes of ROS production. Oocyte perfusion with Fe/Asc caused a 50% increase of the outward K+ currents carried by HERG channels, whereas inward currents were not modified. This ROS-induced increase in HERG outward K+ currents was due to a depolarizing shift of the voltage-dependence of channel inactivation, with no change in channel activation. No effect of Fe/Asc was observed on the expressed K+ currents carried by other K+ channels such as bEAG, rDRK1, and mIRK1. Fe/Asc-induced stimulation of HERG outward currents was completely prevented by perfusion of the oocytes with a ROS scavenger mixture (containing 1,000 units/ml catalase, 200 ng/ml superoxide dismutase, and 2 mM mannitol). Furthermore, the scavenger mixture also was able to reduce HERG outward currents in resting conditions by 30%, an effect mimicked by catalase alone. In conclusion, the present results seem to suggest that changes in ROS production can specifically influence K+ currents carried by the HERG channels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reactive oxygen intermediates (ROI) play a critical role in the defense of plants against invading pathogens. Produced during the “oxidative burst,” they are thought to activate programmed cell death (PCD) and induce antimicrobial defenses such as pathogenesis-related proteins. It was shown recently that during the interaction of plants with pathogens, the expression of ROI-detoxifying enzymes such as ascorbate peroxidase (APX) and catalase (CAT) is suppressed. It was suggested that this suppression, occurring upon pathogen recognition and coinciding with an enhanced rate of ROI production, plays a key role in elevating cellular ROI levels, thereby potentiating the induction of PCD and other defenses. To examine the relationship between the suppression of antioxidative mechanisms and the induction of PCD and other defenses during pathogen attack, we studied the interaction between transgenic antisense tobacco plants with reduced APX or CAT and a bacterial pathogen that triggers the hypersensitive response. Transgenic plants with reduced capability to detoxify ROI (i.e., antisense APX or CAT) were found to be hyperresponsive to pathogen attack. They activated PCD in response to low amounts of pathogens that did not trigger the activation of PCD in control plants. Our findings support the hypothesis that suppression of ROI-scavenging enzymes during the hypersensitive response plays an important role in enhancing pathogen-induced PCD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There has been much debate on the contribution of processes such as the persistence of antigens, cross-reactive stimulation, homeostasis, competition between different lineages of lymphocytes, and the rate of cell turnover on the duration of immune memory and the maintenance of the immune repertoire. We use simple mathematical models to investigate the contributions of these various processes to the longevity of immune memory (defined as the rate of decline of the population of antigen-specific memory cells). The models we develop incorporate a large repertoire of immune cells, each lineage having distinct antigenic specificities, and describe the dynamics of the individual lineages and total population of cells. Our results suggest that, if homeostatic control regulates the total population of memory cells, then, for a wide range of parameters, immune memory will be long-lived in the absence of persistent antigen (T1/2 > 1 year). We also show that the longevity of memory in this situation will be insensitive to the relative rates of cross-reactive stimulation, the rate of turnover of immune cells, and the functional form of the term for the maintenance of homeostasis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Photosynthesis, biological nitrogen fixation, and carbon dioxide assimilation are three fundamental biological processes catalyzed by photosynthetic bacteria. In the present study, it is shown that mutant strains of the nonsulfur purple photosynthetic bacteria Rhodospirillum rubrum and Rhodobacter sphaeroides, containing a blockage in the primary CO2 assimilatory pathway, derepress the synthesis of components of the nitrogen fixation enzyme complex and abrogate normal control mechanisms. The absence of the Calvin–Benson–Bassham (CBB) reductive pentose phosphate CO2 fixation pathway removes an important route for the dissipation of excess reducing power. Thus, the mutant strains develop alternative means to remove these reducing equivalents, resulting in the synthesis of large amounts of nitrogenase even in the presence of ammonia. This response is under the control of a global two-component signal transduction system previously found to regulate photosystem biosynthesis and the transcription of genes required for CO2 fixation through the CBB pathway and alternative routes. In addition, this two-component system directly controls the ability of these bacteria to grow under nitrogen-fixing conditions. These results indicate that there is a molecular link between the CBB and nitrogen fixation process, allowing the cell to overcome powerful control mechanisms to remove excess reducing power generated by photosynthesis and carbon metabolism. Furthermore, these results suggest that the two-component system integrates the expression of genes required for the three processes of photosynthesis, nitrogen fixation, and carbon dioxide fixation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The serpins are a family of proteinase inhibitors that play a central role in the control of proteolytic cascades. Their inhibitory mechanism depends on the intramolecular insertion of the reactive loop into β-sheet A after cleavage by the target proteinase. Point mutations within the protein can allow aberrant conformational transitions characterized by β-strand exchange between the reactive loop of one molecule and β-sheet A of another. These loop-sheet polymers result in diseases as varied as cirrhosis, emphysema, angio-oedema, and thrombosis, and we recently have shown that they underlie an early-onset dementia. We report here the biochemical characteristics and crystal structure of a naturally occurring variant (Leu-55–Pro) of the plasma serpin α1-antichymotrypsin trapped as an inactive intermediate. The structure demonstrates a serpin configuration with partial insertion of the reactive loop into β-sheet A. The lower part of the sheet is filled by the last turn of F-helix and the loop that links it to s3A. This conformation matches that of proposed intermediates on the pathway to complex and polymer formation in the serpins. In particular, this intermediate, along with the latent and polymerized conformations, explains the loss of activity of plasma α1-antichymotrypsin associated with chronic obstructive pulmonary disease in patients with the Leu-55–Pro mutation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To determine if exposure to benzodiazepines during the first trimester of pregnancy increases risk of major malformations or cleft lip or palate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Darwin observed that multiple, lowly organized, rudimentary, or exaggerated structures show increased relative variability. However, the cellular basis for these laws has never been investigated. Some animals, such as the nematode Caenorhabditis elegans, are famous for having organs that possess the same number of cells in all individuals, a property known as eutely. But for most multicellular creatures, the extent of cell number variability is unknown. Here we estimate variability in organ cell number for a variety of animals, plants, slime moulds, and volvocine algae. We find that the mean and variance in cell number obey a power law with an exponent of 2, comparable to Taylor's law in ecological processes. Relative cell number variability, as measured by the coefficient of variation, differs widely across taxa and tissues, but is generally independent of mean cell number among homologous tissues of closely related species. We show that the power law for cell number variability can be explained by stochastic branching process models based on the properties of cell lineages. We also identify taxa in which the precision of developmental control appears to have evolved. We propose that the scale independence of relative cell number variability is maintained by natural selection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nonlinear analyses of infant heart rhythms reveal a marked rise in the complexity of the electrocardiogram with maturation. We find that normal mature infants (gestation greater than or equal to 35 weeks) have complex and distinctly nonlinear heart rhythms (consistent with recent reports for healthy adults) but that such nonlinearity is lacking in preterm infants (gestation > or = to 27 weeks) where parasympathetic-sympathetic interaction and function are presumed to be less well developed. Our study further shows that infants with clinical brain death and those treated with atropine exhibit a similar lack of nonlinear feedback control. These three lines of evidence support the hypothesis championed by Goldberger et al. [Goldberger, A.L., Rigney, D.R. & West, B.J. (1990) Sci. Am. 262, 43-49] that autonomic nervous system control underlies the nonlinearity and possible chaos of normal heart rhythms. This report demonstrates the acquisition of nonlinear heart rate dynamics and possible chaos in developing human infants and its loss in brain death and with the administration of atropine. It parallels earlier work documenting changes in the variability of heart rhythms in each of these cases and suggests that nonlinearity may provide additional power in characterizing physiological states.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The object of this doctoral thesis is the analysis of the political and administrative purpose that is given to the reform process of a vital sector of State powers within the framework of delegate democracy, such as the administration of Justice. The object is also to analyze if State reform in a diminished or non-liberal surrounding increase or improve conditions of democracy in a given situation, based on the constitutional “what should be”, or if what occurs is a process of “seizure” of the functions of State, which becomes an institutional risk. Finally, we will examine the real and effective existence of a horizontal accountability process through the use of institutional resources, which would evidence the existence of an incomplete model of democracy. This analysis implies the relationship between two institutions within public administration: State Reform, as an act of change in State structure in order to improve qualitatively the outcomes and outputs of public policies, and in sum, to make the system work better. This, as it will be examined later, is the case of Latin America as a response of the State to three processes in crisis: fiscal, as in government intervention or in the form of bureaucratic administration. In that scheme of things, this thesis examines the present state of the art in public administration science of this process to prove that in delegate democracy, this type of instruments disregard the constitutive elements of democracy and serve, especially in critical areas of the administration, allowing for Power to dismiss Law. This research seeks to contribute towards an area seldom analyzed regarding public administration doctrine under the light of the theory of law, which is the connection between previous conditions or principal inputs of an execution process of a democracy and, on the other hand, regarding the effects of introducing a reform within models of a changing democracy and new concepts of the rule of law. While reviewing writings regarding State reform, it is clear that no approximations have been previously made in reference to prior conditions of the political system in order to begin operating a reform which respects fundamental rights as an object of this procedure. Furthermore, no analysis has been found regarding structural change of strategic areas in State services as to the effect caused on democratic exercise and the outcome in an open society...

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With no written record, the religious beliefs of the Pre-Columbian Mochica civilization are much of a mystery. This paper attempts to decipher the position of the deceased Mochicans, also known as ancestors, within the society as a whole. It discusses the ways in which we can use multiple sources of information, archaeological, iconographic, ethnohistoric and ethnographic to learn about the various aspects of Mochican culture. Specifically I will use these methods for collecting data to examine at how the Mochica viewed their deceased and to argue that part of the Mochica religious system granted their dead a supernatural ability to control human and agricultural fertility. This power would give Mochican ancestors a significant place within the society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Standards reduce production costs and increase products’ value to consumers. Standards however entail risks of anti-competitive abuse. After the adoption of a standard, the chosen technology normally lacks credible substitutes. The owner of the patented technology might thus have additional market power relative to locked-in licensees, and might exploit this power to charge higher access rates. In the economic literature this phenomenon is referred to as ‘hold-up’. To reduce the risk of hold-up, standard-setting organisations often require patent holders to disclose their standard-essential patents before the adoption of the standard and to commit to license on fair, reasonable and non-discriminatory (FRAND) terms. The European Commission normally investigates unfair pricing abuse in a standard-setting context if a patent holder who committed to FRAND ex-ante is suspected not to abide to it ex-post. However, this approach risks ignoring a number of potential abuses which are likely harmful for welfare. That can happen if, for example, ex-post a licensee is able to impose excessively low access rates (‘reverse hold-up’) or if a patent holder acquires additional market power thanks to the standard but its essential patents are not encumbered by FRAND commitments, for instance because the patent holder did not directly participate to the standard setting process and was therefore not required by the standard-setting organisations to commit to FRAND ex-ante. A consistent policy by the Commission capable of tackling all sources of harm should be enforced regardless of whether FRAND commitments are given. Antitrust enforcement should hinge on the identification of a distortion in the bargaining process around technology access prices, which is determined by the adoption of the standard and is not attributable to pro-competitive merits of any of the involved players.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tese de doutoramento, Farmácia (Biologia Celular e Molecular), Universidade de Lisboa, Faculdade de Farmácia, 2016