859 resultados para Voltage control in distribution systems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Access control is a fundamental concern in any system that manages resources, e.g., operating systems, file systems, databases and communications systems. The problem we address is how to specify, enforce, and implement access control in distributed environments. This problem occurs in many applications such as management of distributed project resources, e-newspaper and payTV subscription services. Starting from an access relation between users and resources, we derive a user hierarchy, a resource hierarchy, and a unified hierarchy. The unified hierarchy is then used to specify the access relation in a way that is compact and that allows efficient queries. It is also used in cryptographic schemes that enforce the access relation. We introduce three specific cryptography based hierarchical schemes, which can effectively enforce and implement access control and are designed for distributed environments because they do not need the presence of a central authority (except perhaps for set- UP).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In fluids and plasmas with zonal flow reversed shear, a peculiar kind of transport barrier appears in the shearless region, one that is associated with a proper route of transition to chaos. These barriers have been identified in symplectic nontwist maps that model such zonal flows. We use the so-called standard nontwist map, a paradigmatic example of nontwist systems, to analyze the parameter dependence of the transport through a broken shearless barrier. On varying a proper control parameter, we identify the onset of structures with high stickiness that give rise to an effective barrier near the broken shearless curve. Moreover, we show how these stickiness structures, and the concomitant transport reduction in the shearless region, are determined by a homoclinic tangle of the remaining dominant twin island chains. We use the finite-time rotation number, a recently proposed diagnostic, to identify transport barriers that separate different regions of stickiness. The identified barriers are comparable to those obtained by using finite-time Lyapunov exponents.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Non-adherence to treatment has been identified as the main cause of uncontrolled blood pressure (BP), and may represent a greater risk in older individuals. Objective: The aim of this study was to evaluate and compare the rate of adherence to hypertension treatment using different methods, to estimate the BP control rate, and to observe if there is an association between BP control and adherence. Methods: Treatment adherence was evaluated in older patients with hypertension, followed by the public primary health care, through four methods, including the Morisky-Green test (reference), the Attitude regarding the Medication Intake questionnaire (AMI), an evaluation of adherence by the nurse in the office (Nurse Adherence Evaluation - NAE), and at home (Home Adherence Evaluation - HAE). Salt intake was estimated by 24-hour sodium urinary excretion. BP control was assessed by the awake ambulatory blood pressure monitoring. Results: Concordance between the Morisky-Green test and AMI (Kappa=0.27) or NAE (Kappa=0.05) was poor. There was a moderate concordance between the Morisky-Green test and HAE. Eighty percent had controlled BP, including 42% with white-coat effect. The group with lower salt excretion informed to avoid salt intake more times (p<0.001) and had better medication adherence (p<0.001) than the higher salt excretion group. Conclusion: The evaluated tests did not show a good concordance to the Morisky-Green test. Adherence to hypertension treatment was low; however, there was a high rate of BP control when subjects with the white-coat effect were included in the analysis. (Arq Bras Cardiol 2012;99(1):636-641)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Model predictive control (MPC) applications in the process industry usually deal with process systems that show time delays (dead times) between the system inputs and outputs. Also, in many industrial applications of MPC, integrating outputs resulting from liquid level control or recycle streams need to be considered as controlled outputs. Conventional MPC packages can be applied to time-delay systems but stability of the closed loop system will depend on the tuning parameters of the controller and cannot be guaranteed even in the nominal case. In this work, a state space model based on the analytical step response model is extended to the case of integrating time systems with time delays. This model is applied to the development of two versions of a nominally stable MPC, which is designed to the practical scenario in which one has targets for some of the inputs and/or outputs that may be unreachable and zone control (or interval tracking) for the remaining outputs. The controller is tested through simulation of a multivariable industrial reactor system. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A systematic approach to model nonlinear systems using norm-bounded linear differential inclusions (NLDIs) is proposed in this paper. The resulting NLDI model is suitable for the application of linear control design techniques and, therefore, it is possible to fulfill certain specifications for the underlying nonlinear system, within an operating region of interest in the state-space, using a linear controller designed for this NLDI model. Hence, a procedure to design a dynamic output feedback controller for the NLDI model is also proposed in this paper. One of the main contributions of the proposed modeling and control approach is the use of the mean-value theorem to represent the nonlinear system by a linear parameter-varying model, which is then mapped into a polytopic linear differential inclusion (PLDI) within the region of interest. To avoid the combinatorial problem that is inherent of polytopic models for medium- and large-sized systems, the PLDI is transformed into an NLDI, and the whole process is carried out ensuring that all trajectories of the underlying nonlinear system are also trajectories of the resulting NLDI within the operating region of interest. Furthermore, it is also possible to choose a particular structure for the NLDI parameters to reduce the conservatism in the representation of the nonlinear system by the NLDI model, and this feature is also one important contribution of this paper. Once the NLDI representation of the nonlinear system is obtained, the paper proposes the application of a linear control design method to this representation. The design is based on quadratic Lyapunov functions and formulated as search problem over a set of bilinear matrix inequalities (BMIs), which is solved using a two-step separation procedure that maps the BMIs into a set of corresponding linear matrix inequalities. Two numerical examples are given to demonstrate the effectiveness of the proposed approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The first part of my thesis presents an overview of the different approaches used in the past two decades in the attempt to forecast epileptic seizure on the basis of intracranial and scalp EEG. Past research could reveal some value of linear and nonlinear algorithms to detect EEG features changing over different phases of the epileptic cycle. However, their exact value for seizure prediction, in terms of sensitivity and specificity, is still discussed and has to be evaluated. In particular, the monitored EEG features may fluctuate with the vigilance state and lead to false alarms. Recently, such a dependency on vigilance states has been reported for some seizure prediction methods, suggesting a reduced reliability. An additional factor limiting application and validation of most seizure-prediction techniques is their computational load. For the first time, the reliability of permutation entropy [PE] was verified in seizure prediction on scalp EEG data, contemporarily controlling for its dependency on different vigilance states. PE was recently introduced as an extremely fast and robust complexity measure for chaotic time series and thus suitable for online application even in portable systems. The capability of PE to distinguish between preictal and interictal state has been demonstrated using Receiver Operating Characteristics (ROC) analysis. Correlation analysis was used to assess dependency of PE on vigilance states. Scalp EEG-Data from two right temporal epileptic lobe (RTLE) patients and from one patient with right frontal lobe epilepsy were analysed. The last patient was included only in the correlation analysis, since no datasets including seizures have been available for him. The ROC analysis showed a good separability of interictal and preictal phases for both RTLE patients, suggesting that PE could be sensitive to EEG modifications, not visible on visual inspection, that might occur well in advance respect to the EEG and clinical onset of seizures. However, the simultaneous assessment of the changes in vigilance showed that: a) all seizures occurred in association with the transition of vigilance states; b) PE was sensitive in detecting different vigilance states, independently of seizure occurrences. Due to the limitations of the datasets, these results cannot rule out the capability of PE to detect preictal states. However, the good separability between pre- and interictal phases might depend exclusively on the coincidence of epileptic seizure onset with a transition from a state of low vigilance to a state of increased vigilance. The finding of a dependency of PE on vigilance state is an original finding, not reported in literature, and suggesting the possibility to classify vigilance states by means of PE in an authomatic and objectic way. The second part of my thesis provides the description of a novel behavioral task based on motor imagery skills, firstly introduced (Bruzzo et al. 2007), in order to study mental simulation of biological and non-biological movement in paranoid schizophrenics (PS). Immediately after the presentation of a real movement, participants had to imagine or re-enact the very same movement. By key release and key press respectively, participants had to indicate when they started and ended the mental simulation or the re-enactment, making it feasible to measure the duration of the simulated or re-enacted movements. The proportional error between duration of the re-enacted/simulated movement and the template movement were compared between different conditions, as well as between PS and healthy subjects. Results revealed a double dissociation between the mechanisms of mental simulation involved in biological and non-biologial movement simulation. While for PS were found large errors for simulation of biological movements, while being more acurate than healthy subjects during simulation of non-biological movements. Healthy subjects showed the opposite relationship, making errors during simulation of non-biological movements, but being most accurate during simulation of non-biological movements. However, the good timing precision during re-enactment of the movements in all conditions and in both groups of participants suggests that perception, memory and attention, as well as motor control processes were not affected. Based upon a long history of literature reporting the existence of psychotic episodes in epileptic patients, a longitudinal study, using a slightly modified behavioral paradigm, was carried out with two RTLE patients, one patient with idiopathic generalized epilepsy and one patient with extratemporal lobe epilepsy. Results provide strong evidence for a possibility to predict upcoming seizures in RTLE patients behaviorally. In the last part of the thesis it has been validated a behavioural strategy based on neurobiofeedback training, to voluntarily control seizures and to reduce there frequency. Three epileptic patients were included in this study. The biofeedback was based on monitoring of slow cortical potentials (SCPs) extracted online from scalp EEG. Patients were trained to produce positive shifts of SCPs. After a training phase patients were monitored for 6 months in order to validate the ability of the learned strategy to reduce seizure frequency. Two of the three refractory epileptic patients recruited for this study showed improvements in self-management and reduction of ictal episodes, even six months after the last training session.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the statics and dynamics of a glassy,non-entangled, short bead-spring polymer melt with moleculardynamics simulations. Temperature ranges from slightlyabove the mode-coupling critical temperature to the liquidregime where features of a glassy liquid are absent. Ouraim is to work out the polymer specific effects on therelaxation and particle correlation. We find the intra-chain static structure unaffected bytemperature, it depends only on the distance of monomersalong the backbone. In contrast, the distinct inter-chainstructure shows pronounced site-dependence effects at thelength-scales of the chain and the nearest neighbordistance. There, we also find the strongest temperaturedependence which drives the glass transition. Both the siteaveraged coupling of the monomer and center of mass (CM) andthe CM-CM coupling are weak and presumably not responsiblefor a peak in the coherent relaxation time at the chain'slength scale. Chains rather emerge as soft, easilyinterpenetrating objects. Three particle correlations arewell reproduced by the convolution approximation with theexception of model dependent deviations. In the spatially heterogeneous dynamics of our system weidentify highly mobile monomers which tend to follow eachother in one-dimensional paths forming ``strings''. Thesestrings have an exponential length distribution and aregenerally short compared to the chain length. Thus, arelaxation mechanism in which neighboring mobile monomersmove along the backbone of the chain seems unlikely.However, the correlation of bonded neighbors is enhanced. When liquids are confined between two surfaces in relativesliding motion kinetic friction is observed. We study ageneric model setup by molecular dynamics simulations for awide range of sliding speeds, temperatures, loads, andlubricant coverings for simple and molecular fluids. Instabilities in the particle trajectories are identified asthe origin of kinetic friction. They lead to high particlevelocities of fluid atoms which are gradually dissipatedresulting in a friction force. In commensurate systemsfluid atoms follow continuous trajectories for sub-monolayercoverings and consequently, friction vanishes at low slidingspeeds. For incommensurate systems the velocity probabilitydistribution exhibits approximately exponential tails. Weconnect this velocity distribution to the kinetic frictionforce which reaches a constant value at low sliding speeds. This approach agrees well with the friction obtaineddirectly from simulations and explains Amontons' law on themicroscopic level. Molecular bonds in commensurate systemslead to incommensurate behavior, but do not change thequalitative behavior of incommensurate systems. However,crossed chains form stable load bearing asperities whichstrongly increase friction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Synthetic biology is a young field of applicative research aiming to design and build up artificial biological devices, useful for human applications. How synthetic biology emerged in past years and how the development of the Registry of Standard Biological Parts aimed to introduce one practical starting solution to apply the basics of engineering to molecular biology is presented in chapter 1 in the thesis The same chapter recalls how biological parts can make up a genetic program, the molecular cloning tecnique useful for this purpose, and an overview of the mathematical modeling adopted to describe gene circuit behavior. Although the design of gene circuits has become feasible the increasing complexity of gene networks asks for a rational approach to design gene circuits. A bottom-up approach was proposed, suggesting that the behavior of a complicated system can be predicted from the features of its parts. The option to use modular parts in large-scale networks will be facilitated by a detailed and shared characterization of their functional properties. Such a prediction, requires well-characterized mathematical models of the parts and of how they behave when assembled together. In chapter 2, the feasibility of the bottom-up approach in the design of a synthetic program in Escherichia coli bacterial cells is described. The rational design of gene networks is however far from being established. The synthetic biology approach can used the mathematical formalism to identify biological information not assessable with experimental measurements. In this context, chapter 3 describes the design of a synthetic sensor for identifying molecules of interest inside eukaryotic cells. The Registry of Standard parts collects standard and modular biological parts. To spread the use of BioBricks the iGEM competition was started. The ICM Laboratory, where Francesca Ceroni completed her Ph.D, partecipated with teams of students and Chapter 4 summarizes the projects developed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pearls are an amazing example of calcium carbonate biomineralization. They show a classic brick and mortar internal structure in which the predominant inorganic part is composed by aragonite and vaterite tablets. The organic matrix is disposed in concentric layers tightly associated to the mineral structures. Freshwater cultivate pearls (FWCPs) and shells nacreous layers of the Chinese mussel Hyriopsis cumingii were demineralized using an ion exchange resin in order to isolate the organic matrix. From both starting materials a soluble fraction was obtained and further analyzed. The major component of the soluble extracts was represented by a similar glycoprotein having a molecular weight of about 48 kDa in pearls and 44 kDa in shells. Immunolocalization showed their wide distribution in the organic sheet surrounding calcium carbonate tablets of the nacre and in the interlamellar and intertabular matrix. These acidic glycoprotein also contained inside the aragonite platelets, are direct regulators during biomineralization processes, participating to calcium carbonate precipitation since the nucleation step. Selective calcium carbonate polymorph precipitation was performed using the two extracts. The polysaccharides moiety was demonstrate to be a crucial factor in polymorphs selection. In particular, the higher content in sugar groups found in pearls extract was responsible of stabilization of the high energetic vaterite during the in vitro precipitation assay; while irregular calcite was obtained using shells protein. Furthermore these polypeptides showed a carbonic anhydrase activity that, even if not directly involved in polymorphs determination, is an essential regulator in CaCO3 formation by means of carbonate anions production. The structural and functional characterization of the proteins included in biocomposites, gives important hints for understanding the complicated process of biomineralization. A better knowledge of this natural mechanism can offer new strategies for producing environmental friendly materials with controlled structures and enhanced chemical-physical features.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new control scheme has been presented in this thesis. Based on the NonLinear Geometric Approach, the proposed Active Control System represents a new way to see the reconfigurable controllers for aerospace applications. The presence of the Diagnosis module (providing the estimation of generic signals which, based on the case, can be faults, disturbances or system parameters), mean feature of the depicted Active Control System, is a characteristic shared by three well known control systems: the Active Fault Tolerant Controls, the Indirect Adaptive Controls and the Active Disturbance Rejection Controls. The standard NonLinear Geometric Approach (NLGA) has been accurately investigated and than improved to extend its applicability to more complex models. The standard NLGA procedure has been modified to take account of feasible and estimable sets of unknown signals. Furthermore the application of the Singular Perturbations approximation has led to the solution of Detection and Isolation problems in scenarios too complex to be solved by the standard NLGA. Also the estimation process has been improved, where multiple redundant measuremtent are available, by the introduction of a new algorithm, here called "Least Squares - Sliding Mode". It guarantees optimality, in the sense of the least squares, and finite estimation time, in the sense of the sliding mode. The Active Control System concept has been formalized in two controller: a nonlinear backstepping controller and a nonlinear composite controller. Particularly interesting is the integration, in the controller design, of the estimations coming from the Diagnosis module. Stability proofs are provided for both the control schemes. Finally, different applications in aerospace have been provided to show the applicability and the effectiveness of the proposed NLGA-based Active Control System.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fully controlled liquid injection and flow in hydrophobic polydimethylsiloxane (PDMS) two-dimensional microchannel arrays based on on-chip integrated, low-voltage-driven micropumps are demonstrated. Our architecture exploits the surface-acoustic-wave (SAW) induced counterflow mechanism and the effect of nebulization anisotropies at crossing areas owing to lateral propagating SAWs. We show that by selectively exciting single or multiple SAWs, fluids can be drawn from their reservoirs and moved towards selected positions of a microchannel grid. Splitting of the main liquid flow is also demonstrated by exploiting multiple SAW beams. As a demonstrator, we show simultaneous filling of two orthogonal microchannels. The present results show that SAW micropumps are good candidates for truly integrated on-chip fluidic networks allowing liquid control in arbitrarily shaped two-dimensional microchannel arrays.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For virtually all hospitals, utilization rates are a critical managerial indicator of efficiency and are determined in part by turnover time. Turnover time is defined as the time elapsed between surgeries, during which the operating room is cleaned and preparedfor the next surgery. Lengthier turnover times result in lower utilization rates, thereby hindering hospitals’ ability to maximize the numbers of patients that can be attended to. In this thesis, we analyze operating room data from a two year period provided byEvangelical Community Hospital in Lewisburg, Pennsylvania, to understand the variability of the turnover process. From the recorded data provided, we derive our best estimation of turnover time. Recognizing the importance of being able to properly modelturnover times in order to improve the accuracy of scheduling, we seek to fit distributions to the set of turnover times. We find that log-normal and log-logistic distributions are well-suited to turnover times, although further research must validate this finding. Wepropose that the choice of distribution depends on the hospital and, as a result, a hospital must choose whether to use the log-normal or the log-logistic distribution. Next, we use statistical tests to identify variables that may potentially influence turnover time. We find that there does not appear to be a correlation between surgerytime and turnover time across doctors. However, there are statistically significant differences between the mean turnover times across doctors. The final component of our research entails analyzing and explaining the benefits of introducing control charts as a quality control mechanism for monitoring turnover times in hospitals. Although widely instituted in other industries, control charts are notwidely adopted in healthcare environments, despite their potential benefits. A major component of our work is the development of control charts to monitor the stability of turnover times. These charts can be easily instituted in hospitals to reduce the variabilityof turnover times. Overall, our analysis uses operations research techniques to analyze turnover times and identify manners for improvement in lowering the mean turnover time and thevariability in turnover times. We provide valuable insight into a component of the surgery process that has received little attention, but can significantly affect utilization rates in hospitals. Most critically, an ability to more accurately predict turnover timesand a better understanding of the sources of variability can result in improved scheduling and heightened hospital staff and patient satisfaction. We hope that our findings can apply to many other hospital settings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Water distribution systems are important for life saving facilities especially in the recovery after earthquakes. In this paper, a framework is discussed about seismic serviceability of water systems that includes the fragility evaluation of water sources of water distribution networks. Also, a case study is brought about the performance of a water system under different levels of seismic hazard. The seismic serviceability of a water supply system provided by EPANET is evaluated under various levels of seismic hazard. Basically, the assessment process is based on hydraulic analysis and Monte Carlo simulations, implemented with empirical fragility data provided by the American Lifeline Alliance (ALA, 2001) for both pipelines and water facilities. Represented by the Seismic Serviceability Index (Cornell University, 2008), the serviceability of the water distribution system is evaluated under each level of earthquakes with return periods of 72 years, 475 years, and 2475 years. The system serviceability under levels of earthquake hazard are compared with and without considering the seismic fragility of the water source. The results show that the seismic serviceability of the water system decreases with the growing of the return period of seismic hazard, and after considering the seismic fragility of the water source, the seismic serviceability decreases. The results reveal the importance of considering the seismic fragility of water sources, and the growing dependence of the system performance of water system on the seismic resilience of water source under severe earthquakes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research evaluated an Intelligent Compaction (IC) unit on the M-189 highway reconstruction project at Iron River, Michigan. The results from the IC unit were compared to several traditional compaction measurement devices including Nuclear Density Gauge (NDG), Geogauge, Light Weight Deflectometer (LWD), Dynamic Cone Penetrometer (DCP), and Modified Clegg Hammer (MCH). The research collected point measurements data on a test section in which 30 test locations on the final Class II sand base layer and the 22A gravel layer. These point measurements were compared with the IC measurements (ICMVs) on a point-to-point basis through a linear regression analysis. Poor correlations were obtained among different measurements points using simple regression analysis. When comparing the ICMV to the compaction measurements points. Factors attributing to the weak correlation include soil heterogeneity, variation in IC roller operation parameters, in-place moisture content, the narrow range of the compaction devices measurement ranges and support conditions of the support layers. After incorporating some of the affecting factors into a multiple regression analysis, the strength of correlation significantly improved, especially on the stiffer gravel layer. Measurements were also studied from an overall distribution perspective in terms of average, measurement range, standard deviation, and coefficient of variance. Based on data analysis, on-site project observation and literature review, conclusions were made on how IC performed in regards to compaction control on the M-189 reconstruction project.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tourette Syndrome begins in childhood and is characterized by uncontrollable repetitive actions like neck craning or hopping and noises such as sniffing or chirping. Worst in early adolescence, these tics wax and wane in severity and occur in bouts unpredictably, often drawing unwanted attention from bystanders. Making matters worse, over half of children with Tourette Syndrome also suffer from comorbid, or concurrent, disorders such as attention deficit hyperactivity disorder (ADHD) and obsessive-compulsive disorder (OCD). These disorders introduce anxious thoughts, impulsivity, inattention, and mood variability that further disrupt children with Tourette Syndrome from focusing and performing well at school and home. Thus, deficits in the cognitive control functions of response inhibition, response generation, and working memory have long been ascribed to Tourette Syndrome. Yet, without considering the effect of medication, age, and comorbidity, this is a premature attribution. This study used an infrared eye tracking camera and various computer tasks requiring eye movement responses to evaluate response inhibition, response generation, and working memory in Tourette Syndrome. This study, the first to control for medication, age, and comorbidity, enrolled 39 unmedicated children with Tourette Syndrome and 29 typically developing peers aged 10-16 years who completed reflexive and voluntary eye movement tasks and diagnostic rating scales to assess symptom severities of Tourette Syndrome, ADHD, and OCD. Children with Tourette Syndrome and comorbid ADHD and/or OCD, but not children with Tourette Syndrome only, took longer to respond and made more errors and distracted eye movements compared to typically-developing children, displaying cognitive control deficits. However, increasing symptom severities of Tourette Syndrome, ADHD, and OCD correlated with one another. Thus, cognitive control deficits were not specific to Tourette Syndrome patients with comorbid conditions, but rather increase with increasing tic severity, suggesting that a majority of Tourette Syndrome patients, regardless of a clinical diagnosis of ADHD and/or OCD, have symptoms of cognitive control deficits at some level. Therefore, clinicians should evaluate and counsel all families of children with Tourette Syndrome, with or without currently diagnosed ADHD and/or OCD, about the functional ramifications of comorbid symptoms and that they may wax and wane with tic severity.