933 resultados para Automation and control


Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND. Exposure to xenoestrogens during pregnancy may disturb the development and function of male sexual organs. OBJECTIVE. In this study we aimed to determine whether the combined effect of environmental estrogens measured as total effective xenoestrogen burden (TEXB) is a risk factor for male urogenital malformations. METHODS. In a case-control study, nested in a mother-child cohort (n = 702) established at Granada University Hospital, we compared 50 newborns with diagnosis of cryptorchidism and/or hypospadias with 114 boys without malformations matched by gestational age, date of birth, and parity. Controls did not differ from the total cohort in confounding variables. TEXB and levels of 16 organochlorine pesticides were measured in placenta tissues. Characteristics of parents, pregnancy, and birth were gathered by questionnaire. We used conditional and unconditional regression models to estimate odds ratios (ORs) and 95% confidence intervals (CIs). RESULTS. TEXB from organohalogenated compounds was detectable in 72% and 54% of case and control placentas, respectively. Compared with controls, cases had an OR for detectable versus non-detectable TEXB of 2.82 (95% CI, 1.10-7.24). More pesticides were detected in cases than in controls (9.34 +/- 3.19 vs. 6.97 +/- 3.93). ORs for cases with detectable levels of pesticides, after adjusting for potential confounders in the conditional regression analysis, were o,p'-DDT (OR = 2.25; 95% CI, 1.03-4.89), p,p'-DDT (OR = 2.63; 95% CI, 1.21-5.72), lindane (OR = 3.38; 95% CI, 1.36-8.38), mirex (OR = 2.85; 95% CI, 1.22-6.66), and endosulfan alpha (OR = 2.19; 95% CI, 0.99-4.82). Engagement of mothers in agriculture (OR = 3.47; 95% CI, 1.33-9.03), fathers' occupational exposure to xenoestrogens (OR = 2.98; 95% CI, 1.11-8.01), and history of previous stillbirths (OR = 4.20; 95% CI, 1.11-16.66) were also associated with risk of malformations. CONCLUSIONS We found an increased risk for male urogenital malformations related to the combined effect of environmental estrogens in placenta.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It has been estimated that more than 70% of all medical activity is directly related to information providing analytical data. Substantial technological advances have taken place recently, which have allowed a previously unimagined number of analytical samples to be processed while offering high quality results. Concurrently, yet more new diagnostic determinations have been introduced - all of which has led to a significant increase in the prescription of analytical parameters. This increased workload has placed great pressure on the laboratory with respect to health costs. The present manager of the Clinical Laboratory (CL) has had to examine cost control as well as rationing - meaning that the CL's focus has not been strictly metrological, as if it were purely a system producing results, but instead has had to concentrate on its efficiency and efficacy. By applying re-engineering criteria, an emphasis has had to be placed on improved organisation and operating practice within the CL, focussing on the current criteria of the Integrated Management Areas where the technical and human resources are brought together. This re-engineering has been based on the concepts of consolidating and integrating the analytical platforms, while differentiating the production areas (CORE Laboratory) from the information areas. With these present concepts in mind, automation and virological treatment, along with serology in general, follow the same criteria as the rest of the operating methodology in the Clinical Laboratory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Insulin-like growth factor-I (IGF-I) and C-reactive protein (CRP) may be positively associated with the risk of epithelial ovarian cancer (EOC) but no previous studies have investigated their associations with non-epithelial ovarian cancers (NEOC). METHODS: A case-control study was nested within the Finnish Maternity Cohort. Case subjects were 58 women diagnosed with sex cord-stromal tumors (SCST) and 30 with germ cell tumors (GCT) after recruitment. Control subjects (144 for SCST and 74 for GCT) were matched for age, parity, and date of blood donation of the index case. RESULTS: Doubling of IGF-I concentration was not related to maternal risk of either SCST (OR 0.97, 95% CI 0.58-1.62) or GCT (OR 1.13, 95% CI 0.51-2.51). Similarly, doubling of CRP concentrations was not related to maternal risk of either SCST (OR 1.10, 95% CI 0.85-1.43) or GCT (OR 0.93, 95% CI 0.68-1.28). CONCLUSIONS: Pre-diagnostic IGF-I and CRP concentrations during the first trimester of pregnancy were not associated with increased risk of NEOC in the mother. Risk factors for NEOC may differ from those of EOC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Coronary microvascular dysfunction (CMD) is associated with cardiovascular events in type 2 diabetes mellitus (T2DM). Optimal glycaemic control does not always preclude future events. We sought to assess the effect of the current target of HBA1c level on the coronary microcirculatory function and identify predictive factors for CMD in T2DM patients. Methods We studied 100 patients with T2DM and 214 patients without T2DM. All of them with a history of chest pain, non-obstructive angiograms and a direct assessment of coronary blood flow increase in response to adenosine and acetylcholine coronary infusion, for evaluation of endothelial independent and dependent CMD. Patients with T2DM were categorized as having optimal (HbA1c < 7 %) vs. suboptimal (HbA1c ≥ 7 %) glycaemic control at the time of catheterization. Results Baseline characteristics and coronary endothelial function parameters differed significantly between T2DM patients and control group. The prevalence of endothelial independent CMD (29.8 vs. 39.6 %, p = 0.40) and dependent CMD (61.7 vs. 62.2 %, p = 1.00) were similar in patients with optimal vs. suboptimal glycaemic control. Age (OR 1.10; CI 95 % 1.04–1.18; p < 0.001) and female gender (OR 3.87; CI 95 % 1.45–11.4; p < 0.01) were significantly associated with endothelial independent CMD whereas glomerular filtrate (OR 0.97; CI 95 % 0.95–0.99; p < 0.05) was significantly associated with endothelial dependent CMD. The optimal glycaemic control was not associated with endothelial independent (OR 0.60, CI 95 % 0.23–1.46; p 0.26) or dependent CMD (OR 0.99, CI 95 % 0.43–2.24; p = 0.98). Conclusions The current target of HBA1c level does not predict a better coronary microcirculatory function in T2DM patients. The appropriate strategy for prevention of CMD in T2DM patients remains to be addressed. Keywords: Endothelial dysfunction; Diabetes mellitus; Coronary microcirculation

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stylized facts regarding the industrial process include emphases on obtaining information about and control over the quality of raw materials. We provide a model that establishes conditions under which informed control involves ensuring uniformity in inputs and increased uniformity encourages more extensive processing. We show when the Boltzmann-Shannon entropy statistic is an appropriate measure of uniformity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The choice for suitable places for female mosquitoes to lay eggs is a key-factor for the survival of immature stages (eggs and larvae). This knowledge stands out in importance concerning the control of disease vectors. The selection of a place for oviposition requires a set of chemical, visual, olfactory and tactile cues that interact with the female before laying eggs, helping the localization of adequate sites for oviposition. The present paper presents a bibliographic revision on the main aspects of semiochemicals in regard to mosquitoes' oviposition, aiding the comprehension of their mechanisms and estimation of their potential as a tool for the monitoring and control of the Culicidae.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To provide information on the effects of alcohol and tobacco on laryngeal cancer and its subsites. METHODS: This was a case-control study conducted between 1992 and 2000 in northern Italy and Switzerland. A total of 527 cases of incident squamous-cell carcinoma of the larynx and 1297 hospital controls frequency-matched with cases on age, sex, and area of residence were included. Odds ratios (ORs) and corresponding 95% confidence intervals (CIs) were estimated using multiple logistic regression. RESULTS: In comparison with never smokers, ORs were 19.8 for current smokers and 7.0 for ex-smokers. The risk increased in relation to the number of cigarettes (OR = 42.9 for > or = 25 cigarettes/day) and for duration of smoking (OR = 37.2 for > or = 40 years). For alcohol, the risk increased in relation to number of drinks (OR = 5.9 for > or = 56 drinks per week). Combined alcohol and tobacco consumption showed a multiplicative (OR = 177) rather than an additive risk. For current smokers and current drinkers the risk was higher for supraglottis (ORs 54.9 and 2.6, respectively) than for glottis (ORs 7.4 and 1.8) and others subsites (ORs 10.9 and 1.9). CONCLUSIONS: Our study shows that both cigarette smoking and alcohol drinking are independent risk factors for laryngeal cancer. Heavy consumption of alcohol and cigarettes determined a multiplicative risk increase, possibly suggesting biological synergy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Understanding the complexity of cancer depends on an elucidation of the underlying regulatory networks, at the cellular and intercellular levels and in their temporal dimension. This Opinion article focuses on the multilevel crosstalk between the Notch pathway and the p53 and p63 pathways. These two coordinated signalling modules are at the interface of external damaging signals and control of stem cell potential and differentiation. Positive or negative reciprocal regulation of the two pathways can vary with cell type and cancer stage. Therefore, selective or combined targeting of the two pathways could improve the efficacy and reduce the toxicity of cancer therapies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les reconstructions palinspastiques fournissent le cadre idéal à de nombreuses études géologiques, géographiques, océanographique ou climatiques. En tant qu?historiens de la terre, les "reconstructeurs" essayent d?en déchiffrer le passé. Depuis qu?ils savent que les continents bougent, les géologues essayent de retracer leur évolution à travers les âges. Si l?idée originale de Wegener était révolutionnaire au début du siècle passé, nous savons depuis le début des années « soixante » que les continents ne "dérivent" pas sans but au milieu des océans mais sont inclus dans un sur-ensemble associant croûte « continentale » et « océanique »: les plaques tectoniques. Malheureusement, pour des raisons historiques aussi bien que techniques, cette idée ne reçoit toujours pas l'écho suffisant parmi la communauté des reconstructeurs. Néanmoins, nous sommes intimement convaincus qu?en appliquant certaines méthodes et certains principes il est possible d?échapper à l?approche "Wégenerienne" traditionnelle pour enfin tendre vers la tectonique des plaques. Le but principal du présent travail est d?exposer, avec tous les détails nécessaires, nos outils et méthodes. Partant des données paléomagnétiques et paléogéographiques classiquement utilisées pour les reconstructions, nous avons développé une nouvelle méthodologie replaçant les plaques tectoniques et leur cinématique au coeur du problème. En utilisant des assemblages continentaux (aussi appelés "assemblées clés") comme des points d?ancrage répartis sur toute la durée de notre étude (allant de l?Eocène jusqu?au Cambrien), nous développons des scénarios géodynamiques permettant de passer de l?une à l?autre en allant du passé vers le présent. Entre deux étapes, les plaques lithosphériques sont peu à peu reconstruites en additionnant/ supprimant les matériels océaniques (symbolisés par des isochrones synthétiques) aux continents. Excepté lors des collisions, les plaques sont bougées comme des entités propres et rigides. A travers les âges, les seuls éléments évoluant sont les limites de plaques. Elles sont préservées aux cours du temps et suivent une évolution géodynamique consistante tout en formant toujours un réseau interconnecté à travers l?espace. Cette approche appelée "limites de plaques dynamiques" intègre de multiples facteurs parmi lesquels la flottabilité des plaques, les taux d'accrétions aux rides, les courbes de subsidence, les données stratigraphiques et paléobiogéographiques aussi bien que les évènements tectoniques et magmatiques majeurs. Cette méthode offre ainsi un bon contrôle sur la cinématique des plaques et fournit de sévères contraintes au modèle. Cette approche "multi-source" nécessite une organisation et une gestion des données efficaces. Avant le début de cette étude, les masses de données nécessaires était devenues un obstacle difficilement surmontable. Les SIG (Systèmes d?Information Géographiques) et les géo-databases sont des outils informatiques spécialement dédiés à la gestion, au stockage et à l?analyse des données spatialement référencées et de leurs attributs. Grâce au développement dans ArcGIS de la base de données PaleoDyn nous avons pu convertir cette masse de données discontinues en informations géodynamiques précieuses et facilement accessibles pour la création des reconstructions. Dans le même temps, grâce à des outils spécialement développés, nous avons, tout à la fois, facilité le travail de reconstruction (tâches automatisées) et amélioré le modèle en développant fortement le contrôle cinématique par la création de modèles de vitesses des plaques. Sur la base des 340 terranes nouvellement définis, nous avons ainsi développé un set de 35 reconstructions auxquelles est toujours associé un modèle de vitesse. Grâce à cet ensemble de données unique, nous pouvons maintenant aborder des problématiques majeurs de la géologie moderne telles que l?étude des variations du niveau marin et des changements climatiques. Nous avons commencé par aborder un autre problème majeur (et non définitivement élucidé!) de la tectonique moderne: les mécanismes contrôlant les mouvements des plaques. Nous avons pu observer que, tout au long de l?histoire de la terre, les pôles de rotation des plaques (décrivant les mouvements des plaques à la surface de la terre) tendent à se répartir le long d'une bande allant du Pacifique Nord au Nord de l'Amérique du Sud, l'Atlantique Central, l'Afrique du Nord, l'Asie Centrale jusqu'au Japon. Fondamentalement, cette répartition signifie que les plaques ont tendance à fuir ce plan médian. En l'absence d'un biais méthodologique que nous n'aurions pas identifié, nous avons interprété ce phénomène comme reflétant l'influence séculaire de la Lune sur le mouvement des plaques. La Lune sur le mouvement des plaques. Le domaine océanique est la clé de voute de notre modèle. Nous avons attaché un intérêt tout particulier à le reconstruire avec beaucoup de détails. Dans ce modèle, la croûte océanique est préservée d?une reconstruction à l?autre. Le matériel crustal y est symbolisé sous la forme d?isochrones synthétiques dont nous connaissons les âges. Nous avons également reconstruit les marges (actives ou passives), les rides médio-océaniques et les subductions intra-océaniques. En utilisant ce set de données très détaillé, nous avons pu développer des modèles bathymétriques 3-D unique offrant une précision bien supérieure aux précédents.<br/><br/>Palinspastic reconstructions offer an ideal framework for geological, geographical, oceanographic and climatology studies. As historians of the Earth, "reconstructers" try to decipher the past. Since they know that continents are moving, geologists a trying to retrieve the continents distributions through ages. If Wegener?s view of continent motions was revolutionary at the beginning of the 20th century, we know, since the Early 1960?s that continents are not drifting without goal in the oceanic realm but are included in a larger set including, all at once, the oceanic and the continental crust: the tectonic plates. Unfortunately, mainly due to technical and historical issues, this idea seems not to receive a sufficient echo among our particularly concerned community. However, we are intimately convinced that, by applying specific methods and principles we can escape the traditional "Wegenerian" point of view to, at last, reach real plate tectonics. This is the main aim of this study to defend this point of view by exposing, with all necessary details, our methods and tools. Starting with the paleomagnetic and paleogeographic data classically used in reconstruction studies, we developed a modern methodology placing the plates and their kinematics at the centre of the issue. Using assemblies of continents (referred as "key assemblies") as anchors distributed all along the scope of our study (ranging from Eocene time to Cambrian time) we develop geodynamic scenarios leading from one to the next, from the past to the present. In between, lithospheric plates are progressively reconstructed by adding/removing oceanic material (symbolized by synthetic isochrones) to major continents. Except during collisions, plates are moved as single rigid entities. The only evolving elements are the plate boundaries which are preserved and follow a consistent geodynamical evolution through time and form an interconnected network through space. This "dynamic plate boundaries" approach integrates plate buoyancy factors, oceans spreading rates, subsidence patterns, stratigraphic and paleobiogeographic data, as well as major tectonic and magmatic events. It offers a good control on plate kinematics and provides severe constraints for the model. This multi-sources approach requires an efficient data management. Prior to this study, the critical mass of necessary data became a sorely surmountable obstacle. GIS and geodatabases are modern informatics tools of specifically devoted to store, analyze and manage data and associated attributes spatially referenced on the Earth. By developing the PaleoDyn database in ArcGIS software we converted the mass of scattered data offered by the geological records into valuable geodynamical information easily accessible for reconstructions creation. In the same time, by programming specific tools we, all at once, facilitated the reconstruction work (tasks automation) and enhanced the model (by highly increasing the kinematic control of plate motions thanks to plate velocity models). Based on the 340 terranes properly defined, we developed a revised set of 35 reconstructions associated to their own velocity models. Using this unique dataset we are now able to tackle major issues of the geology (such as the global sea-level variations and climate changes). We started by studying one of the major unsolved issues of the modern plate tectonics: the driving mechanism of plate motions. We observed that, all along the Earth?s history, plates rotation poles (describing plate motions across the Earth?s surface) tend to follow a slight linear distribution along a band going from the Northern Pacific through Northern South-America, Central Atlantic, Northern Africa, Central Asia up to Japan. Basically, it sighifies that plates tend to escape this median plan. In the absence of a non-identified methodological bias, we interpreted it as the potential secular influence ot the Moon on plate motions. The oceanic realms are the cornerstone of our model and we attached a particular interest to reconstruct them with many details. In this model, the oceanic crust is preserved from one reconstruction to the next. The crustal material is symbolised by the synthetic isochrons from which we know the ages. We also reconstruct the margins (active or passive), ridges and intra-oceanic subductions. Using this detailed oceanic dataset, we developed unique 3-D bathymetric models offering a better precision than all the previously existing ones.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Kiristyvä kansainvälinen kilpailu pakottaa automaatiojärjestelmien valmistajat ottamaan käyttöön uusia menetelmiä, joiden avulla järjestelmien suorituskykyä ja joustavuutta saadaan parannettua. Agenttiteknologiaa on esitetty käytettäväksi olemassa olevien automaatiojärjestelmien kanssa vastaamaan automaatiolle asetettaviin uusiin haasteisiin. Agentit ovat itsenäisiä yhteisöllisiä toimijoita, jotka suorittavat niille ennalta määrättyjä tehtäviä. Ne tarjoavat yhtenäisen kehyksen kehittyneiden toimintojen toteutukselle. Agenttiteknologian avulla automaatiojärjestelmä saadaan toimimaan joustavasti ja vikasietoisesti. Tässä työssä selostetaan agenttiteknologian ajatuksia ja käsitteitä. Lisäksi selvitetään sen soveltuvuutta monimutkaisten ohjausjärjestelmien kehittämiseen ja etsitään käyttökohteita sen soveltamiselle levytehtaassa. Työssä käsitellään myös aatteita, jotka ovat johtaneet agenttiteknologian käyttöön automaatiojärjestelmissä, sekä selostetaan agenttiavusteisen esimerkkisovelluksen rakenne ja testitulokset. Tutkimuksen tuloksena löydettiin useita kohteita agenttiteknologian käytölle levytehtaassa. Esimerkkisovellus osoittaa sen sopivan hyvin kehittyneiden toimintojen toteutukseen automaatiojärjestelmissä.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A 10-year experience of our automated molecular diagnostic platform that carries out 91 different real-time PCR is described. Progresses and future perspectives in molecular diagnostic microbiology are reviewed: why automation is important; how our platform was implemented; how homemade PCRs were developed; the advantages/disadvantages of homemade PCRs, including the critical aspects of troubleshooting and the need to further reduce the turnaround time for specific samples, at least for defined clinical settings such as emergencies. The future of molecular diagnosis depends on automation, and in a novel perspective, it is time now to fully acknowledge the true contribution of molecular diagnostic and to reconsider the indication for PCR, by also using these tests as first-line assays.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Broadcasting systems are networks where the transmission is received by several terminals. Generally broadcast receivers are passive devices in the network, meaning that they do not interact with the transmitter. Providing a certain Quality of Service (QoS) for the receivers in heterogeneous reception environment with no feedback is not an easy task. Forward error control coding can be used for protection against transmission errors to enhance the QoS for broadcast services. For good performance in terrestrial wireless networks, diversity should be utilized. The diversity is utilized by application of interleaving together with the forward error correction codes. In this dissertation the design and analysis of forward error control and control signalling for providing QoS in wireless broadcasting systems are studied. Control signaling is used in broadcasting networks to give the receiver necessary information on how to connect to the network itself and how to receive the services that are being transmitted. Usually control signalling is considered to be transmitted through a dedicated path in the systems. Therefore, the relationship of the signaling and service data paths should be considered early in the design phase. Modeling and simulations are used in the case studies of this dissertation to study this relationship. This dissertation begins with a survey on the broadcasting environment and mechanisms for providing QoS therein. Then case studies present analysis and design of such mechanisms in real systems. The mechanisms for providing QoS considering signaling and service data paths and their relationship at the DVB-H link layer are analyzed as the first case study. In particular the performance of different service data decoding mechanisms and optimal signaling transmission parameter selection are presented. The second case study investigates the design of signaling and service data paths for the more modern DVB-T2 physical layer. Furthermore, by comparing the performances of the signaling and service data paths by simulations, configuration guidelines for the DVB-T2 physical layer signaling are given. The presented guidelines can prove useful when configuring DVB-T2 transmission networks. Finally, recommendations for the design of data and signalling paths are given based on findings from the case studies. The requirements for the signaling design should be derived from the requirements for the main services. Generally, these requirements for signaling should be more demanding as the signaling is the enabler for service reception.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A centralized robust position control for an electrical driven tooth belt drive is designed in this doctoral thesis. Both a cascaded control structure and a PID based position controller are discussed. The performance and the limitations of the system are analyzed and design principles for the mechanical structure and the control design are given. These design principles are also suitable for most of the motion control applications, where mechanical resonance frequencies and control loop delays are present. One of the major challenges in the design of a controller for machinery applications is that the values of the parameters in the system model (parameter uncertainty) or the system model it self (non-parametric uncertainty) are seldom known accurately in advance. In this thesis a systematic analysis of the parameter uncertainty of the linear tooth beltdrive model is presented and the effect of the variation of a single parameter on the performance of the total system is shown. The total variation of the model parameters is taken into account in the control design phase using a Quantitative Feedback Theory (QFT). The thesis also introduces a new method to analyze reference feedforward controllers applying the QFT. The performance of the designed controllers is verified by experimentalmeasurements. The measurements confirm the control design principles that are given in this thesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper studies the effect of time delay on the active non-linear control of dynamically loaded flexible structures. The behavior of non-linear systems under state feedback control, considering a fixed time delay for the control force, is investigated. A control method based on non-linear optimal control, using a tensorial formulation and state feedback control is used. The state equations and the control forces are expressed in polynomial form and a performance index, quadratic in both state vector and control forces, is used. General polynomial representations of the non-linear control law are obtained and implemented for control algorithms up to the fifth order. This methodology is applied to systems with quadratic and cubic non-linearities. Strongly non-linear systems are tested and the effectiveness of the control system including a delay for the application of control forces is discussed. Numerical results indicate that the adopted control algorithm can be efficient for non-linear systems, chiefly in the presence of strong non-linearities but increasing time delay reduces the efficiency of the control system. Numerical results emphasize the importance of considering time delay in the project of active structural control systems.