37 resultados para Tidal power industry

em Université de Lausanne, Switzerland


Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION. Neurally Adjusted Ventilatory Assist (NAVA) is a new ventilatory mode in which ventilator settings are adjusted based on the electrical activity detected in the diaphragm (Eadi). This mode offers significant advantages in mechanical ventilation over standard pressure support (PS) modes, since ventilator input is determined directly from patient ventilatory demand. Therefore, it is expected that tidal volume (Vt) under NAVA would show better correlation with Eadi compared with PS, and exhibit greater variability due to the variability in the Eadi input to the ventilator. OBJECTIVES. To compare tidal volume variability in PS and NAVA ventilation modes, and its correlation with patient ventilatory demand (as characterized by maximum Eadi). METHODS. Acomparative study of patient-ventilator interaction was performed for 22 patients during standard PS with clinician determined ventilator settings; and NAVA, with NAVA gain set to ensure the same peak airway pressure as the total pressure obtained in PS. A 20 min continuous recording was performed in each ventilator mode. Respiratory rate, Vt, and Eadi were recorded. Tidal volume variance and Pearson correlation coefficient between Vt and Eadi were calculated for each patient. A periodogram was plotted for each ventilator mode and each patient, showing spectral power as a function of frequency to assess variability. RESULTS. Median, lower quartile and upper quartile values for Vt variance and Vt/Eadi correlation are shown in Table 1. The NAVA cohort exhibits substantially greater correlation and variance than the PS cohort. Power spectrums for Vt and Eadi are shown in Fig. 1 (PS and NAVA) for a typical patient. The enlarged section highlights how changes in Eadi are highly synchronized with NAVA ventilation, but less so for PS. CONCLUSIONS. There is greater variability in tidal volume and correlation between tidal volume and diaphragmatic electrical activity with NAVA compared to PS. These results are consistent with the improved patient-ventilator synchrony reported in the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Résumé Cet article examine le rôle joué par les normes internationales techniques dans la mondialisation des activités de service. Différentes approches d'économie considèrent que les spécificités des activités de services sont un frein à leur délocalisation, à leur industrialisation et à leur normalisation. A l'opposé de ces approches centrées sur les spécificités des activités de services, les approches d'économie politique internationale mettent en avant l'existence de configurations conflictuelles de pouvoir à l'oeuvre dans l'internationalisation des activités de services et ce, au-delà des limites sectorielles et nationales. Cet article examine le cas du secteur des centres d'appels et, plus généralement, celui de la sous-traitance des services aux entreprises (BPO) en Inde. Nos résultats suggèrent que les normes techniques sont importantes dans le secteur étudié, alors même que ces types de services sont conventionnellement identifiés comme étant peu susceptibles d'être soumis à des normes. Une perspective d'économie politique sur la normalisation des activités de service souligne comment la problématique du pouvoir investit la normalisation technique d'une dimension plus progressive à travers les thématiques du "travailleur", du "consommateur", ou de "l'environnement". Abstract This paper explores the role of international standards in the much-debated globalisation of the service economy. Various strands of economic analyses consider that core attributes of services affect their ability to be reliably delocalised, industrialised, and standardised. In contrast, international political economy approaches draw attention to power configurations supporting conflicting use of standards across industries and nations. The paper examines the case of the rising Indian service industry in customer centres and business process outsourcing to probe these opposing views. Our findings suggest that standards matter in types of services that conventional economic analyses identify as unlikely to be standardised, and that the standards used in the Indian BPO industry are widely accepted. Despite little conflict in actual definitions of market requirements, an international political economy perspective on service standardisation highlights the importance of potential power issues related to workers', consumers', and environmental concerns likely to be included in more progressive forms of standardisation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To provide an update to the original Surviving Sepsis Campaign clinical management guidelines, "Surviving Sepsis Campaign Guidelines for Management of Severe Sepsis and Septic Shock," published in 2004. DESIGN: Modified Delphi method with a consensus conference of 55 international experts, several subsequent meetings of subgroups and key individuals, teleconferences, and electronic-based discussion among subgroups and among the entire committee. This process was conducted independently of any industry funding. METHODS: We used the Grades of Recommendation, Assessment, Development and Evaluation (GRADE) system to guide assessment of quality of evidence from high (A) to very low (D) and to determine the strength of recommendations. A strong recommendation (1) indicates that an intervention's desirable effects clearly outweigh its undesirable effects (risk, burden, cost) or clearly do not. Weak recommendations (2) indicate that the tradeoff between desirable and undesirable effects is less clear. The grade of strong or weak is considered of greater clinical importance than a difference in letter level of quality of evidence. In areas without complete agreement, a formal process of resolution was developed and applied. Recommendations are grouped into those directly targeting severe sepsis, recommendations targeting general care of the critically ill patient that are considered high priority in severe sepsis, and pediatric considerations. RESULTS: Key recommendations, listed by category, include early goal-directed resuscitation of the septic patient during the first 6 hrs after recognition (1C); blood cultures before antibiotic therapy (1C); imaging studies performed promptly to confirm potential source of infection (1C); administration of broad-spectrum antibiotic therapy within 1 hr of diagnosis of septic shock (1B) and severe sepsis without septic shock (1D); reassessment of antibiotic therapy with microbiology and clinical data to narrow coverage, when appropriate (1C); a usual 7-10 days of antibiotic therapy guided by clinical response (1D); source control with attention to the balance of risks and benefits of the chosen method (1C); administration of either crystalloid or colloid fluid resuscitation (1B); fluid challenge to restore mean circulating filling pressure (1C); reduction in rate of fluid administration with rising filing pressures and no improvement in tissue perfusion (1D); vasopressor preference for norepinephrine or dopamine to maintain an initial target of mean arterial pressure > or = 65 mm Hg (1C); dobutamine inotropic therapy when cardiac output remains low despite fluid resuscitation and combined inotropic/vasopressor therapy (1C); stress-dose steroid therapy given only in septic shock after blood pressure is identified to be poorly responsive to fluid and vasopressor therapy (2C); recombinant activated protein C in patients with severe sepsis and clinical assessment of high risk for death (2B except 2C for postoperative patients). In the absence of tissue hypoperfusion, coronary artery disease, or acute hemorrhage, target a hemoglobin of 7-9 g/dL (1B); a low tidal volume (1B) and limitation of inspiratory plateau pressure strategy (1C) for acute lung injury (ALI)/acute respiratory distress syndrome (ARDS); application of at least a minimal amount of positive end-expiratory pressure in acute lung injury (1C); head of bed elevation in mechanically ventilated patients unless contraindicated (1B); avoiding routine use of pulmonary artery catheters in ALI/ARDS (1A); to decrease days of mechanical ventilation and ICU length of stay, a conservative fluid strategy for patients with established ALI/ARDS who are not in shock (1C); protocols for weaning and sedation/analgesia (1B); using either intermittent bolus sedation or continuous infusion sedation with daily interruptions or lightening (1B); avoidance of neuromuscular blockers, if at all possible (1B); institution of glycemic control (1B), targeting a blood glucose < 150 mg/dL after initial stabilization (2C); equivalency of continuous veno-veno hemofiltration or intermittent hemodialysis (2B); prophylaxis for deep vein thrombosis (1A); use of stress ulcer prophylaxis to prevent upper gastrointestinal bleeding using H2 blockers (1A) or proton pump inhibitors (1B); and consideration of limitation of support where appropriate (1D). Recommendations specific to pediatric severe sepsis include greater use of physical examination therapeutic end points (2C); dopamine as the first drug of choice for hypotension (2C); steroids only in children with suspected or proven adrenal insufficiency (2C); and a recommendation against the use of recombinant activated protein C in children (1B). CONCLUSIONS: There was strong agreement among a large cohort of international experts regarding many level 1 recommendations for the best current care of patients with severe sepsis. Evidenced-based recommendations regarding the acute management of sepsis and septic shock are the first step toward improved outcomes for this important group of critically ill patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Locating new wind farms is of crucial importance for energy policies of the next decade. To select the new location, an accurate picture of the wind fields is necessary. However, characterizing wind fields is a difficult task, since the phenomenon is highly nonlinear and related to complex topographical features. In this paper, we propose both a nonparametric model to estimate wind speed at different time instants and a procedure to discover underrepresented topographic conditions, where new measuring stations could be added. Compared to space filling techniques, this last approach privileges optimization of the output space, thus locating new potential measuring sites through the uncertainty of the model itself.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neurally adjusted ventilatory assist (NAVA) is a ventilation assist mode that delivers pressure in proportionality to electrical activity of the diaphragm (Eadi). Compared to pressure support ventilation (PS), it improves patient-ventilator synchrony and should allow a better expression of patient's intrinsic respiratory variability. We hypothesize that NAVA provides better matching in ventilator tidal volume (Vt) to patients inspiratory demand. 22 patients with acute respiratory failure, ventilated with PS were included in the study. A comparative study was carried out between PS and NAVA, with NAVA gain ensuring the same peak airway pressure as PS. Robust coefficients of variation (CVR) for Eadi and Vt were compared for each mode. The integral of Eadi (ʃEadi) was used to represent patient's inspiratory demand. To evaluate tidal volume and patient's demand matching, Range90 = 5-95 % range of the Vt/ʃEadi ratio was calculated, to normalize and compare differences in demand within and between patients and modes. In this study, peak Eadi and ʃEadi are correlated with median correlation of coefficients, R > 0.95. Median ʃEadi, Vt, neural inspiratory time (Ti_ ( Neural )), inspiratory time (Ti) and peak inspiratory pressure (PIP) were similar in PS and NAVA. However, it was found that individual patients have higher or smaller ʃEadi, Vt, Ti_ ( Neural ), Ti and PIP. CVR analysis showed greater Vt variability for NAVA (p < 0.005). Range90 was lower for NAVA than PS for 21 of 22 patients. NAVA provided better matching of Vt to ʃEadi for 21 of 22 patients, and provided greater variability Vt. These results were achieved regardless of differences in ventilatory demand (Eadi) between patients and modes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction In my thesis I argue that economic policy is all about economics and politics. Consequently, analysing and understanding economic policy ideally has at least two parts. The economics part, which is centered around the expected impact of a specific policy on the real economy both in terms of efficiency and equity. The insights of this part point into which direction the fine-tuning of economic policies should go. However, fine-tuning of economic policies will be most likely subject to political constraints. That is why, in the politics part, a much better understanding can be gained by taking into account how the incentives of politicians and special interest groups as well as the role played by different institutional features affect the formation of economic policies. The first part and chapter of my thesis concentrates on the efficiency-related impact of economic policies: how does corporate income taxation in general, and corporate income tax progressivity in specific, affect the creation of new firms? Reduced progressivity and flat-rate taxes are in vogue. By 2009, 22 countries are operating flat-rate income tax systems, as do 7 US states and 14 Swiss cantons (for corporate income only). Tax reform proposals in the spirit of the "flat tax" model typically aim to reduce three parameters: the average tax burden, the progressivity of the tax schedule, and the complexity of the tax code. In joint work, Marius Brülhart and I explore the implications of changes in these three parameters on entrepreneurial activity, measured by counts of firm births in a panel of Swiss municipalities. Our results show that lower average tax rates and reduced complexity of the tax code promote firm births. Controlling for these effects, reduced progressivity inhibits firm births. Our reading of these results is that tax progressivity has an insurance effect that facilitates entrepreneurial risk taking. The positive effects of lower tax levels and reduced complexity are estimated to be significantly stronger than the negative effect of reduced progressivity. To the extent that firm births reflect desirable entrepreneurial dynamism, it is not the flattening of tax schedules that is key to successful tax reforms, but the lowering of average tax burdens and the simplification of tax codes. Flatness per se is of secondary importance and even appears to be detrimental to firm births. The second part of my thesis, which corresponds to the second and third chapter, concentrates on how economic policies are formed. By the nature of the analysis, these two chapters draw on a broader literature than the first chapter. Both economists and political scientists have done extensive research on how economic policies are formed. Thereby, researchers in both disciplines have recognised the importance of special interest groups trying to influence policy-making through various channels. In general, economists base their analysis on a formal and microeconomically founded approach, while abstracting from institutional details. In contrast, political scientists' frameworks are generally richer in terms of institutional features but lack the theoretical rigour of economists' approaches. I start from the economist's point of view. However, I try to borrow as much as possible from the findings of political science to gain a better understanding of how economic policies are formed in reality. In the second chapter, I take a theoretical approach and focus on the institutional policy framework to explore how interactions between different political institutions affect the outcome of trade policy in presence of special interest groups' lobbying. Standard political economy theory treats the government as a single institutional actor which sets tariffs by trading off social welfare against contributions from special interest groups seeking industry-specific protection from imports. However, these models lack important (institutional) features of reality. That is why, in my model, I split up the government into a legislative and executive branch which can both be lobbied by special interest groups. Furthermore, the legislative has the option to delegate its trade policy authority to the executive. I allow the executive to compensate the legislative in exchange for delegation. Despite ample anecdotal evidence, bargaining over delegation of trade policy authority has not yet been formally modelled in the literature. I show that delegation has an impact on policy formation in that it leads to lower equilibrium tariffs compared to a standard model without delegation. I also show that delegation will only take place if the lobby is not strong enough to prevent it. Furthermore, the option to delegate increases the bargaining power of the legislative at the expense of the lobbies. Therefore, the findings of this model can shed a light on why the U.S. Congress often practices delegation to the executive. In the final chapter of my thesis, my coauthor, Antonio Fidalgo, and I take a narrower approach and focus on the individual politician level of policy-making to explore how connections to private firms and networks within parliament affect individual politicians' decision-making. Theories in the spirit of the model of the second chapter show how campaign contributions from lobbies to politicians can influence economic policies. There exists an abundant empirical literature that analyses ties between firms and politicians based on campaign contributions. However, the evidence on the impact of campaign contributions is mixed, at best. In our paper, we analyse an alternative channel of influence in the shape of personal connections between politicians and firms through board membership. We identify a direct effect of board membership on individual politicians' voting behaviour and an indirect leverage effect when politicians with board connections influence non-connected peers. We assess the importance of these two effects using a vote in the Swiss parliament on a government bailout of the national airline, Swissair, in 2001, which serves as a natural experiment. We find that both the direct effect of connections to firms and the indirect leverage effect had a strong and positive impact on the probability that a politician supported the government bailout.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Résumé Le μ-calcul est une extension de la logique modale par des opérateurs de point fixe. Dans ce travail nous étudions la complexité de certains fragments de cette logique selon deux points de vue, différents mais étroitement liés: l'un syntaxique (ou combinatoire) et l'autre topologique. Du point de vue syn¬taxique, les propriétés définissables dans ce formalisme sont classifiées selon la complexité combinatoire des formules de cette logique, c'est-à-dire selon le nombre d'alternances des opérateurs de point fixe. Comparer deux ensembles de modèles revient ainsi à comparer la complexité syntaxique des formules as¬sociées. Du point de vue topologique, les propriétés définissables dans cette logique sont comparées à l'aide de réductions continues ou selon leurs positions dans la hiérarchie de Borel ou dans celle projective. Dans la première partie de ce travail nous adoptons le point de vue syntax¬ique afin d'étudier le comportement du μ-calcul sur des classes restreintes de modèles. En particulier nous montrons que: (1) sur la classe des modèles symétriques et transitifs le μ-calcul est aussi expressif que la logique modale; (2) sur la classe des modèles transitifs, toute propriété définissable par une formule du μ-calcul est définissable par une formule sans alternance de points fixes, (3) sur la classe des modèles réflexifs, il y a pour tout η une propriété qui ne peut être définie que par une formule du μ-calcul ayant au moins η alternances de points fixes, (4) sur la classe des modèles bien fondés et transitifs le μ-calcul est aussi expressif que la logique modale. Le fait que le μ-calcul soit aussi expressif que la logique modale sur la classe des modèles bien fondés et transitifs est bien connu. Ce résultat est en ef¬fet la conséquence d'un théorème de point fixe prouvé indépendamment par De Jongh et Sambin au milieu des années 70. La preuve que nous donnons de l'effondrement de l'expressivité du μ-calcul sur cette classe de modèles est néanmoins indépendante de ce résultat. Par la suite, nous étendons le langage du μ-calcul en permettant aux opérateurs de point fixe de lier des occurrences négatives de variables libres. En montrant alors que ce formalisme est aussi ex¬pressif que le fragment modal, nous sommes en mesure de fournir une nouvelle preuve du théorème d'unicité des point fixes de Bernardi, De Jongh et Sambin et une preuve constructive du théorème d'existence de De Jongh et Sambin. RÉSUMÉ Pour ce qui concerne les modèles transitifs, du point de vue topologique cette fois, nous prouvons que la logique modale correspond au fragment borélien du μ-calcul sur cette classe des systèmes de transition. Autrement dit, nous vérifions que toute propriété définissable des modèles transitifs qui, du point de vue topologique, est une propriété borélienne, est nécessairement une propriété modale, et inversement. Cette caractérisation du fragment modal découle du fait que nous sommes en mesure de montrer que, modulo EF-bisimulation, un ensemble d'arbres est définissable dans la logique temporelle Ε F si et seulement il est borélien. Puisqu'il est possible de montrer que ces deux propriétés coïncident avec une caractérisation effective de la définissabilité dans la logique Ε F dans le cas des arbres à branchement fini donnée par Bojanczyk et Idziaszek [24], nous obtenons comme corollaire leur décidabilité. Dans une deuxième partie, nous étudions la complexité topologique d'un sous-fragment du fragment sans alternance de points fixes du μ-calcul. Nous montrons qu'un ensemble d'arbres est définissable par une formule de ce frag¬ment ayant au moins η alternances si et seulement si cette propriété se trouve au moins au n-ième niveau de la hiérarchie de Borel. Autrement dit, nous vérifions que pour ce fragment du μ-calcul, les points de vue topologique et combina- toire coïncident. De plus, nous décrivons une procédure effective capable de calculer pour toute propriété définissable dans ce langage sa position dans la hiérarchie de Borel, et donc le nombre d'alternances de points fixes nécessaires à la définir. Nous nous intéressons ensuite à la classification des ensembles d'arbres par réduction continue, et donnons une description effective de l'ordre de Wadge de la classe des ensembles d'arbres définissables dans le formalisme considéré. En particulier, la hiérarchie que nous obtenons a une hauteur (ωω)ω. Nous complétons ces résultats en décrivant un algorithme permettant de calculer la position dans cette hiérarchie de toute propriété définissable.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Power is a fundamental force in social relationships and is pervasive throughout various types of interactions. Although research has shown that the possession of power can change the powerholder, the full extent of power's consequences on individuals' decision making capabilities and social interactions within organizations is not fully understood. The goal of this paper is to review, synthesize, and critique the literature on power with a focus on its organizational and managerial implications. Specifically, we propose a definition of power that takes into account its three defining characteristics-having the discretion and means to enforce one's will-and summarize the extant literature on how power influences individuals' thoughts, emotions, and actions both in terms of prosocial and antisocial outcomes. In addition, we highlight important moderators of power and describe ways in which it can be studied in a more rigorous manner by examining methodological issues and pitfalls with regard to its measurement and manipulation. We also provide future research directions to motivate and guide the study of power by management scholars. Our desire is to present a thorough and parsimonious account of power's influence on individuals within an organizational context, as well as provide a foundation that scholars can build upon as they continue to make consequential contributions to the study of power.