983 resultados para mechanism design


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The financial and economic crisis has hit Europe in its core. While the crisis may not have originated in the European Union, it has laid bare structural weaknesses in the EU’s policy framework. Both public finances and the banking sector have been heavily affected. For a long time, the EU failed to take into account sufficiently the perverse link that existed between the two. Negative evolutions in one field of the crisis often dragged along the other in its downward spiral. In June 2012, in the early hours of a yet another EU Summit, the leaders of the eurozone finally decided to address the link between the banking and sovereign debt crises. Faced with soaring public borrowing costs in Spain and Italy, they decided to allow for the direct European recapitalisation of banks when the Member State itself would no longer be in a position to do so. In exchange, supervision of the banking sector would be lifted to the European level by means of a Single Supervisory Mechanism. The Single Supervisory Mechanism, or SSM in the EU jargon, is a first step in the broader revision of policies towards banks in Europe. The eventual goal is the creation of a Banking Union, which is to carry out effective surveillance and – if needed – crisis management of the banking sector. The SSM is to rely on national supervisors and the ECB, with the ECB having final authority on the matter. The involvement of the latter made it clear that the SSM would be centred on the eurozone – while it is to remain open to other Member States willing to join. Due to the ongoing problems and the link between the creation of the SSM and the recapitalisation of banks, the SSM became one of the key legislative priorities of the EU. In December 2012, Member States reached an agreement on the design of the SSM. After discussions with the European Parliament (which were still ongoing at the time of writing), the process towards making the SSM operational can be initiated. The goal is to have the SSM fully up and running in the first half of 2014. The decisions that were taken in June 2012 are likely to have had a bigger impact than the eurozone’s Heads of State and Government could have realised at the time for two important reasons. On the one hand, creating the SSM necessitates a full Banking Union and therefore shared risk. On the other hand, the decisions improved the ECB’s perception of the willingness of governments to take far-reaching measures. This undoubtedly played a significant role in the creation of the Outright Monetary Transactions programme by the ECB, which has led to a substantial easing of the crisis in the short-term. 1 These short-term gains should now be matched with a stable long-term framework for bank supervision and crisis management. The agreement on the SSM should be the first step in the direction of this goal. This paper provides an analysis of the SSM and its role in the creation of a Banking Union. The paper starts with a reminder of why the EU decided to put in place the SSM (§1) and the state of play of the ongoing negotiations on the SSM (§2). Subsequently, the supervisory responsibilities of the SSM are detailed, including its scope and the division of labour between the national supervisors and the ECB (§3). The internal functioning of the SSM (§4) and its relation to the other supervisors are discussed afterwards (§5). As mentioned earlier, the SSM is part of a wider move towards a Banking Union. Therefore, this paper sheds light on the other building blocks of this ambitious project (§6). The transition towards the Banking Union is important and will prove to be a bumpy ride. Before formulating a number of conclusions, this Working Paper therefore provides an overview of the planned road ahead (§7).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

"SAE J795."

Relevância:

30.00% 30.00%

Publicador:

Resumo:

On cover: SAE standards. TR-9.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Translation of Zur graphischen Statik der Maschinengetriebe.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Living radical polymerization has allowed complex polymer architectures to be synthesized in bulk, solution, and water. The most versatile of these techniques is reversible addition-fragmentation chain transfer (RAFT), which allows a wide range of functional and nonfunctional polymers to be made with predictable molecular weight distributions (MWDs), ranging from very narrow to quite broad. The great complexity of the RAFT mechanism and how the kinetic parameters affect the rate of polymerization and MWD are not obvious. Therefore, the aim of this article is to provide useful insights into the important kinetic parameters that control the rate of polymerization and the evolution of the MWD with conversion. We discuss how a change in the chain-transfer constant can affect the evolution of the MWD. It is shown how we can, in principle, use only one RAFT agent to obtain a poly-mer with any MWD. Retardation and inhibition are discussed in terms of (1) the leaving R group reactivity and (2) the intermediate radical termination model versus the slow fragmentation model. (c) 2005 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an approach for optimal design of a fully regenerative dynamic dynamometer using genetic algorithms. The proposed dynamometer system includes an energy storage mechanism to adaptively absorb the energy variations following the dynamometer transients. This allows the minimum power electronics requirement at the mains power supply grid to compensate for the losses. The overall dynamometer system is a dynamic complex system and design of the system is a multi-objective problem, which requires advanced optimisation techniques such as genetic algorithms. The case study of designing and simulation of the dynamometer system indicates that the genetic algorithm based approach is able to locate a best available solution in view of system performance and computational costs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Requirements for systems to continue to operate satisfactorily in the presence of faults has led to the development of techniques for the construction of fault tolerant software. This thesis addresses the problem of error detection and recovery in distributed systems which consist of a set of communicating sequential processes. A method is presented for the `a priori' design of conversations for this class of distributed system. Petri nets are used to represent the state and to solve state reachability problems for concurrent systems. The dynamic behaviour of the system can be characterised by a state-change table derived from the state reachability tree. Systematic conversation generation is possible by defining a closed boundary on any branch of the state-change table. By relating the state-change table to process attributes it ensures all necessary processes are included in the conversation. The method also ensures properly nested conversations. An implementation of the conversation scheme using the concurrent language occam is proposed. The structure of the conversation is defined using the special features of occam. The proposed implementation gives a structure which is independent of the application and is independent of the number of processes involved. Finally, the integrity of inter-process communications is investigated. The basic communication primitives used in message passing systems are seen to have deficiencies when applied to systems with safety implications. Using a Petri net model a boundary for a time-out mechanism is proposed which will increase the integrity of a system which involves inter-process communications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The absence of a definitive approach to the design of manufacturing systems signifies the importance of a control mechanism to ensure the timely application of relevant design techniques. To provide effective control, design development needs to be continually assessed in relation to the required system performance, which can only be achieved analytically through computer simulation. The technique providing the only method of accurately replicating the highly complex and dynamic interrelationships inherent within manufacturing facilities and realistically predicting system behaviour. Owing to the unique capabilities of computer simulation, its application should support and encourage a thorough investigation of all alternative designs. Allowing attention to focus specifically on critical design areas and enabling continuous assessment of system evolution. To achieve this system analysis needs to efficient, in terms of data requirements and both speed and accuracy of evaluation. To provide an effective control mechanism a hierarchical or multi-level modelling procedure has therefore been developed, specifying the appropriate degree of evaluation support necessary at each phase of design. An underlying assumption of the proposal being that evaluation is quick, easy and allows models to expand in line with design developments. However, current approaches to computer simulation are totally inappropriate to support the hierarchical evaluation. Implementation of computer simulation through traditional approaches is typically characterized by a requirement for very specialist expertise, a lengthy model development phase, and a correspondingly high expenditure. Resulting in very little and rather inappropriate use of the technique. Simulation, when used, is generally only applied to check or verify a final design proposal. Rarely is the full potential of computer simulation utilized to aid, support or complement the manufacturing system design procedure. To implement the proposed modelling procedure therefore the concept of a generic simulator was adopted, as such systems require no specialist expertise, instead facilitating quick and easy model creation, execution and modification, through simple data inputs. Previously generic simulators have tended to be too restricted, lacking the necessary flexibility to be generally applicable to manufacturing systems. Development of the ATOMS manufacturing simulator, however, has proven that such systems can be relevant to a wide range of applications, besides verifying the benefits of multi-level modelling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mobile technology has been one of the major growth areas in computing over recent years (Urbaczewski, Valacich, & Jessup, 2003). Mobile devices are becoming increasingly diverse and are continuing to shrink in size and weight. Although this increases the portability of such devices, their usability tends to suffer. Fuelled almost entirely by lack of usability, users report high levels of frustration regarding interaction with mobile technologies (Venkatesh, Ramesh, & Massey, 2003). This will only worsen if interaction design for mobile technologies does not continue to receive increasing research attention. For the commercial benefit of mobility and mobile commerce (m-commerce) to be fully realized, users’ interaction experiences with mobile technology cannot be negative. To ensure this, it is imperative that we design the right types of mobile interaction (m-interaction); an important prerequisite for this is ensuring that users’ experience meets both their sensory and functional needs (Venkatesh, Ramesh, & Massey, 2003). Given the resource disparity between mobile and desktop technologies, successful electronic commerce (e-commerce) interface design and evaluation does not necessarily equate to successful m-commerce design and evaluation. It is, therefore, imperative that the specific needs of m-commerce are addressed–both in terms of design and evaluation. This chapter begins by exploring the complexities of designing interaction for mobile technology, highlighting the effect of context on the use of such technology. It then goes on to discuss how interaction design for mobile devices might evolve, introducing alternative interaction modalities that are likely to affect that future evolution. It is impossible, within a single chapter, to consider each and every potential mechanism for interacting with mobile technologies; to provide a forward-looking flavor of what might be possible, this chapter focuses on some more novel methods of interaction and does not, therefore, look at the typical keyboard and visual display-based interaction which, in essence, stem from the desktop interaction design paradigm. Finally, this chapter touches on issues associated with effective evaluation of m-interaction and mobile application designs. By highlighting some of the issues and possibilities for novel m-interaction design and evaluation, we hope that future designers will be encouraged to “think out of the box” in terms of their designs and evaluation strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Bacterial endotoxin is a potently inflammatory antigen that is abundant in the human gut. Endotoxin circulates at low concentrations in the blood of all healthy individuals, although elevated concentrations are associated with an increased risk of atherosclerosis. Objective: We sought to determine whether a high-fat meal or smoking increases plasma endotoxin concentrations and whether such concentrations are of physiologic relevance. Design: Plasma endotoxin and endotoxin neutralization capacity were measured for 4 h in 12 healthy men after no meal, 3 cigarettes, a high-fat meal, or a high-fat meal with 3 cigarettes by using the limulus assay. Results: Baseline endotoxin concentrations were 8.2 pg/mL (interquartile range: 3.4–13.5 pg/mL) but increased significantly (P < 0.05) by ≈50% after a high-fat meal or after a high-fat meal with cigarettes but not after no meal or cigarettes alone. These results were validated by the observations that a high-fat meal with or without cigarettes, but not no meal or smoking, also significantly (P < 0.05) reduced plasma endotoxin neutralization capacity, which is an indirect measure of endotoxin exposure. Human monocytes, but not aortic endothelial cells, were responsive to transient (30 s) or low-dose (10 pg/mL) exposure to endotoxin. However, plasma from whole blood treated with as little as 10 pg endotoxin/mL increased the endothelial cell expression of E-selectin, at least partly via tumor necrosis factor-α–induced cellular activation. Conclusions: Low-grade endotoxemia may contribute to the postprandial inflammatory state and could represent a novel potential contributor to endothelial activation and the development of atherosclerosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a new medium for questionnaire delivery, the internet has the potential to revolutionise the survey process. Online (web-based) questionnaires provide several advantages over traditional survey methods in terms of cost, speed, appearance, flexibility, functionality, and usability [1, 2]. For instance, delivery is faster, responses are received more quickly, and data collection can be automated or accelerated [1- 3]. Online-questionnaires can also provide many capabilities not found in traditional paper-based questionnaires: they can include pop-up instructions and error messages; they can incorporate links; and it is possible to encode difficult skip patterns making such patterns virtually invisible to respondents. Like many new technologies, however, online-questionnaires face criticism despite their advantages. Typically, such criticisms focus on the vulnerability of online-questionnaires to the four standard survey error types: namely, coverage, non-response, sampling, and measurement errors. Although, like all survey errors, coverage error (“the result of not allowing all members of the survey population to have an equal or nonzero chance of being sampled for participation in a survey” [2, pg. 9]) also affects traditional survey methods, it is currently exacerbated in online-questionnaires as a result of the digital divide. That said, many developed countries have reported substantial increases in computer and internet access and/or are targeting this as part of their immediate infrastructural development [4, 5]. Indicating that familiarity with information technologies is increasing, these trends suggest that coverage error will rapidly diminish to an acceptable level (for the developed world at least) in the near future, and in so doing, positively reinforce the advantages of online-questionnaire delivery. The second error type – the non-response error – occurs when individuals fail to respond to the invitation to participate in a survey or abandon a questionnaire before it is completed. Given today’s societal trend towards self-administration [2] the former is inevitable, irrespective of delivery mechanism. Conversely, non-response as a consequence of questionnaire abandonment can be relatively easily addressed. Unlike traditional questionnaires, the delivery mechanism for online-questionnaires makes estimation of questionnaire length and time required for completion difficult1, thus increasing the likelihood of abandonment. By incorporating a range of features into the design of an online questionnaire, it is possible to facilitate such estimation – and indeed, to provide respondents with context sensitive assistance during the response process – and thereby reduce abandonment while eliciting feelings of accomplishment [6]. For online-questionnaires, sampling error (“the result of attempting to survey only some, and not all, of the units in the survey population” [2, pg. 9]) can arise when all but a small portion of the anticipated respondent set is alienated (and so fails to respond) as a result of, for example, disregard for varying connection speeds, bandwidth limitations, browser configurations, monitors, hardware, and user requirements during the questionnaire design process. Similarly, measurement errors (“the result of poor question wording or questions being presented in such a way that inaccurate or uninterpretable answers are obtained” [2, pg. 11]) will lead to respondents becoming confused and frustrated. Sampling, measurement, and non-response errors are likely to occur when an online-questionnaire is poorly designed. Individuals will answer questions incorrectly, abandon questionnaires, and may ultimately refuse to participate in future surveys; thus, the benefit of online questionnaire delivery will not be fully realized. To prevent errors of this kind2, and their consequences, it is extremely important that practical, comprehensive guidelines exist for the design of online questionnaires. Many design guidelines exist for paper-based questionnaire design (e.g. [7-14]); the same is not true for the design of online questionnaires [2, 15, 16]. The research presented in this paper is a first attempt to address this discrepancy. Section 2 describes the derivation of a comprehensive set of guidelines for the design of online-questionnaires and briefly (given space restrictions) outlines the essence of the guidelines themselves. Although online-questionnaires reduce traditional delivery costs (e.g. paper, mail out, and data entry), set up costs can be high given the need to either adopt and acquire training in questionnaire development software or secure the services of a web developer. Neither approach, however, guarantees a good questionnaire (often because the person designing the questionnaire lacks relevant knowledge in questionnaire design). Drawing on existing software evaluation techniques [17, 18], we assessed the extent to which current questionnaire development applications support our guidelines; Section 3 describes the framework used for the evaluation, and Section 4 discusses our findings. Finally, Section 5 concludes with a discussion of further work.