968 resultados para Single Intraoperative Application


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Effective enforcement and compliance with EU law is not just a legal necessity, it is also of economic interest since the potential of the Single Market will be fully exploited. Enforcement barriers generate unjustified costs and hindrances or uncertainty for cross-border business and might deprive consumers from receiving the full benefit of greater choice and/or cheaper offers. The EU has developed several types of enforcement efforts (preventive initiatives, pre-infringement initiatives and formal infringement procedures). More recently, the emphasis has been placed on effective prevention. This CEPS Policy Brief analyses the functioning of one preventive mechanism (the 98/34 Directive) and assesses its potential to detect and prevent technical or other barriers in the course of the last 25 years. Based on an empirical approach, it shows that this amazing mechanism has successfully prevented thousands of new technical barriers from arising in the internal goods market.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Problems in the banking system are at the core of the current crisis. The establishment of a banking union is a necessary (though not sufficient) condition for eventual crisis resolution that respects the integrity of the euro. The European Commission’s proposal for the establishment of a Single Supervisory Mechanism and related reform of the European Banking Authority (EBA) do not and cannot create a fully-fledged banking union, but represent a broadly adequate step on the basis of the leaders’ declaration of 29 June 2012 and of the decision to use Article 127(6) of the treaty as legal basis. The proposal rightly endows the European Central Bank (ECB) with broad authority over banks within the supervisory mechanism’s geographical perimeter; however, the status of non-euro area member states willing to participate in this mechanism, and the governance and decision-making processes of the ECB in this respect, call for further elaboration. Further adjustments are also desirable in the proposed reform of the EBA, even though they must probably retain a stopgap character pending the more substantial review planned in 2014.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The constant-density Charney model describes the simplest unstable basic state with a planetary-vorticity gradient, which is uniform and positive, and baroclinicity that is manifest as a negative contribution to the potential-vorticity (PV) gradient at the ground and positive vertical wind shear. Together, these ingredients satisfy the necessary conditions for baroclinic instability. In Part I it was shown how baroclinic growth on a general zonal basic state can be viewed as the interaction of pairs of ‘counter-propagating Rossby waves’ (CRWs) that can be constructed from a growing normal mode and its decaying complex conjugate. In this paper the normal-mode solutions for the Charney model are studied from the CRW perspective. Clear parallels can be drawn between the most unstable modes of the Charney model and the Eady model, in which the CRWs can be derived independently of the normal modes. However, the dispersion curves for the two models are very different; the Eady model has a short-wave cut-off, while the Charney model is unstable at short wavelengths. Beyond its maximum growth rate the Charney model has a neutral point at finite wavelength (r=1). Thereafter follows a succession of unstable branches, each with weaker growth than the last, separated by neutral points at integer r—the so-called ‘Green branches’. A separate branch of westward-propagating neutral modes also originates from each neutral point. By approximating the lower CRW as a Rossby edge wave and the upper CRW structure as a single PV peak with a spread proportional to the Rossby scale height, the main features of the ‘Charney branch’ (0

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bioturbation at all scales, which tends to replace the primary fabric of a sediment by the ichnofabric (the overall fabric of a sediment that has been bioturbated), is now recognised as playing a major role in facies interpretation. The manner in which the substrate may be colonized, and the physical, chemical and ecological controls (grainsize, sedimentation rate, oxygenation, nutrition, salinity, ethology, community structure and succession), together with the several ways in which the substrate is tiered by bioturbators, are the factors and processes that determine the nature of the ichnofabric. Eleven main styles of substrate tiering are described, ranging from single, pioneer colonization to complex tiering under equilibria, their modification under environmental deterioration and amelioration, and diagenetic enhancement or obscuration. Ichnofabrics may be assessed by four attributes: primary sedimentary factors, Bioturbation Index (BI), burrow size and frequency, and ichnological diversity. Construction of tier and ichnofabric constituent diagrams aid visualization and comparison. The breaks or changes in colonization and style of tiering at key stratal surfaces accentuate the surfaces, and many reflect a major environmental shift of the trace-forming biota. due to change in hydrodynamic regime (leading to non-deposition and/or erosion and/or lithification), change in salinity regime, or subaerial exposure. The succession of gradational or abrupt changes in ichnofabric through genetically related successions, together with changes in colonization and tiering across event beds, may also be interpreted in terms of changes in environmental parameters. It is not the ichnotaxa per se that are important in discriminating between ichnofabrics, but rather the environmental conditions that determine the overall style of colonization. Fabrics composed of different ichnotaxa (and different taphonomies) but similar tier structure and ichnoguild may form in similar environments of different age or different latitude. Appreciation of colonization and tiering styles places ancient ichnofabrics on a sound processrelated basis for environmental interpretation. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The calculation of accurate and reliable vibrational potential functions and normal co-ordinates is discussed, for such simple polyatomic molecules as it may be possible. Such calculations should be corrected for the effects of anharmonicity and of resonance interactions between the vibrational states, and should be fitted to all the available information on all isotopic species: particularly the vibrational frequencies, Coriolis zeta constants and centrifugal distortion constants. The difficulties of making these corrections, and of making use of the observed data are reviewed. A programme for the Ferranti Mercury Computer is described by means of which harmonic vibration frequencies and normal co-ordinate vectors, zeta factors and centrifugal distortion constants can be calculated, from a given force field and from given G-matrix elements, etc. The programme has been used on up to 5 × 5 secular equations for which a single calculation and output of results takes approximately l min; it can readily be extended to larger determinants. The best methods of using such a programme and the possibility of reversing the direction of calculation are discussed. The methods are applied to calculating the best possible vibrational potential function for the methane molecule, making use of all the observed data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Capillary electrophoresis (CE) offers the analyst a number of key advantages for the analysis of the components of foods. CE offers better resolution than, say, high-performance liquid chromatography (HPLC), and is more adept at the simultaneous separation of a number of components of different chemistries within a single matrix. In addition, CE requires less rigorous sample cleanup procedures than HPLC, while offering the same degree of automation. However, despite these advantages, CE remains under-utilized by food analysts. Therefore, this review consolidates and discusses the currently reported applications of CE that are relevant to the analysis of foods. Some discussion is also devoted to the development of these reported methods and to the advantages/disadvantages compared with the more usual methods for each particular analysis. It is the aim of this review to give practicing food analysts an overview of the current scope of CE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two-stage designs offer substantial advantages for early phase II studies. The interim analysis following the first stage allows the study to he stopped for futility, or more positively, it might lead to early progression to the trials needed for late phase H and phase III. If the study is to continue to its second stage, then there is an opportunity for a revision of the total sample size. Two-stage designs have been implemented widely in oncology studies in which there is a single treatment arm and patient responses are binary. In this paper the case of two-arm comparative studies in which responses are quantitative is considered. This setting is common in therapeutic areas other than oncology. It will be assumed that observations are normally distributed, but that there is some doubt concerning their standard deviation, motivating the need for sample size review. The work reported has been motivated by a study in diabetic neuropathic pain, and the development of the design for that trial is described in detail. Copyright (C) 2008 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Space applications are challenged by the reliability of parallel computing systems (FPGAs) employed in space crafts due to Single-Event Upsets. The work reported in this paper aims to achieve self-managing systems which are reliable for space applications by applying autonomic computing constructs to parallel computing systems. A novel technique, 'Swarm-Array Computing' inspired by swarm robotics, and built on the foundations of autonomic and parallel computing is proposed as a path to achieve autonomy. The constitution of swarm-array computing comprising for constituents, namely the computing system, the problem / task, the swarm and the landscape is considered. Three approaches that bind these constituents together are proposed. The feasibility of one among the three proposed approaches is validated on the SeSAm multi-agent simulator and landscapes representing the computing space and problem are generated using the MATLAB.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract. Different types of mental activity are utilised as an input in Brain-Computer Interface (BCI) systems. One such activity type is based on Event-Related Potentials (ERPs). The characteristics of ERPs are not visible in single-trials, thus averaging over a number of trials is necessary before the signals become usable. An improvement in ERP-based BCI operation and system usability could be obtained if the use of single-trial ERP data was possible. The method of Independent Component Analysis (ICA) can be utilised to separate single-trial recordings of ERP data into components that correspond to ERP characteristics, background electroencephalogram (EEG) activity and other components with non- cerebral origin. Choice of specific components and their use to reconstruct “denoised” single-trial data could improve the signal quality, thus allowing the successful use of single-trial data without the need for averaging. This paper assesses single-trial ERP signals reconstructed using a selection of estimated components from the application of ICA on the raw ERP data. Signal improvement is measured using Contrast-To-Noise measures. It was found that such analysis improves the signal quality in all single-trials.