53 resultados para Patched-conic approximation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Within the framework of a retrospective study of the incidence of hip fractures in the canton of Vaud (Switzerland), all cases of hip fracture occurring among the resident population in 1986 and treated in the hospitals of the canton were identified from among five different information sources. Relevant data were then extracted from the medical records. At least two sources of information were used to identify cases in each hospital, among them the statistics of the Swiss Hospital Association (VESKA). These statistics were available for 9 of the 18 hospitals in the canton that participated in the study. The number of cases identified from the VESKA statistics was compared to the total number of cases for each hospital. For the 9 hospitals the number of cases in the VESKA statistics was 407, whereas, after having excluded diagnoses that were actually "status after fracture" and double entries, the total for these hospitals was 392, that is 4% less than the VESKA statistics indicate. It is concluded that the VESKA statistics provide a good approximation of the actual number of cases treated in these hospitals, with a tendency to overestimate this number. In order to use these statistics for calculating incidence figures, however, it is imperative that a greater proportion of all hospitals (50% presently in the canton, 35% nationwide) participate in these statistics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: Osteoporosis (OP) is a systemic skeletal disease characterized by a low bone mineral density (BMD) and a micro-architectural (MA) deterioration. Clinical risk factors (CRF) are often used as a MA approximation. MA is yet evaluable in daily practice by the Trabecular Bone Score (TBS) measure. TBS is a novel grey-level texture measurement reflecting bone micro-architecture based on the use of experimental variograms of 2D projection images. TBS is very simple to obtain, by reanalyzing a lumbar DXA-scan. TBS has proven to have diagnosis and prognosis value, partially independent of CRF and BMD. The aim of the OsteoLaus cohort is to combine in daily practice the CRF and the information given by DXA (BMD, TBS and vertebral fracture assessment (VFA)) to better identify women at high fracture risk. Method: The OsteoLaus cohort (1400 women 50 to 80 years living in Lausanne, Switzerland) started in 2010. This study is derived from the cohort COLAUS who started in Lausanne in 2003. The main goals of COLAUS is to obtain information on the epidemiology and genetic determinants of cardiovascular risk in 6700 men and women. CRF for OP, bone ultrasound of the heel, lumbar spine and hip BMD, VFA by DXA and MA evaluation by TBS are recorded in OsteoLaus. Preliminary results are reported. Results: We included 631 women: mean age 67.4±6.7 y, BMI 26.1±4.6, mean lumbar spine BMD 0.943±0.168 (T-score -1.4 SD), TBS 1.271±0.103. As expected, correlation between BMD and site matched TBS is low (r2=0.16). Prevalence of VFx grade 2/3, major OP Fx and all OP Fx is 8.4%, 17.0% and 26.0% respectively. Age- and BMI-adjusted ORs (per SD decrease) are 1.8 (1.2- 2.5), 1.6 (1.2-2.1), 1.3 (1.1-1.6) for BMD for the different categories of fractures and 2.0 (1.4-3.0), 1.9 (1.4-2.5), 1.4 (1.1-1.7) for TBS respectively. Only 32 to 37% of women with OP Fx have a BMD < -2.5 SD or a TBS < 1.200. If we combine a BMD < -2.5 SD or a TBS < 1.200, 54 to 60% of women with an osteoporotic Fx are identified. Conclusion: As in the already published studies, these preliminary results confirm the partial independence between BMD and TBS. More importantly, a combination of TBS subsequent to BMD increases significantly the identification of women with prevalent OP Fx which would have been miss-classified by BMD alone. For the first time we are able to have complementary information about fracture (VFA), density (BMD), micro- and macro architecture (TBS & HAS) from a simple, low ionizing radiation and cheap device: DXA. Such complementary information is very useful for the patient in the daily practice and moreover will likely have an impact on cost effectiveness analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reliable quantification of the macromolecule signals in short echo-time H-1 MRS spectra is particularly important at high magnetic fields for an accurate quantification of metabolite concentrations (the neurochemical profile) due to effectively increased spectral resolution of the macromolecule components. The purpose of the present study was to assess two approaches of quantification, which take the contribution of macromolecules into account in the quantification step. H-1 spectra were acquired on a 14.1 T/26 cm horizontal scanner on five rats using the ultra-short echo-time SPECIAL (spin echo full intensity acquired localization) spectroscopy sequence. Metabolite concentrations were estimated using LCModel, combined with a simulated basis set of metabolites using published spectral parameters and either the spectrum of macromolecules measured in vivo, using an inversion recovery technique, or baseline simulated by the built-in spline function. The fitted spline function resulted in a smooth approximation of the in vivo macromolecules, but in accordance with previous studies using Subtract-QUEST could not reproduce completely all features of the in vivo spectrum of macromolecules at 14.1 T. As a consequence, the measured macromolecular 'baseline' led to a more accurate and reliable quantification at higher field strengths.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cette thèse s'intéresse à étudier les propriétés extrémales de certains modèles de risque d'intérêt dans diverses applications de l'assurance, de la finance et des statistiques. Cette thèse se développe selon deux axes principaux, à savoir: Dans la première partie, nous nous concentrons sur deux modèles de risques univariés, c'est-à- dire, un modèle de risque de déflation et un modèle de risque de réassurance. Nous étudions le développement des queues de distribution sous certaines conditions des risques commun¬s. Les principaux résultats sont ainsi illustrés par des exemples typiques et des simulations numériques. Enfin, les résultats sont appliqués aux domaines des assurances, par exemple, les approximations de Value-at-Risk, d'espérance conditionnelle unilatérale etc. La deuxième partie de cette thèse est consacrée à trois modèles à deux variables: Le premier modèle concerne la censure à deux variables des événements extrême. Pour ce modèle, nous proposons tout d'abord une classe d'estimateurs pour les coefficients de dépendance et la probabilité des queues de distributions. Ces estimateurs sont flexibles en raison d'un paramètre de réglage. Leurs distributions asymptotiques sont obtenues sous certaines condi¬tions lentes bivariées de second ordre. Ensuite, nous donnons quelques exemples et présentons une petite étude de simulations de Monte Carlo, suivie par une application sur un ensemble de données réelles d'assurance. L'objectif de notre deuxième modèle de risque à deux variables est l'étude de coefficients de dépendance des queues de distributions obliques et asymétriques à deux variables. Ces distri¬butions obliques et asymétriques sont largement utiles dans les applications statistiques. Elles sont générées principalement par le mélange moyenne-variance de lois normales et le mélange de lois normales asymétriques d'échelles, qui distinguent la structure de dépendance de queue comme indiqué par nos principaux résultats. Le troisième modèle de risque à deux variables concerne le rapprochement des maxima de séries triangulaires elliptiques obliques. Les résultats théoriques sont fondés sur certaines hypothèses concernant le périmètre aléatoire sous-jacent des queues de distributions. -- This thesis aims to investigate the extremal properties of certain risk models of interest in vari¬ous applications from insurance, finance and statistics. This thesis develops along two principal lines, namely: In the first part, we focus on two univariate risk models, i.e., deflated risk and reinsurance risk models. Therein we investigate their tail expansions under certain tail conditions of the common risks. Our main results are illustrated by some typical examples and numerical simu¬lations as well. Finally, the findings are formulated into some applications in insurance fields, for instance, the approximations of Value-at-Risk, conditional tail expectations etc. The second part of this thesis is devoted to the following three bivariate models: The first model is concerned with bivariate censoring of extreme events. For this model, we first propose a class of estimators for both tail dependence coefficient and tail probability. These estimators are flexible due to a tuning parameter and their asymptotic distributions are obtained under some second order bivariate slowly varying conditions of the model. Then, we give some examples and present a small Monte Carlo simulation study followed by an application on a real-data set from insurance. The objective of our second bivariate risk model is the investigation of tail dependence coefficient of bivariate skew slash distributions. Such skew slash distributions are extensively useful in statistical applications and they are generated mainly by normal mean-variance mixture and scaled skew-normal mixture, which distinguish the tail dependence structure as shown by our principle results. The third bivariate risk model is concerned with the approximation of the component-wise maxima of skew elliptical triangular arrays. The theoretical results are based on certain tail assumptions on the underlying random radius.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cryo-electron microscopy of vitreous sections (CEMOVIS) has recently been shown to provide images of biological specimens with unprecedented quality and resolution. Cutting the sections remains however the major difficulty. Here, we examine the parameters influencing the quality of the sections and analyse the resulting artefacts. They are in particular: knife marks, compression, crevasses, and chatter. We propose a model taking into account the interplay between viscous flow and fracture. We confirm that crevasses are formed on only one side of the section, and define conditions by which they can be avoided. Chatter is an effect of irregular compression due to friction of the section of the knife edge and conditions to prevent this are also explored. In absence of crevasses and chatter, the bulk of the section is compressed approximately homogeneously. Within this approximation, it is possible to correct for compression by a simple linear transformation for the bulk of the section. A research program is proposed to test and refine our understanding of the sectioning process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The multiscale finite volume (MsFV) method has been developed to efficiently solve large heterogeneous problems (elliptic or parabolic); it is usually employed for pressure equations and delivers conservative flux fields to be used in transport problems. The method essentially relies on the hypothesis that the (fine-scale) problem can be reasonably described by a set of local solutions coupled by a conservative global (coarse-scale) problem. In most cases, the boundary conditions assigned for the local problems are satisfactory and the approximate conservative fluxes provided by the method are accurate. In numerically challenging cases, however, a more accurate localization is required to obtain a good approximation of the fine-scale solution. In this paper we develop a procedure to iteratively improve the boundary conditions of the local problems. The algorithm relies on the data structure of the MsFV method and employs a Krylov-subspace projection method to obtain an unconditionally stable scheme and accelerate convergence. Two variants are considered: in the first, only the MsFV operator is used; in the second, the MsFV operator is combined in a two-step method with an operator derived from the problem solved to construct the conservative flux field. The resulting iterative MsFV algorithms allow arbitrary reduction of the solution error without compromising the construction of a conservative flux field, which is guaranteed at any iteration. Since it converges to the exact solution, the method can be regarded as a linear solver. In this context, the schemes proposed here can be viewed as preconditioned versions of the Generalized Minimal Residual method (GMRES), with a very peculiar characteristic that the residual on the coarse grid is zero at any iteration (thus conservative fluxes can be obtained).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cette étude porte sur la recherche biomédicale en Suisse dans une perspective interprétative. Elle s'intéresse à l'usage que font les acteurs scientifiques et institutionnels de la catégorie «biomédical», à la signification qu'ils en donnent et aux processus de structuration de la recherche biomédicale autour de ces enjeux de catégorisation. Nous avons formulé l'hypothèse que le «biomédical» pouvait être considéré comme un label, à savoir une stratégie discursive de positionnement des acteurs, ou pouvait constituer un champ, à savoir un espace social de recherche fortement structuré. Pour pouvoir vérifier la validité de ces hypothèses, trois perspectives analytiques ont été retenues: topographie, discours et pratiques. Dans un premier temps, nous avons établi une topographie de la recherche biomédicale en repérant les acteurs (et leur appartenance disciplinaire) et les institutions qui s'associent au terme «biomédical», que ce soit pour décrire des institutions ou des projets de recherche. Les résultats de cette analyse offrent une première approximation d'un espace de la recherche en donnant une image d'un domaine peu unifié. Ainsi, l'usage de la catégorie «biomédical» dans les projets des chercheurs n'est pas le fait des seuls médecins et biologistes, mais également de représentants d'autres disciplines. La physique, la chimie et les sciences de l'ingénieur occupent ainsi également une place très importante dans cet espace de recherche. Puis, dans une perspective discursive, nous avons analysé le «biomédical» non seulement comme un label, mais également comme un objet-frontière permettant d'articuler différentes significations, de produire du sens là où des univers de recherche pourraient s'opposer, ou à coordonner des politiques qui ne l'étaient pas. L'analyse des différentes définitions du «biomédical» nous a confirmé l'existence d'un espace social marqué par une grande diversité disciplinaire, toutefois articulé autour d'un coeur médical et, plus particulièrement, d'une application médicale (potentielle ou actuelle). De plus, il ne semble pas y avoir de profondes luttes pour l'établissement de limites claires au «biomédical». Finalement, nous avons étudié les différentes activités de la production des savoirs (carrières, financement, collaboration, publication, etc.). Cette analyse a permis de comprendre que la diversité des définitions et des significations que les acteurs attribuent à la catégorie «biomédical» a aussi un ancrage dans la matérialité des réseaux sociotechniques dans lesquels les chercheurs s'inscrivent. Ces éléments confirment l'idée d'une fragmentation et d'une hétérogénéité de l'espace de la recherche biomédicale. En dépit de cette fragmentation, nous avons également montré que différentes mesures et instruments d'action publique visant à organiser et réguler les pratiques des chercheurs sont mis en oeuvre. Néanmoins et paradoxalement, la recherche biomédicale ne constitue pas pour autant un objet de politique scientifique abordé par les autorités politiques, en tous les cas pas sous l'angle de la catégorie «biomédical». Ces différents niveaux d'analyse ont permis d'arriver à la conclusion que la catégorie «biomédical» n'est pas suffisamment institutionnalisée et que le degré d'interaction entre l'ensemble des chercheurs qui en font usage est trop faible pour que l'on puisse considérer le «biomédical» comme un espace social fortement organisé et structuré, à savoir un champ de la recherche biomédicale. Cela est principalement lié au fait que les acteurs ne partagent pas les mêmes définitions de ce qu'est (ou devrait être) le «biomédical», que leurs pratiques de recherche s'inscrivent dans des univers relativement séparés, et que cette diversité ne donne pas lieu à de fortes luttes pour l'imposition d'une définition légitime ou de normes d'excellence scientifiques dominantes. Par contre, les analyses ont permis de confirmer la validité du «biomédical» comme label, puisque les acteurs se servent de cette catégorie pour valoriser leurs pratiques de recherche et se positionner, même si d'autres notions ont émergé ces dernières années («translationnel», «biotech», «medtech», médecine personnalisée, etc.). On peut, in fine, considérer le «biomédical» comme un probable langage commun («objet-frontière») reposant tant sur la scientificisation du médical que sur la médicalisation des sciences («de base» et «techniques »), visant à améliorer les conditions de possibilité d'un dialogue fructueux entre chercheurs fondamentaux et cliniciens.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Analyzing functional data often leads to finding common factors, for which functional principal component analysis proves to be a useful tool to summarize and characterize the random variation in a function space. The representation in terms of eigenfunctions is optimal in the sense of L-2 approximation. However, the eigenfunctions are not always directed towards an interesting and interpretable direction in the context of functional data and thus could obscure the underlying structure. To overcome such difficulty, an alternative to functional principal component analysis is proposed that produces directed components which may be more informative and easier to interpret. These structural components are similar to principal components, but are adapted to situations in which the domain of the function may be decomposed into disjoint intervals such that there is effectively independence between intervals and positive correlation within intervals. The approach is demonstrated with synthetic examples as well as real data. Properties for special cases are also studied.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La quatrième version de l'échelle d'intelligence de Wechsler pour enfants (WISC-IV) permet le calcul du QI Total et de quatre indices factoriels : compréhension verbale, raisonnement perceptif, vitesse de traitement et mémoire de travail. En 1998, Prifitera et al, ont préconisé le calcul de l'indice d'aptitude général (IAG) comme alternative au quotient intellectuel total (QIT), et cela à partir des scores de compréhension verbale et de raisonnement perceptif. La première étude présentée dans cet article a pour objectif d'établir les normes francophones pour le score IAG du WISC-IV, en utilisant une procédure d'approximation statistique. La deuxième étude vise à examiner la validité de ces normes, en les confrontant à des données recueillies sur un échantillon de 60 enfants. La corrélation entre QIT et IAG est de 0,91 et la différence relative moyenne de 0,18 point. Ces normes permettent d'utiliser le score IAG comme alternative au QIT dans certaines situations diagnostiques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The "Europeanization" of non-EU countries' laws is predominantly seen as an "export" of the EU acquis, especially in the case of so-called "quasi-member" states such as Switzerland. Based on an examination of the Swiss experience, this paper highlights the flaws of this conceptualization: the Europeanization of Swiss Law is a highly differentiated phenomenon, encompassing several forms of approximation to EU Law. All of these forms fall short of an "export" of norms, and result in the creation of something new: a "Europeanized law" that is similar to, but qualitatively different from, EU Law. Another drawback of the "export" metaphor is the emphasis it places on the isomorphism of positive legislation. Europeanization goes deeper than that. As shown in this paper, it is a process of transformation involving not only positive law, but also legal thinking. The Swiss case demonstrates how significant such deeper transformations can be: the Europeanization of positive law has induced an alteration of the traditional canon of legal interpretation. It also demonstrates how problematic such transformations can be: the above-mentioned alteration has not given rise to a new and universally accepted canon of interpretation. This reflects the tension between the need for clear "rules of reference" for EU legal materials - which are required in order to restore coherence and predictability to an extensively Europeanized legal system - and the reluctance to give a legal value to foreign legal materials - which is rooted in a traditional understanding of the concept of "law". Such tension, in turn, shows what deep and difficult transformations are required in order to establish a viable model of legal integration outside supranational structures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract This thesis proposes a set of adaptive broadcast solutions and an adaptive data replication solution to support the deployment of P2P applications. P2P applications are an emerging type of distributed applications that are running on top of P2P networks. Typical P2P applications are video streaming, file sharing, etc. While interesting because they are fully distributed, P2P applications suffer from several deployment problems, due to the nature of the environment on which they perform. Indeed, defining an application on top of a P2P network often means defining an application where peers contribute resources in exchange for their ability to use the P2P application. For example, in P2P file sharing application, while the user is downloading some file, the P2P application is in parallel serving that file to other users. Such peers could have limited hardware resources, e.g., CPU, bandwidth and memory or the end-user could decide to limit the resources it dedicates to the P2P application a priori. In addition, a P2P network is typically emerged into an unreliable environment, where communication links and processes are subject to message losses and crashes, respectively. To support P2P applications, this thesis proposes a set of services that address some underlying constraints related to the nature of P2P networks. The proposed services include a set of adaptive broadcast solutions and an adaptive data replication solution that can be used as the basis of several P2P applications. Our data replication solution permits to increase availability and to reduce the communication overhead. The broadcast solutions aim, at providing a communication substrate encapsulating one of the key communication paradigms used by P2P applications: broadcast. Our broadcast solutions typically aim at offering reliability and scalability to some upper layer, be it an end-to-end P2P application or another system-level layer, such as a data replication layer. Our contributions are organized in a protocol stack made of three layers. In each layer, we propose a set of adaptive protocols that address specific constraints imposed by the environment. Each protocol is evaluated through a set of simulations. The adaptiveness aspect of our solutions relies on the fact that they take into account the constraints of the underlying system in a proactive manner. To model these constraints, we define an environment approximation algorithm allowing us to obtain an approximated view about the system or part of it. This approximated view includes the topology and the components reliability expressed in probabilistic terms. To adapt to the underlying system constraints, the proposed broadcast solutions route messages through tree overlays permitting to maximize the broadcast reliability. Here, the broadcast reliability is expressed as a function of the selected paths reliability and of the use of available resources. These resources are modeled in terms of quotas of messages translating the receiving and sending capacities at each node. To allow a deployment in a large-scale system, we take into account the available memory at processes by limiting the view they have to maintain about the system. Using this partial view, we propose three scalable broadcast algorithms, which are based on a propagation overlay that tends to the global tree overlay and adapts to some constraints of the underlying system. At a higher level, this thesis also proposes a data replication solution that is adaptive both in terms of replica placement and in terms of request routing. At the routing level, this solution takes the unreliability of the environment into account, in order to maximize reliable delivery of requests. At the replica placement level, the dynamically changing origin and frequency of read/write requests are analyzed, in order to define a set of replica that minimizes communication cost.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Swain corrects the chi-square overidentification test (i.e., likelihood ratio test of fit) for structural equation models whethr with or without latent variables. The chi-square statistic is asymptotically correct; however, it does not behave as expected in small samples and/or when the model is complex (cf. Herzog, Boomsma, & Reinecke, 2007). Thus, particularly in situations where the ratio of sample size (n) to the number of parameters estimated (p) is relatively small (i.e., the p to n ratio is large), the chi-square test will tend to overreject correctly specified models. To obtain a closer approximation to the distribution of the chi-square statistic, Swain (1975) developed a correction; this scaling factor, which converges to 1 asymptotically, is multiplied with the chi-square statistic. The correction better approximates the chi-square distribution resulting in more appropriate Type 1 reject error rates (see Herzog & Boomsma, 2009; Herzog, et al., 2007).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Among the large number of granitic intrusions within the Dora-Maira massif, several main types can be distinguished. In this study we report field, petrographic and geochemical investigations as well as zircon typology and conventional U-Pb zircon dating of plutons representing these types. The main results are as follows: the Punta Muret augengneiss is a polymetamorphosed peraluminous granite of anatectic origin. It is 457 +/- 2 Ma old and represents one of the numerous Caledonian orthogneisses of the Alpine basement. All other dated granites are of Late Variscan age. The Cavour leucogranite is an evolved granite of probably calc-alkaline affiliation, dated at 304 +/- 2 Ma. The dioritic and granodioritic facies of the Malanaggio diorite (auct.) are typical calc-alkaline rocks, whose respective age of 290 +/- 2 and 288 +/- 2 Ma overlap within errors. The Sangone and Freidour granite types have very similar alkali-calcic characteristics; their ages are poorly constrained between 267-279 and 268-283 Ma, respectively. The new data for the Dora-Maira granites are in keeping with models of the overall evolution of the Late- to Post-Variscan magmatism in the Alpine area in terms of age distribution and progressive geochemical evolution towards alkaline melts. In a first approximation, granitic rocks across the Variscan belt seem to be increasingly younger towards the internal (southern) parts of the orogen. A Carboniferous, distensive Basin and Range situation is thought to be responsible for the magmatic activity. This tectonic context is comparable to the back-are opening of an active continental margin. The observed southward migration of the magmatism could be linked to the roll-back of the subducting Paleotethyan oceanic plate along the Variscan cordillera.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Les résultats des recherches contemporaines, montrant notamment l'importance du raisonnement fluide, de la mémoire de travail (IMT) et de la vitesse de traitement (IVT) dans le fonctionnement cognitif, ont conduit les concepteurs de la WAIS-IV à introduire de nouvelles épreuves pour renforcer l'évaluation de ces dimensions cognitives. L'interprétation des scores de la WAIS-IV repose maintenant sur quatre indices factoriels (ICV, IRP, IMT et IVT), ainsi que sur le QIT. Les concepteurs de la WAIS-IV indiquent que l'un des objectifs de la révision consistait à actualiser les fondements théoriques de cette échelle. Pourtant, la structure globale de la WAIS-IV ne correspond que partiellement à celle proposée dans le modèle qui fait consensus aujourd'hui, le modèle de Cattell-Horn-Carroll (CHC). Par exemple, la WAIS-IV ne propose pas d'indice de raisonnement fluide, bien que les constructeurs soulignent l'importance de cette dimension dans le fonctionnement cognitif. Dans cet article, nous proposons, pour la WAIS-IV, les normes francophones de cinq scores composites CHC, à savoir le raisonnement fluide (Gf), compréhension-connaissances (Gc), le traitement visuel (Gv), la mémoire à court terme (Gsm), et l'IVT (Gs). Ces normes ont été établies en utilisant une procédure d'approximation statistique. À l'instar des scores CHC que nous avons proposés pour le WISCIV, ces normes pour la WAIS-IV permettent aux cliniciens de basculer vers une grille d'interprétation basée sur le modèle dominant et d'utiliser les cinq scores composites CHC en complément des quatre indices standard dans le cadre d'analyses normatives et ipsatives.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To target pharmacological prevention, instruments giving an approximation of an individual patient's risk of developing postoperative delirium are available. In view of the variable clinical presentation, identifying patients in whom prophylaxis has failed (that is, who develop delirium) remains a challenge. Several bedside instruments are available for the routine ward and ICU setting. Several have been shown to have a high specificity and sensitivity when compared with the standard definitions according to DSM-IV-TR and ICD-10. The Confusion Assessment Method (CAM) and a version specifically developed for the intensive care setting (CAM-ICU) have emerged as a standard. However, alternatives allowing grading of the severity of delirium are also available. In many units, the approach to delirium follows a three-step strategy. Initially, non-pharmacological multicomponent strategies are used for primary prevention. As a second step, pharmacological prophylaxis may be added. Perioperative administration of haloperidol has been shown to reduce the severity, but not the incidence, of delirium. Perioperative administration of atypical antipsychotics has been shown to reduce the incidence of delirium in specific groups of patients. In patients with delirium, both symptomatic and causal treatment of delirium need to be considered. So far symptomatic treatment of delirium is primarily based on antipsychotics. Currently, cholinesterase inhibitors cannot be recommended and the data on dexmedetomidine are inconclusive. With the exception of alcohol-withdrawal delirium, there is no role for benzodiazepines in the treatment of delirium. It is unclear whether treating delirium prevents long-term sequelae.