61 resultados para Localised Approximation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cette thèse s'intéresse à étudier les propriétés extrémales de certains modèles de risque d'intérêt dans diverses applications de l'assurance, de la finance et des statistiques. Cette thèse se développe selon deux axes principaux, à savoir: Dans la première partie, nous nous concentrons sur deux modèles de risques univariés, c'est-à- dire, un modèle de risque de déflation et un modèle de risque de réassurance. Nous étudions le développement des queues de distribution sous certaines conditions des risques commun¬s. Les principaux résultats sont ainsi illustrés par des exemples typiques et des simulations numériques. Enfin, les résultats sont appliqués aux domaines des assurances, par exemple, les approximations de Value-at-Risk, d'espérance conditionnelle unilatérale etc. La deuxième partie de cette thèse est consacrée à trois modèles à deux variables: Le premier modèle concerne la censure à deux variables des événements extrême. Pour ce modèle, nous proposons tout d'abord une classe d'estimateurs pour les coefficients de dépendance et la probabilité des queues de distributions. Ces estimateurs sont flexibles en raison d'un paramètre de réglage. Leurs distributions asymptotiques sont obtenues sous certaines condi¬tions lentes bivariées de second ordre. Ensuite, nous donnons quelques exemples et présentons une petite étude de simulations de Monte Carlo, suivie par une application sur un ensemble de données réelles d'assurance. L'objectif de notre deuxième modèle de risque à deux variables est l'étude de coefficients de dépendance des queues de distributions obliques et asymétriques à deux variables. Ces distri¬butions obliques et asymétriques sont largement utiles dans les applications statistiques. Elles sont générées principalement par le mélange moyenne-variance de lois normales et le mélange de lois normales asymétriques d'échelles, qui distinguent la structure de dépendance de queue comme indiqué par nos principaux résultats. Le troisième modèle de risque à deux variables concerne le rapprochement des maxima de séries triangulaires elliptiques obliques. Les résultats théoriques sont fondés sur certaines hypothèses concernant le périmètre aléatoire sous-jacent des queues de distributions. -- This thesis aims to investigate the extremal properties of certain risk models of interest in vari¬ous applications from insurance, finance and statistics. This thesis develops along two principal lines, namely: In the first part, we focus on two univariate risk models, i.e., deflated risk and reinsurance risk models. Therein we investigate their tail expansions under certain tail conditions of the common risks. Our main results are illustrated by some typical examples and numerical simu¬lations as well. Finally, the findings are formulated into some applications in insurance fields, for instance, the approximations of Value-at-Risk, conditional tail expectations etc. The second part of this thesis is devoted to the following three bivariate models: The first model is concerned with bivariate censoring of extreme events. For this model, we first propose a class of estimators for both tail dependence coefficient and tail probability. These estimators are flexible due to a tuning parameter and their asymptotic distributions are obtained under some second order bivariate slowly varying conditions of the model. Then, we give some examples and present a small Monte Carlo simulation study followed by an application on a real-data set from insurance. The objective of our second bivariate risk model is the investigation of tail dependence coefficient of bivariate skew slash distributions. Such skew slash distributions are extensively useful in statistical applications and they are generated mainly by normal mean-variance mixture and scaled skew-normal mixture, which distinguish the tail dependence structure as shown by our principle results. The third bivariate risk model is concerned with the approximation of the component-wise maxima of skew elliptical triangular arrays. The theoretical results are based on certain tail assumptions on the underlying random radius.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cryo-electron microscopy of vitreous sections (CEMOVIS) has recently been shown to provide images of biological specimens with unprecedented quality and resolution. Cutting the sections remains however the major difficulty. Here, we examine the parameters influencing the quality of the sections and analyse the resulting artefacts. They are in particular: knife marks, compression, crevasses, and chatter. We propose a model taking into account the interplay between viscous flow and fracture. We confirm that crevasses are formed on only one side of the section, and define conditions by which they can be avoided. Chatter is an effect of irregular compression due to friction of the section of the knife edge and conditions to prevent this are also explored. In absence of crevasses and chatter, the bulk of the section is compressed approximately homogeneously. Within this approximation, it is possible to correct for compression by a simple linear transformation for the bulk of the section. A research program is proposed to test and refine our understanding of the sectioning process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The multiscale finite volume (MsFV) method has been developed to efficiently solve large heterogeneous problems (elliptic or parabolic); it is usually employed for pressure equations and delivers conservative flux fields to be used in transport problems. The method essentially relies on the hypothesis that the (fine-scale) problem can be reasonably described by a set of local solutions coupled by a conservative global (coarse-scale) problem. In most cases, the boundary conditions assigned for the local problems are satisfactory and the approximate conservative fluxes provided by the method are accurate. In numerically challenging cases, however, a more accurate localization is required to obtain a good approximation of the fine-scale solution. In this paper we develop a procedure to iteratively improve the boundary conditions of the local problems. The algorithm relies on the data structure of the MsFV method and employs a Krylov-subspace projection method to obtain an unconditionally stable scheme and accelerate convergence. Two variants are considered: in the first, only the MsFV operator is used; in the second, the MsFV operator is combined in a two-step method with an operator derived from the problem solved to construct the conservative flux field. The resulting iterative MsFV algorithms allow arbitrary reduction of the solution error without compromising the construction of a conservative flux field, which is guaranteed at any iteration. Since it converges to the exact solution, the method can be regarded as a linear solver. In this context, the schemes proposed here can be viewed as preconditioned versions of the Generalized Minimal Residual method (GMRES), with a very peculiar characteristic that the residual on the coarse grid is zero at any iteration (thus conservative fluxes can be obtained).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

RESUME Ce travail se propose de discuter des résultats comportementaux observés chez des rats obtenus dans trois paradigmes expérimentaux différents : le bassin de Morris (Morris Water Maze, Morris, 1984) ; la table à trous (Homing Board, Schenk, 1989) et le labyrinthe radial (Radial Arm Maze, Olton et Samuelson, 1976). Les deux premières tâches sont spatiales et permettent un apprentissage de place en environnements contrôlés, et la troisième est une tâche comportementale qui différencie deux habiletés particulières, celle d'élimination (basée sur la mémoire de travail) et celle de sélection (basée sur la mémoire de référence). La discussion des résultats porte sur les stratégies de navigation utilisées par les animaux pour résoudre les tâches et plus précisément sur les facteurs qui peuvent influencer le choix de ces stratégies. Le facteur environnemental (environnement contrôlé) et le facteur cognitif (vieillissement) représentent les variables étudiées ici. C'est ainsi que certaines hypothèses communément acceptées ont été malmenées par nos résultats. Or si l'espace est habituellement supposé homogène (toutes les positions spatiales présentent le même degré de difficulté lors d'un apprentissage en champ ouvert), ce travail établit qu'une position associée -sans contiguïté - à l'un des trois indices visuels situés dans la périphérie de l'environnement est plus difficile à apprendre qu'une position située entre deux des trois indices. Deuxièmement, alors qu'il est admis que l'apprentissage d'une place dans un environnement riche requiert le même type d'information. dans la bassin de Morris (tâche nagée) que sur la table à trous (tâche marchée), nous avons montré que la discrimination spatiale en bassin ne peut être assurée par les trois indices visuels périphériques et nécessite la présence d'au moins un élément supplémentaire. Enfin, l'étude du vieillissement a souvent montré que l'âge réduit les capacités cognitives nécessaires à la navigation spatiale, conduisant à un déficit général des performances d'un animal sénescent, alors que dans notre travail, nous avons trouvé les animaux âgés plus performants et plus efficaces que les adultes dans une tâche particulière de collecte de nourriture. Ces expériences s'inscrivent dans une étude générale qui met à l'épreuve le modèle théorique proposé pax Schenk et Jacobs (2003), selon lequel l'encodage de la carte cognitive (Tolman, 1948 ; O'Keefe et Nadel, 1978) se ferait dans l'hippocampe par l'activité de deux modules complémentaires :d'une part le CA3 - Gyrus Denté pour le traitement d'une trame spatiale basée sur des éléments directionnels et Jou distribués en gradient (bearing map) et d'autre part le CAl - Subiculum pour le traitement des représentations locales basées sur les positions relatives des éléments fixes de l'environnement (sketch map). SUMMARY This work proposes to talk about behavioural results observed in three different experimental paradigms with rats: the Morris Water Maze (Morris, 1984); the Homing Board (Schenk, 1989) and the Radial Arm Maze (Olton and Samuelson, 1976). The two first tasks are spatial ones and allow place learning in controlled environments. The third one is a behavioural task which contrasts two particular skills, the elimination (based on working memory) and the selection one (based on reference memory). The topic of the discussion will be the navigation strategies used by animals to solve the different tasks, and more precisely the factors which can bias this strategies' choice. The environmental (controlled) and the cognitive (aging) factors are the variables studied here. Thus, some hypotheses usually accepted were manhandled by our results. Indeed, if space is habitually homogenously considered (all spatial positions present the same degree of difficulty in an open field learning), this work establishes that an associated position -without being adjacent - to one of the three visual cues localised in the environmental periphery is more difficult to learn than a configurationnel position (situated between two of the three cues). Secondly, if it is received that place learning in a rich environment requires the same information in the Morris water maze (swimming task) that on the Homing board (walking task), we showed that spatial discrimination in the water maze can't be provided by the three peripheral cue cards and needs the presence of a supplementary cue. At last, aging studies often showed that oldness decreases cognitive skills in spatial navigation, leading to a general deficit in performances. But, in our work, we found that senescent rats were more efficient than adult ones in a special food collecting task. These experiments come within the scope of a general study which tests the theoretical model proposed by Jacobs and Schenk (2003), according to which the cognitive map's encoding (Tolman, 1948, O'Keefe and Nadel, 1978) should take place in the hippocampus by two complementary modules, first the DG-CA3 should encode directional and/or gradients references (the bearing map), and secondly the Subiculum-CAl should process locale elements (the sketch map).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cette étude porte sur la recherche biomédicale en Suisse dans une perspective interprétative. Elle s'intéresse à l'usage que font les acteurs scientifiques et institutionnels de la catégorie «biomédical», à la signification qu'ils en donnent et aux processus de structuration de la recherche biomédicale autour de ces enjeux de catégorisation. Nous avons formulé l'hypothèse que le «biomédical» pouvait être considéré comme un label, à savoir une stratégie discursive de positionnement des acteurs, ou pouvait constituer un champ, à savoir un espace social de recherche fortement structuré. Pour pouvoir vérifier la validité de ces hypothèses, trois perspectives analytiques ont été retenues: topographie, discours et pratiques. Dans un premier temps, nous avons établi une topographie de la recherche biomédicale en repérant les acteurs (et leur appartenance disciplinaire) et les institutions qui s'associent au terme «biomédical», que ce soit pour décrire des institutions ou des projets de recherche. Les résultats de cette analyse offrent une première approximation d'un espace de la recherche en donnant une image d'un domaine peu unifié. Ainsi, l'usage de la catégorie «biomédical» dans les projets des chercheurs n'est pas le fait des seuls médecins et biologistes, mais également de représentants d'autres disciplines. La physique, la chimie et les sciences de l'ingénieur occupent ainsi également une place très importante dans cet espace de recherche. Puis, dans une perspective discursive, nous avons analysé le «biomédical» non seulement comme un label, mais également comme un objet-frontière permettant d'articuler différentes significations, de produire du sens là où des univers de recherche pourraient s'opposer, ou à coordonner des politiques qui ne l'étaient pas. L'analyse des différentes définitions du «biomédical» nous a confirmé l'existence d'un espace social marqué par une grande diversité disciplinaire, toutefois articulé autour d'un coeur médical et, plus particulièrement, d'une application médicale (potentielle ou actuelle). De plus, il ne semble pas y avoir de profondes luttes pour l'établissement de limites claires au «biomédical». Finalement, nous avons étudié les différentes activités de la production des savoirs (carrières, financement, collaboration, publication, etc.). Cette analyse a permis de comprendre que la diversité des définitions et des significations que les acteurs attribuent à la catégorie «biomédical» a aussi un ancrage dans la matérialité des réseaux sociotechniques dans lesquels les chercheurs s'inscrivent. Ces éléments confirment l'idée d'une fragmentation et d'une hétérogénéité de l'espace de la recherche biomédicale. En dépit de cette fragmentation, nous avons également montré que différentes mesures et instruments d'action publique visant à organiser et réguler les pratiques des chercheurs sont mis en oeuvre. Néanmoins et paradoxalement, la recherche biomédicale ne constitue pas pour autant un objet de politique scientifique abordé par les autorités politiques, en tous les cas pas sous l'angle de la catégorie «biomédical». Ces différents niveaux d'analyse ont permis d'arriver à la conclusion que la catégorie «biomédical» n'est pas suffisamment institutionnalisée et que le degré d'interaction entre l'ensemble des chercheurs qui en font usage est trop faible pour que l'on puisse considérer le «biomédical» comme un espace social fortement organisé et structuré, à savoir un champ de la recherche biomédicale. Cela est principalement lié au fait que les acteurs ne partagent pas les mêmes définitions de ce qu'est (ou devrait être) le «biomédical», que leurs pratiques de recherche s'inscrivent dans des univers relativement séparés, et que cette diversité ne donne pas lieu à de fortes luttes pour l'imposition d'une définition légitime ou de normes d'excellence scientifiques dominantes. Par contre, les analyses ont permis de confirmer la validité du «biomédical» comme label, puisque les acteurs se servent de cette catégorie pour valoriser leurs pratiques de recherche et se positionner, même si d'autres notions ont émergé ces dernières années («translationnel», «biotech», «medtech», médecine personnalisée, etc.). On peut, in fine, considérer le «biomédical» comme un probable langage commun («objet-frontière») reposant tant sur la scientificisation du médical que sur la médicalisation des sciences («de base» et «techniques »), visant à améliorer les conditions de possibilité d'un dialogue fructueux entre chercheurs fondamentaux et cliniciens.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Analyzing functional data often leads to finding common factors, for which functional principal component analysis proves to be a useful tool to summarize and characterize the random variation in a function space. The representation in terms of eigenfunctions is optimal in the sense of L-2 approximation. However, the eigenfunctions are not always directed towards an interesting and interpretable direction in the context of functional data and thus could obscure the underlying structure. To overcome such difficulty, an alternative to functional principal component analysis is proposed that produces directed components which may be more informative and easier to interpret. These structural components are similar to principal components, but are adapted to situations in which the domain of the function may be decomposed into disjoint intervals such that there is effectively independence between intervals and positive correlation within intervals. The approach is demonstrated with synthetic examples as well as real data. Properties for special cases are also studied.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The cellular localisation of neurofilament triplet subunits was investigated in the rat neocortex. A subset of mainly pyramidal neurons showed colocalisation of subunit immunolabelling throughout the neocortex, including labelling with the antibody SMI32, which has been used extensively in other studies of the primate cortex as a selective cellular marker. Neurofilament-labelled neurons were principally localised to two or three cell layers in most cortical regions, but dramatically reduced labelling was present in areas such as the perirhinal cortex, anterior cingulate and a strip of cortex extending from caudal motor regions through the medial parietal region to secondary visual areas. However, quantitative analysis demonstrated a similar proportion (10-20%) of cells with neurofilament triplet labelling in regions of high or low labelling. Combining retrograde tracing with immunolabelling showed that cellular content of the neurofilament proteins was not correlated with the length of projection. Double labelling immunohistochemistry demonstrated that neurofilament content in axons was closely associated with myelination. Analysis of SMI32 labelling in development indicated that content of this epitope within cell bodies was associated with relatively late maturation, between postnatal days 14 and 21. This study is further evidence of a cell type-specific regulation of neurofilament proteins within neocortical neurons. Neurofilament triplet content may be more closely related to the degree of myelination, rather than the absolute length, of the projecting axon.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La quatrième version de l'échelle d'intelligence de Wechsler pour enfants (WISC-IV) permet le calcul du QI Total et de quatre indices factoriels : compréhension verbale, raisonnement perceptif, vitesse de traitement et mémoire de travail. En 1998, Prifitera et al, ont préconisé le calcul de l'indice d'aptitude général (IAG) comme alternative au quotient intellectuel total (QIT), et cela à partir des scores de compréhension verbale et de raisonnement perceptif. La première étude présentée dans cet article a pour objectif d'établir les normes francophones pour le score IAG du WISC-IV, en utilisant une procédure d'approximation statistique. La deuxième étude vise à examiner la validité de ces normes, en les confrontant à des données recueillies sur un échantillon de 60 enfants. La corrélation entre QIT et IAG est de 0,91 et la différence relative moyenne de 0,18 point. Ces normes permettent d'utiliser le score IAG comme alternative au QIT dans certaines situations diagnostiques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Even if there is clinical evidence that carbon monoxide poisoning determines cardiac damage, the literature on the cardiac pathomorphology in such cases is scarce. We investigated the immunohistochemical expression of two known markers of fresh cardiac damage, fibronectin and the terminal complement complex C5b-9, in both cardiac ventricles in 26 cases of CO intoxication (study group, 15 ♀, 11 ♂, mean age 47 years, mean COHb level 65.9%, min. 51%, max. 85%) compared to a group of 23 cases of hanging (n = 23, 4♀, 19♂, mean age 42 years) as well as to 25 cases of myocardial infarction (n = 25, 13♀, 12♂, mean age 64 years). Fresh cardiac damage was detected with the antibody fibronectin in cases of CO poisoning and was prevalently localised at the right ventricle.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Heart transplantation is the treatment of choice for many patients with end-stage heart failure. Its success, however, is limited by organ shortage, side effects of immunosuppressive drugs, and chronic rejection. Gene therapy is conceptually appealing for applications in transplantation, as the donor organ is genetically manipulated ex vivo before transplantation. Localised expression of immunomodulatory genes aims to create a state of immune privilege within the graft, which could eliminate the need for systemic immunosuppression. In this review, recent advances in the development of gene therapy in heart transplantation are discussed. Studies in animal models have demonstrated that genetic modification of the donor heart with immunomodulatory genes attenuates ischaemia-reperfusion injury and rejection. Alternatively, bone marrow-derived cells genetically engineered with donor-type major histocompatibility complex (MHC) class I or II promote donor-specific hyporesponsiveness. Genetic engineering of naïve T cells or dendritic cells may induce regulatory T cells and regulatory dendritic cells. Despite encouraging results in animal models, however, clinical gene therapy trials in heart transplantation have not yet been started. The best vector and gene to be delivered remain to be identified. Pre-clinical studies in non-human primates are needed. Nonetheless, the potential of gene therapy as an adjunct therapy in transplantation is essentially intact.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The "Europeanization" of non-EU countries' laws is predominantly seen as an "export" of the EU acquis, especially in the case of so-called "quasi-member" states such as Switzerland. Based on an examination of the Swiss experience, this paper highlights the flaws of this conceptualization: the Europeanization of Swiss Law is a highly differentiated phenomenon, encompassing several forms of approximation to EU Law. All of these forms fall short of an "export" of norms, and result in the creation of something new: a "Europeanized law" that is similar to, but qualitatively different from, EU Law. Another drawback of the "export" metaphor is the emphasis it places on the isomorphism of positive legislation. Europeanization goes deeper than that. As shown in this paper, it is a process of transformation involving not only positive law, but also legal thinking. The Swiss case demonstrates how significant such deeper transformations can be: the Europeanization of positive law has induced an alteration of the traditional canon of legal interpretation. It also demonstrates how problematic such transformations can be: the above-mentioned alteration has not given rise to a new and universally accepted canon of interpretation. This reflects the tension between the need for clear "rules of reference" for EU legal materials - which are required in order to restore coherence and predictability to an extensively Europeanized legal system - and the reluctance to give a legal value to foreign legal materials - which is rooted in a traditional understanding of the concept of "law". Such tension, in turn, shows what deep and difficult transformations are required in order to establish a viable model of legal integration outside supranational structures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract This thesis proposes a set of adaptive broadcast solutions and an adaptive data replication solution to support the deployment of P2P applications. P2P applications are an emerging type of distributed applications that are running on top of P2P networks. Typical P2P applications are video streaming, file sharing, etc. While interesting because they are fully distributed, P2P applications suffer from several deployment problems, due to the nature of the environment on which they perform. Indeed, defining an application on top of a P2P network often means defining an application where peers contribute resources in exchange for their ability to use the P2P application. For example, in P2P file sharing application, while the user is downloading some file, the P2P application is in parallel serving that file to other users. Such peers could have limited hardware resources, e.g., CPU, bandwidth and memory or the end-user could decide to limit the resources it dedicates to the P2P application a priori. In addition, a P2P network is typically emerged into an unreliable environment, where communication links and processes are subject to message losses and crashes, respectively. To support P2P applications, this thesis proposes a set of services that address some underlying constraints related to the nature of P2P networks. The proposed services include a set of adaptive broadcast solutions and an adaptive data replication solution that can be used as the basis of several P2P applications. Our data replication solution permits to increase availability and to reduce the communication overhead. The broadcast solutions aim, at providing a communication substrate encapsulating one of the key communication paradigms used by P2P applications: broadcast. Our broadcast solutions typically aim at offering reliability and scalability to some upper layer, be it an end-to-end P2P application or another system-level layer, such as a data replication layer. Our contributions are organized in a protocol stack made of three layers. In each layer, we propose a set of adaptive protocols that address specific constraints imposed by the environment. Each protocol is evaluated through a set of simulations. The adaptiveness aspect of our solutions relies on the fact that they take into account the constraints of the underlying system in a proactive manner. To model these constraints, we define an environment approximation algorithm allowing us to obtain an approximated view about the system or part of it. This approximated view includes the topology and the components reliability expressed in probabilistic terms. To adapt to the underlying system constraints, the proposed broadcast solutions route messages through tree overlays permitting to maximize the broadcast reliability. Here, the broadcast reliability is expressed as a function of the selected paths reliability and of the use of available resources. These resources are modeled in terms of quotas of messages translating the receiving and sending capacities at each node. To allow a deployment in a large-scale system, we take into account the available memory at processes by limiting the view they have to maintain about the system. Using this partial view, we propose three scalable broadcast algorithms, which are based on a propagation overlay that tends to the global tree overlay and adapts to some constraints of the underlying system. At a higher level, this thesis also proposes a data replication solution that is adaptive both in terms of replica placement and in terms of request routing. At the routing level, this solution takes the unreliability of the environment into account, in order to maximize reliable delivery of requests. At the replica placement level, the dynamically changing origin and frequency of read/write requests are analyzed, in order to define a set of replica that minimizes communication cost.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Specialised plant cell types often locally modify their cell walls as part of a developmental program, as do cells that are challenged by particular environmental conditions. Modifications can include deposition of secondary cellulose, callose, cutin, suberin or lignin. Although the biosyntheses of cell wall components are more and more understood, little is known about the mechanisms that control localised deposition of wall materials. During metaxylem vessel differentiation, site-specific cell wall deposition is locally prevented by the microtubule depolymerising protein MIDD1, which disassembles the cytoskeleton and precludes the cellulose synthase complex from depositing cellulose. As a result, metaxylem vessel secondary cell wall appears pitted. How MIDD1 is tethered at the plasma membrane and how other cell wall polymers are locally deposited remain elusive. Casparian strips in the root endodermis represent a further example of local cell wall deposition. The recent discovery of the Casparian Strip membrane domain Proteins (CASPs), which are located at the plasma membrane and are important for the site-specific deposition of lignin during Casparian strip development, establishes the root endodermis as an attractive model system to study the mechanisms of localised cell wall modifications. How secondary modifications are modulated and monitored during development or in response to environmental changes is another question that still misses a complete picture.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Swain corrects the chi-square overidentification test (i.e., likelihood ratio test of fit) for structural equation models whethr with or without latent variables. The chi-square statistic is asymptotically correct; however, it does not behave as expected in small samples and/or when the model is complex (cf. Herzog, Boomsma, & Reinecke, 2007). Thus, particularly in situations where the ratio of sample size (n) to the number of parameters estimated (p) is relatively small (i.e., the p to n ratio is large), the chi-square test will tend to overreject correctly specified models. To obtain a closer approximation to the distribution of the chi-square statistic, Swain (1975) developed a correction; this scaling factor, which converges to 1 asymptotically, is multiplied with the chi-square statistic. The correction better approximates the chi-square distribution resulting in more appropriate Type 1 reject error rates (see Herzog & Boomsma, 2009; Herzog, et al., 2007).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Few episodes of suspected infection observed in paediatric intensive care are classifiable without ambiguity by a priori defined criteria. Most require additional expert judgement. Recently, we observed a high variability in antibiotic prescription rates, not explained by the patients' clinical data or underlying diseases. We hypothesised that the disagreement of experts in adjudication of episodes of suspected infection could be one of the potential causes for this variability. During a 5-month period, we included all patients of a 19-bed multidisciplinary, tertiary, neonatal and paediatric intensive care unit, in whom infection was clinically suspected and antibiotics were prescribed ( n=183). Three experts (two senior ICU physicians and a specialist in infectious diseases) were provided with all patient data, laboratory and microbiological findings. All experts classified episodes according to a priori defined criteria into: proven sepsis, probable sepsis (negative cultures), localised infection and no infection. Episodes of proven viral infection and incomplete data sets were excluded. Of the remaining 167 episodes, 48 were classifiable by a priori criteria ( n=28 proven sepsis, n= 20 no infection). The three experts only achieved limited agreement beyond chance in the remaining 119 episodes (kappa = 0.32, and kappa = 0.19 amongst the ICU physicians). The kappa is a measure of the degree of agreement beyond what would be expected by chance alone, with 0 indicating the chance result and 1 indicating perfect agreement. CONCLUSION: agreement of specialists in hindsight adjudication of episodes of suspected infection is of questionable reliability.