50 resultados para ANALYTIC ULTRACENTRIFUGATION


Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: The major source of hemolysis during cardiopulmonary bypass remains the cardiotomy suction and is primarily due to the interaction between air and blood. The Smart suction system involves an automatically controlled aspiration designed to avoid the mixture of blood with air. This study was set-up to compare this recently designed suction system to a Cell Saver system in order to investigate their effects on blood elements during prolonged intrathoracic aspiration. METHODS: In a calf model (n=10; mean weight, 69.3+/-4.5 kg), a standardized hole was created in the right atrium allowing a blood loss of 100 ml/min, with a suction cannula placed into the chest cavity into a fixed position during 6 h. The blood was continuously aspirated either with the Smart suction system (five animals) or the Cell Saver system (five animals). Blood samples were taken hourly for blood cell counts and biochemistry. RESULTS: In the Smart suction group, red cell count, plasma protein and free hemoglobin levels remained stable, while platelet count exhibited a significant drop from the fifth hour onwards (prebypass: 683+/-201*10(9)/l, 5 h: 280+/-142*10(9)/l, P=0.046). In the Cell Saver group, there was a significant drop of the red cell count from the third hour onwards (prebypass: 8.6+/-0.9*10(12)/l, 6 h: 6.3+/-0.4*10(12)/l, P=0.02), of the platelet count from the first hour onwards (prebypass: 630+/-97*10(9)/l, 1 h: 224+/-75*10(9)/l, P<0.01), and of the plasma protein level from the first hour onwards (prebypass: 61.7+/-0.6 g/l, 1 h: 29.3+/-9.1 g/l, P<0.01). CONCLUSIONS: In this experimental set-up, the Smart suction system avoids damage to red cells and affects platelet count less than the Cell Saver system which induces important blood cell destruction, as any suction device mixing air and blood, as well as severe hypoproteinemia with its metabolic, clotting and hemodynamic consequences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The topic of cardiorespiratory interactions is of extreme importance to the practicing intensivist. It also has a reputation for being intellectually challenging, due in part to the enormous volume of relevant, at times contradictory literature. Another source of difficulty is the need to simultaneously consider the interrelated functioning of several organ systems (not necessarily limited to the heart and lung), in other words, to adopt a systemic (as opposed to analytic) point of view. We believe that the proper understanding of a few simple physiological concepts is of great help in organizing knowledge in this field. The first part of this review will be devoted to demonstrating this point. The second part, to be published in a coming issue of Intensive Care Medicine, will apply these concepts to clinical situations. We hope that this text will be of some use, especially to intensivists in training, to demystify a field that many find intimidating.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

1. Few examples of habitat-modelling studies of rare and endangered species exist in the literature, although from a conservation perspective predicting their distribution would prove particularly useful. Paucity of data and lack of valid absences are the probable reasons for this shortcoming. Analytic solutions to accommodate the lack of absence include the ecological niche factor analysis (ENFA) and the use of generalized linear models (GLM) with simulated pseudo-absences. 2. In this study we tested a new approach to generating pseudo-absences, based on a preliminary ENFA habitat suitability (HS) map, for the endangered species Eryngium alpinum. This method of generating pseudo-absences was compared with two others: (i) use of a GLM with pseudo-absences generated totally at random, and (ii) use of an ENFA only. 3. The influence of two different spatial resolutions (i.e. grain) was also assessed for tackling the dilemma of quality (grain) vs. quantity (number of occurrences). Each combination of the three above-mentioned methods with the two grains generated a distinct HS map. 4. Four evaluation measures were used for comparing these HS maps: total deviance explained, best kappa, Gini coefficient and minimal predicted area (MPA). The last is a new evaluation criterion proposed in this study. 5. Results showed that (i) GLM models using ENFA-weighted pseudo-absence provide better results, except for the MPA value, and that (ii) quality (spatial resolution and locational accuracy) of the data appears to be more important than quantity (number of occurrences). Furthermore, the proposed MPA value is suggested as a useful measure of model evaluation when used to complement classical statistical measures. 6. Synthesis and applications. We suggest that the use of ENFA-weighted pseudo-absence is a possible way to enhance the quality of GLM-based potential distribution maps and that data quality (i.e. spatial resolution) prevails over quantity (i.e. number of data). Increased accuracy of potential distribution maps could help to define better suitable areas for species protection and reintroduction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: We examined the analytic validity of reported family history of hypertension and diabetes among siblings in the Seychelles. STUDY DESIGN AND SETTING: Four hundred four siblings from 73 families with at least two hypertensive persons were identified through a national hypertension register. Two gold standards were used prospectively. Sensitivity was the proportion of respondents who indicated the presence of disease in a sibling, given that the sibling reported to be affected (personal history gold standard) or was clinically affected (clinical status gold standard). Specificity was the proportion of respondents who reported an unaffected sibling, given that the sibling reported to be unaffected or was clinically unaffected. Respondents gave information on the disease status in their siblings in approximately two-thirds of instances. RESULTS: When sibling history could be obtained (n=348 for hypertension, n=404 for diabetes), the sensitivity and the specificity of the sibling history were, respectively, 90 and 55% for hypertension, and 61 and 98% for diabetes, using clinical status and, respectively, 89 and 78% for hypertension, and 53 and 98% for diabetes, using personal history. CONCLUSION: The sibling history, when available, is a useful screening test to detect hypertension, but it is less useful to detect diabetes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Glutamate transport through astrocytic excitatory amino-acid transporters (EAAT)-1 and EAAT-2 is paramount for neural homeostasis. EAAT-1 has been reported in secreted extracellular microvesicles (eMV, such as exosomes) and because the protein kinase C (PKC) family controls the sub-cellular distribution of EAATs, we have explored whether PKCs drive EAATs into eMV. Using rat primary astrocytes, confocal immunofluorescence and ultracentrifugation on sucrose gradient we here report that PKC activation by phorbol myristate acetate (PMA) reorganizes EAAT-1 distribution and reduces functional [(3)H]-aspartate reuptake. Western-blots show that EAAT-1 is present in eMV from astrocyte conditioned medium, together with NaK ATPase and glutamine synthetase all being further increased after PMA treatment. However, nanoparticle tracking analysis reveals that PKC activation did not change particle concentration. Functional analysis indicates that eMV have the capacity to reuptake [(3)H]-aspartate. In vivo, we demonstrate that spinal astrocytic reaction induced by peripheral nerve lesion (spared nerve injury, SNI) is associated with a phosphorylation of PKC δ together with a shift of EAAT distribution ipsilaterally. Ex vivo, spinal explants from SNI rats release eMV with an increased content of NaK ATPase, EAAT-1 and EAAT-2. These data indicate PKC and cell activation as important regulators of EAAT-1 incorporation in eMV, and raise the possibility that microvesicular EAAT-1 may exert extracellular functions. Beyond a putative role in neuropathic pain, this phenomenon may be important for understanding neural homeostasis and a wide range of neurological diseases associated with astrocytic reaction as well as non-neurological diseases linked to eMV release.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Common acute lymphoblastic leukemia antigen detected by radioimmunoassay in the serum of patients with common acute lymphoblastic leukemia was found to be exclusively associated with the pellet of the serum samples obtained by ultracentrifugation at 100,000 X g. The pellets were shown to contain membrane vesicles or fragments which were characterized by electron microscopy and determination of enzymatic activity. The pelleted fragments had an apparent diameter ranging between 60 and 260 nm and showed a trilaminar membrane structure. On freeze-fracture preparations, the fragments with concave profile, corresponding to the external fracture face of plasma membrane, displayed an intramembrane particle density (ranging from 0 to 750 particles per micron2) which is similar to that recorded on the corresponding fracture face of intact cells from the common lymphoblastic leukemia antigen positive leukemic cell line (Nalm-1) or of vesicles shed in the culture medium by Nalm-1 cells. Furthermore, analysis of the membrane enzyme marker 5'-nucleotidase in the pellet of patient's sera, showed that the presence of this enzyme correlated with that of common lymphoblastic leukemia antigen, but the quantitative relationship between the two surface constituents was not linear. The results suggest that the two markers are located on the same membrane fragments, but that their individual distribution on the shed fragments is heterogeneous.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Differential X-ray phase-contrast tomography (DPCT) refers to a class of promising methods for reconstructing the X-ray refractive index distribution of materials that present weak X-ray absorption contrast. The tomographic projection data in DPCT, from which an estimate of the refractive index distribution is reconstructed, correspond to one-dimensional (1D) derivatives of the two-dimensional (2D) Radon transform of the refractive index distribution. There is an important need for the development of iterative image reconstruction methods for DPCT that can yield useful images from few-view projection data, thereby mitigating the long data-acquisition times and large radiation doses associated with use of analytic reconstruction methods. In this work, we analyze the numerical and statistical properties of two classes of discrete imaging models that form the basis for iterative image reconstruction in DPCT. We also investigate the use of one of the models with a modern image reconstruction algorithm for performing few-view image reconstruction of a tissue specimen.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

MI-based interventions are widely used with a number of different clinical populations and their efficacy has been well established. However, the clinicians' training has not traditionally been the focus of empirical investigations. We conducted a meta-analytic review of clinicians' MI-training and MI-skills findings. Fifteen studies were included, involving 715 clinicians. Pre-post training effect sizes were calculated (13 studies) as well as group contrast effect sizes (7 studies). Pre-post training comparisons showed medium to large ES of MI training, which are maintained over a short period of time. When compared to a control group, our results also suggested higher MI proficiency in the professionals trained in MI than in nontrained ones (medium ES). However, this estimate of ES may be affected by a publication bias and therefore, should be considered with caution. Methodological limitations and potential sources of heterogeneity of the studies included in this meta-analysis are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Depuis le séminaire H. Cartan de 1954-55, il est bien connu que l'on peut trouver des éléments de torsion arbitrairement grande dans l'homologie entière des espaces d'Eilenberg-MacLane K(G,n) où G est un groupe abélien non trivial et n>1. L'objectif majeur de ce travail est d'étendre ce résultat à des H-espaces possédant plus d'un groupe d'homotopie non trivial. Dans le but de contrôler précisément le résultat de H. Cartan, on commence par étudier la dualité entre l'homologie et la cohomologie des espaces d'Eilenberg-MacLane 2-locaux de type fini. On parvient ainsi à raffiner quelques résultats qui découlent des calculs de H. Cartan. Le résultat principal de ce travail peut être formulé comme suit. Soit X un H-espace ne possédant que deux groupes d'homotopie non triviaux, tous deux finis et de 2-torsion. Alors X n'admet pas d'exposant pour son groupe gradué d'homologie entière réduite. On construit une large classe d'espaces pour laquelle ce résultat n'est qu'une conséquence d'une caractéristique topologique, à savoir l'existence d'un rétract faible X K(G,n) pour un certain groupe abélien G et n>1. On généralise également notre résultat principal à des espaces plus compliqués en utilisant la suite spectrale d'Eilenberg-Moore ainsi que des méthodes analytiques faisant apparaître les nombres de Betti et leur comportement asymptotique. Finalement, on conjecture que les espaces qui ne possédent qu'un nombre fini de groupes d'homotopie non triviaux n'admettent pas d'exposant homologique. Ce travail contient par ailleurs la présentation de la « machine d'Eilenberg-MacLane », un programme C++ conçu pour calculer explicitement les groupes d'homologie entière des espaces d'Eilenberg-MacLane. <br/><br/>By the work of H. Cartan, it is well known that one can find elements of arbitrarilly high torsion in the integral (co)homology groups of an Eilenberg-MacLane space K(G,n), where G is a non-trivial abelian group and n>1. The main goal of this work is to extend this result to H-spaces having more than one non-trivial homotopy groups. In order to have an accurate hold on H. Cartan's result, we start by studying the duality between homology and cohomology of 2-local Eilenberg-MacLane spaces of finite type. This leads us to some improvements of H. Cartan's methods in this particular case. Our main result can be stated as follows. Let X be an H-space with two non-vanishing finite 2-torsion homotopy groups. Then X does not admit any exponent for its reduced integral graded (co)homology group. We construct a wide class of examples for which this result is a simple consequence of a topological feature, namely the existence of a weak retract X K(G,n) for some abelian group G and n>1. We also generalize our main result to more complicated stable two stage Postnikov systems, using the Eilenberg-Moore spectral sequence and analytic methods involving Betti numbers and their asymptotic behaviour. Finally, we investigate some guesses on the non-existence of homology exponents for finite Postnikov towers. We conjecture that Postnikov pieces do not admit any (co)homology exponent. This work also includes the presentation of the "Eilenberg-MacLane machine", a C++ program designed to compute explicitely all integral homology groups of Eilenberg-MacLane spaces. <br/><br/>Il est toujours difficile pour un mathématicien de parler de son travail. La difficulté réside dans le fait que les objets qu'il étudie sont abstraits. On rencontre assez rarement un espace vectoriel, une catégorie abélienne ou une transformée de Laplace au coin de la rue ! Cependant, même si les objets mathématiques sont difficiles à cerner pour un non-mathématicien, les méthodes pour les étudier sont essentiellement les mêmes que celles utilisées dans les autres disciplines scientifiques. On décortique les objets complexes en composantes plus simples à étudier. On dresse la liste des propriétés des objets mathématiques, puis on les classe en formant des familles d'objets partageant un caractère commun. On cherche des façons différentes, mais équivalentes, de formuler un problème. Etc. Mon travail concerne le domaine mathématique de la topologie algébrique. Le but ultime de cette discipline est de parvenir à classifier tous les espaces topologiques en faisant usage de l'algèbre. Cette activité est comparable à celle d'un ornithologue (topologue) qui étudierait les oiseaux (les espaces topologiques) par exemple à l'aide de jumelles (l'algèbre). S'il voit un oiseau de petite taille, arboricole, chanteur et bâtisseur de nids, pourvu de pattes à quatre doigts, dont trois en avant et un, muni d'une forte griffe, en arrière, alors il en déduira à coup sûr que c'est un passereau. Il lui restera encore à déterminer si c'est un moineau, un merle ou un rossignol. Considérons ci-dessous quelques exemples d'espaces topologiques: a) un cube creux, b) une sphère et c) un tore creux (c.-à-d. une chambre à air). a) b) c) Si toute personne normalement constituée perçoit ici trois figures différentes, le topologue, lui, n'en voit que deux ! De son point de vue, le cube et la sphère ne sont pas différents puisque ils sont homéomorphes: on peut transformer l'un en l'autre de façon continue (il suffirait de souffler dans le cube pour obtenir la sphère). Par contre, la sphère et le tore ne sont pas homéomorphes: triturez la sphère de toutes les façons (sans la déchirer), jamais vous n'obtiendrez le tore. Il existe un infinité d'espaces topologiques et, contrairement à ce que l'on serait naïvement tenté de croire, déterminer si deux d'entre eux sont homéomorphes est très difficile en général. Pour essayer de résoudre ce problème, les topologues ont eu l'idée de faire intervenir l'algèbre dans leurs raisonnements. Ce fut la naissance de la théorie de l'homotopie. Il s'agit, suivant une recette bien particulière, d'associer à tout espace topologique une infinité de ce que les algébristes appellent des groupes. Les groupes ainsi obtenus sont appelés groupes d'homotopie de l'espace topologique. Les mathématiciens ont commencé par montrer que deux espaces topologiques qui sont homéomorphes (par exemple le cube et la sphère) ont les même groupes d'homotopie. On parle alors d'invariants (les groupes d'homotopie sont bien invariants relativement à des espaces topologiques qui sont homéomorphes). Par conséquent, deux espaces topologiques qui n'ont pas les mêmes groupes d'homotopie ne peuvent en aucun cas être homéomorphes. C'est là un excellent moyen de classer les espaces topologiques (pensez à l'ornithologue qui observe les pattes des oiseaux pour déterminer s'il a affaire à un passereau ou non). Mon travail porte sur les espaces topologiques qui n'ont qu'un nombre fini de groupes d'homotopie non nuls. De tels espaces sont appelés des tours de Postnikov finies. On y étudie leurs groupes de cohomologie entière, une autre famille d'invariants, à l'instar des groupes d'homotopie. On mesure d'une certaine manière la taille d'un groupe de cohomologie à l'aide de la notion d'exposant; ainsi, un groupe de cohomologie possédant un exposant est relativement petit. L'un des résultats principaux de ce travail porte sur une étude de la taille des groupes de cohomologie des tours de Postnikov finies. Il s'agit du théorème suivant: un H-espace topologique 1-connexe 2-local et de type fini qui ne possède qu'un ou deux groupes d'homotopie non nuls n'a pas d'exposant pour son groupe gradué de cohomologie entière réduite. S'il fallait interpréter qualitativement ce résultat, on pourrait dire que plus un espace est petit du point de vue de la cohomologie (c.-à-d. s'il possède un exposant cohomologique), plus il est intéressant du point de vue de l'homotopie (c.-à-d. il aura plus de deux groupes d'homotopie non nuls). Il ressort de mon travail que de tels espaces sont très intéressants dans le sens où ils peuvent avoir une infinité de groupes d'homotopie non nuls. Jean-Pierre Serre, médaillé Fields en 1954, a montré que toutes les sphères de dimension >1 ont une infinité de groupes d'homotopie non nuls. Des espaces avec un exposant cohomologique aux sphères, il n'y a qu'un pas à franchir...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: This study aimed to analyze complaints of patients, their relatives, and friends who consulted a complaints center based (Espace Patients & Proches (EPP)) in a hospital so as to better understand the reasons that motivated them and their underlying expectations. METHODS: This study was based on the analysis of written accounts of the 253 situations that occurred during the first year of operation of the EPP. The accounts were analyzed qualitatively using an inductive, thematic analytic approach. RESULTS: We identified 372 different types of complaints and 28 main analytic themes. Five clustered themes emerged from the analysis of the interconnections among the core themes: (1) interpersonal relationship (N=160-the number of accounts including a complaint related to this general theme); (2) technical aspects of care (N=106); (3) health-care institution (N=69); (4) billing and insurance; (5) access to information (N=13). CONCLUSION: The main reason for patients, their relatives, and friends going to EPP was related to the quality of the interpersonal relationship with health-care professionals. Such complaints were markedly more frequent than those concerning technical aspects of care. PRACTICE IMPLICATIONS: These results raise important questions concerning changing patient expectations as well as how hospitals integrate complaints into the process of quality health care.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Within Data Envelopment Analysis, several alternative models allow for an environmental adjustment. The majority of them deliver divergent results. Decision makers face the difficult task of selecting the most suitable model. This study is performed to overcome this difficulty. By doing so, it fills a research gap. First, a two-step web-based survey is conducted. It aims (1) to identify the selection criteria, (2) to prioritize and weight the selection criteria with respect to the goal of selecting the most suitable model and (3) to collect the preferences about which model is preferable to fulfil each selection criterion. Second, Analytic Hierarchy Process is used to quantify the preferences expressed in the survey. Results show that the understandability, the applicability and the acceptability of the alternative models are valid selection criteria. The selection of the most suitable model depends on the preferences of the decision makers with regards to these criteria.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Using meta-analytic methods on a sample of 74 studies, we explore the links between CPA and public policy outcomes, and between CPA and firm outcomes. We find that CPA has at best a weak effect and that it appears to be better at maintaining public policy than changing them.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The history of biodiversity is characterized by a continual replacement of branches in the tree of life. The rise and demise of these branches (clades) are ultimately determined by changes in speciation and extinction rates, often interpreted as a response to varying abiotic and biotic factors. However, understanding the relative importance of these factors remains a major challenge in evolutionary biology. Here we analyze the rich North American fossil record of the dog family Canidae and of other carnivores to tease apart the roles of competition, body size evolution, and climate change on the sequential replacement of three canid subfamilies (two of which have gone extinct). We develop a novel Bayesian analytic framework to show that competition from multiple carnivore clades successively drove the demise and replacement of the two extinct canid subfamilies by increasing their extinction rates and suppressing their speciation. Competitive effects have likely come from ecologically similar species from both canid and felid clades. These results imply that competition among entire clades, generally considered a rare process, can play a more substantial role than climate change and body size evolution in determining the sequential rise and decline of clades.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cette thèse rassemble une série de méta-analyses, c'est-à-dire d'analyses ayant pour objet des analyses produites par des sociologues (notamment celles résultant de l'application de méthodes de traitement des entretiens). Il s'agit d'une démarche réflexive visant les pratiques concrètes des sociologues. Celles-ci sont envisagées comme des activités gouvernées par des règles. Une part importante de cette thèse sera donc consacrée au développement d'un outil d'analyse « pragmatologique » (E. Durkheim), c'est-à-dire permettant l'étude des pratiques et des règles en rapport avec elles. Pour aborder les règles, la philosophie analytique d'inspiration wittgensteinienne apporte plusieurs propositions importantes. Les règles sont ainsi considérées comme des concepts d'air de famille : il n'y a pas de définitions communes recouvrant l'ensemble des règles. Pour étudier les règles, il convient alors de faire des distinctions à partir de leurs usages. Une de ces distinctions concerne la différence entre règles constitutives et règles régulatives : une règle constitutive crée une pratique (e.g. le mariage), alors qu'une règle régulative s'applique à des activités qui peuvent exister sans elle (e.g. les règles du savoir-vivre). L'activité méthodologique des sociologues repose et est contrainte par ces types de règles, qui sont pour l'essentiel implicites. Cette thèse vise donc à rendre compte, par la description et la codification des règles, du caractère normatif des méthodes dans les pratiques d'analyse de la sociologie. Elle insiste en particulier sur les limites logiques qu'instituent les règles constitutives, celles-ci rendant impossibles (et non pas interdites) certaines actions des sociologues. This thesis brings together a series of meta-analyzes, that is, analyzes that tackle analyzes produced by sociologists (notably those resulting from the application of methods in treating interviews). The approach is reflexive and aimed at the concrete practices of sociologists, considered as activities governed by rules. An important part of this thesis is therefore devoted to the development of a "pragmatological" analytical tool (Durkheim) to conduct a study of such practices and of the rules that govern them. To approach these rules, Wittgenstein-inspired analytic philosophy offers several important proposals. The rules are, at first, seen as concepts of family resemblance, assuming that there is no common definition accounting for all rules. In order to conduct the study of such rules, it is therefore necessary to discern how they are respectively used. One of these distinctions concerns the difference between constitutive rules and regulative rules: a constitutive rule creates a practice (for example marriage), while a regulative rule applies to activities that can exist outside of the rule (for example, the rules of etiquette). The methodological activity of sociologists relies on, and is constrained by these types of rules, which are essentially implicit. Through the description and codification of rules, this thesis aims to account for the normative character of methods governing analytical practices in sociology. Particular emphasis is on the logical limits established by constitutive rules, limits that render several of the sociologist's actions impossible (rather than forbidden).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Lithium augmentation of antidepressants for treatment of unipolar major depression was one of the first adjunctive strategies based on a neuropharmacologic rationale. Randomized controlled trials supported its efficacy but most trials added lithium to tricyclic antidepressants (TCAs). Despite its efficacy, use of lithium augmentation remains infrequent. The current systematic review and meta-analysis examines the efficacy of lithium augmentation as an adjunct to second generation antidepressants as well as to TCAs and considers reasons for its infrequent use. METHOD: A systematic search of Medline and the Cochrane Clinical Trials database was performed. Randomized, placebo-controlled trials of lithium augmentation were selected. A fixed-effects meta-analysis was performed. Odds ratios for response were calculated for each treatment-control contrast, for the trials grouped by type of initial antidepressant (TCA or second generation antidepressant), and as a meta-analytic summary for all treatments combined. RESULTS: Nine trials that included 237 patients were selected. The odds ratio for response to lithium vs. placebo in all contrasts combined was 2.89 (95% CI 1.65, 5.05, z=3.72, p=0.0002). Heterogeneity was very low, I(2)=0%. Adjunctive lithium was effective with TCAs (7 contrasts) and with second generation agents (3 contrasts). Discontinuation due to adverse events was infrequent and did not differ between lithium and placebo. LIMITATIONS: The meta-analysis is limited by the small size and number of trials and limited data for treatment resistant patients. CONCLUSIONS: Adjunctive lithium appears to be as effective for second generation antidepressants as it was for the tricyclics.