967 resultados para transparency thresholds


Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Most clinical guidelines recommend that AIDS-free, HIV-infected persons with CD4 cell counts below 0.350 × 10(9) cells/L initiate combined antiretroviral therapy (cART), but the optimal CD4 cell count at which cART should be initiated remains a matter of debate. OBJECTIVE: To identify the optimal CD4 cell count at which cART should be initiated. DESIGN: Prospective observational data from the HIV-CAUSAL Collaboration and dynamic marginal structural models were used to compare cART initiation strategies for CD4 thresholds between 0.200 and 0.500 × 10(9) cells/L. SETTING: HIV clinics in Europe and the Veterans Health Administration system in the United States. PATIENTS: 20, 971 HIV-infected, therapy-naive persons with baseline CD4 cell counts at or above 0.500 × 10(9) cells/L and no previous AIDS-defining illnesses, of whom 8392 had a CD4 cell count that decreased into the range of 0.200 to 0.499 × 10(9) cells/L and were included in the analysis. MEASUREMENTS: Hazard ratios and survival proportions for all-cause mortality and a combined end point of AIDS-defining illness or death. RESULTS: Compared with initiating cART at the CD4 cell count threshold of 0.500 × 10(9) cells/L, the mortality hazard ratio was 1.01 (95% CI, 0.84 to 1.22) for the 0.350 threshold and 1.20 (CI, 0.97 to 1.48) for the 0.200 threshold. The corresponding hazard ratios were 1.38 (CI, 1.23 to 1.56) and 1.90 (CI, 1.67 to 2.15), respectively, for the combined end point of AIDS-defining illness or death. Limitations: CD4 cell count at cART initiation was not randomized. Residual confounding may exist. CONCLUSION: Initiation of cART at a threshold CD4 count of 0.500 × 10(9) cells/L increases AIDS-free survival. However, mortality did not vary substantially with the use of CD4 thresholds between 0.300 and 0.500 × 10(9) cells/L.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development of forensic intelligence relies on the expression of suitable models that better represent the contribution of forensic intelligence in relation to the criminal justice system, policing and security. Such models assist in comparing and evaluating methods and new technologies, provide transparency and foster the development of new applications. Interestingly, strong similarities between two separate projects focusing on specific forensic science areas were recently observed. These observations have led to the induction of a general model (Part I) that could guide the use of any forensic science case data in an intelligence perspective. The present article builds upon this general approach by focusing on decisional and organisational issues. The article investigates the comparison process and evaluation system that lay at the heart of the forensic intelligence framework, advocating scientific decision criteria and a structured but flexible and dynamic architecture. These building blocks are crucial and clearly lay within the expertise of forensic scientists. However, it is only part of the problem. Forensic intelligence includes other blocks with their respective interactions, decision points and tensions (e.g. regarding how to guide detection and how to integrate forensic information with other information). Formalising these blocks identifies many questions and potential answers. Addressing these questions is essential for the progress of the discipline. Such a process requires clarifying the role and place of the forensic scientist within the whole process and their relationship to other stakeholders.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Selected configuration interaction (SCI) for atomic and molecular electronic structure calculations is reformulated in a general framework encompassing all CI methods. The linked cluster expansion is used as an intermediate device to approximate CI coefficients BK of disconnected configurations (those that can be expressed as products of combinations of singly and doubly excited ones) in terms of CI coefficients of lower-excited configurations where each K is a linear combination of configuration-state-functions (CSFs) over all degenerate elements of K. Disconnected configurations up to sextuply excited ones are selected by Brown's energy formula, ΔEK=(E-HKK)BK2/(1-BK2), with BK determined from coefficients of singly and doubly excited configurations. The truncation energy error from disconnected configurations, Δdis, is approximated by the sum of ΔEKS of all discarded Ks. The remaining (connected) configurations are selected by thresholds based on natural orbital concepts. Given a model CI space M, a usual upper bound ES is computed by CI in a selected space S, and EM=E S+ΔEdis+δE, where δE is a residual error which can be calculated by well-defined sensitivity analyses. An SCI calculation on Ne ground state featuring 1077 orbitals is presented. Convergence to within near spectroscopic accuracy (0.5 cm-1) is achieved in a model space M of 1.4× 109 CSFs (1.1 × 1012 determinants) containing up to quadruply excited CSFs. Accurate energy contributions of quintuples and sextuples in a model space of 6.5 × 1012 CSFs are obtained. The impact of SCI on various orbital methods is discussed. Since ΔEdis can readily be calculated for very large basis sets without the need of a CI calculation, it can be used to estimate the orbital basis incompleteness error. A method for precise and efficient evaluation of ES is taken up in a companion paper

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A select-divide-and-conquer variational method to approximate configuration interaction (CI) is presented. Given an orthonormal set made up of occupied orbitals (Hartree-Fock or similar) and suitable correlation orbitals (natural or localized orbitals), a large N-electron target space S is split into subspaces S0,S1,S2,...,SR. S0, of dimension d0, contains all configurations K with attributes (energy contributions, etc.) above thresholds T0={T0egy, T0etc.}; the CI coefficients in S0 remain always free to vary. S1 accommodates KS with attributes above T1≤T0. An eigenproblem of dimension d0+d1 for S0+S 1 is solved first, after which the last d1 rows and columns are contracted into a single row and column, thus freezing the last d1 CI coefficients hereinafter. The process is repeated with successive Sj(j≥2) chosen so that corresponding CI matrices fit random access memory (RAM). Davidson's eigensolver is used R times. The final energy eigenvalue (lowest or excited one) is always above the corresponding exact eigenvalue in S. Threshold values {Tj;j=0, 1, 2,...,R} regulate accuracy; for large-dimensional S, high accuracy requires S 0+S1 to be solved outside RAM. From there on, however, usually a few Davidson iterations in RAM are needed for each step, so that Hamiltonian matrix-element evaluation becomes rate determining. One μhartree accuracy is achieved for an eigenproblem of order 24 × 106, involving 1.2 × 1012 nonzero matrix elements, and 8.4×109 Slater determinants

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: According to guidelines, patients with coronary artery disease (CAD) should undergo revascularization if myocardial ischemia is present. While coronary angiography (CXA) allows the morphological assessment of CAD, the fractional flow reserve (FFR) has proved to be a complementary invasive test to assess the functional significance of CAD, i.e. to detect ischemia. Perfusion Cardiac Magnetic Resonance (CMR) has turned out to be a robust non-invasive technique to assess myocardial ischemia. The objective: is to compare the cost-effectiveness ratio - defined as the costs per patient correctly diagnosed - of two algorithms used to diagnose hemodynamically significant CAD in relation to the pretest likelihood of CAD: 1) aCMRto assess ischemia before referring positive patients to CXA (CMR + CXA), 2) a CXA in all patients combined with a FFR test in patients with angiographically positive stenoses (CXA + FFR). Methods: The costs, evaluated from the health care system perspective in the Swiss, German, the United Kingdom (UK) and the United States (US) contexts, included public prices of the different tests considered as outpatient procedures, complications' costs and costs induced by diagnosis errors (false negative). The effectiveness criterion wasthe ability to accurately identify apatient with significantCAD.Test performancesused in the model were based on the clinical literature. Using a mathematical model, we compared the cost-effectiveness ratio for both algorithms for hypothetical patient cohorts with different pretest likelihood of CAD. Results: The cost-effectiveness ratio decreased hyperbolically with increasing pretest likelihood of CAD for both strategies. CMR + CXA and CXA + FFR were equally costeffective at a pretest likelihood of CAD of 62% in Switzerland, 67% in Germany, 83% in the UK and 84% in the US with costs of CHF 5'794, Euros 1'472, £ 2'685 and $ 2'126 per patient correctly diagnosed. Below these thresholds, CMR + CXA showed lower costs per patient correctly diagnosed than CXA + FFR. Implications for the health care system/professionals/patients/society These results facilitate decision making for the clinical use of new generations of imaging procedures to detect ischemia. They show to what extent the cost-effectiveness to diagnose CAD depends on the prevalence of the disease.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modern urban lifestyle encourages the prolongation of wakefulness, leaving less and less time for sleep. Although the exact functions of sleep remain one of the biggest mysteries in neuroscience, the society is well aware of the negative consequences of sleep loss on human physical and mental health and performance. Enhancing sleep's recuperative functions might allow shortening sleep duration while preserving the beneficial effects of sleep. During sleep, brain activity oscillates across a continuum of frequencies. Individual oscillations have been suggested to underlie distinct functions for sleep and cognition. Gaining control about individual oscillations might allow boosting their specific functions. Sleep spindles are 11 - 15 Hz oscillations characteristic for light non-rapid-eye-movement sleep (NREMS) and have been proposed to play a role in memory consolidation and sleep protection against environmental stimuli. The reticular thalamic nucleus (nRt) has been identified as the major pacemaker of spindles. Intrinsic oscillatory burst discharge in nRt neurons, arising from the interplay of low-threshold (T-type) Ca2+ channels (T channels) and small conductance type 2 (SK2) K+ channels (SK2 channels), underlies this pacemaking function. In the present work we investigated the impact of altered nRt bursting on spindle generation during sleep by studying mutant mice for SK2 channels and for CaV3.3 channels, a subtype of T channels. Using in vitro electrophysiology I showed that nRt bursting was abolished in CaV3.3 knock out (CaV3.3 KO) mice. In contrast, in SK2 channel over-expressing (SK2-OE) nRt cells, intrinsic repetitive bursting was prolonged. Compared to wildtype (WT) littermates, altered nRt burst discharge lead to weakened thalamic network oscillations in vitro in CaV3.3 KO mice, while oscillatory activity was prolonged in SK2-OE mice. Sleep electroencephalographic recordings in CaV3.3 KO and SK2-OE mice revealed that reduced or potentiated nRt bursting respectively weakened or prolonged sleep spindle activity at the NREMS - REMS transition. Furthermore, SK2-OE mice showed more consolidated NREMS and increased arousal thresholds, two correlates of good sleep quality. This thesis work suggests that CaV3.3 and SK2 channels may be targeted in order to modulate sleep spindle activity. Furthermore, it proposes a novel function for spindles in NREMS consolidation. Finally, it provides evidence that sleep quality may be improved by promoting spindle activity, thereby supporting the hypothesis that sleep quality can be enhanced by modulating oscillatory activity in the brain. Le style de vie moderne favorise la prolongation de l'éveil, laissant de moins en moins de temps pour le sommeil. Même si le rôle exact du sommeil reste un des plus grands mystères des neurosciences, la société est bien consciente des conséquences négatives que provoque un manque de sommeil, à la fois sur le plan de la santé physique et mentale ainsi qu'au niveau des performances cognitives. Augmenter les fonctions récupératrices du sommeil pourrait permettre de raccourcir la durée du sommeil tout en en conservant les effets bénéfiques. Durant le sommeil, on observe des oscillations à travers un continuum de fréquences. Il a été proposé que chaque oscillation pourrait être à l'origine de fonctions spécifiques pour le sommeil et la cognition. Pouvoir de contrôler les oscillations individuelles permettrait d'augmenter leurs fonctions respectives. Les fuseaux sont des oscillations de 11 à 15 Hz caractéristiques du sommeil à ondes lentes léger et il a été suggéré qu'elles jouent un rôle majeur pour la consolidation de la mémoire ainsi que dans la protection du sommeil contre les stimuli environnementaux. Le nucleus réticulaire du thalamus (nRt) a été identifié en tant que générateur de rythme des fuseaux. Les bouffées oscillatoires intrinsèques des neurones du nRt, provenant de l'interaction de canaux calciques à bas seuil de type T (canaux T) et de canaux potassiques à faible conductance de type 2 (canaux SK2), sont à l'origine de la fonction de générateur de rythme. Dans ce travail, j'ai étudié l'impact de la modulation de bouffées de nRT sur la génération des fuseaux pendant le sommeil en investiguant des souris génétiquement modifiées pour les canaux SK2 et les canaux CaV3.3, un sous-type de canaux T. En utilisant l'électrophysiologie in vitro j'ai démontré que les bouffées du nRT étaient abolies dans les souris knock-out du type CaV3.3 (CaV3.3 KO). D'autre part, dans les cellules nRT sur-exprimant les canaux SK2 (SK2-OE), les bouffées oscillatoires intrinsèques étaient prolongées. Par rapport aux souris wild type, les souris CaV3.3 KO ont montré un affaiblissement des oscillations thalamiques en réponse à un changement des bouffées de nRT, alors que l'activité oscillatoire était prolongée dans les souris SK2-OE. Des enregistrements EEG du sommeil dans des souris de type CaV3.3 KO et SK2-OE ont révélé qu'une réduction ou augmentation des bouffées nRT ont respectivement affaibli ou prolongé l'activité des fuseaux durant les transitions du sommeil à ondes lentes au sommeil paradoxal. De plus, les souris SK2-OE ont montré des signes de consolidation du sommeil à ondes lentes et un seuil augmenté pour le réveil, deux mesures qui corrèlent avec une bonne qualité du sommeil. Le travail de cette thèse propose que les canaux CaV3.3 et SK2 pourrait être ciblés pour moduler l'activité des fuseaux. De plus, je propose une fonction nouvelle pour les fuseaux dans la consolidation du sommeil à ondes lentes. Finalement je suggère que la qualité du sommeil peut être améliorée en promouvant l'activité des fuseaux, soutenant ainsi l'idée que la qualité du sommeil peut être améliorée en modulant l'activité oscillatoire dans le cerveau.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: The diagnosis of hypertension in children is difficult because of the multiple sex-, age-, and height-specific thresholds to define elevated blood pressure (BP). Blood pressure-to-height ratio (BPHR) has been proposed to facilitate the identification of elevated BP in children. OBJECTIVE: We assessed the performance of BPHR at a single screening visit to identify children with hypertension that is sustained elevated BP. METHOD: In a school-based study conducted in Switzerland, BP was measured at up to three visits in 5207 children. Children had hypertension if BP was elevated at the three visits. Sensitivity, specificity, negative predictive value (NPV), and positive predictive value (PPV) for the identification of hypertension were assessed for different thresholds of BPHR. The ability of BPHR at a single screening visit to discriminate children with and without hypertension was evaluated with receiver operating characteristic (ROC) curve analyses. RESULTS: The prevalence of systolic/diastolic hypertension was 2.2%. Systolic BPHR had a better performance to identify hypertension compared with diastolic BPHR (area under the ROC curve: 0.95 vs. 0.84). The highest performance was obtained with a systolic BPHR threshold set at 0.80 mmHg/cm (sensitivity: 98%; specificity: 85%; PPV: 12%; and NPV: 100%) and a diastolic BPHR threshold set at 0.45 mmHg/cm (sensitivity: 79%; specificity: 70%; PPV: 5%; and NPV: 99%). The PPV was higher among tall or overweight children. CONCLUSION: BPHR at a single screening visit had a high performance to identify hypertension in children, although the low prevalence of hypertension led to a low PPV.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This review paper reports the consensus of a technical workshop hosted by the European network, NanoImpactNet (NIN). The workshop aimed to review the collective experience of working at the bench with manufactured nanomaterials (MNMs), and to recommend modifications to existing experimental methods and OECD protocols. Current procedures for cleaning glassware are appropriate for most MNMs, although interference with electrodes may occur. Maintaining exposure is more difficult with MNMs compared to conventional chemicals. A metal salt control is recommended for experiments with metallic MNMs that may release free metal ions. Dispersing agents should be avoided, but if they must be used, then natural or synthetic dispersing agents are possible, and dispersion controls essential. Time constraints and technology gaps indicate that full characterisation of test media during ecotoxicity tests is currently not practical. Details of electron microscopy, dark-field microscopy, a range of spectroscopic methods (EDX, XRD, XANES, EXAFS), light scattering techniques (DLS, SLS) and chromatography are discussed. The development of user-friendly software to predict particle behaviour in test media according to DLVO theory is in progress, and simple optical methods are available to estimate the settling behaviour of suspensions during experiments. However, for soil matrices such simple approaches may not be applicable. Alternatively, a Critical Body Residue approach may be taken in which body concentrations in organisms are related to effects, and toxicity thresholds derived. For microbial assays, the cell wall is a formidable barrier to MNMs and end points that rely on the test substance penetrating the cell may be insensitive. Instead assays based on the cell envelope should be developed for MNMs. In algal growth tests, the abiotic factors that promote particle aggregation in the media (e.g. ionic strength) are also important in providing nutrients, and manipulation of the media to control the dispersion may also inhibit growth. Controls to quantify shading effects, and precise details of lighting regimes, shaking or mixing should be reported in algal tests. Photosynthesis may be more sensitive than traditional growth end points for algae and plants. Tests with invertebrates should consider non-chemical toxicity from particle adherence to the organisms. The use of semi-static exposure methods with fish can reduce the logistical issues of waste water disposal and facilitate aspects of animal husbandry relevant to MMNs. There are concerns that the existing bioaccumulation tests are conceptually flawed for MNMs and that new test(s) are required. In vitro testing strategies, as exemplified by genotoxicity assays, can be modified for MNMs, but the risk of false negatives in some assays is highlighted. In conclusion, most protocols will require some modifications and recommendations are made to aid the researcher at the bench. [Authors]

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: The transcription factor NFAT5/TonEBP regulates the response of mammalian cells to hypertonicity. However, little is known about the physiopathologic tonicity thresholds that trigger its transcriptional activity in primary cells. Wilkins et al. recently developed a transgenic mouse carrying a luciferase reporter (9xNFAT-Luc) driven by a cluster of NFAT sites, that was activated by calcineurin-dependent NFATc proteins. Since the NFAT site of this reporter was very similar to an optimal NFAT5 site, we tested whether this reporter could detect the activation of NFAT5 in transgenic cells.Results: The 9xNFAT-Luc reporter was activated by hypertonicity in an NFAT5-dependent manner in different types of non-transformed transgenic cells: lymphocytes, macrophages and fibroblasts. Activation of this reporter by the phorbol ester PMA plus ionomycin was independent of NFAT5 and mediated by NFATc proteins. Transcriptional activation of NFAT5 in T lymphocytes was detected at hypertonic conditions of 360–380 mOsm/kg (isotonic conditions being 300 mOsm/kg) and strongly induced at 400 mOsm/kg. Such levels have been recorded in plasma in patients with osmoregulatory disorders and in mice deficient in aquaporins and vasopressin receptor. The hypertonicity threshold required to activate NFAT5 was higher in bone marrow-derived macrophages (430 mOsm/kg) and embryonic fibroblasts (480 mOsm/kg). Activation of the 9xNFAT-Luc reporter by hypertonicity in lymphocytes was insensitive to the ERK inhibitor PD98059, partially inhibited by the PI3-kinase inhibitor wortmannin (0.5 μM) and the PKA inhibitor H89, and substantially downregulated by p38 inhibitors (SB203580 and SB202190) and by inhibition of PI3-kinase-related kinases with 25 μM LY294002. Sensitivity of the reporter to FK506 varied among cell types and was greater in primary T cells than in fibroblasts and macrophages.Conclusion: Our results indicate that NFAT5 is a sensitive responder to pathologic increases in extracellular tonicity in T lymphocytes. Activation of NFAT5 by hypertonicity in lymphocytes was mediated by a combination of signaling pathways that differed from those required in other cell types. We propose that the 9xNFAT-Luc transgenic mouse model might be useful to study the physiopathological regulation of both NFAT5 and NFATc factors in primary cells.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Maintenance of corneal transparency is crucial for vision and depends mainly on the endothelium, a non-proliferative monolayer of cells covering the inner part of the cornea. When endothelial cell density falls below a critical threshold, the barrier and "pump" functions of the endothelium are compromised which results in corneal oedema and loss of visual acuity. The conventional treatment for such severe disorder is corneal graft. Unfortunately, there is a worldwide shortage of donor corneas, necessitating amelioration of tissue survival and storage after harvesting. Recently it was reported that the ROCK inhibitor Y-27632 promotes adhesion, inhibits apoptosis, increases the number of proliferating monkey corneal endothelial cells in vitro and enhance corneal endothelial wound healing both in vitro and in vivo in animal models. Using organ culture human cornea (N = 34), the effect of ROCK inhibitor was evaluated in vitro and ex vivo. Toxicity, corneal endothelial cell density, cell proliferation, apoptosis, cell morphometry, adhesion and wound healing process were evaluated by live/dead assay standard cell counting method, EdU labelling, Ki67, Caspase3, Zo-1 and Actin immunostaining. We demonstrated for the first time in human corneal endothelial cells ex vivo and in vitro, that ROCK inhibitor did not induce any toxicity effect and did not alter cell viability. ROCK inhibitor treatment did not induce human corneal endothelial cells proliferation. However, ROCK inhibitor significantly enhanced adhesion and wound healing. The present study shows that the selective ROCK inhibitor Y-27632 has no effect on human corneal endothelial cells proliferative capacities, but alters cellular behaviours. It induces changes in cell shape, increases cell adhesion and enhances wound healing ex vivo and in vitro. Its absence of toxicity, as demonstrated herein, is relevant for its use in human therapy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Object The purpose of this study was to investigate whether diffusion tensor imaging (DTI) of the corticospinal tract (CST) is a reliable surrogate for intraoperative macrostimulation through the deep brain stimulation (DBS) leads. The authors hypothesized that the distance on MRI from the DBS lead to the CST as determined by DTI would correlate with intraoperative motor thresholds from macrostimulations through the same DBS lead. Methods The authors retrospectively reviewed pre- and postoperative MRI studies and intraoperative macrostimulation recordings in 17 patients with Parkinson disease (PD) treated by DBS stimulation. Preoperative DTI tractography of the CST was coregistered with postoperative MRI studies showing the position of the DBS leads. The shortest distance and the angle from each contact of each DBS lead to the CST was automatically calculated using software-based analysis. The distance measurements calculated for each contact were evaluated with respect to the intraoperative voltage thresholds that elicited a motor response at each contact. Results There was a nonsignificant trend for voltage thresholds to increase when the distances between the DBS leads and the CST increased. There was a significant correlation between the angle and the voltage, but the correlation was weak (coefficient of correlation [R] = 0.36). Conclusions Caution needs to be exercised when using DTI tractography information to guide DBS lead placement in patients with PD. Further studies are needed to compare DTI tractography measurements with other approaches such as microelectrode recordings and conventional intraoperative MRI-guided placement of DBS leads.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Neutralizing antibodies are necessary and sufficient for protection against infection with vesicular stomatitis virus (VSV). The in vitro neutralization capacities and in vivo protective capacities of a panel of immunoglobulin G monoclonal antibodies to the glycoprotein of VSV were evaluated. In vitro, neutralizing activity correlated with avidity and with neutralization rate constant, a measure of on-rate. However, in vivo, protection was independent of immunoglobulin subclass, avidity, neutralization rate constant, and in vitro neutralizing activity; above a minimal avidity threshold, protection depended simply on a minimum serum concentration. These two biologically defined thresholds of antibody specificity offer hope for the development of adoptive therapy with neutralizing antibodies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Overdiagnosis is the diagnosis of an abnormality that is not associated with a substantial health hazard and that patients have no benefit to be aware of. It is neither a misdiagnosis (diagnostic error), nor a false positive result (positive test in the absence of a real abnormality). It mainly results from screening, use of increasingly sensitive diagnostic tests, incidental findings on routine examinations, and widening diagnostic criteria to define a condition requiring an intervention. The blurring boundaries between risk and disease, physicians' fear of missing a diagnosis and patients' need for reassurance are further causes of overdiagnosis. Overdiagnosis often implies procedures to confirm or exclude the presence of the condition and is by definition associated with useless treatments and interventions, generating harm and costs without any benefit. Overdiagnosis also diverts healthcare professionals from caring about other health issues. Preventing overdiagnosis requires increasing awareness of healthcare professionals and patients about its occurrence, the avoidance of unnecessary and untargeted diagnostic tests, and the avoidance of screening without demonstrated benefits. Furthermore, accounting systematically for the harms and benefits of screening and diagnostic tests and determining risk factor thresholds based on the expected absolute risk reduction would also help prevent overdiagnosis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This is the first report of three cases of severe acute corneal graft rejection, treated by transscleral methylprednisolone (Solumédrol) iontophoresis. The efficacy of the treatment was evaluated by corneal transparency, visual acuity and corneal inflammation parameters. The patient was treated with Solumédrol iontophoresis once a day for 3 days with a topical corticotherapy reduced to three drops of dexamethasone per day. Iontophoresis was performed, under topical anesthesia, and lasted 3 minutes with a 1.5-mA current. The subjective and objective tolerance of iontophoresis was good. No side-effect was observed. Corneal transparency and visual acuity improved rapidly after the second iontophoresis procedure. These observations show that Solumédrol iontophoresis might be an alternative to pulse therapy in the treatment of corneal graft rejection. Further comparative studies are necessary to confirm these preliminary observations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recognition systems play a key role in a range of biological processes, including mate choice, immune defence and altruistic behaviour. Social insects provide an excellent model for studying recognition systems because workers need to discriminate between nestmates and non-nestmates, enabling them to direct altruistic behaviour towards closer kin and to repel potential invaders. However, the level of aggression directed towards conspecific intruders can vary enormously, even among workers within the same colony. This is usually attributed to differences in the aggression thresholds of individuals or to workers having different roles within the colony. Recent evidence from the weaver ant Oecophylla smaragdina suggests that this does not tell the whole story. Here I propose a new model for nestmate recognition based on a vector template derived from both the individual's innate odour and the shared colony odour. This model accounts for the recent findings concerning weaver ants, and also provides an alternative explanation for why the level of aggression expressed by a colony decreases as the diversity within the colony increases, even when odour is well-mixed. The model makes additional predictions that are easily tested, and represents a significant advance in our conceptualisation of recognition systems.