928 resultados para Residual-based tests
Resumo:
To cite this article: Ponvert C, Perrin Y, Bados-Albiero A, Le Bourgeois M, Karila C, Delacourt C, Scheinmann P, De Blic J. Allergy to betalactam antibiotics in children: results of a 20-year study based on clinical history, skin and challenge tests. Pediatr Allergy Immunol 2011; 22: 411-418. ABSTRACT: Studies based on skin and challenge tests have shown that 12-60% of children with suspected betalactam hypersensitivity were allergic to betalactams. Responses in skin and challenge tests were studied in 1865 children with suspected betalactam allergy (i) to confirm or rule out the suspected diagnosis; (ii) to evaluate diagnostic value of immediate and non-immediate responses in skin and challenge tests; (iii) to determine frequency of betalactam allergy in those children, and (iv) to determine potential risk factors for betalactam allergy. The work-up was completed in 1431 children, of whom 227 (15.9%) were diagnosed allergic to betalactams. Betalactam hypersensitivity was diagnosed in 50 of the 162 (30.9%) children reporting immediate reactions and in 177 of the 1087 (16.7%) children reporting non-immediate reactions (p < 0.001). The likelihood of betalactam hypersensitivity was also significantly higher in children reporting anaphylaxis, serum sickness-like reactions, and (potentially) severe skin reactions such as acute generalized exanthematic pustulosis, Stevens-Johnson syndrome, and drug reaction with systemic symptoms than in other children (p < 0.001). Skin tests diagnosed 86% of immediate and 31.6% of non-immediate sensitizations. Cross-reactivity and/or cosensitization among betalactams was diagnosed in 76% and 14.7% of the children with immediate and non-immediate hypersensitivity, respectively. The number of children diagnosed allergic to betalactams decreased with time between the reaction and the work-up, probably because the majority of children with severe and worrying reactions were referred for allergological work-up more promptly than the other children. Sex, age, and atopy were not risk factors for betalactam hypersensitivity. In conclusion, we confirm in numerous children that (i) only a few children with suspected betalactam hypersensitivity are allergic to betalactams; (ii) the likelihood of betalactam allergy increases with earliness and/or severity of the reactions; (iii) although non-immediate-reading skin tests (intradermal and patch tests) may diagnose non-immediate sensitizations in children with non-immediate reactions to betalactams (maculopapular rashes and potentially severe skin reactions especially), the diagnostic value of non-immediate-reading skin tests is far lower than the diagnostic value of immediate-reading skin tests, most non-immediate sensitizations to betalactams being diagnosed by means of challenge tests; (iv) cross-reactivity and/or cosensitizations among betalactams are much more frequent in children reporting immediate and/or anaphylactic reactions than in the other children; (v) age, sex and personal atopy are not significant risk factors for betalactam hypersensitivity; and (vi) the number of children with diagnosed allergy to betalactams (of the immediate-type hypersensitivity especially) decreases with time between the reaction and allergological work-up. Finally, based on our experience, we also propose a practical diagnostic approach in children with suspected betalactam hypersensitivity.
Resumo:
Reconstruction of defects in the craniomaxillofacial (CMF) area has mainly been based on bone grafts or metallic fixing plates and screws. Particularly in the case of large calvarial and/or craniofacial defects caused by trauma, tumours or congenital malformations, there is a need for reliable reconstruction biomaterials, because bone grafts or metallic fixing systems do not completely fulfill the criteria for the best possible reconstruction methods in these complicated cases. In this series of studies, the usability of fibre-reinforced composite (FRC) was studied as a biostable, nonmetallic alternative material for reconstructing artificially created bone defects in frontal and calvarial areas of rabbits. The experimental part of this work describes the different stages of the product development process from the first in vitro tests with resin-impregnated fibrereinforced composites to the in vivo animal studies, in which this FRC was tested as an implant material for reconstructing different size bone defects in rabbit frontal and calvarial areas. In the first in vitro study, the FRC was polymerised in contact with bone or blood in the laboratory. The polymerised FRC samples were then incubated in water, which was analysed for residual monomer content by using high performance liquid chromatography (HPLC). It was found that this in vitro polymerisation in contact with bone and blood did not markedly increase the residual monomer leaching from the FRC. In the second in vitro study, different adhesive systems were tested in fixing the implant to bone surface. This was done to find an alternative implant fixing system to screws and pins. On the basis of this study, it was found that the surface of the calvarial bone needed both mechanical and chemical treatments before the resinimpregnated FRC could be properly fixed onto it. In three animal studies performed with rabbit frontal bone defects and critical size calvarial bone defect models, biological responses to the FRC implants were evaluated. On the basis of theseevaluations, it can be concluded that the FRC, based on E-glass (electrical glass) fibres forming a porous fibre veil enables the ingrowth of connective tissues to the inner structures of the material, as well as the bone formation and mineralization inside the fibre veil. Bone formation could be enhanced by using bioactive glass granules fixed to the FRC implants. FRC-implanted bone defects healed partly; no total healing of defects was achieved. Biological responses during the follow-up time, at a maximum of 12 weeks, to resin-impregnated composite implant seemed to depend on the polymerization time of the resin matrix of the FRC. Both of the studied resin systems used in the FRC were photopolymerised and the heat-induced postpolymerisation was used additionally.
Resumo:
A regularization method based on the non-extensive maximum entropy principle is devised. Special emphasis is given to the q=1/2 case. We show that, when the residual principle is considered as constraint, the q=1/2 generalized distribution of Tsallis yields a regularized solution for bad-conditioned problems. The so devised regularized distribution is endowed with a component which corresponds to the well known regularized solution of Tikhonov (1977).
Resumo:
Sobriety checkpoints are not usually randomly located by traffic authorities. As such, information provided by non-random alcohol tests cannot be used to infer the characteristics of the general driving population. In this paper a case study is presented in which the prevalence of alcohol-impaired driving is estimated for the general population of drivers. A stratified probabilistic sample was designed to represent vehicles circulating in non-urban areas of Catalonia (Spain), a region characterized by its complex transportation network and dense traffic around the metropolis of Barcelona. Random breath alcohol concentration tests were performed during spring 2012 on 7,596 drivers. The estimated prevalence of alcohol-impaired drivers was 1.29 PER CENT, which is roughly a third of the rate obtained in non-random tests. Higher rates were found on weekends (1.90 PER CENT on Saturdays, 4.29 PER CENT on Sundays) and especially at night. The rate is higher for men (1.45 PER CENT) than for women (0.64 PER CENT) and the percentage of positive outcomes shows an increasing pattern with age. In vehicles with two occupants, the proportion of alcohol-impaired drivers is estimated at 2.62 PER CENT, but when the driver was alone the rate drops to 0.84 PER CENT, which might reflect the socialization of drinking habits. The results are compared with outcomes in previous surveys, showing a decreasing trend in the prevalence of alcohol-impaired drivers over time.
Resumo:
In the context of multivariate linear regression (MLR) models, it is well known that commonly employed asymptotic test criteria are seriously biased towards overrejection. In this paper, we propose a general method for constructing exact tests of possibly nonlinear hypotheses on the coefficients of MLR systems. For the case of uniform linear hypotheses, we present exact distributional invariance results concerning several standard test criteria. These include Wilks' likelihood ratio (LR) criterion as well as trace and maximum root criteria. The normality assumption is not necessary for most of the results to hold. Implications for inference are two-fold. First, invariance to nuisance parameters entails that the technique of Monte Carlo tests can be applied on all these statistics to obtain exact tests of uniform linear hypotheses. Second, the invariance property of the latter statistic is exploited to derive general nuisance-parameter-free bounds on the distribution of the LR statistic for arbitrary hypotheses. Even though it may be difficult to compute these bounds analytically, they can easily be simulated, hence yielding exact bounds Monte Carlo tests. Illustrative simulation experiments show that the bounds are sufficiently tight to provide conclusive results with a high probability. Our findings illustrate the value of the bounds as a tool to be used in conjunction with more traditional simulation-based test methods (e.g., the parametric bootstrap) which may be applied when the bounds are not conclusive.
Resumo:
A wide range of tests for heteroskedasticity have been proposed in the econometric and statistics literature. Although a few exact homoskedasticity tests are available, the commonly employed procedures are quite generally based on asymptotic approximations which may not provide good size control in finite samples. There has been a number of recent studies that seek to improve the reliability of common heteroskedasticity tests using Edgeworth, Bartlett, jackknife and bootstrap methods. Yet the latter remain approximate. In this paper, we describe a solution to the problem of controlling the size of homoskedasticity tests in linear regression contexts. We study procedures based on the standard test statistics [e.g., the Goldfeld-Quandt, Glejser, Bartlett, Cochran, Hartley, Breusch-Pagan-Godfrey, White and Szroeter criteria] as well as tests for autoregressive conditional heteroskedasticity (ARCH-type models). We also suggest several extensions of the existing procedures (sup-type of combined test statistics) to allow for unknown breakpoints in the error variance. We exploit the technique of Monte Carlo tests to obtain provably exact p-values, for both the standard and the new tests suggested. We show that the MC test procedure conveniently solves the intractable null distribution problem, in particular those raised by the sup-type and combined test statistics as well as (when relevant) unidentified nuisance parameter problems under the null hypothesis. The method proposed works in exactly the same way with both Gaussian and non-Gaussian disturbance distributions [such as heavy-tailed or stable distributions]. The performance of the procedures is examined by simulation. The Monte Carlo experiments conducted focus on : (1) ARCH, GARCH, and ARCH-in-mean alternatives; (2) the case where the variance increases monotonically with : (i) one exogenous variable, and (ii) the mean of the dependent variable; (3) grouped heteroskedasticity; (4) breaks in variance at unknown points. We find that the proposed tests achieve perfect size control and have good power.
Resumo:
Les modèles de séries chronologiques avec variances conditionnellement hétéroscédastiques sont devenus quasi incontournables afin de modéliser les séries chronologiques dans le contexte des données financières. Dans beaucoup d'applications, vérifier l'existence d'une relation entre deux séries chronologiques représente un enjeu important. Dans ce mémoire, nous généralisons dans plusieurs directions et dans un cadre multivarié, la procédure dévéloppée par Cheung et Ng (1996) conçue pour examiner la causalité en variance dans le cas de deux séries univariées. Reposant sur le travail de El Himdi et Roy (1997) et Duchesne (2004), nous proposons un test basé sur les matrices de corrélation croisée des résidus standardisés carrés et des produits croisés de ces résidus. Sous l'hypothèse nulle de l'absence de causalité en variance, nous établissons que les statistiques de test convergent en distribution vers des variables aléatoires khi-carrées. Dans une deuxième approche, nous définissons comme dans Ling et Li (1997) une transformation des résidus pour chaque série résiduelle vectorielle. Les statistiques de test sont construites à partir des corrélations croisées de ces résidus transformés. Dans les deux approches, des statistiques de test pour les délais individuels sont proposées ainsi que des tests de type portemanteau. Cette méthodologie est également utilisée pour déterminer la direction de la causalité en variance. Les résultats de simulation montrent que les tests proposés offrent des propriétés empiriques satisfaisantes. Une application avec des données réelles est également présentée afin d'illustrer les méthodes
Resumo:
In dieser Arbeit werden mithilfe der Likelihood-Tiefen, eingeführt von Mizera und Müller (2004), (ausreißer-)robuste Schätzfunktionen und Tests für den unbekannten Parameter einer stetigen Dichtefunktion entwickelt. Die entwickelten Verfahren werden dann auf drei verschiedene Verteilungen angewandt. Für eindimensionale Parameter wird die Likelihood-Tiefe eines Parameters im Datensatz als das Minimum aus dem Anteil der Daten, für die die Ableitung der Loglikelihood-Funktion nach dem Parameter nicht negativ ist, und dem Anteil der Daten, für die diese Ableitung nicht positiv ist, berechnet. Damit hat der Parameter die größte Tiefe, für den beide Anzahlen gleich groß sind. Dieser wird zunächst als Schätzer gewählt, da die Likelihood-Tiefe ein Maß dafür sein soll, wie gut ein Parameter zum Datensatz passt. Asymptotisch hat der Parameter die größte Tiefe, für den die Wahrscheinlichkeit, dass für eine Beobachtung die Ableitung der Loglikelihood-Funktion nach dem Parameter nicht negativ ist, gleich einhalb ist. Wenn dies für den zu Grunde liegenden Parameter nicht der Fall ist, ist der Schätzer basierend auf der Likelihood-Tiefe verfälscht. In dieser Arbeit wird gezeigt, wie diese Verfälschung korrigiert werden kann sodass die korrigierten Schätzer konsistente Schätzungen bilden. Zur Entwicklung von Tests für den Parameter, wird die von Müller (2005) entwickelte Simplex Likelihood-Tiefe, die eine U-Statistik ist, benutzt. Es zeigt sich, dass für dieselben Verteilungen, für die die Likelihood-Tiefe verfälschte Schätzer liefert, die Simplex Likelihood-Tiefe eine unverfälschte U-Statistik ist. Damit ist insbesondere die asymptotische Verteilung bekannt und es lassen sich Tests für verschiedene Hypothesen formulieren. Die Verschiebung in der Tiefe führt aber für einige Hypothesen zu einer schlechten Güte des zugehörigen Tests. Es werden daher korrigierte Tests eingeführt und Voraussetzungen angegeben, unter denen diese dann konsistent sind. Die Arbeit besteht aus zwei Teilen. Im ersten Teil der Arbeit wird die allgemeine Theorie über die Schätzfunktionen und Tests dargestellt und zudem deren jeweiligen Konsistenz gezeigt. Im zweiten Teil wird die Theorie auf drei verschiedene Verteilungen angewandt: Die Weibull-Verteilung, die Gauß- und die Gumbel-Copula. Damit wird gezeigt, wie die Verfahren des ersten Teils genutzt werden können, um (robuste) konsistente Schätzfunktionen und Tests für den unbekannten Parameter der Verteilung herzuleiten. Insgesamt zeigt sich, dass für die drei Verteilungen mithilfe der Likelihood-Tiefen robuste Schätzfunktionen und Tests gefunden werden können. In unverfälschten Daten sind vorhandene Standardmethoden zum Teil überlegen, jedoch zeigt sich der Vorteil der neuen Methoden in kontaminierten Daten und Daten mit Ausreißern.
Resumo:
Resumen tomado de la publicaci??n
Resumo:
Resumen tomado de la publicaci??n
Resumo:
The performance of a model-based diagnosis system could be affected by several uncertainty sources, such as,model errors,uncertainty in measurements, and disturbances. This uncertainty can be handled by mean of interval models.The aim of this thesis is to propose a methodology for fault detection, isolation and identification based on interval models. The methodology includes some algorithms to obtain in an automatic way the symbolic expression of the residual generators enhancing the structural isolability of the faults, in order to design the fault detection tests. These algorithms are based on the structural model of the system. The stages of fault detection, isolation, and identification are stated as constraint satisfaction problems in continuous domains and solved by means of interval based consistency techniques. The qualitative fault isolation is enhanced by a reasoning in which the signs of the symptoms are derived from analytical redundancy relations or bond graph models of the system. An initial and empirical analysis regarding the differences between interval-based and statistical-based techniques is presented in this thesis. The performance and efficiency of the contributions are illustrated through several application examples, covering different levels of complexity.
Resumo:
La aplicación de materiales compuestos de matriz polimérica reforzados mediante fibras largas (FRP, Fiber Reinforced Plastic), está en gradual crecimiento debido a las buenas propiedades específicas y a la flexibilidad en el diseño. Uno de los mayores consumidores es la industria aeroespacial, dado que la aplicación de estos materiales tiene claros beneficios económicos y medioambientales. Cuando los materiales compuestos se aplican en componentes estructurales, se inicia un programa de diseño donde se combinan ensayos reales y técnicas de análisis. El desarrollo de herramientas de análisis fiables que permiten comprender el comportamiento mecánico de la estructura, así como reemplazar muchos, pero no todos, los ensayos reales, es de claro interés. Susceptibilidad al daño debido a cargas de impacto fuera del plano es uno de los aspectos de más importancia que se tienen en cuenta durante el proceso de diseño de estructuras de material compuesto. La falta de conocimiento de los efectos del impacto en estas estructuras es un factor que limita el uso de estos materiales. Por lo tanto, el desarrollo de modelos de ensayo virtual mecánico para analizar la resistencia a impacto de una estructura es de gran interés, pero aún más, la predicción de la resistencia residual después del impacto. En este sentido, el presente trabajo abarca un amplio rango de análisis de eventos de impacto a baja velocidad en placas laminadas de material compuesto, monolíticas, planas, rectangulares, y con secuencias de apilamiento convencionales. Teniendo en cuenta que el principal objetivo del presente trabajo es la predicción de la resistencia residual a compresión, diferentes tareas se llevan a cabo para favorecer el adecuado análisis del problema. Los temas que se desarrollan son: la descripción analítica del impacto, el diseño y la realización de un plan de ensayos experimentales, la formulación e implementación de modelos constitutivos para la descripción del comportamiento del material, y el desarrollo de ensayos virtuales basados en modelos de elementos finitos en los que se usan los modelos constitutivos implementados.
Resumo:
A pair of primers directed to 16S-23S rDNA interspacer (ITS) was designed directed to Brucella genetic sequences in order to develop a polymerase chain reaction (PCR) putatively capable of amplifying DNA from any Brucella species. Nucleic acid extracts from whole-blood from naive dogs were spiked with decreasing amounts of Brucella canis RM6/66 DNA and the resulting solutions were tested by PCR. In addition, the ability of PCR to amplify Brucella spp. genetic sequences from naturally infected dogs was evaluated using 210 whole-blood samples of dogs from 19 kennels. The whole-blood samples collected were subjected to blood culture and PCR. Serodiagnosis was performed using the rapid slide agglutination test with and without 2-mercaptoethanol. The DNA from whole blood was extracted using proteinase-K, sodium dodecyl sulphate and cetyl trimethyl ammonium bromide followed by phenol-chloroform purification. The PCR was capable of detecting as little as 3.8 fg of Brucella DNA mixed with 450 ng of host DNA. Theoretically, 3.8 fg of Brucella DNA represents the total genomic mass of fewer than two bacterial cells. The PCR diagnostic sensitivity and specificity were 100%. From the results observed in the present study, we conclude that PCR could be used as confirmatory test for diagnosis of B. canis infection.
Resumo:
The purpose of this study was to evaluate the residual antibacterial activity of several calcium hydroxide [Ca(OH) 2]-based pastes, placed in root canals of dogs' teeth with induced chronic periapical lesions. Root canals were instrumented with the ProFile rotary system and filled with 4 pastes: G1 (n=16): Ca(OH) 2 paste + anesthetic solution; G2 (n=20): Calen® paste + camphorated pmonochlorophenol (CMCP); G3 (n=18): Calen®; and G4 (n=18): Ca(OH) 2 paste + 2% chlorhexidine digluconate. After 21 days, the pastes were removed with size 60 K-files and placed on Petri plates with agar inoculated with Micrococcus luteus ATCC 9341. Pastes that were not placed into root canals served as control. After pre-diffusion, incubation and optimization, the inhibition zones of bacterial growth were measured and analyzed by Mann-Whitney U test at 5% significance level. All pastes showed residual antibacterial activity. The control samples had larger halos (p<0.05). The mean residual antibacterial activity halos in G1, G2, G3 and G4 were 7.6; 10.4; 17.7 and 21.4 mm, respectively. The zones of bacterial growth of G4 were significantly larger than those of G1 and G2 (p<0.05). In conclusion, regardless of the vehicle and antiseptic, all Ca(OH) 2-based pastes showed different degrees of measurable residual antibacterial activity. Furthermore, unlike CMCP, chlorhexidine increased significantly the antibacterial activity of Ca(OH) 2.
Resumo:
The presence of residual endodontic sealer in the pulp chamber may cause discoloration of the dental crown and interfere with the adhesion of restorative materials. The aim of this study was to compare the efficacy of different solvents in removing residues of an epoxy resin-based sealer (AH Plus) from the dentin walls of the pulp chamber, by scanning electron microscopy (SEM). Forty-four bovine incisor dental crown fragments were treated with 17% EDTA and 2.5% NaOCl. Specimens received a coating of AH Plus and were left undisturbed for 5 min. Then, specimens were divided in four groups (n = 10) and cleaned with one of the following solutions: isopropyl alcohol, 95% ethanol, acetone solution, or amyl acetate solution. Negative controls (n = 2) did not receive AH Plus, while in positive controls (n = 2) the sealer was not removed. AH Plus removal was evaluated by SEM, and a score system was applied. Data were analyzed by Kruskal-Wallis and Dunn tests. None of the solutions tested was able to completely remove AH Plus from the dentin of the pulp chamber. Amyl acetate performed better than 95% ethanol and isopropyl alcohol (p < 0.05), but not better than acetone (p > 0.05) in removing the sealer from dentin. No significant differences were observed between acetone, 95% ethanol, and isopropyl alcohol (p > 0.05). It was concluded that amyl acetate and acetone may be good options for cleaning the pulp chamber after obturation with AH Plus. SCANNING 35:17-21, 2013. © 2012 Wiley Periodicals, Inc. © Wiley Periodicals, Inc.