838 resultados para cut-off value


Relevância:

100.00% 100.00%

Publicador:

Resumo:

There has been growing recognition of a changing clinical presentation of celiac disease (CD), with the manifestation of milder symptoms. Serologic testing is widely used to screen patients with suspected CD and populations at risk. The aim of this retrospective analysis was to evaluate the clinical presentation of CD in childhood, assess the diagnostic value of serologic tests, and investigate the impact of IgA deficiency on diagnostic accuracy. We evaluated 206 consecutive children with suspected CD on the basis of clinical symptoms and positive serology results. Ninety-four (46%) had biopsy-proven CD. The median age at diagnosis of CD was 6.8 years; 15% of the children were <2 years of age. There was a higher incidence of CD in girls (p = 0.003). Iron deficiency and intestinal complaints were more frequent in children with CD than those without CD (61% vs. 33%, p = 0.0001 and 71% vs. 55%, p = 0.02, respectively), while failure to thrive was less common (35% vs. 53%, p = 0.02). The sensitivity of IgA tissue transglutaminase (IgA-tTG) was 0.98 when including all children and 1.00 after excluding children with selective IgA deficiency. The specificity of IgA-tTG was 0.73 using the recommended cut-off value of 20 IU, and this improved to 0.94 when using a higher cut-off value of 100 IU. All children with CD and relative IgA deficiency (IgA levels that are measurable but below the age reference [n = 8]) had elevated IgA-tTG. In conclusion, CD is frequently diagnosed in school-age children with relatively mild symptoms. The absence of intestinal symptoms does not preclude the diagnosis of CD; many children with CD do not report intestinal symptoms. While the sensitivity of IgA-tTG is excellent, its specificity is insufficient for the diagnostic confirmation of a disease requiring life-long dietary restrictions. Children with negative IgA-tTG and decreased but measurable IgA values are unlikely to have CD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the commentary by Zander et al. the authors appear concerned about the methods and results of our, at that time, unpublished sepsis trial evaluating hydroxyethyl starch (HES) and insulin therapy. Unfortunately, the authors' concerns are based on false assumptions about the design, conduct and modes of action of the compounds under investigation. For instance, in our study the HES solution was not used for maintenance of daily fluid requirements, so that the assumption of the authors that this colloid was used "exclusively" is wrong. Moreover, the manufacturer of Hemohes, the HES product we used, gives no cut-off value for creatinine, thus the assumption that this cut-off value was "doubled" in our study is also incorrect. Other claims by the authors such as that lactated solutions cause elevated lactate levels, iatrogenic hyperglycemia and increase O(2) consumption are unfounded. There is no randomized controlled trial supporting such a claim - this claim is neither consistent with our study data nor with any credible published sepsis guidelines or with routine practice worldwide. We fully support open scientific debate. Our study methods and results have now been published after a strict peer-reviewing process and this data is now open to critical and constructive reviewing. However, in our opinion this premature action based on wrong assumptions and containing comments by representatives of pharmaceutical companies does not contribute to a serious, unbiased scientific discourse.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: T cells play a key role in delayed-type drug hypersensitivity reactions. Their reactivity can be assessed by their proliferation in response to the drug in the lymphocyte transformation test (LTT). However, the LTT imposes limitations in terms of practicability, and an alternative method that is easier to implement than the LTT would be desirable. METHODS: Four months to 12 years after acute drug hypersensitivity reactions, CD69 upregulation on T cells of 15 patients and five healthy controls was analyzed by flow cytometry. RESULTS: All 15 LTT-positive patients showed a significant increase of CD69 expression on T cells after 48 h of drug-stimulation exclusively with the drugs incriminated in drug-hypersensitivities. A stimulation index of 2 as cut-off value allowed discrimination between nonreactive and reactive T cells in LTT and CD69 upregulation. T cells (0.5-3%) showed CD69 up-regulation. The reactive cell population consisted of a minority of truly drug reactive T cells secreting cytokines and a higher number of bystander T cells activated by IL-2 and possibly other cytokines. CONCLUSIONS: CD69 upregulation was observed after 2 days in all patients with a positive LTT after 6 days, thus appearing to be a promising tool to identify drug-reactive T cells in the peripheral blood of patients with drug-hypersensitivity reactions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Peak levels of troponin T (TnT) reliably predict morbidity and mortality after cardiac surgery. However, the therapeutic window to manage CABG-related in-hospital complications may close before the peak is reached. We investigated whether early TnT levels correlate as well with complications after coronary artery bypass grafting (CABG) surgery. METHODS A 12 month consecutive series of patients undergoing elective isolated CABG procedures (mini-extra-corporeal circuit, Cardioplegic arrest) was analyzed. Logistic regression modeling was used to investigate whether TnT levels 6 to 8 hours after surgery were independently associated with in-hospital complications (either post-operative myocardial infarction, stroke, new-onset renal insufficiency, intensive care unit (ICU) readmission, prolonged ICU stay (>48 hours), prolonged need for vasopressors (>24 hours), resuscitation or death). RESULTS A total of 290 patients, including 36 patients with complications, was analyzed. Early TnT levels (odds ratio (OR): 6.8, 95% confidence interval (CI): 2.2-21.4, P=.001), logistic EuroSCORE (OR: 1.2, 95%CI: 1.0-1.3, P=.007) and the need for vasopressors during the first 6 postoperative hours (OR: 2.7, 95%CI: 1.0-7.1, P=.05) were independently associated with the risk of complications. With consideration of vasopressor use during the first 6 postoperative hours, the sum of specificity (0.958) and sensitivity (0.417) of TnT for subsequent complications was highest at a TnT cut-off value of 0.8 ng/mL. CONCLUSION Early TnT levels may be useful to guide ICU management of CABG patients. They predict clinically relevant complications within a potential therapeutic window, particularly in patients requiring vasopressors during the first postoperative hours, although with only moderate sensitivity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND In 2006, bluetongue virus serotype 8 (BTV-8) was detected for the first time in central Europe. Measures to control the infection in livestock were implemented in Switzerland but the question was raised whether free-ranging wildlife could be a maintenance host for BTV-8. Furthermore Toggenburg orbivirus (TOV), considered as a potential 25th BTV serotype, was detected in 2007 in domestic goats in Switzerland and wild ruminants were considered a potential source of infection. To assess prevalences of BTV-8 and TOV infections in wildlife, we conducted a serological and virological survey in red deer, roe deer, Alpine chamois and Alpine ibex between 2009 and 2011. Because samples originating from wildlife carcasses are often of poor quality, we also documented the influence of hemolysis on test results, and evaluated the usefulness of confirmatory tests. RESULTS Ten out of 1,898 animals (0.5%, 95% confidence interval 0.3-1.0%) had detectable antibodies against BTV-8 and BTV-8 RNA was found in two chamois and one roe deer (0.3%, 0.1-0.8%). Seroprevalence was highest among red deer, and the majority of positive wild animals were sampled close to areas where outbreaks had been reported in livestock. Most samples were hemolytic and the range of the optical density percentage values obtained in the screening test increased with increasing hemolysis. Confirmatory tests significantly increased specificity of the testing procedure and proved to be applicable even on poor quality samples. Nearly all samples confirmed as positive had an optical density percentage value greater than 50% in the ELISA screening. CONCLUSIONS Prevalence of BTV-8 infection was low, and none of the tested animals were positive for TOV. Currently, wild ruminants are apparently not a reservoir for these viruses in Switzerland. However, we report for the first time BTV-8 RNA in Alpine chamois. This animal was found at high altitude and far from a domestic outbreak, which suggests that the virus could spread into/through the Alps. Regarding testing procedures, hemolysis did not significantly affect test results but confirmatory tests proved to be necessary to obtain reliable prevalence estimates. The cut-off value recommended by the manufacturer for the screening test was applicable for wildlife samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Infrared thermography (IRT) was used to detect digital dermatitis (DD) prior to routine claw trimming. A total of 1192 IRT observations were collected from 149 cows on eight farms. All cows were housed in tie-stalls. The maximal surface temperatures of the coronary band (CB) region and skin (S) of the fore and rear feet (mean value of the maximal surface temperatures of both digits for each foot separately, CBmax and Smax) were assessed. Grouping was performed at the foot level (presence of DD, n=99; absence, n=304), or at the cow level (all four feet healthy, n=24) or where there was at least one DD lesion on the rear feet, n=37). For individual cows (n=61), IRT temperature difference was determined by subtracting the mean sum of CBmax and Smax of the rear feet from that of the fore feet. Feet with DD had higher CBmax and Smax (P<0.001) than healthy feet. Smax was significantly higher in feet with infectious DD lesions (M-stage: M2+M4; n=15) than in those with non-infectious M-lesions (M1+M3; n=84) (P=0.03), but this was not the case for CBmax (P=0.12). At the cow level, an optimal cut-off value for detecting DD of 0.99°C (IRT temperature difference between rear and front feet) yielded a sensitivity of 89.1% and a specificity of 66.6%. The results indicate that IRT may be a useful non-invasive diagnostic tool to screen for the presence of DD in dairy cows by measuring CBmax and Smax.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE Blood loss and blood substitution are associated with higher morbidity after major abdominal surgery. During major liver resection, low local venous pressure, has been shown to reduce blood loss. Ambiguity persists concerning the impact of local venous pressure on blood loss during open radical cystectomy. We aimed to determine the association between intraoperative blood loss and pelvic venous pressure (PVP) and determine factors affecting PVP. MATERIAL AND METHODS In the frame of a single-center, double-blind, randomized trial, PVP was measured in 82 patients from a norepinephrine/low-volume group and in 81 from a control group with liberal hydration. For this secondary analysis, patients from each arm were stratified into subgroups with PVP <5 mmHg or ≥5 mmHg measured after cystectomy (optimal cut-off value for discrimination of patients with relevant blood loss according to the Youden's index). RESULTS Median blood loss was 800 ml [range: 300-1600] in 55/163 patients (34%) with PVP <5 mmHg and 1200 ml [400-3000] in 108/163 patients (66%) with PVP ≥5 mmHg; (P<0.0001). A PVP <5 mmHg was measured in 42/82 patients (51%) in the norepinephrine/low-volume group and 13/81 (16%) in the control group (P<0.0001). PVP dropped significantly after removal of abdominal packing and abdominal lifting in both groups at all time points (at begin and end of pelvic lymph node dissection, end of cystectomy) (P<0.0001). No correlation between PVP and central venous pressure could be detected. CONCLUSIONS Blood loss was significantly reduced in patients with low PVP. Factors affecting PVP were fluid management and abdominal packing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Immunoglobulin A (IgA) serves as the basis of the secretory immune system by protecting the lining of mucosal sites from pathogens. In both humans and dogs, IgA deficiency (IgAD) is associated with recurrent infections of mucosal sites and immune-mediated diseases. Low concentrations of serum IgA have previously been reported to occur in a number of dog breeds but no generally accepted cut-off value has been established for canine IgAD. The current study represents the largest screening to date of IgA in dogs in terms of both number of dogs (n = 1267) and number of breeds studied (n = 22). Serum IgA concentrations were quantified by using capture ELISA and were found to vary widely between breeds. We also found IgA to be positively correlated with age (p < 0.0001). Apart from the two breeds previously reported as predisposed to low IgA (Shar-Pei and German shepherd), we identified six additional breeds in which ≥10% of all tested dogs had very low (<0.07 g/l) IgA concentrations (Hovawart, Norwegian elkhound, Nova Scotia duck tolling retriever, Bullterrier, Golden retriever and Labrador retriever). In addition, we discovered low IgA concentrations to be significantly associated with canine atopic dermatitis (CAD, p < 0.0001) and pancreatic acinar atrophy (PAA, p = 0.04) in German shepherds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Instruments for on-farm determination of colostrum quality such as refractometers and densimeters are increasingly used in dairy farms. The colour of colostrum is also supposed to reflect its quality. A paler or mature milk-like colour is associated with a lower colostrum value in terms of its general composition compared with a more yellowish and darker colour. The objective of this study was to investigate the relationships between colour measurement of colostrum using the CIELAB colour space (CIE L*=from white to black, a*=from red to green, b*=from yellow to blue, chroma value G=visual perceived colourfulness) and its composition. Dairy cow colostrum samples (n=117) obtained at 4·7±1·5 h after parturition were analysed for immunoglobulin G (IgG) by ELISA and for fat, protein and lactose by infrared spectroscopy. For colour measurements, a calibrated spectrophotometer was used. At a cut-off value of 50 mg IgG/ml, colour measurement had a sensitivity of 50·0%, a specificity of 49·5%, and a negative predictive value of 87·9%. Colostral IgG concentration was not correlated with the chroma value G, but with relative lightness L*. While milk fat content showed a relationship to the parameters L*, a*, b* and G from the colour measurement, milk protein content was not correlated with a*, but with L*, b*, and G. Lactose concentration in colostrum showed only a relationship with b* and G. In conclusion, parameters of the colour measurement showed clear relationships to colostral IgG, fat, protein and lactose concentration in dairy cows. Implementation of colour measuring devices in automatic milking systems and milking parlours might be a potential instrument to access colostrum quality as well as detecting abnormal milk.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Intravenous anaesthetic drugs are the primary means for producing general anaesthesia in equine practice. The ideal drug for intravenous anaesthesia has high reliability and pharmacokinetic properties indicating short elimination and lack of accumulation when administered for prolonged periods. Induction of general anaesthesia with racemic ketamine preceded by profound sedation has already an established place in the equine field anaesthesia. Due to potential advantages over racemic ketamine, S-ketamine has been employed in horses to induce general anaesthesia, but its optimal dose remains under investigation. The objective of this study was to evaluate whether 2.5 mg/kg S-ketamine could be used as a single intravenous bolus to provide short-term surgical anaesthesia in colts undergoing surgical castration, and to report its pharmacokinetic profile. RESULTS After premedication with romifidine and L-methadone, the combination of S-ketamine and diazepam allowed reaching surgical anaesthesia in the 28 colts. Induction of anaesthesia as well as recovery were good to excellent in the majority (n = 22 and 24, respectively) of the colts. Seven horses required additional administration of S-ketamine to prolong the duration of surgical anaesthesia. Redosing did not compromise recovery quality. Plasma concentration of S-ketamine decreased rapidly after administration, following a two-compartmental model, leading to the hypothesis of a consistent unchanged elimination of the parent compound into the urine beside its conversion to S-norketamine. The observed plasma concentrations of S-ketamine at the time of first movement were various and did not support the definition of a clear cut-off value to predict the termination of the drug effect. CONCLUSIONS The administration of 2.5 mg/kg IV S-ketamine after adequate premedication provided good quality of induction and recovery and a duration of action similar to what has been reported for racemic ketamine at the dose of 2.2 mg/kg. Until further investigations will be provided, close monitoring to adapt drug delivery is mandatory, particularly once the first 10 minutes after injection are elapsed. Taking into account rapid elimination of S-ketamine, significant inter-individual variability and rapid loss of effect over a narrow range of concentrations a sudden return of consciousness has to be foreseen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The most commonly used method for formally assessing grapheme-colour synaesthesia (i.e., experiencing colours in response to letter and/or number stimuli) involves selecting colours from a large colour palette on several occasions and measuring consistency of the colours selected. However, the ability to diagnose synaesthesia using this method depends on several factors that have not been directly contrasted. These include the type of colour space used (e.g., RGB, HSV, CIELUV, CIELAB) and different measures of consistency (e.g., city block and Euclidean distance in colour space). This study aims to find the most reliable way of diagnosing grapheme-colour synaesthesia based on maximising sensitivity (i.e., ability of a test to identify true synaesthetes) and specificity (i.e., ability of a test to identify true non-synaesthetes). We show, applying ROC (Receiver Operating Characteristics) to binary classification of a large sample of self-declared synaesthetes and non-synaesthetes, that the consistency criterion (i.e., cut-off value) for diagnosing synaesthesia is considerably higher than the current standard in the field. We also show that methods based on perceptual CIELUV and CIELAB colour models (rather than RGB and HSV colour representations) and Euclidean distances offer an even greater sensitivity and specificity than most currently used measures. Together, these findings offer improved heuristics for the behavioural assessment of grapheme-colour synaesthesia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Random Forests™ is reported to be one of the most accurate classification algorithms in complex data analysis. It shows excellent performance even when most predictors are noisy and the number of variables is much larger than the number of observations. In this thesis Random Forests was applied to a large-scale lung cancer case-control study. A novel way of automatically selecting prognostic factors was proposed. Also, synthetic positive control was used to validate Random Forests method. Throughout this study we showed that Random Forests can deal with large number of weak input variables without overfitting. It can account for non-additive interactions between these input variables. Random Forests can also be used for variable selection without being adversely affected by collinearities. ^ Random Forests can deal with the large-scale data sets without rigorous data preprocessing. It has robust variable importance ranking measure. Proposed is a novel variable selection method in context of Random Forests that uses the data noise level as the cut-off value to determine the subset of the important predictors. This new approach enhanced the ability of the Random Forests algorithm to automatically identify important predictors for complex data. The cut-off value can also be adjusted based on the results of the synthetic positive control experiments. ^ When the data set had high variables to observations ratio, Random Forests complemented the established logistic regression. This study suggested that Random Forests is recommended for such high dimensionality data. One can use Random Forests to select the important variables and then use logistic regression or Random Forests itself to estimate the effect size of the predictors and to classify new observations. ^ We also found that the mean decrease of accuracy is a more reliable variable ranking measurement than mean decrease of Gini. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In population studies, most current methods focus on identifying one outcome-related SNP at a time by testing for differences of genotype frequencies between disease and healthy groups or among different population groups. However, testing a great number of SNPs simultaneously has a problem of multiple testing and will give false-positive results. Although, this problem can be effectively dealt with through several approaches such as Bonferroni correction, permutation testing and false discovery rates, patterns of the joint effects by several genes, each with weak effect, might not be able to be determined. With the availability of high-throughput genotyping technology, searching for multiple scattered SNPs over the whole genome and modeling their joint effect on the target variable has become possible. Exhaustive search of all SNP subsets is computationally infeasible for millions of SNPs in a genome-wide study. Several effective feature selection methods combined with classification functions have been proposed to search for an optimal SNP subset among big data sets where the number of feature SNPs far exceeds the number of observations. ^ In this study, we take two steps to achieve the goal. First we selected 1000 SNPs through an effective filter method and then we performed a feature selection wrapped around a classifier to identify an optimal SNP subset for predicting disease. And also we developed a novel classification method-sequential information bottleneck method wrapped inside different search algorithms to identify an optimal subset of SNPs for classifying the outcome variable. This new method was compared with the classical linear discriminant analysis in terms of classification performance. Finally, we performed chi-square test to look at the relationship between each SNP and disease from another point of view. ^ In general, our results show that filtering features using harmononic mean of sensitivity and specificity(HMSS) through linear discriminant analysis (LDA) is better than using LDA training accuracy or mutual information in our study. Our results also demonstrate that exhaustive search of a small subset with one SNP, two SNPs or 3 SNP subset based on best 100 composite 2-SNPs can find an optimal subset and further inclusion of more SNPs through heuristic algorithm doesn't always increase the performance of SNP subsets. Although sequential forward floating selection can be applied to prevent from the nesting effect of forward selection, it does not always out-perform the latter due to overfitting from observing more complex subset states. ^ Our results also indicate that HMSS as a criterion to evaluate the classification ability of a function can be used in imbalanced data without modifying the original dataset as against classification accuracy. Our four studies suggest that Sequential Information Bottleneck(sIB), a new unsupervised technique, can be adopted to predict the outcome and its ability to detect the target status is superior to the traditional LDA in the study. ^ From our results we can see that the best test probability-HMSS for predicting CVD, stroke,CAD and psoriasis through sIB is 0.59406, 0.641815, 0.645315 and 0.678658, respectively. In terms of group prediction accuracy, the highest test accuracy of sIB for diagnosing a normal status among controls can reach 0.708999, 0.863216, 0.639918 and 0.850275 respectively in the four studies if the test accuracy among cases is required to be not less than 0.4. On the other hand, the highest test accuracy of sIB for diagnosing a disease among cases can reach 0.748644, 0.789916, 0.705701 and 0.749436 respectively in the four studies if the test accuracy among controls is required to be at least 0.4. ^ A further genome-wide association study through Chi square test shows that there are no significant SNPs detected at the cut-off level 9.09451E-08 in the Framingham heart study of CVD. Study results in WTCCC can only detect two significant SNPs that are associated with CAD. In the genome-wide study of psoriasis most of top 20 SNP markers with impressive classification accuracy are also significantly associated with the disease through chi-square test at the cut-off value 1.11E-07. ^ Although our classification methods can achieve high accuracy in the study, complete descriptions of those classification results(95% confidence interval or statistical test of differences) require more cost-effective methods or efficient computing system, both of which can't be accomplished currently in our genome-wide study. We should also note that the purpose of this study is to identify subsets of SNPs with high prediction ability and those SNPs with good discriminant power are not necessary to be causal markers for the disease.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Atherosclerosis is a complex disease resulting from interactions of genetic and environmental risk factors leading to heart failure and stroke. Using an atherosclerotic mouse model (ldlr-/-, apobec1-/- designated as LDb), we performed microarray analysis to identify candidate genes and pathways, which are most perturbed in changes in the following risk factors: genetics (control C57BL/6 vs. LDb mice), shearstress (lesion-prone vs. lesion-resistant regions in LDb mice), diet (chow vs. high fat fed LDb mice) and age (2-month-old vs. 8-month old LDb mice). ^ Atherosclerotic lesion quantification and lipid profile studies were performed to assess the disease phenotype. A microarray study was performed on lesion-prone and lesion-resistant regions of each aorta. Briefly, 32 male C57BL/6 and LDb mice (n =16/each) were fed on either chow or high fat diet, sacrificed at 2- and 8-months old, and RNA isolated from the aortic lesion-prone and aortic lesion-resistant segments. Using 64 Affymetrix Murine 430 2.0 chips, we profiled differentially expressed genes with the cut off value of FDR ≤ 0.15 for t-test, and q <0.0001 for the ANOVA. The data were normalized using two normalization methods---invariant probe sets (Loess) and Quantile normalization, the statistical analysis was performed using t-tests and ANOVA, and pathway characterization was done using Pathway Express (Wayne State). The result identified the calcium signaling pathway as the most significant overrepresented pathway, followed by focal adhesion. In the calcium signaling pathway, 56 genes were found to be significantly differentially expressed out of 180 genes listed in the KEGG calcium signaling pathway. Nineteen of these genes were consistently identified by both statistical tests, 11 of which were unique to the test, and 26 were unique to the ANOVA test, using the cutoffs noted above. ^ In conclusion, this finding suggested that hypercholesterolemia drives the disease progression by altering the expression of calcium channels and regulators which subsequently results in cell differentiation, growth, adhesion, cytoskeletal change and death. Clinically, this pathway may serve as an important target for future therapeutic intervention, and thus the calcium signaling pathway may serve as an important target for future diagnostic and therapeutic intervention. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Helicobacter pylori infection is frequently acquired during childhood. This microorganism is known to cause gastritis, and duodenal ulcer in pediatric patients, however most children remain completely asymptomatic to the infection. Currently there is no consensus in favor of treatment of H. pylori infection in asymptomatic children. The firstline of treatment for this population is triple medication therapy including two antibacterial agents and one proton pump inhibitor for a 2 week duration course. Decreased eradication rate of less than 75% has been documented with the use of this first-line therapy but novel tinidazole-containing quadruple sequential therapies seem worth investigating. None of the previous studies on such therapy has been done in the United States of America. As part of an iron deficiency anemia study in asymptomatic H. pylori infected children of El Paso, Texas, we conducted a secondary data analysis of study data collected in this trial to assess the effectiveness of this tinidazole-containing sequential quadruple therapy compared to placebo on clearing the infection. Subjects were selected from a group of asymptomatic children identified through household visits to 11,365 randomly selected dwelling units. After obtaining parental consent and child assent a total of 1,821 children 3-10 years of age were screened and 235 were positive to a novel urine immunoglobulin class G antibodies test for H. pylori infection and confirmed as infected using a 13C urea breath test, using a hydrolysis urea rate >10 μg/min as cut-off value. Out of those, 119 study subjects had a complete physical exam and baseline blood work and were randomly allocated to four groups, two of which received active H. pylori eradication medication alone or in combination with iron, while the other two received iron only or placebo only. Follow up visits to their houses were done to assess compliance and occurrence of adverse events and at 45+ days post-treatment, a second urea breath test was performed to assess their infection status. The effectiveness was primarily assessed on intent to treat basis (i.e., according to their treatment allocation), and the proportion of those who cleared their infection using a cut-off value >10 μg/min of for urea hydrolysis rate, was the primary outcome. Also we conducted analysis on a per-protocol basis and according to the cytotoxin associated gene A product of the H. pylori infection status. Also we compared the rate of adverse events across the two arms. On intent-to-treat and per-protocol analyses, 44.3% and 52.9%, respectively, of the children receiving the novel quadruple sequential eradication cleared their infection compared to 12.2% and 15.4% in the arms receiving iron or placebo only, respectively. Such differences were statistically significant (p<0.001). The study medications were well accepted and safe. In conclusion, we found in this study population, of mostly asymptomatically H. pylori infected children, living in the US along the border with Mexico, that the quadruple sequential eradication therapy cleared the infection in only half of the children receiving this treatment. Research is needed to assess the antimicrobial susceptibility of the strains of H. pylori infecting this population to formulate more effective therapies. ^