971 resultados para Immunologic Tests -- methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

As an important Civil Engineering material, asphalt concrete (AC) is commonly used to build road surfaces, airports, and parking lots. With traditional laboratory tests and theoretical equations, it is a challenge to fully understand such a random composite material. Based on the discrete element method (DEM), this research seeks to develop and implement computer models as research approaches for improving understandings of AC microstructure-based mechanics. In this research, three categories of approaches were developed or employed to simulate microstructures of AC materials, namely the randomly-generated models, the idealized models, and image-based models. The image-based models were recommended for accurately predicting AC performance, while the other models were recommended as research tools to obtain deep insight into the AC microstructure-based mechanics. A viscoelastic micromechanical model was developed to capture viscoelastic interactions within the AC microstructure. Four types of constitutive models were built to address the four categories of interactions within an AC specimen. Each of the constitutive models consists of three parts which represent three different interaction behaviors: a stiffness model (force-displace relation), a bonding model (shear and tensile strengths), and a slip model (frictional property). Three techniques were developed to reduce the computational time for AC viscoelastic simulations. It was found that the computational time was significantly reduced to days or hours from years or months for typical three-dimensional models. Dynamic modulus and creep stiffness tests were simulated and methodologies were developed to determine the viscoelastic parameters. It was found that the DE models could successfully predict dynamic modulus, phase angles, and creep stiffness in a wide range of frequencies, temperatures, and time spans. Mineral aggregate morphology characteristics (sphericity, orientation, and angularity) were studied to investigate their impacts on AC creep stiffness. It was found that aggregate characteristics significantly impact creep stiffness. Pavement responses and pavement-vehicle interactions were investigated by simulating pavement sections under a rolling wheel. It was found that wheel acceleration, steadily moving, and deceleration significantly impact contact forces. Additionally, summary and recommendations were provided in the last chapter and part of computer programming codes wree provided in the appendixes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Few data are available on the long-term immunologic response to antiretroviral therapy (ART) in resource-limited settings, where ART is being rapidly scaled up using a public health approach, with a limited repertoire of drugs. OBJECTIVES: To describe immunologic response to ART among ART patients in a network of cohorts from sub-Saharan Africa, Latin America, and Asia. STUDY POPULATION/METHODS: Treatment-naive patients aged 15 and older from 27 treatment programs were eligible. Multilevel, linear mixed models were used to assess associations between predictor variables and CD4 cell count trajectories following ART initiation. RESULTS: Of 29 175 patients initiating ART, 8933 (31%) were excluded due to insufficient follow-up time and early lost to follow-up or death. The remaining 19 967 patients contributed 39 200 person-years on ART and 71 067 CD4 cell count measurements. The median baseline CD4 cell count was 114 cells/microl, with 35% having less than 100 cells/microl. Substantial intersite variation in baseline CD4 cell count was observed (range 61-181 cells/microl). Women had higher median baseline CD4 cell counts than men (121 vs. 104 cells/microl). The median CD4 cell count increased from 114 cells/microl at ART initiation to 230 [interquartile range (IQR) 144-338] at 6 months, 263 (IQR 175-376) at 1 year, 336 (IQR 224-472) at 2 years, 372 (IQR 242-537) at 3 years, 377 (IQR 221-561) at 4 years, and 395 (IQR 240-592) at 5 years. In multivariable models, baseline CD4 cell count was the most important determinant of subsequent CD4 cell count trajectories. CONCLUSION: These data demonstrate robust and sustained CD4 response to ART among patients remaining on therapy. Public health and programmatic interventions leading to earlier HIV diagnosis and initiation of ART could substantially improve patient outcomes in resource-limited settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Meta-analysis of studies of the accuracy of diagnostic tests currently uses a variety of methods. Statistically rigorous hierarchical models require expertise and sophisticated software. We assessed whether any of the simpler methods can in practice give adequately accurate and reliable results. STUDY DESIGN AND SETTING: We reviewed six methods for meta-analysis of diagnostic accuracy: four simple commonly used methods (simple pooling, separate random-effects meta-analyses of sensitivity and specificity, separate meta-analyses of positive and negative likelihood ratios, and the Littenberg-Moses summary receiver operating characteristic [ROC] curve) and two more statistically rigorous approaches using hierarchical models (bivariate random-effects meta-analysis and hierarchical summary ROC curve analysis). We applied the methods to data from a sample of eight systematic reviews chosen to illustrate a variety of patterns of results. RESULTS: In each meta-analysis, there was substantial heterogeneity between the results of different studies. Simple pooling of results gave misleading summary estimates of sensitivity and specificity in some meta-analyses, and the Littenberg-Moses method produced summary ROC curves that diverged from those produced by more rigorous methods in some situations. CONCLUSION: The closely related hierarchical summary ROC curve or bivariate models should be used as the standard method for meta-analysis of diagnostic accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The authors describe the design, fabrication, and testing of a passive wireless sensor platform utilizing low-cost commercial surface acoustic wave filters and sensors. Polyimide and polyethylene terephthalate sheets are used as substrates to create a flexible sensor tag that can be applied to curved surfaces. A microfabricated antenna is integrated on the substrate in order to create a compact form factor. The sensor tags are fabricated using 315 MHz surface acoustic wave filters and photodiodes and tested with the aid of a fiber-coupled tungsten lamp. Microwave energy transmitted from a network analyzer is used to interrogate the sensor tag. Due to an electrical impedance mismatch at the SAW filter and sensor, energy is reflected at the sensor load and reradiated from the integrated antenna. By selecting sensors that change electrical impedance based on environmental conditions, the sensor state can be inferred through measurement of the reflected energy profile. Testing has shown that a calibrated system utilizing this type of sensor tag can detect distinct light levels wireless and passively. The authors also demonstrate simultaneous operation of two tags with different center passbands that detects light. Ranging tests show that the sensor tags can operate at a distance of at least 3.6 m.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The James Lind Library (www.jameslindlibrary.org) has been established to improve public and professional general knowledge about fair tests of treatments in healthcare and their history. Its foundation was laid ten years ago at the Royal College of Physicians of Edinburgh, and its administrative centre is in the College's Sibbald Library, one of the most important collections of historic medical manuscripts, papers and books in the world. The James Lind Library is a website that introduces visitors to the principles of fair tests of treatments, with a series of short, illustrated essays, which are currently available in English, Arabic, Chinese, French, Portuguese, Russian and Spanish. A 100-page book-- Testing Treatments--is now available free through the website, both in English and in Arabic and Spanish translations. To illustrate the evolution of ideas related to fair tests of treatments from 2000 BC to the present, the James Lind Library contains key passages and images from manuscripts, books and journal articles, many of them accompanied by commentaries, biographies, portraits and other relevant documents and images, including audio and video files. New material is being added to the website continuously, as relevant new records are identified and as methods for testing treatments evolve. A multinational, multilingual editorial team oversees the development of the website, which currently receives tens of thousands of visitors every month.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The diagnosis of a drug hypersensitivity reaction (DHR) is a challenging task because multiple and complex mechanisms are involved. Better understanding of immunologic pathomechanisms in DHRs and rapid progress in cellular-based in-vitro tests can help to adjust the correct diagnostic strategy to individual patients with different clinical manifestations of drug allergy. Thus, drug hypersensitivity diagnosis needs to rely on a combination of medical history and different in vivo and in vitro tests. In this article, the authors discuss current in vitro techniques, most recent findings, and new promising tools in the diagnosis of T-cell-mediated drug hypersensitivity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVES Quantitative sensory testing (QST) is widely used to investigate peripheral and central sensitization. However, the comparative performance of different QST for diagnostic or prognostic purposes is unclear. We explored the discriminative ability of different quantitative sensory tests in distinguishing between patients with chronic neck pain and pain-free control subjects and ranked these tests according to the extent of their association with pain hypersensitivity. METHODS We performed a case-control study in 40 patients and 300 control subjects. Twenty-six tests, including different modalities of pressure, heat, cold, and electrical stimulation, were used. As measures of discrimination, we estimated receiver operating characteristic curves and likelihood ratios. RESULTS The following quantitative sensory tests displayed the best discriminative value: (1) pressure pain threshold at the site of the most severe neck pain (fitted area under the receiver operating characteristic curve, 0.92), (2) reflex threshold to single electrical stimulation (0.90), (3) pain threshold to single electrical stimulation (0.89), (4) pain threshold to repeated electrical stimulation (0.87), and (5) pressure pain tolerance threshold at the site of the most severe neck pain (0.86). Only the first 3 could be used for both ruling in and out pain hypersensitivity. CONCLUSIONS Pressure stimulation at the site of the most severe pain and parameters of electrical stimulation were the most appropriate QST to distinguish between patients with chronic neck pain and asymptomatic control subjects. These findings may be used to select the tests in future diagnostic and longitudinal prognostic studies on patients with neck pain and to optimize the assessment of localized and spreading sensitization in chronic pain patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this note, we show that an extension of a test for perfect ranking in a balanced ranked set sample given by Li and Balakrishnan (2008) to the multi-cycle case turns out to be equivalent to the test statistic proposed by Frey et al. (2007). This provides an alternative interpretation and motivation for their test statistic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the past 2 decades, we have observed a rapid increase of infections due to multidrug-resistant Enterobacteriaceae. Regrettably, these isolates possess genes encoding for extended-spectrum β-lactamases (e.g., blaCTX-M, blaTEM, blaSHV) or plasmid-mediated AmpCs (e.g., blaCMY) that confer resistance to last-generation cephalosporins. Furthermore, other resistance traits against quinolones (e.g., mutations in gyrA and parC, qnr elements) and aminoglycosides (e.g., aminoglycosides modifying enzymes and 16S rRNA methylases) are also frequently co-associated. Even more concerning is the rapid increase of Enterobacteriaceae carrying genes conferring resistance to carbapenems (e.g., blaKPC, blaNDM). Therefore, the spread of these pathogens puts in peril our antibiotic options. Unfortunately, standard microbiological procedures require several days to isolate the responsible pathogen and to provide correct antimicrobial susceptibility test results. This delay impacts the rapid implementation of adequate antimicrobial treatment and infection control countermeasures. Thus, there is emerging interest in the early and more sensitive detection of resistance mechanisms. Modern non-phenotypic tests are promising in this respect, and hence, can influence both clinical outcome and healthcare costs. In this review, we present a summary of the most advanced methods (e.g., next-generation DNA sequencing, multiplex PCRs, real-time PCRs, microarrays, MALDI-TOF MS, and PCR/ESI MS) presently available for the rapid detection of antibiotic resistance genes in Enterobacteriaceae. Taking into account speed, manageability, accuracy, versatility, and costs, the possible settings of application (research, clinic, and epidemiology) of these methods and their superiority against standard phenotypic methods are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Methods are described for working with Nosema apis and Nosema ceranae in the field and in the laboratory. For fieldwork, different sampling methods are described to determine colony level infections at a given point in time, but also for following the temporal infection dynamics. Suggestions are made for how to standardise field trials for evaluating treatments and disease impact. The laboratory methods described include different means for determining colony level and individual bee infection levels and methods for species determination, including light microscopy, electron microscopy, and molecular methods (PCR). Suggestions are made for how to standardise cage trials, and different inoculation methods for infecting bees are described, including control methods for spore viability. A cell culture system for in vitro rearing of Nosema spp. is described. Finally, how to conduct different types of experiments are described, including infectious dose, dose effects, course of infection and longevity tests

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Although free eye testing is available in the UK from a nation-wide network of optometrists, there is evidence of unrecognised, tractable vision loss amongst older people. A recent review identified this unmet need as a priority for further investigation, highlighting the need to understand public perceptions of eye services and barriers to service access and utilisation. This paper aims to identify risk factors for (1) having poor vision and (2) not having had an eyesight check among community-dwelling older people without an established ophthalmological diagnosis. METHODS Secondary analysis of self-reported data from the ProAge trial. 1792 people without a known ophthalmological diagnosis were recruited from three group practices in London. RESULTS Almost two in ten people in this population of older individuals without known ophthalmological diagnoses had self-reported vision loss, and more than a third of them had not had an eye test in the previous twelve months. In this sample, those with limited education, depressed mood, need for help with instrumental and basic activities of daily living (IADLs and BADLs), and subjective memory complaints were at increased risk of fair or poor self-reported vision. Individuals with basic education only were at increased risk for not having had an eye test in the previous 12 months (OR 1.52, 95% CI 1.17-1.98 p=0.002), as were those with no, or only one chronic condition (OR 1.850, 95% CI 1.382-2.477, p<0.001). CONCLUSIONS Self-reported poor vision in older people without ophthalmological diagnoses is associated with other functional losses, with no or only one chronic condition, and with depression. This pattern of disorders may be the basis for case finding in general practice. Low educational attainment is an independent determinant of not having had eye tests, as well as a factor associated with undiagnosed vision loss. There are other factors, not identified in this study, which determine uptake of eye testing in those with self-reported vision loss. Further exploration is needed to identify these factors and lead towards effective case finding.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND While the assessment of analytical precision within medical laboratories has received much attention in scientific enquiry, the degree of as well as the sources causing variation between them remains incompletely understood. In this study, we quantified the variance components when performing coagulation tests with identical analytical platforms in different laboratories and computed intraclass correlations coefficients (ICC) for each coagulation test. METHODS Data from eight laboratories measuring fibrinogen twice in twenty healthy subjects with one out of 3 different platforms and single measurements of prothrombin time (PT), and coagulation factors II, V, VII, VIII, IX, X, XI and XIII were analysed. By platform, the variance components of (i) the subjects, (ii) the laboratory and the technician and (iii) the total variance were obtained for fibrinogen as well as (i) and (iii) for the remaining factors using ANOVA. RESULTS The variability for fibrinogen measurements within a laboratory ranged from 0.02 to 0.04, the variability between laboratories ranged from 0.006 to 0.097. The ICC for fibrinogen ranged from 0.37 to 0.66 and from 0.19 to 0.80 for PT between the platforms. For the remaining factors the ICC's ranged from 0.04 (FII) to 0.93 (FVIII). CONCLUSIONS Variance components that could be attributed to technicians or laboratory procedures were substantial, led to disappointingly low intraclass correlation coefficients for several factors and were pronounced for some of the platforms. Our findings call for sustained efforts to raise the level of standardization of structures and procedures involved in the quantification of coagulation factors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Congestive heart failure has long been one of the most serious medical conditions in the United States; in fact, in the United States alone, heart failure accounts for 6.5 million days of hospitalization each year. One important goal of heart-failure therapy is to inhibit the progression of congestive heart failure through pharmacologic and device-based therapies. Therefore, there have been efforts to develop device-based therapies aimed at improving cardiac reserve and optimizing pump function to meet metabolic requirements. The course of congestive heart failure is often worsened by other conditions, including new-onset arrhythmias, ischemia and infarction, valvulopathy, decompensation, end-organ damage, and therapeutic refractoriness, that have an impact on outcomes. The onset of such conditions is sometimes heralded by subtle pathophysiologic changes, and the timely identification of these changes may promote the use of preventive measures. Consequently, device-based methods could in the future have an important role in the timely identification of the subtle pathophysiologic changes associated with congestive heart failure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Linkage disequilibrium methods can be used to find genes influencing quantitative trait variation in humans. Linkage disequilibrium methods can require smaller sample sizes than linkage equilibrium methods, such as the variance component approach to find loci with a specific effect size. The increase in power is at the expense of requiring more markers to be typed to scan the entire genome. This thesis compares different linkage disequilibrium methods to determine which factors influence the power to detect disequilibrium. The costs of disequilibrium and equilibrium tests were compared to determine whether the savings in phenotyping costs when using disequilibrium methods outweigh the additional genotyping costs.^ Nine linkage disequilibrium tests were examined by simulation. Five tests involve selecting isolated unrelated individuals while four involved the selection of parent child trios (TDT). All nine tests were found to be able to identify disequilibrium with the correct significance level in Hardy-Weinberg populations. Increasing linked genetic variance and trait allele frequency were found to increase the power to detect disequilibrium, while increasing the number of generations and distance between marker and trait loci decreased the power to detect disequilibrium. Discordant sampling was used for several of the tests. It was found that the more stringent the sampling, the greater the power to detect disequilibrium in a sample of given size. The power to detect disequilibrium was not affected by the presence of polygenic effects.^ When the trait locus had more than two trait alleles, the power of the tests maximized to less than one. For the simulation methods used here, when there were more than two-trait alleles there was a probability equal to 1-heterozygosity of the marker locus that both trait alleles were in disequilibrium with the same marker allele, resulting in the marker being uninformative for disequilibrium.^ The five tests using isolated unrelated individuals were found to have excess error rates when there was disequilibrium due to population admixture. Increased error rates also resulted from increased unlinked major gene effects, discordant trait allele frequency, and increased disequilibrium. Polygenic effects did not affect the error rates. The TDT, Transmission Disequilibrium Test, based tests were not liable to any increase in error rates.^ For all sample ascertainment costs, for recent mutations ($<$100 generations) linkage disequilibrium tests were less expensive than the variance component test to carry out. Candidate gene scans saved even more money. The use of recently admixed populations also decreased the cost of performing a linkage disequilibrium test. ^