26 resultados para Laboratory assessment
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
The first part of this three-part review on the relevance of laboratory testing of composites and adhesives deals with approval requirements for composite materials. We compare the in vivo and in vitro literature data and discuss the relevance of in vitro analyses. The standardized ISO protocols are presented, with a focus on the evaluation of physical parameters. These tests all have a standardized protocol that describes the entire test set-up. The tests analyse flexural strength, depth of cure, susceptibility to ambient light, color stability, water sorption and solubility, and radiopacity. Some tests have a clinical correlation. A high flexural strength, for instance, decreases the risk of fractures of the marginal ridge in posterior restorations and incisal edge build-ups of restored anterior teeth. Other tests do not have a clinical correlation or the threshold values are too low, which results in an approval of materials that show inferior clinical properties (e.g., radiopacity). It is advantageous to know the test set-ups and the ideal threshold values to correctly interpret the material data. Overall, however, laboratory assessment alone cannot ensure the clinical success of a product.
Resumo:
Biomarkers are currently best used as mechanistic "signposts" rather than as "traffic lights" in the environmental risk assessment of endocrine-disrupting chemicals (EDCs). In field studies, biomarkers of exposure [e.g., vitellogenin (VTG) induction in male fish] are powerful tools for tracking single substances and mixtures of concern. Biomarkers also provide linkage between field and laboratory data, thereby playing an important role in directing the need for and design of fish chronic tests for EDCs. It is the adverse effect end points (e.g., altered development, growth, and/or reproduction) from such tests that are most valuable for calculating adverseNOEC (no observed effect concentration) or adverseEC10 (effective concentration for a 10% response) and subsequently deriving predicted no effect concentrations (PNECs). With current uncertainties, biomarkerNOEC or biomarkerEC10 data should not be used in isolation to derive PNECs. In the future, however, there may be scope to increasingly use biomarker data in environmental decision making, if plausible linkages can be made across levels of organization such that adverse outcomes might be envisaged relative to biomarker responses. For biomarkers to fulfil their potential, they should be mechanistically relevant and reproducible (as measured by interlaboratory comparisons of the same protocol). VTG is a good example of such a biomarker in that it provides an insight to the mode of action (estrogenicity) that is vital to fish reproductive health. Interlaboratory reproducibility data for VTG are also encouraging; recent comparisons (using the same immunoassay protocol) have provided coefficients of variation (CVs) of 38-55% (comparable to published CVs of 19-58% for fish survival and growth end points used in regulatory test guidelines). While concern over environmental xenoestrogens has led to the evaluation of reproductive biomarkers in fish, it must be remembered that many substances act via diverse mechanisms of action such that the environmental risk assessment for EDCs is a broad and complex issue. Also, biomarkers such as secondary sexual characteristics, gonadosomatic indices, plasma steroids, and gonadal histology have significant potential for guiding interspecies assessments of EDCs and designing fish chronic tests. To strengthen the utility of EDC biomarkers in fish, we need to establish a historical control database (also considering natural variability) to help differentiate between statistically detectable versus biologically significant responses. In conclusion, as research continues to develop a range of useful EDC biomarkers, environmental decision-making needs to move forward, and it is proposed that the "biomarkers as signposts" approach is a pragmatic way forward in the current risk assessment of EDCs.
Resumo:
Alveolar echinococcosis (AE)--caused by the cestode Echinococcus multilocularis--is a severe zoonotic disease found in temperate and arctic regions of the northern hemisphere. Even though the transmission patterns observed in different geographical areas are heterogeneous, the nuclear and mitochondrial targets usually used for the genotyping of E. multilocularis have shown only a marked genetic homogeneity in this species. We used microsatellite sequences, because of their high typing resolution, to explore the genetic diversity of E. multilocularis. Four microsatellite targets (EmsJ, EmsK, and EmsB, which were designed in our laboratory, and NAK1, selected from the literature) were tested on a panel of 76 E. multilocularis samples (larval and adult stages) obtained from Alaska, Canada, Europe, and Asia. Genetic diversity for each target was assessed by size polymorphism analysis. With the EmsJ and EmsK targets, two alleles were found for each locus, yielding two and three genotypes, respectively, discriminating European isolates from the other groups. With NAK1, five alleles were found, yielding seven genotypes, including those specific to Tibetan and Alaskan isolates. The EmsB target, a tandem repeated multilocus microsatellite, found 17 alleles showing a complex pattern. Hierarchical clustering analyses were performed with the EmsB findings, and 29 genotypes were identified. Due to its higher genetic polymorphism, EmsB exhibited a higher discriminatory power than the other targets. The complex EmsB pattern was able to discriminate isolates on a regional and sectoral level, while avoiding overdistinction. EmsB will be used to assess the putative emergence of E. multilocularis in Europe.
Resumo:
BACKGROUND: Knowledge of the number of recent HIV infections is important for epidemiologic surveillance. Over the past decade approaches have been developed to estimate this number by testing HIV-seropositive specimens with assays that discriminate the lower concentration and avidity of HIV antibodies in early infection. We have investigated whether this "recency" information can also be gained from an HIV confirmatory assay. METHODS AND FINDINGS: The ability of a line immunoassay (INNO-LIA HIV I/II Score, Innogenetics) to distinguish recent from older HIV-1 infection was evaluated in comparison with the Calypte HIV-1 BED Incidence enzyme immunoassay (BED-EIA). Both tests were conducted prospectively in all HIV infections newly diagnosed in Switzerland from July 2005 to June 2006. Clinical and laboratory information indicative of recent or older infection was obtained from physicians at the time of HIV diagnosis and used as the reference standard. BED-EIA and various recency algorithms utilizing the antibody reaction to INNO-LIA's five HIV-1 antigen bands were evaluated by logistic regression analysis. A total of 765 HIV-1 infections, 748 (97.8%) with complete test results, were newly diagnosed during the study. A negative or indeterminate HIV antibody assay at diagnosis, symptoms of primary HIV infection, or a negative HIV test during the past 12 mo classified 195 infections (26.1%) as recent (< or = 12 mo). Symptoms of CDC stages B or C classified 161 infections as older (21.5%), and 392 patients with no symptoms remained unclassified. BED-EIA ruled 65% of the 195 recent infections as recent and 80% of the 161 older infections as older. Two INNO-LIA algorithms showed 50% and 40% sensitivity combined with 95% and 99% specificity, respectively. Estimation of recent infection in the entire study population, based on actual results of the three tests and adjusted for a test's sensitivity and specificity, yielded 37% for BED-EIA compared to 35% and 33% for the two INNO-LIA algorithms. Window-based estimation with BED-EIA yielded 41% (95% confidence interval 36%-46%). CONCLUSIONS: Recency information can be extracted from INNO-LIA-based confirmatory testing at no additional costs. This method should improve epidemiologic surveillance in countries that routinely use INNO-LIA for HIV confirmation.
Resumo:
Efficient planning of soil conservation measures requires, first, to understand the impact of soil erosion on soil fertility with regard to local land cover classes; and second, to identify hot spots of soil erosion and bright spots of soil conservation in a spatially explicit manner. Soil organic carbon (SOC) is an important indicator of soil fertility. The aim of this study was to conduct a spatial assessment of erosion and its impact on SOC for specific land cover classes. Input data consisted of extensive ground truth, a digital elevation model and Landsat 7 imagery from two different seasons. Soil spectral reflectance readings were taken from soil samples in the laboratory and calibrated with results of SOC chemical analysis using regression tree modelling. The resulting model statistics for soil degradation assessments are promising (R2=0.71, RMSEV=0.32). Since the area includes rugged terrain and small agricultural plots, the decision tree models allowed mapping of land cover classes, soil erosion incidence and SOC content classes at an acceptable level of accuracy for preliminary studies. The various datasets were linked in the hot-bright spot matrix, which was developed to combine soil erosion incidence information and SOC content levels (for uniform land cover classes) in a scatter plot. The quarters of the plot show different stages of degradation, from well conserved land to hot spots of soil degradation. The approach helps to gain a better understanding of the impact of soil erosion on soil fertility and to identify hot and bright spots in a spatially explicit manner. The results show distinctly lower SOC content levels on large parts of the test areas, where annual crop cultivation was dominant in the 1990s and where cultivation has now been abandoned. On the other hand, there are strong indications that afforestations and fruit orchards established in the 1980s have been successful in conserving soil resources.
Resumo:
Soil degradation is a major problem in the agriculturally dominated country of Tajikistan, which makes it necessary to determine and monitor the state of soils. For this purpose a soil spectral library was established as it enables the determination of soil properties with relatively low costs and effort. A total of 1465 soil samples were collected from three 10x10 km test sites in western Tajikistan. The diffuse reflectance of the samples was measured with a FieldSpec PRO FR from ASD in the spectral range from 380 to 2500 nm in laboratory. 166 samples were finally selected based on their spectral information and analysed on total C and N, organic C, pH, CaCO₃, extractable P, exchangeable Ca, Mg and K, and the fractions clay, silt and sand. Multiple linear regression was used to set up the models. Two third of the chemically analysed samples were used to calibrate the models, one third was used for hold-out validation. Very good prediction accuracy was obtained for total C (R² = 0.76, RMSEP = 4.36 g kg⁻¹), total N (R² = 0.83, RMSEP = 0.30 g kg⁻¹) and organic C (R² = 0.81, RMSEP = 3.30 g kg⁻¹), good accuracy for pH (R² = 0.61, RMSEP = 0.157) and CaCO3(R² = 0.72, RMSEP = 4.63 %). No models could be developed for extractable P, exchangeable Ca, Mg and K, and the fractions clay, silt and sand. It can be concluded that the spectral library approach has a high potential to substitute standard laboratory methods where rapid and inexpensive analysis is required.
Resumo:
BACKGROUND: Patients with chemotherapy-related neutropenia and fever are usually hospitalized and treated on empirical intravenous broad-spectrum antibiotic regimens. Early diagnosis of sepsis in children with febrile neutropenia remains difficult due to non-specific clinical and laboratory signs of infection. We aimed to analyze whether IL-6 and IL-8 could define a group of patients at low risk of septicemia. METHODS: A prospective study was performed to assess the potential value of IL-6, IL-8 and C-reactive protein serum levels to predict severe bacterial infection or bacteremia in febrile neutropenic children with cancer during chemotherapy. Statistical test used: Friedman test, Wilcoxon-Test, Kruskal-Wallis H test, Mann-Whitney U-Test and Receiver Operating Characteristics. RESULTS: The analysis of cytokine levels measured at the onset of fever indicated that IL-6 and IL-8 are useful to define a possible group of patients with low risk of sepsis. In predicting bacteremia or severe bacterial infection, IL-6 was the best predictor with the optimum IL-6 cut-off level of 42 pg/ml showing a high sensitivity (90%) and specificity (85%). CONCLUSION: These findings may have clinical implications for risk-based antimicrobial treatment strategies.
Resumo:
BACKGROUND: Renovascular vasoconstriction in patients with hepatorenal syndrome can be quantified by the renal arterial resistance index (RI). We investigated the value of RI measurement in detection of renal function impairment in patients with different stages of chronic liver disease. METHODS: Subjects were divided into 4 groups containing 21 patients with liver cirrhosis and ascites, 25 patients with liver cirrhosis without ascites, 35 patients with fatty liver disease and 78 control subjects. All patients underwent abdominal ultrasound examination with renal RI measurement and correlation with laboratory results for renal function. RESULTS: RI was significantly higher in ascitic patients compared to non-ascitic patients (0.74 vs. 0.67, p<0.01) and in non-ascitic patients with liver cirrhosis than in control subjects (0.67 vs. 0.62, p<0.01). 48% (19/40) of patients with liver cirrhosis and normal serum creatinine concentration showed elevated RI levels. There were no significant differences in RI levels between patients with fatty liver disease and controls (0.63 vs. 0.62). CONCLUSIONS: Intrarenal RI measurement is a predictor of renal vasoconstriction and serves to detect early renal function impairment in cirrhotic patients. The diagnosis of elevated RI may be taken into account in the clinical management of these patients.
Resumo:
OBJECTIVE: The aim of this study was to establish and validate a three-dimensional imaging protocol for the assessment of Computed Tomography (CT) scans of abdominal aortic aneurysms in UK EVAR trials patients. Quality control and repeatability of anatomical measurements is important for the validity of any core laboratory. METHODS: Three different observers performed anatomical measurements on 50 preoperative CT scans of aortic aneurysms using the Vitrea 2 three-dimensional post-imaging software in a core laboratory setting. We assessed the accuracy of intra and inter observer repeatability of measurements, the time required for collection of measurements, 3 different levels of automation and 3 different automated criteria for measurement of neck length. RESULTS: None of the automated neck length measurements demonstrated sufficient accuracy and it was necessary to perform checking of the important automated landmarks. Good intra and limited inter observer agreement were achieved with three-dimensional assessment. Complete assessment of the aneurysm and iliacs took an average (SD) of 17.2 (4.1) minutes. CONCLUSIONS: Aortic aneurysm anatomy can be assessed reliably and quickly using three-dimensional assessment but for scans of limited quality, manual checking of important landmarks remains necessary. Using a set protocol, agreement between observers is satisfactory but not as good as within observers.
Resumo:
The widespread species Escherichia coli includes a broad variety of different types, ranging from highly pathogenic strains causing worldwide outbreaks of severe disease to avirulent isolates which are part of the normal intestinal flora or which are well characterized and safe laboratory strains. The pathogenicity of a given E. coli strain is mainly determined by specific virulence factors which include adhesins, invasins, toxins and capsule. They are often organized in large genetic blocks either on the chromosome ('pathogenicity islands'), on large plasmids or on phages and can be transmitted horizontally between strains. In this review we summarize the current knowledge of the virulence attributes which determine the pathogenic potential of E. coli strains and the methodology available to assess the virulence of E. coli isolates. We also focus on a recently developed procedure based on a broad-range detection system for E. coli-specific virulence genes that makes it possible to determine the potential pathogenicity and its nature in E. coli strains from various sources. This makes it possible to determine the pathotype of E. coli strains in medical diagnostics, to assess the virulence and health risks of E. coli contaminating water, food and the environment and to study potential reservoirs of virulence genes which might contribute to the emergence of new forms of pathogenic E. coli.
Resumo:
While the pathology peer review/pathology working group (PWG) model has long been used in mammalian toxicologic pathology to ensure the accuracy, consistency, and objectivity of histopathology data, application of this paradigm to ecotoxicological studies has thus far been limited. In the current project, the PWG approach was used to evaluate histopathologic sections of gills, liver, kidney, and/or intestines from three previously published studies of diclofenac in trout, among which there was substantial variation in the reported histopathologic findings. The main objectives of this review process were to investigate and potentially reconcile these interstudy differences, and based on the results, to establish an appropriate no observed effect concentration (NOEC). Following a complete examination of all histologic sections and original diagnoses by a single experienced fish pathologist (pathology peer review), a two-day PWG session was conducted to allow members of a four-person expert panel to determine the extent of treatment-related findings in each of the three trout studies. The PWG was performed according to the United States Environmental Protection Agency (US EPA) Pesticide Regulation (PR) 94-5 (EPA Pesticide Regulation, 1994). In accordance with standard procedures, the PWG review was conducted by the non-voting chairperson in a manner intended to minimize bias, and thus during the evaluation, the four voting panelists were unaware of the treatment group status of individual fish and the original diagnoses associated with the histologic sections. Based on the results of this review, findings related to diclofenac exposure included minimal to slightly increased thickening of the gill filament tips in fish exposed to the highest concentration tested (1,000 μg/L), plus a previously undiagnosed finding, decreased hepatic glycogen, which also occurred at the 1,000 μg/L dose level. The panel found little evidence to support other reported effects of diclofenac in trout, and thus the overall NOEC was determined to be >320 μg/L. By consensus, the PWG panel was able to identify diagnostic inconsistencies among and within the three prior studies; therefore this exercise demonstrated the value of the pathology peer review/PWG approach for assessing the reliability of histopathology results that may be used by regulatory agencies for risk assessment.
Resumo:
Alveolar echinococcosis (AE) in humans is a parasitic disease characterized by severe damage to the liver and occasionally other organs. AE is caused by infection with the metacestode (larval) stage of the fox tapeworm Echinococcus multilocularis, usually infecting small rodents as natural intermediate hosts. Conventionally, human AE is chemotherapeutically treated with mebendazole or albendazole. There is, however still the need for improved chemotherapeutical options. Primary in vivo studies on drugs of interest are commonly performed in small laboratory animals such as mice and Mongolian jirds, and in most cases, a secondary infection model is used, whereby E. multilocularis metacestodes are directly injected into the peritoneal cavity or into the liver. Disadvantages of this methodological approach include risk of injury to organs during the inoculation and, most notably, a limitation in the macroscopic (visible) assessment of treatment efficacy. Thus, in order to monitor the efficacy of chemotherapeutical treatment, animals have to be euthanized and the parasite tissue dissected. In the present study, mice were infected with E. multilocularis metacestodes through the subcutaneous route and were then subjected to chemotherapy employing albendazole. Serological responses to infection were comparatively assessed in mice infected by the conventional intraperitoneal route. We demonstrate that the subcutaneous infection model for secondary AE facilitates the assessment of the progress of infection and drug treatment in the live animal.