913 resultados para MS-based methods


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Toxoplasma gondii is an obligate intracellular parasite capable of infecting virtually all warm-blooded species, including humans, but cats are the only definitive hosts. Humans or animals acquire T. gondii infection by ingesting food or water contaminated with sporulated oocysts or by ingesting tissue cysts containing bradyzoites. Toxoplasmosis has the highest human incidence among zoonotic parasitic diseases, but it is still considered an underreported zoonosis. The importance of T. gondii primary infection in livestock is related to the ability of the parasite to produce tissue cysts in infected animals, which may represent important sources of infection for humans. Consumption of undercooked mutton and pork are considered important sources of human Toxoplasma gondii. The first aim of this thesis was to develop a rapid and sensitive in- house indirect ELISA for the detection of antibodies against T. gondii in sheep sera. ROC-curve analysis showed high discriminatory power (AUC=0.999) and high sensitivity (99.4%) and specificity (99.8%) of the method. The ELISA was used to test a batch of sheep sera (375) collected in the Forli-Cesena district. The overall prevalence was estimated at 41.9% demonstrating that T. gondii infection is widely distributed in sheep reared in Forli-Cesena district. Since the epidemiological impact of waterborne transmission route of T.gondii to humans is now thought to be more significant than previously believed, the second aim of the thesis was to evaluate PCR based methods for detecting T. gondii DNA in raw and finished drinking water samples collected in Scotland. Samples were tested using a quantitative PCR on 529 bp repetitive elements. Only one raw water sample (0.3%), out of the 358 examined, tested T. gondii positive demonstrating that there is no evidence that tap water is a source of Toxoplasma infection in Scotland.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Over the past ten years, the cross-correlation of long-time series of ambient seismic noise (ASN) has been widely adopted to extract the surface-wave part of the Green’s Functions (GF). This stochastic procedure relies on the assumption that ASN wave-field is diffuse and stationary. At frequencies <1Hz, the ASN is mainly composed by surface-waves, whose origin is attributed to the sea-wave climate. Consequently, marked directional properties may be observed, which call for accurate investigation about location and temporal evolution of the ASN-sources before attempting any GF retrieval. Within this general context, this thesis is aimed at a thorough investigation about feasibility and robustness of the noise-based methods toward the imaging of complex geological structures at the local (∼10-50km) scale. The study focused on the analysis of an extended (11 months) seismological data set collected at the Larderello-Travale geothermal field (Italy), an area for which the underground geological structures are well-constrained thanks to decades of geothermal exploration. Focusing on the secondary microseism band (SM;f>0.1Hz), I first investigate the spectral features and the kinematic properties of the noise wavefield using beamforming analysis, highlighting a marked variability with time and frequency. For the 0.1-0.3Hz frequency band and during Spring- Summer-time, the SMs waves propagate with high apparent velocities and from well-defined directions, likely associated with ocean-storms in the south- ern hemisphere. Conversely, at frequencies >0.3Hz the distribution of back- azimuths is more scattered, thus indicating that this frequency-band is the most appropriate for the application of stochastic techniques. For this latter frequency interval, I tested two correlation-based methods, acting in the time (NCF) and frequency (modified-SPAC) domains, respectively yielding esti- mates of the group- and phase-velocity dispersions. Velocity data provided by the two methods are markedly discordant; comparison with independent geological and geophysical constraints suggests that NCF results are more robust and reliable.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this thesis, new advances in the development of spectroscopic based methods for the characterization of heritage materials have been achieved. As concern FTIR spectroscopy new approaches aimed at exploiting near and far IR region for the characterization of inorganic or organic materials have been tested. Paint cross-section have been analysed by FTIR spectroscopy in the NIR range and an “ad hoc” chemometric approach has been developed for the elaboration of hyperspectral maps. Moreover, a new method for the characterization of calcite based on the use of grinding curves has been set up both in MIR and in FAR region. Indeed, calcite is a material widely applied in cultural heritage, and this spectroscopic approach is an efficient and rapid tool to distinguish between different calcite samples. Different enhanced vibrational techniques for the characterisation of dyed fibres have been tested. First a SEIRA (Surface Enhanced Infra-Red Absorption) protocol has been optimised allowing the analysis of colorant micro-extracts thanks to the enhancement produced by the addition of gold nanoparticles. These preliminary studies permitted to identify a new enhanced FTIR method, named ATR/RAIRS, which allowed to reach lower detection limits. Regarding Raman microscopy, the research followed two lines, which have in common the aim of avoiding the use of colloidal solutions. AgI based supports obtained after deposition on a gold-coated glass slides have been developed and tested spotting colorant solutions. A SERS spectrum can be obtained thanks to the photoreduction, which the laser may induce on the silver salt. Moreover, these supports can be used for the TLC separation of a mixture of colorants and the analyses by means of both Raman/SERS and ATR-RAIRS can be successfully reached. Finally, a photoreduction method for the “on fiber” analysis of colorant without the need of any extraction have been optimised.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Movement analysis carried out in laboratory settings is a powerful, but costly solution since it requires dedicated instrumentation, space and personnel. Recently, new technologies such as the magnetic and inertial measurement units (MIMU) are becoming widely accepted as tools for the assessment of human motion in clinical and research settings. They are relatively easy-to-use and potentially suitable for estimating gait kinematic features, including spatio-temporal parameters. The objective of this thesis regards the development and testing in clinical contexts of robust MIMUs based methods for assessing gait spatio-temporal parameters applicable across a number of different pathological gait patterns. First, considering the need of a solution the least obtrusive as possible, the validity of the single unit based approach was explored. A comparative evaluation of the performance of various methods reported in the literature for estimating gait temporal parameters using a single unit attached to the trunk first in normal gait and then in different pathological gait conditions was performed. Then, the second part of the research headed towards the development of new methods for estimating gait spatio-temporal parameters using shank worn MIMUs on different pathological subjects groups. In addition to the conventional gait parameters, new methods for estimating the changes of the direction of progression were explored. Finally, a new hardware solution and relevant methodology for estimating inter-feet distance during walking was proposed. Results of the technical validation of the proposed methods at different walking speeds and along different paths against a gold standard were reported and showed that the use of two MIMUs attached to the lower limbs associated with a robust method guarantee a much higher accuracy in determining gait spatio-temporal parameters. In conclusion, the proposed methods could be reliably applied to various abnormal gaits obtaining in some cases a comparable level of accuracy with respect to normal gait.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As a large and long-lived species with high economic value, restricted spawning areas and short spawning periods, the Atlantic bluefin tuna (BFT; Thunnus thynnus) is particularly susceptible to over-exploitation. Although BFT have been targeted by fisheries in the Mediterranean Sea for thousands of years, it has only been in these last decades that the exploitation rate has reached far beyond sustainable levels. An understanding of the population structure, spatial dynamics, exploitation rates and the environmental variables that affect BFT is crucial for the conservation of the species. The aims of this PhD project were 1) to assess the accuracy of larval identification methods, 2) determine the genetic structure of modern BFT populations, 3) assess the self-recruitment rate in the Gulf of Mexico and Mediterranean spawning areas, 4) estimate the immigration rate of BFT to feeding aggregations from the various spawning areas, and 5) develop tools capable of investigating the temporal stability of population structuring in the Mediterranean Sea. Several weaknesses in modern morphology-based taxonomy including demographic decline of expert taxonomists, flawed identification keys, reluctance of the taxonomic community to embrace advances in digital communications and a general scarcity of modern user-friendly materials are reviewed. Barcoding of scombrid larvae revealed important differences in the accuracy of the taxonomic identifications carried out by different ichthyoplanktologists following morphology-based methods. Using a Genotyping-by-Sequencing a panel of 95 SNPs was developed and used to characterize the population structuring of BFT and composition of adult feeding aggregations. Using novel molecular techniques, DNA was extracted from bluefin tuna vertebrae excavated from late iron age, ancient roman settlements Byzantine-era Constantinople and a 20th century collection. A second panel of 96 SNPs was developed to genotype historical and modern samples in order to elucidate changes in population structuring and allele frequencies of loci associated with selective traits.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In dieser Arbeit werden neuere methodische Entwicklungen aus dem Bereich der Numerischen Integration für die näherungsweise Berechnung von Zustandraummodellen erprobt. Die resultierenden Algorithmen werden bzgl. ihrer Approximationsgüte mit den populären simulationsbasierten Näherungsverfahren verglichen.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Robust and accurate identification of intervertebral discs from low resolution, sparse MRI scans is essential for the automated scan planning of the MRI spine scan. This paper presents a graphical model based solution for the detection of both the positions and orientations of intervertebral discs from low resolution, sparse MRI scans. Compared with the existing graphical model based methods, the proposed method does not need a training process using training data and it also has the capability to automatically determine the number of vertebrae visible in the image. Experiments on 25 low resolution, sparse spine MRI data sets verified its performance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Over the last decade, translational science has come into the focus of academic medicine, and significant intellectual and financial efforts have been made to initiate a multitude of bench-to-bedside projects. The quest for suitable biomarkers that will significantly change clinical practice has become one of the biggest challenges in translational medicine. Quantitative measurement of proteins is a critical step in biomarker discovery. Assessing a large number of potential protein biomarkers in a statistically significant number of samples and controls still constitutes a major technical hurdle. Multiplexed analysis offers significant advantages regarding time, reagent cost, sample requirements and the amount of data that can be generated. The two contemporary approaches in multiplexed and quantitative biomarker validation, antibody-based immunoassays and MS-based multiple (or selected) reaction monitoring, are based on different assay principles and instrument requirements. Both approaches have their own advantages and disadvantages and therefore have complementary roles in the multi-staged biomarker verification and validation process. In this review, we discuss quantitative immunoassay and multiple reaction monitoring/selected reaction monitoring assay principles and development. We also discuss choosing an appropriate platform, judging the performance of assays, obtaining reliable, quantitative results for translational research and clinical applications in the biomarker field.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Most butterfly monitoring protocols rely on counts along transects (Pollard walks) to generate species abundance indices and track population trends. It is still too often ignored that a population count results from two processes: the biological process (true abundance) and the statistical process (our ability to properly quantify abundance). Because individual detectability tends to vary in space (e.g., among sites) and time (e.g., among years), it remains unclear whether index counts truly reflect population sizes and trends. This study compares capture-mark-recapture (absolute abundance) and count-index (relative abundance) monitoring methods in three species (Maculinea nausithous and Iolana iolas: Lycaenidae; Minois dryas: Satyridae) in contrasted habitat types. We demonstrate that intraspecific variability in individual detectability under standard monitoring conditions is probably the rule rather than the exception, which questions the reliability of count-based indices to estimate and compare specific population abundance. Our results suggest that the accuracy of count-based methods depends heavily on the ecology and behavior of the target species, as well as on the type of habitat in which surveys take place. Monitoring programs designed to assess the abundance and trends in butterfly populations should incorporate a measure of detectability. We discuss the relative advantages and inconveniences of current monitoring methods and analytical approaches with respect to the characteristics of the species under scrutiny and resources availability.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Lung stereology has a long and successful tradition. From mice to men, the application of new stereological methods at several levels (alveoli, parenchymal cells, organelles, proteins) has led to new insights into normal lung architecture, parenchymal remodelling in emphysema-like pathology, alveolar type II cell hyperplasia and hypertrophy and intracellular surfactant alterations as well as distribution of surfactant proteins. The Euler number of the network of alveolar openings, estimated using physical disectors at the light microscopic level, is an unbiased and direct estimate of alveolar number. Surfactant-producing alveolar type II cells can be counted and sampled for local size estimation with physical disectors at a high magnification light microscopic level. The number of their surfactant storage organelles, lamellar bodies, can be estimated using physical disectors at the EM level. By immunoelectron microscopy, surfactant protein distribution can be analysed with the relative labelling index. Together with the well-established classical stereological methods, these design-based methods now allow for a complete quantitative phenotype analysis in lung development and disease, including the structural characterization of gene-manipulated mice, at the light and electron microscopic level.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Professor Sir David R. Cox (DRC) is widely acknowledged as among the most important scientists of the second half of the twentieth century. He inherited the mantle of statistical science from Pearson and Fisher, advanced their ideas, and translated statistical theory into practice so as to forever change the application of statistics in many fields, but especially biology and medicine. The logistic and proportional hazards models he substantially developed, are arguably among the most influential biostatistical methods in current practice. This paper looks forward over the period from DRC's 80th to 90th birthdays, to speculate about the future of biostatistics, drawing lessons from DRC's contributions along the way. We consider "Cox's model" of biostatistics, an approach to statistical science that: formulates scientific questions or quantities in terms of parameters gamma in probability models f(y; gamma) that represent in a parsimonious fashion, the underlying scientific mechanisms (Cox, 1997); partition the parameters gamma = theta, eta into a subset of interest theta and other "nuisance parameters" eta necessary to complete the probability distribution (Cox and Hinkley, 1974); develops methods of inference about the scientific quantities that depend as little as possible upon the nuisance parameters (Barndorff-Nielsen and Cox, 1989); and thinks critically about the appropriate conditional distribution on which to base infrences. We briefly review exciting biomedical and public health challenges that are capable of driving statistical developments in the next decade. We discuss the statistical models and model-based inferences central to the CM approach, contrasting them with computationally-intensive strategies for prediction and inference advocated by Breiman and others (e.g. Breiman, 2001) and to more traditional design-based methods of inference (Fisher, 1935). We discuss the hierarchical (multi-level) model as an example of the future challanges and opportunities for model-based inference. We then consider the role of conditional inference, a second key element of the CM. Recent examples from genetics are used to illustrate these ideas. Finally, the paper examines causal inference and statistical computing, two other topics we believe will be central to biostatistics research and practice in the coming decade. Throughout the paper, we attempt to indicate how DRC's work and the "Cox Model" have set a standard of excellence to which all can aspire in the future.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: In contrast to RIA, recently available ELISAs provide the potential for fully automated analysis of adiponectin. To date, studies reporting on the diagnostic characteristics of ELISAs and investigating on the relationship between ELISA- and RIA-based methods are rare. METHODS: Thus, we established and evaluated a fully automated platform (BEP 2000; Dade-Behring, Switzerland) for determination of adiponectin levels in serum by two different ELISA methods (competitive human adiponectin ELISA; high sensitivity human adiponectin sandwich ELISA; both Biovendor, Czech Republic). Further, as a reference method, we also employed a human adiponectin RIA (Linco Research, USA). Samples from 150 patients routinely presenting to our cardiology unit were tested. RESULTS: ELISA measurements could be accomplished in less than 3 h, measurement of RIA had a duration of 24 h. The ELISAs were evaluated for precision, analytical sensitivity and specificity, linearity on dilution and spiking recovery. In the investigated patients, type 2 diabetes, higher age and male gender were significantly associated with lower serum adiponectin concentrations. Correlations between the ELISA methods and the RIA were strong (competitive ELISA, r=0.82; sandwich ELISA, r=0.92; both p<0.001). However, Deming regression and Bland-Altman analysis indicated lack of agreement of the 3 methods preventing direct comparison of results. The equations of the regression lines are: Competitive ELISA=1.48 x RIA-0.88; High sensitivity sandwich ELISA=0.77 x RIA+1.01. CONCLUSIONS: Fully automated measurement of adiponectin by ELISA is feasible and substantially more rapid than RIA. The investigated ELISA test systems seem to exhibit analytical characteristics allowing for clinical application. In addition, there is a strong correlation between the ELISA methods and RIA. These findings might promote a more widespread use of adiponectin measurements in clinical research.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Simulation-based assessment is a popular and frequently necessary approach to evaluation of statistical procedures. Sometimes overlooked is the ability to take advantage of underlying mathematical relations and we focus on this aspect. We show how to take advantage of large-sample theory when conducting a simulation using the analysis of genomic data as a motivating example. The approach uses convergence results to provide an approximation to smaller-sample results, results that are available only by simulation. We consider evaluating and comparing a variety of ranking-based methods for identifying the most highly associated SNPs in a genome-wide association study, derive integral equation representations of the pre-posterior distribution of percentiles produced by three ranking methods, and provide examples comparing performance. These results are of interest in their own right and set the framework for a more extensive set of comparisons.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This book will serve as a foundation for a variety of useful applications of graph theory to computer vision, pattern recognition, and related areas. It covers a representative set of novel graph-theoretic methods for complex computer vision and pattern recognition tasks. The first part of the book presents the application of graph theory to low-level processing of digital images such as a new method for partitioning a given image into a hierarchy of homogeneous areas using graph pyramids, or a study of the relationship between graph theory and digital topology. Part II presents graph-theoretic learning algorithms for high-level computer vision and pattern recognition applications, including a survey of graph based methodologies for pattern recognition and computer vision, a presentation of a series of computationally efficient algorithms for testing graph isomorphism and related graph matching tasks in pattern recognition and a new graph distance measure to be used for solving graph matching problems. Finally, Part III provides detailed descriptions of several applications of graph-based methods to real-world pattern recognition tasks. It includes a critical review of the main graph-based and structural methods for fingerprint classification, a new method to visualize time series of graphs, and potential applications in computer network monitoring and abnormal event detection.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Blood-brain barrier (BBB) breakdown is an early event in the pathogenesis of multiple sclerosis (MS). In a previous study we have found a direct stabilization of barrier characteristics after treatment of bovine brain capillary endothelial cells (BCECs) with human recombinant interferon-beta-1a (IFN-beta-1a) in an in vitro BBB model. In the present study we examined the effect of human recombinant IFN-beta-1a on the barrier properties of BCECs derived from four different species including humans to predict treatment efficacy of IFN-beta-1a in MS patients. METHODS: We used primary bovine and porcine BCECs, as well as human and murine BCEC cell lines. We investigated the influence of human recombinant IFN-beta-1a on the paracellular permeability for 3H-inulin and 14C-sucrose across monolayers of bovine, human, and murine BCECs. In addition, the transendothelial electrical resistance (TEER) was determined in in vitro systems applying porcine and murine BCECS. RESULTS: We found a stabilizing effect on the barrier characteristics of BCECs after pretreatment with IFN-beta-1a in all applied in vitro models: addition of IFN-beta-1a resulted in a significant decrease of the paracellular permeability across monolayers of human, bovine, and murine BCECs. Furthermore, the TEER was significantly increased after pretreatment of porcine and murine BCECs with IFN-beta-1a. CONCLUSION: Our data suggest that BBB stabilization by IFN-beta-1a may contribute to its beneficial effects in the treatment of MS. A human in vitro BBB model might be useful as bioassay for testing the treatment efficacy of drugs in MS.