913 resultados para MS-based methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study aimed to evaluate the effectiveness of fluorescence-based methods (DIAGNOdent, LF; DIAGNOdent pen, LFpen, and VistaProof fluorescence camera, FC) in detecting demineralization and remineralization on smooth surfaces in situ. Ten volunteers wore acrylic palatal appliances, each containing 6 enamel blocks that were demineralized for 14 days by exposure to a 20% sucrose solution and 3 of them were remineralized for 7 days with fluoride dentifrice. Sixty enamel blocks were evaluated at baseline, after demineralization and 30 blocks after remineralization by two examiners using LF, LFpen and FC. They were submitted to surface microhardness (SMH) and cross-sectional microhardness analysis. The integrated loss of surface hardness (ΔKHN) was calculated. The intraclass correlation coefficient for interexaminer reproducibility ranged from 0.21 (FC) to 0.86 (LFpen). SMH, LF and LFpen values presented significant differences among the three phases. However, FC fluorescence values showed no significant differences between the demineralization and remineralization phases. Fluorescence values for baseline, demineralized and remineralized enamel were, respectively, 5.4 ± 1.0, 9.2 ± 2.2 and 7.0 ± 1.5 for LF; 10.5 ± 2.0, 15.0 ± 3.2 and 12.5 ± 2.9 for LFpen, and 1.0 ± 0.0, 1.0 ± 0.1 and 1.0 ± 0.1 for FC. SMH and ΔKHN showed significant differences between demineralization and remineralization phases. There was a negative and significant correlation between SMH and LF and LFpen in the remineralization phase. In conclusion, LF and LFpen devices were effective in detecting demineralization and remineralization on smooth surfaces provoked in situ.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although there has been a significant decrease in caries prevalence in developed countries, the slower progression of dental caries requires methods capable of detecting and quantifying lesions at an early stage. The aim of this study was to evaluate the effectiveness of fluorescence-based methods (DIAGNOdent 2095 laser fluorescence device [LF], DIAGNOdent 2190 pen [LFpen], and VistaProof fluorescence camera [FC]) in monitoring the progression of noncavitated caries-like lesions on smooth surfaces. Caries-like lesions were developed in 60 blocks of bovine enamel using a bacterial model of Streptococcus mutans and Lactobacillus acidophilus . Enamel blocks were evaluated by two independent examiners at baseline (phase I), after the first cariogenic challenge (eight days) (phase II), and after the second cariogenic challenge (a further eight days) (phase III) by two independent examiners using the LF, LFpen, and FC. Blocks were submitted to surface microhardness (SMH) and cross-sectional microhardness analyses. The intraclass correlation coefficient for intra- and interexaminer reproducibility ranged from 0.49 (FC) to 0.94 (LF/LFpen). SMH values decreased and fluorescence values increased significantly among the three phases. Higher values for sensitivity, specificity, and area under the receiver operating characteristic curve were observed for FC (phase II) and LFpen (phase III). A significant correlation was found between fluorescence values and SMH in all phases and integrated loss of surface hardness (ΔKHN) in phase III. In conclusion, fluorescence-based methods were effective in monitoring noncavitated caries-like lesions on smooth surfaces, with moderate correlation with SMH, allowing differentiation between sound and demineralized enamel.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a technique to reconstruct the electromagnetic properties of a medium or a set of objects buried inside it from boundary measurements when applying electric currents through a set of electrodes. The electromagnetic parameters may be recovered by means of a gradient method without a priori information on the background. The shape, location and size of objects, when present, are determined by a topological derivative-based iterative procedure. The combination of both strategies allows improved reconstructions of the objects and their properties, assuming a known background.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many inflammatory diseases have an oxidative aetiology, which leads to oxidative damage to biomolecules, including proteins. It is now increasingly recognized that oxidative post-translational modifications (oxPTMs) of proteins affect cell signalling and behaviour, and can contribute to pathology. Moreover, oxidized proteins have potential as biomarkers for inflammatory diseases. Although many assays for generic protein oxidation and breakdown products of protein oxidation are available, only advanced tandem mass spectrometry approaches have the power to localize specific oxPTMs in identified proteins. While much work has been carried out using untargeted or discovery mass spectrometry approaches, identification of oxPTMs in disease has benefitted from the development of sophisticated targeted or semi-targeted scanning routines, combined with chemical labeling and enrichment approaches. Nevertheless, many potential pitfalls exist which can result in incorrect identifications. This review explains the limitations, advantages and challenges of all of these approaches to detecting oxidatively modified proteins, and provides an update on recent literature in which they have been used to detect and quantify protein oxidation in disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper advances a philosophically informed rationale for the broader, reflexive and practical application of arts-based methods to benefit research, practice and pedagogy. It addresses the complexity and diversity of learning and knowing, foregrounding a cohabitative position and recognition of a plurality of research approaches, tailored and responsive to context. Appreciation of art and aesthetic experience is situated in the everyday, underpinned by multi-layered exemplars of pragmatic visual-arts narrative inquiry undertaken in the third, creative and communications sectors. Discussion considers semi-guided use of arts-based methods as a conduit for topic engagement, reflection and intersubjective agreement; alongside observation and interpretation of organically employed approaches used by participants within daily norms. Techniques span handcrafted (drawing), digital (photography), hybrid (cartooning), performance dimensions (improvised installations) and music (metaphor and structure). The process of creation, the artefact/outcome produced and experiences of consummation are all significant, with specific reflexivity impacts. Exploring methodology and epistemology, both the "doing" and its interpretation are explicated to inform method selection, replication, utility, evaluation and development of cross-media skills literacy. Approaches are found engaging, accessible and empowering, with nuanced capabilities to alter relationships with phenomena, experiences and people. By building a discursive space that reduces barriers; emancipation, interaction, polyphony, letting-go and the progressive unfolding of thoughts are supported, benefiting ways of knowing, narrative (re)construction, sensory perception and capacities to act. This can also present underexplored researcher risks in respect to emotion work, self-disclosure, identity and agenda. The paper therefore elucidates complex, intricate relationships between form and content, the represented and the representation or performance, researcher and participant, and the self and other. This benefits understanding of phenomena including personal experience, sensitive issues, empowerment, identity, transition and liminality. Observations are relevant to qualitative and mixed methods researchers and a multidisciplinary audience, with explicit identification of challenges, opportunities and implications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mass spectrometry (MS)-based proteomics has seen significant technical advances during the past two decades and mass spectrometry has become a central tool in many biosciences. Despite the popularity of MS-based methods, the handling of the systematic non-biological variation in the data remains a common problem. This biasing variation can result from several sources ranging from sample handling to differences caused by the instrumentation. Normalization is the procedure which aims to account for this biasing variation and make samples comparable. Many normalization methods commonly used in proteomics have been adapted from the DNA-microarray world. Studies comparing normalization methods with proteomics data sets using some variability measures exist. However, a more thorough comparison looking at the quantitative and qualitative differences of the performance of the different normalization methods and at their ability in preserving the true differential expression signal of proteins, is lacking. In this thesis, several popular and widely used normalization methods (the Linear regression normalization, Local regression normalization, Variance stabilizing normalization, Quantile-normalization, Median central tendency normalization and also variants of some of the forementioned methods), representing different strategies in normalization are being compared and evaluated with a benchmark spike-in proteomics data set. The normalization methods are evaluated in several ways. The performance of the normalization methods is evaluated qualitatively and quantitatively on a global scale and in pairwise comparisons of sample groups. In addition, it is investigated, whether performing the normalization globally on the whole data or pairwise for the comparison pairs examined, affects the performance of the normalization method in normalizing the data and preserving the true differential expression signal. In this thesis, both major and minor differences in the performance of the different normalization methods were found. Also, the way in which the normalization was performed (global normalization of the whole data or pairwise normalization of the comparison pair) affected the performance of some of the methods in pairwise comparisons. Differences among variants of the same methods were also observed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent years observed massive growth in wearable technology, everything can be smart: phones, watches, glasses, shirts, etc. These technologies are prevalent in various fields: from wellness/sports/fitness to the healthcare domain. The spread of this phenomenon led the World-Health-Organization to define the term 'mHealth' as "medical and public health practice supported by mobile devices, such as mobile phones, patient monitoring devices, personal digital assistants, and other wireless devices". Furthermore, mHealth solutions are suitable to perform real-time wearable Biofeedback (BF) systems: sensors in the body area network connected to a processing unit (smartphone) and a feedback device (loudspeaker) to measure human functions and return them to the user as (bio)feedback signal. During the COVID-19 pandemic, this transformation of the healthcare system has been dramatically accelerated by new clinical demands, including the need to prevent hospital surges and to assure continuity of clinical care services, allowing pervasive healthcare. Never as of today, we can say that the integration of mHealth technologies will be the basis of this new era of clinical practice. In this scenario, this PhD thesis's primary goal is to investigate new and innovative mHealth solutions for the Assessment and Rehabilitation of different neuromotor functions and diseases. For the clinical assessment, there is the need to overcome the limitations of subjective clinical scales. Creating new pervasive and self-administrable mHealth solutions, this thesis investigates the possibility of employing innovative systems for objective clinical evaluation. For rehabilitation, we explored the clinical feasibility and effectiveness of mHealth systems. In particular, we developed innovative mHealth solutions with BF capability to allow tailored rehabilitation. The main goal that a mHealth-system should have is improving the person's quality of life, increasing or maintaining his autonomy and independence. To this end, inclusive design principles might be crucial, next to the technical and technological ones, to improve mHealth-systems usability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays robotic applications are widespread and most of the manipulation tasks are efficiently solved. However, Deformable-Objects (DOs) still represent a huge limitation for robots. The main difficulty in DOs manipulation is dealing with the shape and dynamics uncertainties, which prevents the use of model-based approaches (since they are excessively computationally complex) and makes sensory data difficult to interpret. This thesis reports the research activities aimed to address some applications in robotic manipulation and sensing of Deformable-Linear-Objects (DLOs), with particular focus to electric wires. In all the works, a significant effort was made in the study of an effective strategy for analyzing sensory signals with various machine learning algorithms. In the former part of the document, the main focus concerns the wire terminals, i.e. detection, grasping, and insertion. First, a pipeline that integrates vision and tactile sensing is developed, then further improvements are proposed for each module. A novel procedure is proposed to gather and label massive amounts of training images for object detection with minimal human intervention. Together with this strategy, we extend a generic object detector based on Convolutional-Neural-Networks for orientation prediction. The insertion task is also extended by developing a closed-loop control capable to guide the insertion of a longer and curved segment of wire through a hole, where the contact forces are estimated by means of a Recurrent-Neural-Network. In the latter part of the thesis, the interest shifts to the DLO shape. Robotic reshaping of a DLO is addressed by means of a sequence of pick-and-place primitives, while a decision making process driven by visual data learns the optimal grasping locations exploiting Deep Q-learning and finds the best releasing point. The success of the solution leverages on a reliable interpretation of the DLO shape. For this reason, further developments are made on the visual segmentation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DIGE is a protein labelling and separation technique allowing quantitative proteomics of two or more samples by optical fluorescence detection of differentially labelled proteins that are electrophoretically separated on the same gel. DIGE is an alternative to quantitation by MS-based methodologies and can circumvent their analytical limitations in areas such as intact protein analysis, (linear) detection over a wide range of protein abundances and, theoretically, applications where extreme sensitivity is needed. Thus, in quantitative proteomics DIGE is usually complementary to MS-based quantitation and has some distinct advantages. This review describes the basics of DIGE and its unique properties and compares it to MS-based methods in quantitative protein expression analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the past decade, several major food safety crises originated from problems with feed. Consequently, there is an urgent need for early detection of fraudulent adulteration and contamination in the feed chain. Strategies are presented for two specific cases, viz. adulterations of (i) soybean meal with melamine and other types of adulterants/contaminants and (ii) vegetable oils with mineral oil, transformer oil or other oils. These strategies comprise screening at the feed mill or port of entry with non-destructive spectroscopic methods (NIRS and Raman), followed by post-screening and confirmation in the laboratory with MS-based methods. The spectroscopic techniques are suitable for on-site and on-line applications. Currently they are suited to detect fraudulent adulteration at relatively high levels but not to detect low level contamination. The potential use of the strategies for non-targeted analysis is demonstrated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Here, we examine morphological changes in cortical thickness of patients with Alzheimer`s disease (AD) using image analysis algorithms for brain structure segmentation and study automatic classification of AD patients using cortical and volumetric data. Cortical thickness of AD patients (n = 14) was measured using MRI cortical surface-based analysis and compared with healthy subjects (n = 20). Data was analyzed using an automated algorithm for tissue segmentation and classification. A Support Vector Machine (SVM) was applied over the volumetric measurements of subcortical and cortical structures to separate AD patients from controls. The group analysis showed cortical thickness reduction in the superior temporal lobe, parahippocampal gyrus, and enthorhinal cortex in both hemispheres. We also found cortical thinning in the isthmus of cingulate gyrus and middle temporal gyrus at the right hemisphere, as well as a reduction of the cortical mantle in areas previously shown to be associated with AD. We also confirmed that automatic classification algorithms (SVM) could be helpful to distinguish AD patients from healthy controls. Moreover, the same areas implicated in the pathogenesis of AD were the main parameters driving the classification algorithm. While the patient sample used in this study was relatively small, we expect that using a database of regional volumes derived from MRI scans of a large number of subjects will increase the SVM power of AD patient identification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Constrained and unconstrained Nonlinear Optimization Problems often appear in many engineering areas. In some of these cases it is not possible to use derivative based optimization methods because the objective function is not known or it is too complex or the objective function is non-smooth. In these cases derivative based methods cannot be used and Direct Search Methods might be the most suitable optimization methods. An Application Programming Interface (API) including some of these methods was implemented using Java Technology. This API can be accessed either by applications running in the same computer where it is installed or, it can be remotely accessed through a LAN or the Internet, using webservices. From the engineering point of view, the information needed from the API is the solution for the provided problem. On the other hand, from the optimization methods researchers’ point of view, not only the solution for the problem is needed. Also additional information about the iterative process is useful, such as: the number of iterations; the value of the solution at each iteration; the stopping criteria, etc. In this paper are presented the features added to the API to allow users to access to the iterative process data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mass spectrometry (MS) is currently the most sensitive and selective analytical technique for routine peptide and protein structure analysis. Top-down proteomics is based on tandem mass spectrometry (MS/ MS) of intact proteins, where multiply charged precursor ions are fragmented in the gas phase, typically by electron transfer or electron capture dissociation, to yield sequence-specific fragment ions. This approach is primarily used for the study of protein isoforms, including localization of post-translational modifications and identification of splice variants. Bottom-up proteomics is utilized for routine high-throughput protein identification and quantitation from complex biological samples. The proteins are first enzymatically digested into small (usually less than ca. 3 kDa) peptides, these are identified by MS or MS/MS, usually employing collisional activation techniques. To overcome the limitations of these approaches while combining their benefits, middle-down proteomics has recently emerged. Here, the proteins are digested into long (3-15 kDa) peptides via restricted proteolysis followed by the MS/MS analysis of the obtained digest. With advancements of high-resolution MS and allied techniques, routine implementation of the middle-down approach has been made possible. Herein, we present the liquid chromatography (LC)-MS/MS-based experimental design of our middle-down proteomic workflow coupled with post-LC supercharging.