425 resultados para Discriminative Itemsets


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The recognition of differences between regulated large-scale mass manufactured products and the uncontrolled cultivation of tobaccos for illicit purposes plays a significant role within identification of provenance. This research highlights X-ray fluorescence and Fourier transform infrared spectroscopy as useful analytical techniques for the rapid identification of tobacco samples of unknown provenance. Identification of key discriminative features within each technique allowed for the development of typical characteristic profiles for each type of tobacco. Analysis using X-ray fluorescence highlights chlorine, potassium, calcium and iron as key elemental indicators of tobacco provenance. Significant levels of chlorine seen within Snüs samples prompted attempts to visualise chlorine containing regions and structures within the sample. Scanning electron microscopy images showed crystalline structures visible within the Snüs tobacco, structures which Energy dispersive X-ray spectroscopy qualitatively confirmed to contain chlorine. Chloride levels within Snüs samples were quantified using ion chromatography with levels found to range between 0.87mgmL‾¹ and 1.28mg. Additionally, FTIR indicated that absorbances attributed to carbonyl stretching at 1050-1150cm‾¹, alkane bending at 1350-1480cm‾¹and amide I stretching at 1600-1700cm‾¹ highlighting a spectral fingerprint region that allowed for the clear differentiation between different types of tobaccos using PCA analysis, but was limited by differentiation between provenance of cigarettes and hand rolled tobacco. X-ray fluorescence and Fourier transform infrared spectroscopy yielded different information with regards tobacco discrimination and provenance, however both methods overall analysis time and cost reduced indicating usefulness as potential handheld analytical techniques in the field.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Circulating low density lipoproteins (LDL) are thought to play a crucial role in the onset and development of atherosclerosis, though the detailed molecular mechanisms responsible for their biological effects remain controversial. The complexity of biomolecules (lipids, glycans and protein) and structural features (isoforms and chemical modifications) found in LDL particles hampers the complete understanding of the mechanism underlying its atherogenicity. For this reason the screening of LDL for features discriminative of a particular pathology in search of biomarkers is of high importance. Three major biomolecule classes (lipids, protein and glycans) in LDL particles were screened using mass spectrometry coupled to liquid chromatography. Dual-polarity screening resulted in good lipidome coverage, identifying over 300 lipid species from 12 lipid sub-classes. Multivariate analysis was used to investigate potential discriminators in the individual lipid sub-classes for different study groups (age, gender, pathology). Additionally, the high protein sequence coverage of ApoB-100 routinely achieved (≥70%) assisted in the search for protein modifications correlating to aging and pathology. The large size and complexity of the datasets required the use of chemometric methods (Partial Least Square-Discriminant Analysis, PLS-DA) for their analysis and for the identification of ions that discriminate between study groups. The peptide profile from enzymatically digested ApoB-100 can be correlated with the high structural complexity of lipids associated with ApoB-100 using exploratory data analysis. In addition, using targeted scanning modes, glycosylation sites within neutral and acidic sugar residues in ApoB-100 are also being explored. Together or individually, knowledge of the profiles and modifications of the major biomolecules in LDL particles will contribute towards an in-depth understanding, will help to map the structural features that contribute to the atherogenicity of LDL, and may allow identification of reliable, pathology-specific biomarkers. This research was supported by a Marie Curie Intra-European Fellowship within the 7th European Community Framework Program (IEF 255076). Work of A. Rudnitskaya was supported by Portuguese Science and Technology Foundation, through the European Social Fund (ESF) and "Programa Operacional Potencial Humano - POPH".

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fully articulated hand tracking promises to enable fundamentally new interactions with virtual and augmented worlds, but the limited accuracy and efficiency of current systems has prevented widespread adoption. Today's dominant paradigm uses machine learning for initialization and recovery followed by iterative model-fitting optimization to achieve a detailed pose fit. We follow this paradigm, but make several changes to the model-fitting, namely using: (1) a more discriminative objective function; (2) a smooth-surface model that provides gradients for non-linear optimization; and (3) joint optimization over both the model pose and the correspondences between observed data points and the model surface. While each of these changes may actually increase the cost per fitting iteration, we find a compensating decrease in the number of iterations. Further, the wide basin of convergence means that fewer starting points are needed for successful model fitting. Our system runs in real-time on CPU only, which frees up the commonly over-burdened GPU for experience designers. The hand tracker is efficient enough to run on low-power devices such as tablets. We can track up to several meters from the camera to provide a large working volume for interaction, even using the noisy data from current-generation depth cameras. Quantitative assessments on standard datasets show that the new approach exceeds the state of the art in accuracy. Qualitative results take the form of live recordings of a range of interactive experiences enabled by this new approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE: The validity of the SF-6D, a preference-based measure of health-related quality of life, is not well explored in the context of spinal cord injury (SCI). The aim of this analysis was to assess appropriate measurement properties of the SF-6D in a sample of individuals living with SCI. METHODS: Longitudinal data from the Rick Hansen Spinal Cord Injury Registry were used. Responses to the 36-item short-form health survey were transformed into SF-6D utility scores. We investigated practicality, floor and ceiling effects, and responsiveness to change. Responsiveness to change was explored using three different anchors that reflected changes in self-reported health, functional independence, and life satisfaction. Discriminative validity was assessed by ten a priori defined hypotheses, with a distinction made between 'strong' and 'weak' hypotheses. RESULTS: Three hundred and fifty-eight individuals with SCI were included in this analysis. Practicality was deemed acceptable based on a completion rate of 94%. The SF-6D showed low responsiveness to detect important health changes over time, and differences in responsiveness were found between individuals with paraplegia and tetraplegia. All five strong hypotheses and three weak hypotheses were confirmed. CONCLUSION: The SF-6D demonstrated good practicality and discriminative validity in this sample. The failure to detect self-reported and clinically important health changes requires further consideration. Comparative performance of the SF-6D (i.e., how the SF-6D performs against other preference-based measures) is unknown in the SCI context and requires further research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cardiovascular disease is one of the leading causes of death around the world. Resting heart rate has been shown to be a strong and independent risk marker for adverse cardiovascular events and mortality, and yet its role as a predictor of risk is somewhat overlooked in clinical practice. With the aim of highlighting its prognostic value, the role of resting heart rate as a risk marker for death and other adverse outcomes was further examined in a number of different patient populations. A systematic review of studies that previously assessed the prognostic value of resting heart rate for mortality and other adverse cardiovascular outcomes was presented. New analyses of nine clinical trials were carried out. Both the original and extended Cox model that allows for analysis of time-dependent covariates were used to evaluate and compare the predictive value of baseline and time-updated heart rate measurements for adverse outcomes in the CAPRICORN, EUROPA, PROSPER, PERFORM, BEAUTIFUL and SHIFT populations. Pooled individual patient meta-analyses of the CAPRICORN, EPHESUS, OPTIMAAL and VALIANT trials, and the BEAUTIFUL and SHIFT trials, were also performed. The discrimination and calibration of the models applied were evaluated using Harrell’s C-statistic and likelihood ratio tests, respectively. Finally, following on from the systematic review, meta-analyses of the relation between baseline and time-updated heart rate, and the risk of death from any cause and from cardiovascular causes, were conducted. Both elevated baseline and time-updated resting heart rates were found to be associated with an increase in the risk of mortality and other adverse cardiovascular events in all of the populations analysed. In some cases, elevated time-updated heart rate was associated with risk of events where baseline heart rate was not. Time-updated heart rate also contributed additional information about the risk of certain events despite knowledge of baseline heart rate or previous heart rate measurements. The addition of resting heart rate to the models where resting heart rate was found to be associated with risk of outcome improved both discrimination and calibration, and in general, the models including time-updated heart rate along with baseline or the previous heart rate measurement had the highest and similar C-statistics, and thus the greatest discriminative ability. The meta-analyses demonstrated that a 5bpm higher baseline heart rate was associated with a 7.9% and an 8.0% increase in the risk of all-cause and cardiovascular death, respectively (both p less than 0.001). Additionally, a 5bpm higher time-updated heart rate (adjusted for baseline heart rate in eight of the ten studies included in the analyses) was associated with a 12.8% (p less than 0.001) and a 10.9% (p less than 0.001) increase in the risk of all-cause and cardiovascular death, respectively. These findings may motivate health care professionals to routinely assess resting heart rate in order to identify individuals at a higher risk of adverse events. The fact that the addition of time-updated resting heart rate improved the discrimination and calibration of models for certain outcomes, even if only modestly, strengthens the case that it be added to traditional risk models. The findings, however, are of particular importance, and have greater implications for the clinical management of patients with pre-existing disease. An elevated, or increasing heart rate over time could be used as a tool, potentially alongside other established risk scores, to help doctors identify patient deterioration or those at higher risk, who might benefit from more intensive monitoring or treatment re-evaluation. Further exploration of the role of continuous recording of resting heart rate, say, when patients are at home, would be informative. In addition, investigation into the cost-effectiveness and optimal frequency of resting heart rate measurement is required. One of the most vital areas for future research is the definition of an objective cut-off value for the definition of a high resting heart rate.