35 resultados para Ultra-trace analysis
Resumo:
Sixty samples of milk, Halloumi cheese and local grazing plants (i.e. shrubs) were collected over a year from dairy farms located on three different locations of Cyprus. Major and trace elements were quantified using inductively coupled plasma-atomic emission spectroscopy (ICP-AES). Milk and Halloumi cheese produced in different geographical locations presented significant differences in the concentration of some of the elements analysed. Principal component analysis showed grouping of samples according to the region of production for both milk and cheese samples. These findings show that the assay of elements can provide useful fingerprints for the characterisation of dairy products.
Resumo:
Structural and functional information encoded in DNA combined with unique properties of nanomaterials could be of use for the construction of novel biocomputational circuits and intelligent biomedical nanodevices. However, at present their practical applications are still limited by either low reproducibility of fabrication, modest sensitivity, or complicated handling procedures. Here, we demonstrate the construction of label-free and switchable molecular logic gates (AND, INHIBIT, and OR) that use specific conformation modulation of a guanine- and thymine-rich DNA, while the optical readout is enabled by the tunable metamaterials which serve as a substrate for surface enhanced Raman spectroscopy (MetaSERS). Our MetaSERS-based DNA logic is simple to operate, highly reproducible, and can be stimulated by ultra-low concentration of the external inputs, enabling an extremely sensitive detection of mercury ions down to 2×10-4 ppb, which is four orders of magnitude lower than the exposure limit allowed by United States Environmental Protection Agency
Resumo:
Statistics are regularly used to make some form of comparison between trace evidence or deploy the exclusionary principle (Morgan and Bull, 2007) in forensic investigations. Trace evidence are routinely the results of particle size, chemical or modal analyses and as such constitute compositional data. The issue is that compositional data including percentages, parts per million etc. only carry relative information. This may be problematic where a comparison of percentages and other constraint/closed data is deemed a statistically valid and appropriate way to present trace evidence in a court of law. Notwithstanding an awareness of the existence of the constant sum problem since the seminal works of Pearson (1896) and Chayes (1960) and the introduction of the application of log-ratio techniques (Aitchison, 1986; Pawlowsky-Glahn and Egozcue, 2001; Pawlowsky-Glahn and Buccianti, 2011; Tolosana-Delgado and van den Boogaart, 2013) the problem that a constant sum destroys the potential independence of variances and covariances required for correlation regression analysis and empirical multivariate methods (principal component analysis, cluster analysis, discriminant analysis, canonical correlation) is all too often not acknowledged in the statistical treatment of trace evidence. Yet the need for a robust treatment of forensic trace evidence analyses is obvious. This research examines the issues and potential pitfalls for forensic investigators if the constant sum constraint is ignored in the analysis and presentation of forensic trace evidence. Forensic case studies involving particle size and mineral analyses as trace evidence are used to demonstrate the use of a compositional data approach using a centred log-ratio (clr) transformation and multivariate statistical analyses.
Resumo:
The research presented, investigates the optimal set of operational codes (opcodes) that create a robust indicator of malicious software (malware) and also determines a program’s execution duration for accurate classification of benign and malicious software. The features extracted from the dataset are opcode density histograms, extracted during the program execution. The classifier used is a support vector machine and is configured to select those features to produce the optimal classification of malware over different program run lengths. The findings demonstrate that malware can be detected using dynamic analysis with relatively few opcodes.
Resumo:
The complexity of modern geochemical data sets is increasing in several aspects (number of available samples, number of elements measured, number of matrices analysed, geological-environmental variability covered, etc), hence it is becoming increasingly necessary to apply statistical methods to elucidate their structure. This paper presents an exploratory analysis of one such complex data set, the Tellus geochemical soil survey of Northern Ireland (NI). This exploratory analysis is based on one of the most fundamental exploratory tools, principal component analysis (PCA) and its graphical representation as a biplot, albeit in several variations: the set of elements included (only major oxides vs. all observed elements), the prior transformation applied to the data (none, a standardization or a logratio transformation) and the way the covariance matrix between components is estimated (classical estimation vs. robust estimation). Results show that a log-ratio PCA (robust or classical) of all available elements is the most powerful exploratory setting, providing the following insights: the first two processes controlling the whole geochemical variation in NI soils are peat coverage and a contrast between “mafic” and “felsic” background lithologies; peat covered areas are detected as outliers by a robust analysis, and can be then filtered out if required for further modelling; and peat coverage intensity can be quantified with the %Br in the subcomposition (Br, Rb, Ni).