5 resultados para general chemistry laboratory
Resumo:
Major food adulteration and contamination events occur with alarming regularity and are known to be episodic, with the question being not if but when another large-scale food safety/integrity incident will occur. Indeed, the challenges of maintaining food security are now internationally recognised. The ever increasing scale and complexity of food supply networks can lead to them becoming significantly more vulnerable to fraud and contamination, and potentially dysfunctional. This can make the task of deciding which analytical methods are more suitable to collect and analyse (bio)chemical data within complex food supply chains, at targeted points of vulnerability, that much more challenging. It is evident that those working within and associated with the food industry are seeking rapid, user-friendly methods to detect food fraud and contamination, and rapid/high-throughput screening methods for the analysis of food in general. In addition to being robust and reproducible, these methods should be portable and ideally handheld and/or remote sensor devices, that can be taken to or be positioned on/at-line at points of vulnerability along complex food supply networks and require a minimum amount of background training to acquire information rich data rapidly (ergo point-and-shoot). Here we briefly discuss a range of spectrometry and spectroscopy based approaches, many of which are commercially available, as well as other methods currently under development. We discuss a future perspective of how this range of detection methods in the growing sensor portfolio, along with developments in computational and information sciences such as predictive computing and the Internet of Things, will together form systems- and technology-based approaches that significantly reduce the areas of vulnerability to food crime within food supply chains. As food fraud is a problem of systems and therefore requires systems level solutions and thinking.
Resumo:
Peptides are receiving increasing interest as clinical therapeutics. These highly tunable molecules can be tailored to biocompatibility and biodegradability with simultaneously selective and potent therapeutic effects. Despite challenges regarding up-scaling and licensing of peptide products, their vast clinical potential is reflected in the 60 plus peptide-based therapeutics already on the market, and the further 500 derivatives currently in developmental stages. Peptides are proving effective for a multitude of disease states including: type 2 diabetes (controlled using the licensed glucagon-like peptide-1 receptor liraglutide); irritable bowel syndrome managed with linaclotide (currently at approval stages); acromegaly (treated with octapeptide somostatin analogues lanreotide and octreotide); selective or broad spectrum microbicidal agents such as the Gram-positive selective PTP-7 and antifungal heliomicin; anticancer agents including goserelin used as either adjuvant or for prostate and breast cancer,and the first marketed peptide derived vaccine against prostate cancer, sipuleucel-T. Research is also focusing on improving the biostability of peptides. This is achieved through a number of mechanisms ranging from replacement of naturally occurring L-amino acid enantiomers with D-amino acid forms, lipidation, peptidomimetics, N-methylation, cyclization and exploitation of carrier systems. The development of self-assembling peptides are paving the way for sustained release peptide formulations and already two such licensed examples exist, lanreotide and octreotide. The versatility and tunability of peptide-based products is resulting in increased translation of peptide therapies, however significant challenges remain with regard to their wider implementation. This review highlights some of the notable peptide therapeutics discovered to date and the difficulties encountered by the pharmaceutica lindustry in translating these molecules to the clinical setting for patient benefit, providing some possible solutions to the most challenging barriers.
Resumo:
Motivated by environmental protection concerns, monitoring the flue gas of thermal power plant is now often mandatory due to the need to ensure that emission levels stay within safe limits. Optical based gas sensing systems are increasingly employed for this purpose, with regression techniques used to relate gas optical absorption spectra to the concentrations of specific gas components of interest (NOx, SO2 etc.). Accurately predicting gas concentrations from absorption spectra remains a challenging problem due to the presence of nonlinearities in the relationships and the high-dimensional and correlated nature of the spectral data. This article proposes a generalized fuzzy linguistic model (GFLM) to address this challenge. The GFLM is made up of a series of “If-Then” fuzzy rules. The absorption spectra are input variables in the rule antecedent. The rule consequent is a general nonlinear polynomial function of the absorption spectra. Model parameters are estimated using least squares and gradient descent optimization algorithms. The performance of GFLM is compared with other traditional prediction models, such as partial least squares, support vector machines, multilayer perceptron neural networks and radial basis function networks, for two real flue gas spectral datasets: one from a coal-fired power plant and one from a gas-fired power plant. The experimental results show that the generalized fuzzy linguistic model has good predictive ability, and is competitive with alternative approaches, while having the added advantage of providing an interpretable model.