924 resultados para LABORATORIES


Relevância:

10.00% 10.00%

Publicador:

Resumo:

An improved method for the detection of pressed hazelnut oil in admixtures with virgin olive oil by analysis of polar components is described. The method. which is based on the SPE-based isolation of the polar fraction followed by RP-HPLC analysis with UV detection. is able to detect virgin olive oil adulterated with pressed hazelnut oil at levels as low as 5% with accuracy (90.0 +/- 4.2% recovery of internal standard), good reproducibility (4.7% RSD) and linearity (R-2: 0.9982 over the 5-40% adulteration range). An international ring-test of the developed method highlighted its capability as 80% of the samples were, on average, correctly identified despite the fact that no training samples were provided to the participating laboratories. However, the large variability in marker components among the pressed hazelnut oils examined prevents the use of the method for quantification of the level of adulteration. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article reviews recent developments in the application of capillary electrophoresis (CE) for the analysis of foods and food components. CE has been applied to a number of important areas of food analysis and is fast becoming an established technique within food analytical and research laboratories. Papers are reviewed that were published during the two years to date following the previous review.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This Account provides an overview of strategies that have been reported from our laboratories for the synthesis of targets of therapeutic interest, namely carbohydrates, and prodrugs for the treatment of melanoma. These programmes have involved the development of new synthetic methodologies including the regio- and stereoselective synthesis of specific carbohydrate isomers, and new protecting group methodologies. This review provides an insight into the progress of these research themes, and suggests some applications for the targets that are currently being explored.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Investigation of the anatomical substructure of the medial temporal lobe has revealed a number of highly interconnected areas, which has led some to propose that the region operates as a unitary memory system. However, here we outline the results of a number of studies from our laboratories, which investigate the contributions of the rat's perirhinal cortex and postrhinal cortex to memory, concentrating particularly on their respective roles in memory for objects. By contrasting patterns of impairment and spared abilities on a number of related tasks, we suggest that perirhinal cortex and postrhinal cortex make distinctive contributions to learning and memory: for example, that postrhinal cortex is important in learning about within-scene position and context. We also provide evidence that despite the strong connectivity between these cortical regions and the hippocampus, the hippocampus, as evidenced by lesions of the fornix, has a distinct function of its own-combining information about objects, positions, and contexts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cross-contamination between cell lines is a longstanding and frequent cause of scientific misrepresentation. Estimates from national testing services indicate that up to 36% of cell lines are of a different origin or species to that claimed. To test a standard method of cell line authentication, 253 human cell lines from banks and research institutes worldwide were analyzed by short tandem repeat profiling. The short tandem repeat profile is a simple numerical code that is reproducible between laboratories, is inexpensive, and can provide an international reference standard for every cell line. If DNA profiling of cell lines is accepted and demanded internationally, scientific misrepresentation because of cross-contamination can be largely eliminated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Control systems theory can be a discipline difficult to learn without some laboratory help. With the help of focused laboratories this discipline turns to be very interesting to the students involved. The main problem is that laboratories aren't always available to students, and sometimes, when they are available, aren't big enough to a growing student population. Thus, with computer networks growing so fast, why don't create remote control labs that can be used by a large number of students? Why don't create remote control labs using Internetⓒ Copyright ?2001 IFAC Keywords: Remote Control, Computer Networks, Database, Educational Aids, Laboratory Education, Communication Control Applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sensitive methods that are currently used to monitor proteolysis by plasmin in milk are limited due to 7 their high cost and lack of standardisation for quality assurance in the various dairy laboratories. In 8 this study, four methods, trinitrobenzene sulphonic acid (TNBS), reverse phase high pressure liquid 9 chromatography (RP-HPLC), gel electrophoresis and fluorescamine, were selected to assess their 10 suitability for the detection of proteolysis in milk by plasmin. Commercial UHT milk was incubated 11 with plasmin at 37 °C for one week. Clarification was achieved by isoelectric precipitation (pH 4·6 12 soluble extracts)or 6% (final concentration) trichloroacetic acid (TCA). The pH 4·6 and 6% TCA 13 soluble extracts of milk showed high correlations (R2 > 0·93) by the TNBS, fluorescamine and 14 RP-HPLC methods, confirming increased proteolysis during storage. For gel electrophoresis,15 extensive proteolysis was confirmed by the disappearance of α- and β-casein bands on the seventh 16 day, which was more evident in the highest plasmin concentration. This was accompanied by the 17 appearance of α- and β-casein proteolysis products with higher intensities than on previous days, 18 implying that more products had been formed as a result of casein breakdown. The fluorescamine 19 method had a lower detection limit compared with the other methods, whereas gel electrophoresis 20 was the best qualitative method for monitoring β-casein proteolysis products. Although HPLC was the 21 most sensitive, the TNBS method is recommended for use in routine laboratory analysis on the basis 22 of its accuracy, reliability and simplicity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A dynamic, mechanistic model of enteric fermentation was used to investigate the effect of type and quality of grass forage, dry matter intake (DMI) and proportion of concentrates in dietary dry matter (DM) on variation in methane (CH(4)) emission from enteric fermentation in dairy cows. The model represents substrate degradation and microbial fermentation processes in rumen and hindgut and, in particular, the effects of type of substrate fermented and of pH oil the production of individual volatile fatty acids and CH, as end-products of fermentation. Effects of type and quality of fresh and ensiled grass were evaluated by distinguishing two N fertilization rates of grassland and two stages of grass maturity. Simulation results indicated a strong impact of the amount and type of grass consumed oil CH(4) emission, with a maximum difference (across all forage types and all levels of DM 1) of 49 and 77% in g CH(4)/kg fat and protein corrected milk (FCM) for diets with a proportion of concentrates in dietary DM of 0.1 and 0.4, respectively (values ranging from 10.2 to 19.5 g CH(4)/kg FCM). The lowest emission was established for early Cut, high fertilized grass silage (GS) and high fertilized grass herbage (GH). The highest emission was found for late cut, low-fertilized GS. The N fertilization rate had the largest impact, followed by stage of grass maturity at harvesting and by the distinction between GH and GS. Emission expressed in g CH(4)/kg FCM declined oil average 14% with an increase of DMI from 14 to 18 kg/day for grass forage diets with a proportion of concentrates of 0.1, and on average 29% with an increase of DMI from 14 to 23 kg/day for diets with a proportion of concentrates of 0.4. Simulation results indicated that a high proportion of concentrates in dietary DM may lead to a further reduction of CH, emission per kg FCM mainly as a result of a higher DM I and milk yield, in comparison to low concentrate diets. Simulation results were evaluated against independent data obtained at three different laboratories in indirect calorimetry trials with COWS consuming GH mainly. The model predicted the average of observed values reasonably, but systematic deviations remained between individual laboratories and root mean squared prediction error was a proportion of 0.12 of the observed mean. Both observed and predicted emission expressed in g CH(4)/kg DM intake decreased upon an increase in dietary N:organic matter (OM) ratio. The model reproduced reasonably well the variation in measured CH, emission in cattle sheds oil Dutch dairy farms and indicated that oil average a fraction of 0.28 of the total emissions must have originated from manure under these circumstances.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe the development of a miniaturised microarray for the detection of antimicrobial resistance genes in Gram-negative bacteria. Included on the array are genes encoding resistance to aminoglycosides, trimethoprim, sulphonamides, tetracyclines and beta-lactams, including extended-spectrum beta-lactamases. Validation of the array with control strains demonstrated a 99% correlation between polymerase chain reaction and array results. There was also good correlation between phenotypic and genotypic results for a large panel of Escherichia coli and Salmonella isolates. Some differences were also seen in the number and type of resistance genes harboured by E. coli and Salmonella strains. The array provides an effective, fast and simple method for detection of resistance genes in clinical isolates suitable for use in diagnostic laboratories, which in future will help to understand the epidemiology of isolates and to detect gene linkage in bacterial populations. (C) 2008 Published by Elsevier B.V. and the International Society of Chemotherapy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A neglected critique of social science laboratories alleges that they implement phenomena different to those supposedly under investigation. The critique purports to be conceptual and so invulnerable to a technical solution. I argue that it undermines some economics designs seeking to implement features of real societies, and counsels more modesty in experimental write‐ups. It also constitutes a plausible argument that laboratory economics experiments are necessarily less demonstrative than natural scientific ones. More radical sceptical conclusions are unwarranted.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As weather and climate models move toward higher resolution, there is growing excitement about potential future improvements in the understanding and prediction of atmospheric convection and its interaction with larger-scale phenomena. A meeting in January 2013 in Dartington, Devon was convened to address the best way to maximise these improvements, specifically in a UK context but with international relevance. Specific recommendations included increased convective-scale observations, high-resolution virtual laboratories, and a system of parameterization test beds with a range of complexities. The main recommendation was to facilitate the development of physically based convective parameterizations that are scale-aware, non-local, non-equilibrium, and stochastic.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Neural stem cells (NSCs) are early precursors of neuronal and glial cells. NSCs are capable of generating identical progeny through virtually unlimited numbers of cell divisions (cell proliferation), producing daughter cells committed to differentiation. Nuclear factor kappa B (NF-kappaB) is an inducible, ubiquitous transcription factor also expressed in neurones, glia and neural stem cells. Recently, several pieces of evidence have been provided for a central role of NF-kappaB in NSC proliferation control. Here, we propose a novel mathematical model for NF-kappaB-driven proliferation of NSCs. We have been able to reconstruct the molecular pathway of activation and inactivation of NF-kappaB and its influence on cell proliferation by a system of nonlinear ordinary differential equations. Then we use a combination of analytical and numerical techniques to study the model dynamics. The results obtained are illustrated by computer simulations and are, in general, in accordance with biological findings reported by several independent laboratories. The model is able to both explain and predict experimental data. Understanding of proliferation mechanisms in NSCs may provide a novel outlook in both potential use in therapeutic approaches, and basic research as well.