940 resultados para quantification


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The growing interest for sequencing with higher throughput in the last decade has led to the development of new sequencing applications. This thesis concentrates on optimizing DNA library preparation for Illumina Genome Analyzer II sequencer. The library preparation steps that were optimized include fragmentation, PCR purification and quantification. DNA fragmentation was performed with focused sonication in different concentrations and durations. Two column based PCR purification method, gel matrix method and magnetic bead based method were compared. Quantitative PCR and gel electrophoresis in a chip were compared for DNA quantification. The magnetic bead purification was found to be the most efficient and flexible purification method. The fragmentation protocol was changed to produce longer fragments to be compatible with longer sequencing reads. Quantitative PCR correlates better with the cluster number and should thus be considered to be the default quantification method for sequencing. As a result of this study more data have been acquired from sequencing with lower costs and troubleshooting has become easier as qualification steps have been added to the protocol. New sequencing instruments and applications will create a demand for further optimizations in future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Earth s climate is a highly dynamic and complex system in which atmospheric aerosols have been increasingly recognized to play a key role. Aerosol particles affect the climate through a multitude of processes, directly by absorbing and reflecting radiation and indirectly by changing the properties of clouds. Because of the complexity, quantification of the effects of aerosols continues to be a highly uncertain science. Better understanding of the effects of aerosols requires more information on aerosol chemistry. Before the determination of aerosol chemical composition by the various available analytical techniques, aerosol particles must be reliably sampled and prepared. Indeed, sampling is one of the most challenging steps in aerosol studies, since all available sampling techniques harbor drawbacks. In this study, novel methodologies were developed for sampling and determination of the chemical composition of atmospheric aerosols. In the particle-into-liquid sampler (PILS), aerosol particles grow in saturated water vapor with further impaction and dissolution in liquid water. Once in water, the aerosol sample can then be transported and analyzed by various off-line or on-line techniques. In this study, PILS was modified and the sampling procedure was optimized to obtain less altered aerosol samples with good time resolution. A combination of denuders with different coatings was tested to adsorb gas phase compounds before PILS. Mixtures of water with alcohols were introduced to increase the solubility of aerosols. Minimum sampling time required was determined by collecting samples off-line every hour and proceeding with liquid-liquid extraction (LLE) and analysis by gas chromatography-mass spectrometry (GC-MS). The laboriousness of LLE followed by GC-MS analysis next prompted an evaluation of solid-phase extraction (SPE) for the extraction of aldehydes and acids in aerosol samples. These two compound groups are thought to be key for aerosol growth. Octadecylsilica, hydrophilic-lipophilic balance (HLB), and mixed phase anion exchange (MAX) were tested as extraction materials. MAX proved to be efficient for acids, but no tested material offered sufficient adsorption for aldehydes. Thus, PILS samples were extracted only with MAX to guarantee good results for organic acids determined by liquid chromatography-mass spectrometry (HPLC-MS). On-line coupling of SPE with HPLC-MS is relatively easy, and here on-line coupling of PILS with HPLC-MS through the SPE trap produced some interesting data on relevant acids in atmospheric aerosol samples. A completely different approach to aerosol sampling, namely, differential mobility analyzer (DMA)-assisted filter sampling, was employed in this study to provide information about the size dependent chemical composition of aerosols and understanding of the processes driving aerosol growth from nano-size clusters to climatically relevant particles (>40 nm). The DMA was set to sample particles with diameters of 50, 40, and 30 nm and aerosols were collected on teflon or quartz fiber filters. To clarify the gas-phase contribution, zero gas-phase samples were collected by switching off the DMA every other 15 minutes. Gas-phase compounds were adsorbed equally well on both types of filter, and were found to contribute significantly to the total compound mass. Gas-phase adsorption is especially significant during the collection of nanometer-size aerosols and needs always to be taken into account. Other aims of this study were to determine the oxidation products of β-caryophyllene (the major sesquiterpene in boreal forest) in aerosol particles. Since reference compounds are needed for verification of the accuracy of analytical measurements, three oxidation products of β-caryophyllene were synthesized: β-caryophyllene aldehyde, β-nocaryophyllene aldehyde, and β-caryophyllinic acid. All three were identified for the first time in ambient aerosol samples, at relatively high concentrations, and their contribution to the aerosol mass (and probably growth) was concluded to be significant. Methodological and instrumental developments presented in this work enable fuller understanding of the processes behind biogenic aerosol formation and provide new tools for more precise determination of biosphere-atmosphere interactions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The hazards associated with major accident hazard (MAN) industries are fire, explosion and toxic gas releases. Of these, toxic gas release is the worst as it has the potential to cause extensive fatalities. Qualitative and quantitative hazard analyses are essential for the identification and quantification of these hazards related to chemical industries. Fault tree analysis (FTA) is an established technique in hazard identification. This technique has the advantage of being both qualitative and quantitative, if the probabilities and frequencies of the basic events are known. This paper outlines the estimation of the probability of release of chlorine from storage and filling facility of chlor-alkali industry using FTA. An attempt has also been made to arrive at the probability of chlorine release using expert elicitation and proven fuzzy logic technique for Indian conditions. Sensitivity analysis has been done to evaluate the percentage contribution of each basic event that could lead to chlorine release. Two-dimensional fuzzy fault tree analysis (TDFFTA) has been proposed for balancing the hesitation factor involved in expert elicitation. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Human sport doping control analysis is a complex and challenging task for anti-doping laboratories. The List of Prohibited Substances and Methods, updated annually by World Anti-Doping Agency (WADA), consists of hundreds of chemically and pharmacologically different low and high molecular weight compounds. This poses a considerable challenge for laboratories to analyze for them all in a limited amount of time from a limited sample aliquot. The continuous expansion of the Prohibited List obliges laboratories to keep their analytical methods updated and to research new available methodologies. In this thesis, an accurate mass-based analysis employing liquid chromatography - time-of-flight mass spectrometry (LC-TOFMS) was developed and validated to improve the power of doping control analysis. New analytical methods were developed utilizing the high mass accuracy and high information content obtained by TOFMS to generate comprehensive and generic screening procedures. The suitability of LC-TOFMS for comprehensive screening was demonstrated for the first time in the field with mass accuracies better than 1 mDa. Further attention was given to generic sample preparation, an essential part of screening analysis, to rationalize the whole work flow and minimize the need for several separate sample preparation methods. Utilizing both positive and negative ionization allowed the detection of almost 200 prohibited substances. Automatic data processing produced a Microsoft Excel based report highlighting the entries fulfilling the criteria of the reverse data base search (retention time (RT), mass accuracy, isotope match). The quantitative performance of LC-TOFMS was demonstrated with morphine, codeine and their intact glucuronide conjugates. After a straightforward sample preparation the compounds were analyzed directly without the need for hydrolysis, solvent transfer, evaporation or reconstitution. The hydrophilic interaction technique (HILIC) provided good chromatographic separation, which was critical for the morphine glucuronide isomers. A wide linear range (50-5000 ng/ml) with good precision (RSD<10%) and accuracy (±10%) was obtained, showing comparable or better performance to other methods used. In-source collision-induced dissociation (ISCID) allowed confirmation analysis with three diagnostic ions with a median mass accuracy of 1.08 mDa and repeatable ion ratios fulfilling WADA s identification criteria. The suitability of LC-TOFMS for screening of high molecular weight doping agents was demonstrated with plasma volume expanders (PVE), namely dextran and hydroxyethylstarch (HES). Specificity of the assay was improved, since interfering matrix compounds were removed by size exclusion chromatography (SEC). ISCID produced three characteristic ions with an excellent mean mass accuracy of 0.82 mDa at physiological concentration levels. In summary, by combining TOFMS with a proper sample preparation and chromatographic separation, the technique can be utilized extensively in doping control laboratories for comprehensive screening of chemically different low and high molecular weight compounds, for quantification of threshold substances and even for confirmation. LC-TOFMS rationalized the work flow in doping control laboratories by simplifying the screening scheme, expediting reporting and minimizing the analysis costs. Therefore LC-TOFMS can be exploited widely in doping control, and the need for several separate analysis techniques is reduced.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mycotoxins are secondary metabolites of filamentous fungi. They pose a health risk to humans and animals due to their harmful biological properties and common occurrence in food and feed. Liquid chromatography/mass spectrometry (LC/MS) has gained popularity in the trace analysis of food contaminants. In this study, the applicability of the technique was evaluated in multi-residue methods of mycotoxins aiming at simultaneous detection of chemically diverse compounds. Methods were developed for rapid determination of toxins produced by fungal genera of Aspergillus, Fusarium, Penicillium and Claviceps from cheese, cereal based agar matrices and grains. Analytes were extracted from these matrices with organic solvents. Minimal sample clean-up was carried out before the analysis of the mycotoxins with reversed phase LC coupled to tandem MS (MS/MS). The methods were validated and applied for investigating mycotoxins in cheese and ergot alkaloid occurrence in Finnish grains. Additionally, the toxin production of two Fusarium species predominant in northern Europe was studied. Nine mycotoxins could be determined from cheese with the method developed. The limits of quantification (LOQ) allowed the quantification at concentrations varying from 0.6 to 5.0 µg/kg. The recoveries ranged between 96 and 143 %, and the within-day repeatability (as relative standard deviation, RSDr) between 2.3 and 12.1 %. Roquefortine C and mycophenolic acid could be detected at levels of 300 up to 12000 µg/kg in the mould cheese samples analysed. A total of 29 or 31 toxins could be analysed with the method developed for agar matrices and grains, with the LOQs ranging overall from 0.1 to 1250 µg/kg. The recoveries ranged generally between 44 and 139 %, and the RSDr between 2.0 and 38 %. Type-A trichothecenes and beauvericin were determined from the cereal based agar and grain cultures of F. sporotrichioides and F. langsethiae. T-2 toxin was the main metabolite, the average levels reaching 22000 µg/kg in the grain cultures after 28 days of incubation. The method developed for ten ergot alkaloids from grains allowed their quantification at levels varying from 0.01 to 10 µg/kg. The recoveries ranged from 51 to 139 %, and the RSDr from 0.6 to 13.9 %. Ergot alkaloids were measured in barley and rye at average levels of 59 and 720 µg/kg, respectively. The two most prevalent alkaloids were ergocornine and ergocristine. The LC/MS methods developed enabled rapid detection of mycotoxins in such applications where several toxins co-occurred. Generally, the performance of the methods was good, allowing reliable analysis of the mycotoxins of interest with sufficiently low quantification limits. However, the variation in validation results highlighted the challenges related to optimising this type of multi-residue methods. New data was obtained about the occurrence of mycotoxins in mould cheeses and of ergot alkaloids in Finnish grains. In addition, the study revealed the high mycotoxin-producing potential of two common fungi in Finnish crops. The information can be useful when risks related to fungal and mycotoxin contamination will be assessed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Syövän diagnostiikassa ja hoidossa nanopartikkelit voivat toimia kuljetinaineina lääke- ja diagnostisille aineille tai nukleiinihappojaksoille. Kantaja-aineeseen voidaan liittää kohdennusmolekyylejä partikkelien passiivista tai aktiivista kohdennusta varten tai radioleima kuvantamista tai radioterapiaa varten. Kantaja-aineiden avulla voidaan parantaa lääkeaineen fysikaalis-kemiallisia ominaisuuksia ja biologista hyötyosuutta, vähentää systeemisiä sivuvaikutuksia, pidentää lääkeaineen puoliintumisaikaa ja siten harventaa annosteluväliä, sekä parantaa lääkeaineen pääsyä kohdekudokseen. Näin voidaan parantaa kemo- ja radioterapian tehoa ja hoidon onnistumisen todennäköisyyttä. Kirjallisuuskatsauksessa perehdytään nanokantajien rooliin syövän hoidossa. Vuosikymmeniä jatkuneesta tutkimuksesta huolimatta vain kaksi (Eurooppa) tai kolme (Yhdysvallat) nanopartikkeliformulaatiota on hyväksytty markkinoille syövän hoidossa. Ongelmina ovat riittämätön hakeutuminen kohdekudokseen, immunogeenisyys ja nanopartikkelien labiilius. Kokeellisessa osassa tutkitaan in vitro ja hiirillä in vivo 99mTc-leimattujen, PEG-verhoiltujen biotiiniliposomien kaksivaiheista kohdennusta ihmisen munasarjan adenokarsinoomasoluihin. Kohdentamiseen käytetään biotinyloitua setuksimabi-(Erbitux®) vasta-ainetta, joka sitoutuu solujen yli-ilmentämiin EGF-reseptoreihin. Kaksivaiheista kohdennusta verrataan suoraan ja/tai passiiviseen kohdennukseen. Tehokkaampien kuvantamismenetelmien kehitys on vauhdittanut kohdennettujen nanopartikkelien tutkimusta. Isotooppikuvantamista käyttäen pystytään seuraamaan radioleiman jakautumista elimistössä ja kuvantamaan solutasolla tapahtuvia ilmiöitä. Kirjallisuuskatsauksessa perehdytään SPECT- ja PET-kuvantamiseen syövän hoidossa, sekä niiden hyödyntämiseen lääkekehityksessä nanopartikkelien kuvantamisessa. Kyseiset kuvantamismenetelmät erottuvat muista menetelmistä korkean erotuskyvyn, herkkyyden ja helppokäyttöisyyden suhteen. Kokeellisessa osassa 99mTc-leimattujen liposomien distribuutiota hiirissä tutkittiin SPECT-CT-laitteen avulla. Aktiivisuus kasvaimessa, pernassa ja maksassa kvantifioitiin InVivoScope-ohjelman ja gammalaskijan avulla. Tuloksia verrattiin keskenään. In vitro-kokeessa saavutettiin kaksivaiheisella kohdennuksella 2,7- 3,5-kertainen (solulinjasta riippuen) hakeutuminen soluihin kontrolliliposomeihin verrattuna. Kuitenkin suora kohdennus toimi kaksivaiheista kohdennusta paremmin in vitro. In vivo –kokeissa liposomit jakautuivat kasvaimeen tehokkaammin i.p.-annosteltuna kuin i.v.-annosteltuna. Kaksivaiheisella kohdennuksella saavutettiin 1,24-kertainen jakautuminen kasvaimeen (% ID/g kudosta) passiivisesti kohdennettuihin liposomeihin verrattuna. %ID/elin oli kohdennetuilla liposomeilla 5,9 % ja passiivisesti kohdennetuilla 5,4%. Todellinen ero oli siis pieni. InVivoScope:n ja gammalaskijan tulokset eivät korreloineet keskenään. Lisätutkimuksia ja menetelmän optimointia vaaditaan liposomien kohdennuksessa kasvaimeen.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Representation and quantification of uncertainty in climate change impact studies are a difficult task. Several sources of uncertainty arise in studies of hydrologic impacts of climate change, such as those due to choice of general circulation models (GCMs), scenarios and downscaling methods. Recently, much work has focused on uncertainty quantification and modeling in regional climate change impacts. In this paper, an uncertainty modeling framework is evaluated, which uses a generalized uncertainty measure to combine GCM, scenario and downscaling uncertainties. The Dempster-Shafer (D-S) evidence theory is used for representing and combining uncertainty from various sources. A significant advantage of the D-S framework over the traditional probabilistic approach is that it allows for the allocation of a probability mass to sets or intervals, and can hence handle both aleatory or stochastic uncertainty, and epistemic or subjective uncertainty. This paper shows how the D-S theory can be used to represent beliefs in some hypotheses such as hydrologic drought or wet conditions, describe uncertainty and ignorance in the system, and give a quantitative measurement of belief and plausibility in results. The D-S approach has been used in this work for information synthesis using various evidence combination rules having different conflict modeling approaches. A case study is presented for hydrologic drought prediction using downscaled streamflow in the Mahanadi River at Hirakud in Orissa, India. Projections of n most likely monsoon streamflow sequences are obtained from a conditional random field (CRF) downscaling model, using an ensemble of three GCMs for three scenarios, which are converted to monsoon standardized streamflow index (SSFI-4) series. This range is used to specify the basic probability assignment (bpa) for a Dempster-Shafer structure, which represents uncertainty associated with each of the SSFI-4 classifications. These uncertainties are then combined across GCMs and scenarios using various evidence combination rules given by the D-S theory. A Bayesian approach is also presented for this case study, which models the uncertainty in projected frequencies of SSFI-4 classifications by deriving a posterior distribution for the frequency of each classification, using an ensemble of GCMs and scenarios. Results from the D-S and Bayesian approaches are compared, and relative merits of each approach are discussed. Both approaches show an increasing probability of extreme, severe and moderate droughts and decreasing probability of normal and wet conditions in Orissa as a result of climate change. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The study of soil microbiota and their activities is central to the understanding of many ecosystem processes such as decomposition and nutrient cycling. The collection of microbiological data from soils generally involves several sequential steps of sampling, pretreatment and laboratory measurements. The reliability of results is dependent on reliable methods in every step. The aim of this thesis was to critically evaluate some central methods and procedures used in soil microbiological studies in order to increase our understanding of the factors that affect the measurement results and to provide guidance and new approaches for the design of experiments. The thesis focuses on four major themes: 1) soil microbiological heterogeneity and sampling, 2) storage of soil samples, 3) DNA extraction from soil, and 4) quantification of specific microbial groups by the most-probable-number (MPN) procedure. Soil heterogeneity and sampling are discussed as a single theme because knowledge on spatial (horizontal and vertical) and temporal variation is crucial when designing sampling procedures. Comparison of adjacent forest, meadow and cropped field plots showed that land use has a strong impact on the degree of horizontal variation of soil enzyme activities and bacterial community structure. However, regardless of the land use, the variation of microbiological characteristics appeared not to have predictable spatial structure at 0.5-10 m. Temporal and soil depth-related patterns were studied in relation to plant growth in cropped soil. The results showed that most enzyme activities and microbial biomass have a clear decreasing trend in the top 40 cm soil profile and a temporal pattern during the growing season. A new procedure for sampling of soil microbiological characteristics based on stratified sampling and pre-characterisation of samples was developed. A practical example demonstrated the potential of the new procedure to reduce the analysis efforts involved in laborious microbiological measurements without loss of precision. The investigation of storage of soil samples revealed that freezing (-20 °C) of small sample aliquots retains the activity of hydrolytic enzymes and the structure of the bacterial community in different soil matrices relatively well whereas air-drying cannot be recommended as a storage method for soil microbiological properties due to large reductions in activity. Freezing below -70 °C was the preferred method of storage for samples with high organic matter content. Comparison of different direct DNA extraction methods showed that the cell lysis treatment has a strong impact on the molecular size of DNA obtained and on the bacterial community structure detected. An improved MPN method for the enumeration of soil naphthalene degraders was introduced as an alternative to more complex MPN protocols or the DNA-based quantification approach. The main advantage of the new method is the simple protocol and the possibility to analyse a large number of samples and replicates simultaneously.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

People with coeliac disease have to maintain a gluten-free diet, which means excluding wheat, barley and rye prolamin proteins from their diet. Immunochemical methods are used to analyse the harmful proteins and to control the purity of gluten-free foods. In this thesis, the behaviour of prolamins in immunological gluten assays and with different prolamin-specific antibodies was examined. The immunoassays were also used to detect residual rye prolamins in sourdough systems after enzymatic hydrolysis and wheat prolamins after deamidation. The aim was to characterize the ability of the gluten analysis assays to quantify different prolamins in varying matrices in order to improve the accuracy of the assays. Prolamin groups of cereals consist of a complex mixture of proteins that vary in their size and amino acid sequences. Two common characteristics distinguish prolamins from other cereal proteins. Firstly, they are soluble in aqueous alcohols, and secondly, most of the prolamins are mainly formed from repetitive amino acid sequences containing high amounts of proline and glutamine. The diversity among prolamin proteins sets high requirements for their quantification. In the present study, prolamin contents were evaluated using enzyme-linked immunosorbent assays based on ω- and R5 antibodies. In addition, assays based on A1 and G12 antibodies were used to examine the effect of deamidation on prolamin proteins. The prolamin compositions and the cross-reactivity of antibodies with prolamin groups were evaluated with electrophoretic separation and Western blotting. The results of this thesis research demonstrate that the currently used gluten analysis methods are not able to accurately quantify barley prolamins, especially when hydrolysed or mixed in oats. However, more precise results can be obtained when the standard more closely matches the sample proteins, as demonstrated with barley prolamin standards. The study also revealed that all of the harmful prolamins, i.e. wheat, barley and rye prolamins, are most efficiently extracted with 40% 1-propanol containing 1% dithiothreitol at 50 °C. The extractability of barley and rye prolamins was considerably higher with 40% 1-propanol than with 60% ethanol, which is typically used for prolamin extraction. The prolamin levels of rye were lowered by 99.5% from the original levels when an enzyme-active rye-malt sourdough system was used for prolamin degradation. Such extensive degradation of rye prolamins suggest the use of sourdough as a part of gluten-free baking. Deamidation increases the diversity of prolamins and improves their solubility and ability to form structures such as emulsions and foams. Deamidation changes the protein structure, which has consequences for antibody recognition in gluten analysis. According to the resuts of the present work, the analysis methods were not able to quantify wheat gluten after deamidation except at very high concentrations. Consequently, deamidated gluten peptides can exist in food products and remain undetected, and thus cause a risk for people with gluten intolerance. The results of this thesis demonstrate that current gluten analysis methods cannot accurately quantify prolamins in all food matrices. New information on the prolamins of rye and barley in addition to wheat prolamins is also provided in this thesis, which is essential for improving gluten analysis methods so that they can more accurately quantify prolamins from harmful cereals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Parkinson´s Disease (PD) is a neurodegenerative movement disorder resulting from loss of dopaminergic (DA) neurons in substantia nigra (SN). Possible causative treatment strategies for PD include neurotrophic factors, which protect and in some cases restore the function of dopaminergic neurons. Glial cell line-derived neurotrophic factor (GDNF) family of neurotrophic factors have been to date the most promising candidates for treatment of PD, demonstrating both neuroprotective and neurorestorative properties. We have investigated the role of GDNF in the rodent dopaminergic system and its possible crosstalk with other growth factors. We characterized the GDNF-induced gene expression changes by DNA microarray analysis in different neuronal systems, including in vitro cultured Neuro2A cells treated with GDNF, as well as midbrains from GDNF heterozygous (Hz) knockout mice. These microarray experiments, resulted in the identification of GDNF-induced genes, which were also confirmed by other methods. Further analysis of the dopaminergic system of GDNF Hz mice demonstrated about 40% reduction in GDNF levels, revealed increased intracellular dopamine concentrations and FosB/DeltaFosB expression in striatal areas. These animals did not show any significant changes in behavioural analysis of acute and repeated cocaine administration on locomotor activity, nor did they exhibit any changes in dopamine output following treatment with acute cocaine. We further analysed the significance of GDNF receptor RET signalling in dopaminergic system of MEN2B knock-in animals with constitutively active Ret. The MEN2B animals showed a robust increase in extracellular dopamine and its metabolite levels in striatum, increased tyrosine hydroxylase (TH) and dopamine transporter (DAT) protein levels by immunohistochemical staining and Western blotting, as well as increased Th mRNA levels in SN. MEN2B mice had increased number of DA neurons in SN by about 25% and they also exhibited increased sensitivity to the stimulatory effects of cocaine. We also developed a semi-throughput in vitro micro-island assay for the quantification of neuronal survival and TH levels by computer-assisted methodology from limited amounts of tissue. This assay can be applied for the initial screening for dopaminotrophic molecules, as well as chemical drug library screening. It is applicable to any neuronal system for the screening of neurotrophic molecules. Since our microarray experiments revealed possible GDNF-VEGF-C crosstalk we further concentrated on studying the neurotrophic effects of VEGF-C. We showed that VEGF-C acts as a neurotrophic molecule for the DA neurons both in vitro and in vivo, however without additive effect when used together with GDNF. The neuroprotective effect for VEGF-C in vivo in rat 6-OHDA model of PD was demonstrated. The possible signalling mechanisms of VEGF-C in the nervous system were investigated - infusion of VEGF-C to rat brain induced ERK activation, however no direct activation of RET signalling in vitro was found. VEGF-C treatment of rat striatum lead to up-regulation of VEGFR-1-3, indicating that VEGF-C can regulate the expression level of its own receptor. VEGF-C dopaminotrophic activity in vivo was further supported by increased vascular tissue in the neuroprotection experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study explores the utility of polarimetric measurements for discriminating between hydrometeor types with the emphasis on (a) hail detection and discrimination of its size, (b) measurement of heavy precipitation, (c) identification and quantification of mixed-phase hydrometeors, and (d) discrimination of ice forms. In particular, we examine the specific differential phase, the backscatter differential phase, the correlation coefficient between vertically and horizontally polarized waves, and the differential reflectivity, collected from a storm at close range. Three range–height cross sections are analyzed together with complementary data from a prototype WSR-88D radar. The case is interesting because it demonstrates the complementary nature of these polarimetric measurands. Self-consistency among them allows qualitative and some quantitative discrimination between hydrometeors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The formation of nanoscale liquid droplets by friction of a solid is observed in real-time. This is achieved using a newly developed in situ transmission electron microscope (TEM) triboprobe capable of applying multiple reciprocating wear cycles to a nanoscale surface. Dynamical imaging of the nanoscale cyclic rubbing of a focused-ion-beam (FIB) processed Al alloy by diamond shows that the generation of nanoscale wear particles is followed by a phase separation to form liquid Ga nanodroplets and liquid bridges. The transformation of a two-body system to a four-body solid-liquid system within the reciprocating wear track significantly alters the local dynamical friction and wear processes. Moving liquid bridges are observed in situ to play a key role at the sliding nanocontact, interacting strongly with the highly mobile nanoparticle debris. In situ imaging demonstrates that both static and moving liquid droplets exhibit asymmetric menisci due to nanoscale surface roughness. Nanodroplet kinetics are furthermore dependent on local frictional temperature, with solid-like surface nanofilaments forming on cooling. TEM nanotribology opens up new avenues for the real-time quantification of cyclic friction, wear and dynamic solid-liquid nanomechanics, which will have widespread applications in many areas of nanoscience and nanotechnology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Polycyclic aromatic hydrocarbons (PAHs) are environmental pollutants as well as well-known carcinogens. Therefore, it is important to develop an effective receptor for the detection and quantification of such molecules in solution. In view of this, a 1,3-dinaphthalimide derivative of calix4]arene (L) has been synthesized and characterized, and the structure has been established by single crystal XRD. In the crystal lattice, intermolecular arm-to-arm pi center dot center dot center dot pi overlap dominates and thus L becomes a promising receptor for providing interactions with the aromatic species in solution, which can be monitored by following the changes that occur in its fluorescence and absorption spectra. On the basis of the solution studies carried out with about 17 derivatives of the aromatic guest molecular systems, it may be concluded that the changes that occur in the fluorescence intensity seem to be proportional to the number of aromatic rings present and thus proportional to the extent of pi center dot center dot center dot pi interaction present between the naphthalimide moieties and the aromatic portion of the guest molecule. Though the nonaromatic portion of the guest species affects the fluorescence quenching, the trend is still based on the number of rings present in these. Four guest aldehydes are bound to L with K-ass of 2000-6000 M-1 and their minimum detection limit is in the range of 8-35 mu M. The crystal structure of a naphthaldehyde complex, L.2b, exhibits intermolecular arm-to-arm as well as arm-to-naphthaldehyde pi center dot center dot center dot pi interactions. Molecular dynamics studies of L carried out in the presence of aromatic aldehydes under vacuum as well as in acetonitrile resulted in exhibiting interactions observed in the solid state and hence the changes observed in the fluorescence and absorption spectra are attributable for such interactions. Complex formation has also been delineated through ESI MS studies. Thus L is a promising receptor that can recognize PAHs by providing spectral changes proportional to the aromatic conjugation of the guest and the extent of aromatic pi center dot center dot center dot pi interactions present between L and the guest.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bioconversion of acyclic isoprenoids using a strain of Aspergillus niger results in hydroxylated metabolites with regio- and stereoselectivity. The organism carries out oxidation of the terminal allylic methyl group and the remote double bond in all the compounds tested (I-VII). However, these two activities seem to have preferential structural requirements. When an acyclic isoprenoid with a ketone functionality such as geranylacetone is used as the substrate, the organism also carries out the asymmetric reduction of the keto group. All the metabolites formed have been purified and characterized by conventional spectroscopic methods and quantification has been made by gas chromatographic analyses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The problem of quantification of intelligence of humans, and of intelligent systems, has been a challenging and controversial topic. IQ tests have been traditionally used to quantify human intelligence based on results of test designed by psychologists. It is in general very difficult to quantify intelligence. In this paper the authors consider a simple question-answering (Q-A) system and use this to quantify intelligence. The authors quantify intelligence as a vector with three components. The components consist of a measure of knowledge in asking questions, effectiveness of questions asked, and correctness of deduction. The authors formalize these parameters and have conducted experiments on humans to measure these parameters