902 resultados para Application method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Studies testing the High Energy Moisture Characteristic (HEMC) technique in tropical soils are still incipient. By this method, the effects of different management systems can be evaluated. This study investigated the aggregation state of an Oxisol under coffee with Brachiaria between crop rows and surface-applied gypsum rates using HEMC. Soil in an experimental area in the Upper São Francisco region, Minas Gerais, was studied at depths of 0.05 and 0.20 m in coffee rows. The treatments consisted of 0, 7, and 28 Mg ha-1 of agricultural gypsum rates distributed on the soil surface of the coffee rows, between which Brachiaria was grown and periodically cut, and compared with a treatment without Brachiaria between coffee rows and no gypsum application. To determine the aggregation state using the HEMC method, soil aggregates were placed in a Büchner funnel (500 mL) and wetted using a peristaltic pump with a volumetric syringe. The wetting was applied increasingly at two pre-set speeds: slow (2 mm h-1) and fast (100 mm h-1). Once saturated, the aggregates were exposed to a gradually increasing tension by the displacement of a water column (varying from 0 to 30 cm) to obtain the moisture retention curve [M = f (Ψ) ], underlying the calculation of the stability parameters: modal suction, volume of drainable pores (VDP), stability index (slow and fast), VDP ratio, and stability ratio. The HEMC method conferred sensitivity in quantifying the aggregate stability parameters, and independent of whether gypsum was used, the soil managed with Brachiaria between the coffee rows, with regular cuts discharged in the crop row direction, exhibited a decreased susceptibility to disaggregation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Organization of the Thesis The remainder of the thesis comprises five chapters and a conclusion. The next chapter formalizes the envisioned theory into a tractable model. Section 2.2 presents a formal description of the model economy: the individual heterogeneity, the individual objective, the UI setting, the population dynamics and the equilibrium. The welfare and efficiency criteria for qualifying various equilibrium outcomes are proposed in section 2.3. The fourth section shows how the model-generated information can be computed. Chapter 3 transposes the model from chapter 2 in conditions that enable its use in the analysis of individual labor market strategies and their implications for the labor market equilibrium. In section 3.2 the Swiss labor market data sets, stylized facts, and the UI system are presented. The third section outlines and motivates the parameterization method. In section 3.4 the model's replication ability is evaluated and some aspects of the parameter choice are discussed. Numerical solution issues can be found in the appendix. Chapter 4 examines the determinants of search-strategic behavior in the model economy and its implications for the labor market aggregates. In section 4.2, the unemployment duration distribution is examined and related to search strategies. Section 4.3 shows how the search- strategic behavior is influenced by the UI eligibility and section 4.4 how it is determined by individual heterogeneity. The composition effects generated by search strategies in labor market aggregates are examined in section 4.5. The last section evaluates the model's replication of empirical unemployment escape frequencies reported in Sheldon [67]. Chapter 5 applies the model economy to examine the effects on the labor market equilibrium of shocks to the labor market risk structure, to the deep underlying labor market structure and to the UI setting. Section 5.2 examines the effects of the labor market risk structure on the labor market equilibrium and the labor market strategic behavior. The effects of alterations in the labor market deep economic structural parameters, i.e. individual preferences and production technology, are shown in Section 5.3. Finally, the UI setting impacts on the labor market are studied in Section 5.4. This section also evaluates the role of the UI authority monitoring and the differences in the Way changes in the replacement rate and the UI benefit duration affect the labor market. In chapter 6 the model economy is applied in counterfactual experiments to assess several aspects of the Swiss labor market movements in the nineties. Section 6.2 examines the two equilibria characterizing the Swiss labor market in the nineties, the " growth" equilibrium with a "moderate" UI regime and the "recession" equilibrium with a more "generous" UI. Section 6.3 evaluates the isolated effects of the structural shocks, while the isolated effects of the UI reforms are analyzed in section 6.4. Particular dimensions of the UI reforms, the duration, replacement rate and the tax rate effects, are studied in section 6.5, while labor market equilibria without benefits are evaluated in section 6.6. In section 6.7 the structural and institutional interactions that may act as unemployment amplifiers are discussed in view of the obtained results. A welfare analysis based on individual welfare in different structural and UI settings is presented in the eighth section. Finally, the results are related to more favorable unemployment trends after 1997. The conclusion evaluates the features embodied in the model economy with respect to the resulting model dynamics to derive lessons from the model design." The thesis ends by proposing guidelines for future improvements of the model and directions for further research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Alpha1-Acid glycoprotein (AAG) or orosomucoid was purified to homogeneity from human plasma by a separate two-step method using chromatography on immobilized Cibacron Blue F3G-A to cross-linked agarose and chromatography on hydroxyapatite. The conditions for the pre-purification of AAG by chromatography on immobilized Cibacron Blue F3G-A were first optimized using different buffer systems with different pH values. The overall yield of the combined techniques was 80% and ca. 12 mg of AAG were purified from an initial total amount of ca. 15 mg in a ca. 40 ml sample of human plasma. This method was applied to the purification of AAG samples corresponding to the three main phenotypes of the protein (FI*S/A, F1/A and S/A), from individual human plasma previously phenotyped for AAG. A study by isoelectric focusing with carrier ampholytes showed that the microheterogeneity of the purified F1*S/A, F1/A and S/A AAG samples was similar to that of AAG in the corresponding plasma, thus suggesting that no apparent desialylation of the glycoprotein occurred during the purification steps. This method was also applied to the purification of AAG samples corresponding to rare phenotypes of the protein (F1/A*AD, S/A*X0 and F1/A*C1) and the interactions of these variants with immobilized copper(II) ions were then studied at pH 7, by chromatography on an iminodiacetate Sepharose-Cu(II) gel. It was found that the different variants encoded by the first of the two genes coding for AAG in humans (i.e. the F1 and S variants) interacted non-specifically with the immobilized ligand, whereas those encoded by the second gene of AAG (i.e. the A, AD, X0 and C1 variants) strongly bound to immobilized Cu(II) ions. These results suggested that chromatography on an immobilized affinity Cu(II) adsorbent could be helpful to distinguish between the respective products of the two highly polymorphic genes which code for human AAG.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Accurate automatic segmentation of the caudate nucleus in magnetic resonance images (MRI) of the brain is of great interest in the analysis of developmental disorders. Segmentation methods based on a single atlas or on multiple atlases have been shown to suitably localize caudate structure. However, the atlas prior information may not represent the structure of interest correctly. It may therefore be useful to introduce a more flexible technique for accurate segmentations. Method We present Cau-dateCut: a new fully-automatic method of segmenting the caudate nucleus in MRI. CaudateCut combines an atlas-based segmentation strategy with the Graph Cut energy-minimization framework. We adapt the Graph Cut model to make it suitable for segmenting small, low-contrast structures, such as the caudate nucleus, by defining new energy function data and boundary potentials. In particular, we exploit information concerning the intensity and geometry, and we add supervised energies based on contextual brain structures. Furthermore, we reinforce boundary detection using a new multi-scale edgeness measure. Results We apply the novel CaudateCut method to the segmentation of the caudate nucleus to a new set of 39 pediatric attention-deficit/hyperactivity disorder (ADHD) patients and 40 control children, as well as to a public database of 18 subjects. We evaluate the quality of the segmentation using several volumetric and voxel by voxel measures. Our results show improved performance in terms of segmentation compared to state-of-the-art approaches, obtaining a mean overlap of 80.75%. Moreover, we present a quantitative volumetric analysis of caudate abnormalities in pediatric ADHD, the results of which show strong correlation with expert manual analysis. Conclusion CaudateCut generates segmentation results that are comparable to gold-standard segmentations and which are reliable in the analysis of differentiating neuroanatomical abnormalities between healthy controls and pediatric ADHD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel and simple procedure for concentrating adenoviruses from seawater samples is described. The technique entails the adsorption of viruses to pre-flocculated skimmed milk proteins, allowing the flocs to sediment by gravity, and dissolving the separated sediment in phosphate buffer. Concentrated virus may be detected by PCR techniques following nucleic acid extraction. The method requires no specialized equipment other than that usually available in routine public health laboratories, and due to its straightforwardness it allows the processing of a larger number of water samples simultaneously. The usefulness of the method was demonstrated in concentration of virus in multiple seawater samples during a survey of adenoviruses in coastal waters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Estrone is a powerful growth-inducing hormone that is present in milk, mainly in the form of fatty acid esters, at concentrations that promote growth in experimental animals. We present here a method useful for the measurement of this natural hormone in foods and applied it to several common dairy products. Samples were frozen, finely powdered, and lyophilized then extracted with trichloromethane/methanol; the dry extract was saponified with potassium hydroxide. The free estrone evolved was extracted with ethyl acetate and was used for the estimation of total estrone content through radioimmunoassay. Application of the method to dairy products showed high relative levels of total estrone (essentially acyl-estrone) in milk, in the range of 1 ¿M, which were halved in skimmed milk. Free estrone levels were much lower, in the nanomolar range. A large proportion of estrone esters was present in all other dairy products, fairly correlated with their fat content. The amount of estrone carried by milk is well within the range, where its intake may exert a physiological response in the sucklings for which it is provided. These growth-inducing and energy expenditure-lowering effects may affect humans ingesting significant amounts of dairy products.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A method is proposed for the estimation of absolute binding free energy of interaction between proteins and ligands. Conformational sampling of the protein-ligand complex is performed by molecular dynamics (MD) in vacuo and the solvent effect is calculated a posteriori by solving the Poisson or the Poisson-Boltzmann equation for selected frames of the trajectory. The binding free energy is written as a linear combination of the buried surface upon complexation, SASbur, the electrostatic interaction energy between the ligand and the protein, Eelec, and the difference of the solvation free energies of the complex and the isolated ligand and protein, deltaGsolv. The method uses the buried surface upon complexation to account for the non-polar contribution to the binding free energy because it is less sensitive to the details of the structure than the van der Waals interaction energy. The parameters of the method are developed for a training set of 16 HIV-1 protease-inhibitor complexes of known 3D structure. A correlation coefficient of 0.91 was obtained with an unsigned mean error of 0.8 kcal/mol. When applied to a set of 25 HIV-1 protease-inhibitor complexes of unknown 3D structures, the method provides a satisfactory correlation between the calculated binding free energy and the experimental pIC5o without reparametrization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Terrestrial laser scanning (TLS) is one of the most promising surveying techniques for rockslope characterization and monitoring. Landslide and rockfall movements can be detected by means of comparison of sequential scans. One of the most pressing challenges of natural hazards is combined temporal and spatial prediction of rockfall. An outdoor experiment was performed to ascertain whether the TLS instrumental error is small enough to enable detection of precursory displacements of millimetric magnitude. This consists of a known displacement of three objects relative to a stable surface. Results show that millimetric changes cannot be detected by the analysis of the unprocessed datasets. Displacement measurement are improved considerably by applying Nearest Neighbour (NN) averaging, which reduces the error (1¿) up to a factor of 6. This technique was applied to displacements prior to the April 2007 rockfall event at Castellfollit de la Roca, Spain. The maximum precursory displacement measured was 45 mm, approximately 2.5 times the standard deviation of the model comparison, hampering the distinction between actual displacement and instrumental error using conventional methodologies. Encouragingly, the precursory displacement was clearly detected by applying the NN averaging method. These results show that millimetric displacements prior to failure can be detected using TLS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A score system integrating the evolution of efficacy and tolerability over time was applied to a subpopulation of the STRATHE trial, a trial performed according to a parallel group design, with a double-blind, random allocation to either a fixed-dose combination strategy (perindopril/indapamide 2 mg/0.625 mg, with the possibility to increase the dose to 3 mg/0.935 mg, and 4 mg/1.250 mg if needed, n = 118), a sequential monotherapy approach (atenolol 50 mg, followed by losartan 50 mg and amlodipine 5 mg if needed, n = 108), or a stepped-care strategy (valsartan 40 mg, followed by valsartan 80 mg and valsartan 80 mg+ hydrochlorothiazide 12.5 mg if needed, n = 103). The aim was to lower blood pressure below 140/90 mmHg within a 9-month period. The treatment could be adjusted after 3 and 6 months. Only patients in whom the study protocol was strictly applied were included in this analysis. At completion of the trial the total score averaged 13.1 +/- 70.5 (mean +/- SD) using the fixed-dose combination strategy, compared with -7.2 +/- 81.0 using the sequential monotherapy approach and -17.5 +/- 76.4 using the stepped-care strategy. In conclusion, the use of a score system allows the comparison of antihypertensive therapeutic strategies, taking into account at the same time efficacy and tolerability. In the STRATHE trial the best results were observed with the fixed-dose combination containing low doses of an angiotensin enzyme converting inhibitor (perindopril) and a diuretic (indapamide).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study investigates the possibility to incorporate fracture intensity and block geometry as spatially continuous parameters in GIS-based systems. For this purpose, a deterministic method has been implemented to estimate block size (Bloc3D) and joint frequency (COLTOP). In addition to measuring the block size, the Bloc3D Method provides a 3D representation of the shape of individual blocks. These two methods were applied using field measurements (joint set orientation and spacing) performed over a large field area, in the Swiss Alps. This area is characterized by a complex geology, a number of different rock masses and varying degrees of metamorphism. The spatial variability of the parameters was evaluated with regard to lithology and major faults. A model incorporating these measurements and observations into a GIS system to assess the risk associated with rock falls is proposed. The analysis concludes with a discussion on the feasibility of such an application in regularly and irregularly jointed rock masses, with persistent and impersistent discontinuities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Single amino acid substitution is the type of protein alteration most related to human diseases. Current studies seek primarily to distinguish neutral mutations from harmful ones. Very few methods offer an explanation of the final prediction result in terms of the probable structural or functional effect on the protein. In this study, we describe the use of three novel parameters to identify experimentally-verified critical residues of the TP53 protein (p53). The first two parameters make use of a surface clustering method to calculate the protein surface area of highly conserved regions or regions with high nonlocal atomic interaction energy (ANOLEA) score. These parameters help identify important functional regions on the surface of a protein. The last parameter involves the use of a new method for pseudobinding free-energy estimation to specifically probe the importance of residue side-chains to the stability of protein fold. A decision tree was designed to optimally combine these three parameters. The result was compared to the functional data stored in the International Agency for Research on Cancer (IARC) TP53 mutation database. The final prediction achieved a prediction accuracy of 70% and a Matthews correlation coefficient of 0.45. It also showed a high specificity of 91.8%. Mutations in the 85 correctly identified important residues represented 81.7% of the total mutations recorded in the database. In addition, the method was able to correctly assign a probable functional or structural role to the residues. Such information could be critical for the interpretation and prediction of the effect of missense mutations, as it not only provided the fundamental explanation of the observed effect, but also helped design the most appropriate laboratory experiment to verify the prediction results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Résumé La réalisation d'une seconde ligne de métro (M2) dès 2004, passant dans le centre ville de Lausanne, a été l'opportunité de développer une méthodologie concernant des campagnes microgravimétriques dans un environnement urbain perturbé. Les corrections topographiques prennent une dimension particulière dans un tel milieu, car de nombreux objets non géologiques d'origine anthropogénique comme toutes sortes de sous-sols vides viennent perturber les mesures gravimétriques. Les études de génie civil d'avant projet de ce métro nous ont fournis une quantité importante d'informations cadastrales, notamment sur les contours des bâtiments, sur la position prévue du tube du M2, sur des profondeurs de sous-sol au voisinage du tube, mais aussi sur la géologie rencontré le long du corridor du M2 (issue des données lithologiques de forages géotechniques). La planimétrie des sous-sols a été traitée à l'aide des contours des bâtiments dans un SIG (Système d'Information Géographique), alors qu'une enquête de voisinage fut nécessaire pour mesurer la hauteur des sous-sols. Il a été alors possible, à partir d'un MNT (Modèle Numérique de Terrain) existant sur une grille au mètre, de mettre à jour celui ci avec les vides que représentent ces sous-sols. Les cycles de mesures gravimétriques ont été traités dans des bases de données Ac¬cess, pour permettre un plus grand contrôle des données, une plus grande rapidité de traitement, et une correction de relief rétroactive plus facile, notamment lorsque des mises à jour de la topographie ont lieu durant les travaux. Le quartier Caroline (entre le pont Bessières et la place de l'Ours) a été choisi comme zone d'étude. Le choix s'est porté sur ce quartier du fait que, durant ce travail de thèse, nous avions chronologiquement les phases pré et post creusement du tunnel du M2. Cela nous a permis d'effectuer deux campagnes gravimétriques (avant le creu¬sement durant l'été 2005 et après le creusement durant l'été 2007). Ces réitérations nous ont permis de tester notre modélisation du tunnel. En effet, en comparant les mesures des deux campagnes et la réponse gravifique du modèle du tube discrétisé en prismes rectangulaires, nous avons pu valider notre méthode de modélisation. La modélisation que nous avons développée nous permet de construire avec détail la forme de l'objet considéré avec la possibilité de recouper plusieurs fois des interfaces de terrains géologiques et la surface topographique. Ce type de modélisation peut s'appliquer à toutes constructions anthropogéniques de formes linéaires. Abstract The realization of a second underground (M2) in 2004, in downtown Lausanne, was the opportunity to develop a methodology of microgravity in urban environment. Terrain corrections take on special meaning in such environment. Many non-geologic anthropogenic objects like basements act as perturbation of gravity measurements. Civil engineering provided a large amount of cadastral informations, including out¬lines of buildings, M2 tube position, depths of some basements in the vicinity of the M2 corridor, and also on the geology encountered along the M2 corridor (from the lithological data from boreholes). Geometry of basements was deduced from building outlines in a GIS (Geographic Information System). Field investigation was carried out to measure or estimate heights of basements. A DEM (Digital Elevation Model) of the city of Lausanne is updated from voids of basements. Gravity cycles have been processed in Access database, to enable greater control of data, enhance speed processing, and retroactive terrain correction easier, when update of topographic surface are available. Caroline area (between the bridge Saint-Martin and Place de l'Ours) was chosen as the study area. This area was in particular interest because it was before and after digging in this thesis. This allowed us to conduct two gravity surveys (before excavation during summer 2005 and after excavation during summer 2007). These re-occupations enable us to test our modélisation of the tube. Actually, by comparing the difference of measurements between the both surveys and the gravity response of our model (by rectangular prisms), we were able to validate our modeling. The modeling method we developed allows us to construct detailed shape of an object with possibility to cross land geological interfaces and surface topography. This type of modélisation can be applied to all anthropogenic structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objective of this work was to compare two methods to estimate the deposition of pesticide applied by aerial spraying. Hundred and fifty pieces of water sensitive paper were distributed over an area of 50 m length by 75 m width for sampling droplets sprayed by an aircraft calibrated to apply a spray volume of 32 L/ha. The samples were analysed by visual microscopic method using NG 2 Porton graticule and by an image analyser computer program. The results reached by visual microscopic method were the following: volume median diameter, 398±62 mum; number median diameter, 159±22 mum; droplet density, 22.5±7.0 droplets/cm² and estimated deposited volume, 22.2±9.4 L/ha. The respective ones reached with the computer program were: 402±58 mum, 161±32 mum, 21.9±7.5 droplets/cm² and 21.9±9.2 L/ha. Graphs of the spatial distribution of droplet density and deposited spray volume on the area were produced by the computer program.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Despite the fundamental role of ecosystem goods and services in sustaining human activities, there is no harmonized and internationally agreed method for including them in life cycle assessment (LCA). The main goal of this study was to develop a globally applicable and spatially resolved method for assessing land-use impacts on the erosion regulation ecosystem service.Methods: Soil erosion depends much on location. Thus, unlike conventional LCA, the endpoint method was regionalized at the grid-cell level (5 arc-minutes, approximately 10×10 km2) to reflect the spatial conditions of the site. Spatially explicit characterization factors were not further aggregated at broader spatial scales. Results and discussion: Life cycle inventory data of topsoil and topsoil organic carbon (SOC) losses were interpreted at the endpoint level in terms of the ultimate damage to soil resources and ecosystem quality. Human health damages were excluded from the assessment. The method was tested on a case study of five three-year agricultural rotations, two of them with energy crops, grown in several locations in Spain. A large variation in soil and SOC losses was recorded in the inventory step, depending on climatic and edaphic conditions. The importance of using a spatially explicit model and characterization factors is shown in the case study.Conclusions and outlook: The regionalized assessment takes into account the differences in soil erosion-related environmental impacts caused by the great variability of soils. Taking this regionalized framework as the starting point, further research should focus on testing the applicability of the method trough the complete life cycle of a product and on determining an appropriate spatial scale at which to aggregate characterization factors, in order to deal with data gaps on location of processes, especially in the background system. Additional research should also focus on improving reliability of the method by quantifying and, insofar as it is possible, reducing uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The annotation of protein post-translational modifications (PTMs) is an important task of UniProtKB curators and, with continuing improvements in experimental methodology, an ever greater number of articles are being published on this topic. To help curators cope with this growing body of information we have developed a system which extracts information from the scientific literature for the most frequently annotated PTMs in UniProtKB. RESULTS: The procedure uses a pattern-matching and rule-based approach to extract sentences with information on the type and site of modification. A ranked list of protein candidates for the modification is also provided. For PTM extraction, precision varies from 57% to 94%, and recall from 75% to 95%, according to the type of modification. The procedure was used to track new publications on PTMs and to recover potential supporting evidence for phosphorylation sites annotated based on the results of large scale proteomics experiments. CONCLUSIONS: The information retrieval and extraction method we have developed in this study forms the basis of a simple tool for the manual curation of protein post-translational modifications in UniProtKB/Swiss-Prot. Our work demonstrates that even simple text-mining tools can be effectively adapted for database curation tasks, providing that a thorough understanding of the working process and requirements are first obtained. This system can be accessed at http://eagl.unige.ch/PTM/.