911 resultados para separation of variables
Resumo:
The purpose of this research was to study the genetic diversity and genetic relatedness of 60 genotypes of grapevines derived from the Germplasm Bank of Embrapa Semiárido, Juazeiro, BA, Brazil. Seven previously characterized microsatellite markers were used: VVS2, VVMD5, VVMD7, VVMD27, VVMD3, ssrVrZAG79 and ssrVrZAG62. The expected heterozygosity (He) and polymorphic information content (PIC) were calculated, and the cluster analysis were processed to generate a dendrogram using the algorithm UPGMA. The He ranged from 81.8% to 88.1%, with a mean of 84.8%. The loci VrZAG79 and VVMD7 were the most informative, with a PIC of 87 and 86%, respectively, while VrZAG62 was the least informative, with a PIC value of 80%. Cluster analysis by UPGMA method allowed separation of the genotypes according to their genealogy and identification of possible parentage for the cultivars 'Dominga', 'Isaura', 'CG 26916', 'CG28467' and 'Roni Redi'.
Resumo:
BACKGROUND: Health-related quality of life (HRQOL) levels and their determinants in those living in nursing homes are unclear. The aim of this study was to investigate different HRQOL domains as a function of the degree of cognitive impairment and to explore associations between them and possible determinants of HRQOL. METHOD: Five HRQOL domains using the Minimum Data Set - Health Status Index (MDS-HSI) were investigated in a large sample of nursing home residents depending on cognitive performance levels derived from the Cognitive Performance Scale. Large effect size associations between clinical variables and the different HRQOL domains were looked for. RESULTS: HRQOL domains are impaired to variable degrees but with similar profiles depending on the cognitive performance level. Basic activities of daily living are a major factor associated with some but not all HRQOL domains and vary little with the degree of cognitive impairment. LIMITATIONS: This study is limited by the general difficulties related to measuring HRQOL in patients with cognitive impairment and the reduced number of variables considered among those potentially influencing HRQOL. CONCLUSION: HRQOL dimensions are not all linearly associated with increasing cognitive impairment in NH patients. Longitudinal studies are required to determine how the different HRQOL domains evolve over time in NH residents.
Resumo:
Nowadays, Species Distribution Models (SDMs) are a widely used tool. Using different statistical approaches these models reconstruct the realized niche of a species using presence data and a set of variables, often topoclimatic. There utilization range is quite large from understanding single species requirements, to the creation of nature reserve based on species hotspots, or modeling of climate change impact, etc... Most of the time these models are using variables at a resolution of 50km x 50km or 1 km x 1 km. However in some cases these models are used with resolutions below the kilometer scale and thus called high resolution models (100 m x 100 m or 25 m x 25 m). Quite recently a new kind of data has emerged enabling precision up to lm x lm and thus allowing very high resolution modeling. However these new variables are very costly and need an important amount of time to be processed. This is especially the case when these variables are used in complex calculation like models projections over large areas. Moreover the importance of very high resolution data in SDMs has not been assessed yet and is not well understood. Some basic knowledge on what drive species presence-absences is still missing. Indeed, it is not clear whether in mountain areas like the Alps coarse topoclimatic gradients are driving species distributions or if fine scale temperature or topography are more important or if their importance can be neglected when balance to competition or stochasticity. In this thesis I investigated the importance of very high resolution data (2-5m) in species distribution models using either very high resolution topographic, climatic or edaphic variables over a 2000m elevation gradient in the Western Swiss Alps. I also investigated more local responses of these variables for a subset of species living in this area at two precise elvation belts. During this thesis I showed that high resolution data necessitates very good datasets (species and variables for the models) to produce satisfactory results. Indeed, in mountain areas, temperature is the most important factor driving species distribution and needs to be modeled at very fine resolution instead of being interpolated over large surface to produce satisfactory results. Despite the instinctive idea that topographic should be very important at high resolution, results are mitigated. However looking at the importance of variables over a large gradient buffers the importance of the variables. Indeed topographic factors have been shown to be highly important at the subalpine level but their importance decrease at lower elevations. Wether at the mountane level edaphic and land use factors are more important high resolution topographic data is more imporatant at the subalpine level. Finally the biggest improvement in the models happens when edaphic variables are added. Indeed, adding soil variables is of high importance and variables like pH are overpassing the usual topographic variables in SDMs in term of importance in the models. To conclude high resolution is very important in modeling but necessitate very good datasets. Only increasing the resolution of the usual topoclimatic predictors is not sufficient and the use of edaphic predictors has been highlighted as fundamental to produce significantly better models. This is of primary importance, especially if these models are used to reconstruct communities or as basis for biodiversity assessments. -- Ces dernières années, l'utilisation des modèles de distribution d'espèces (SDMs) a continuellement augmenté. Ces modèles utilisent différents outils statistiques afin de reconstruire la niche réalisée d'une espèce à l'aide de variables, notamment climatiques ou topographiques, et de données de présence récoltées sur le terrain. Leur utilisation couvre de nombreux domaines allant de l'étude de l'écologie d'une espèce à la reconstruction de communautés ou à l'impact du réchauffement climatique. La plupart du temps, ces modèles utilisent des occur-rences issues des bases de données mondiales à une résolution plutôt large (1 km ou même 50 km). Certaines bases de données permettent cependant de travailler à haute résolution, par conséquent de descendre en dessous de l'échelle du kilomètre et de travailler avec des résolutions de 100 m x 100 m ou de 25 m x 25 m. Récemment, une nouvelle génération de données à très haute résolution est apparue et permet de travailler à l'échelle du mètre. Les variables qui peuvent être générées sur la base de ces nouvelles données sont cependant très coûteuses et nécessitent un temps conséquent quant à leur traitement. En effet, tout calcul statistique complexe, comme des projections de distribution d'espèces sur de larges surfaces, demande des calculateurs puissants et beaucoup de temps. De plus, les facteurs régissant la distribution des espèces à fine échelle sont encore mal connus et l'importance de variables à haute résolution comme la microtopographie ou la température dans les modèles n'est pas certaine. D'autres facteurs comme la compétition ou la stochasticité naturelle pourraient avoir une influence toute aussi forte. C'est dans ce contexte que se situe mon travail de thèse. J'ai cherché à comprendre l'importance de la haute résolution dans les modèles de distribution d'espèces, que ce soit pour la température, la microtopographie ou les variables édaphiques le long d'un important gradient d'altitude dans les Préalpes vaudoises. J'ai également cherché à comprendre l'impact local de certaines variables potentiellement négligées en raison d'effets confondants le long du gradient altitudinal. Durant cette thèse, j'ai pu monter que les variables à haute résolution, qu'elles soient liées à la température ou à la microtopographie, ne permettent qu'une amélioration substantielle des modèles. Afin de distinguer une amélioration conséquente, il est nécessaire de travailler avec des jeux de données plus importants, tant au niveau des espèces que des variables utilisées. Par exemple, les couches climatiques habituellement interpolées doivent être remplacées par des couches de température modélisées à haute résolution sur la base de données de terrain. Le fait de travailler le long d'un gradient de température de 2000m rend naturellement la température très importante au niveau des modèles. L'importance de la microtopographie est négligeable par rapport à la topographie à une résolution de 25m. Cependant, lorsque l'on regarde à une échelle plus locale, la haute résolution est une variable extrêmement importante dans le milieu subalpin. À l'étage montagnard par contre, les variables liées aux sols et à l'utilisation du sol sont très importantes. Finalement, les modèles de distribution d'espèces ont été particulièrement améliorés par l'addition de variables édaphiques, principalement le pH, dont l'importance supplante ou égale les variables topographique lors de leur ajout aux modèles de distribution d'espèces habituels.
Resumo:
The present work describes the development of a fast and robust analytical method for the determination of 53 antibiotic residues, covering various chemical groups and some of their metabolites, in environmental matrices that are considered important sources of antibiotic pollution, namely hospital and urban wastewaters, as well as in river waters. The method is based on automated off-line solid phase extraction (SPE) followed by ultra-high-performance liquid chromatography coupled to quadrupole linear ion trap tandem mass spectrometry (UHPLC–QqLIT). For unequivocal identification and confirmation, and in order to fulfill EU guidelines, two selected reaction monitoring (SRM) transitions per compound are monitored (the most intense one is used for quantification and the second one for confirmation). Quantification of target antibiotics is performed by the internal standard approach, using one isotopically labeled compound for each chemical group, in order to correct matrix effects. The main advantages of the method are automation and speed-up of sample preparation, by the reduction of extraction volumes for all matrices, the fast separation of a wide spectrum of antibiotics by using ultra-high-performance liquid chromatography, its sensitivity (limits of detection in the low ng/L range) and selectivity (due to the use of tandem mass spectrometry) The inclusion of β-lactam antibiotics (penicillins and cephalosporins), which are compounds difficult to analyze in multi-residue methods due to their instability in water matrices, and some antibiotics metabolites are other important benefits of the method developed. As part of the validation procedure, the method developed was applied to the analysis of antibiotics residues in hospital, urban influent and effluent wastewaters as well as in river water samples
Resumo:
Interest in public accountability and government transparency is increasing worldwide. The literature on the determinants of transparency is evolving but is still in its early stages. So far, it has typically focused on national or regional governments while neglecting the local government level. This paper builds on the scarce knowledge available in order to examine the economic, social, and institutional determinants of local government transparency in Spain. We draw on a 2010 survey and the transparency indexes constructed by the NGO Transparency International (Spain) in order to move beyond the fiscal transparency addressed in previous work. In so doing, we broaden the analysis of transparency to the corporate, social, fiscal, contracting, and planning activities of governments. Our results on overall transparency indicate that large municipalities and left-wing local government leaders are associated with better transparency indexes; while the worst results are presented by provincial capitals, cities where tourist activity is particularly important and local governments that enjoy an absolute majority. The analysis of other transparency categories generally shows the consistent impact of these determinants and the need to consider a wider set of variables to capture their effect.
Resumo:
Weak acid cation exchange (WAC) resins are used in the chromatographic separation of betaine from vinasse, a by-product of sugar industry. The ionic form of the resin determines the elution time of betaine. When a WAC-resin is in hydrogen form, the retention time of betaine is the longest and betaine elutes as the last component of vi-nasse from the chromatographic column. If the feed solution contains salts and its pH is not acidic enough to keep the resin undissociated, the ionic form of the hydrogen form resin starts to alter. Vinasse contains salts and its pH is around 5, it also contains weak acids. To keep the metal ion content (Na/H ratio) of the resin low enough to ensure successful separation of betaine, acid has to be added to either eluent (water) or vinasse. The aim of the present work was to examine by laboratory experiments which option requires less acid. Also the retention mechanism of betaine was investigated by measuring retention volumes of acetic acid and choline in different Na/H ratios of the resin. It was found that the resulting ionic form of the resin is the same regardless of whether the regeneration acid is added to the eluent or the feed solution (vinasse). Be-sides the salt concentration and the pH of vinasse, also the concentration of weak acids in the feed affects the resulting ionic form of the resin. The more buffering capacity vinasse has, the more acid is required to keep the ionic form of the resin desired. Vinasse was found to be quite strong buffer solution, which means relatively high amounts of acid are required to prevent the Na/H ratio from increasing too much. It is known that the retention volume of betaine decreases significantly, when the Na/H ratio increases. This is assumed to occur, because the amount of hydrogen bonds between the carboxylic groups of betaine and the resin decreases. Same behavior was not found with acetic acid. Choline has the same molecular structure as betaine, but instead of carboxylic group it has hydroxide group. The retention volume of choline increased as the Na/H ratio of the resin increased, because of the ion exchange reaction between choline cation and dissociated carboxylic group of the resin. Since the retention behavior of choline on the resin is opposite to the behavior of be-taine, the strong affinity of betaine towards hydrogen form WAC-resin has to be based on its carboxylic group. It is probable that the quaternary ammonium groups also affect the behavior of the carboxylic groups of betaine, causing them to form hydrogen bonds with the carboxylic groups of the resin.
Resumo:
Ultrafiltration (UF) is already used in pulp and paper industry and its demand is growing because of the required reduction of raw water intake and the separation of useful compounds from process waters. In the pulp and paper industry membranes might be exposed to extreme conditions and, therefore, it is important that the membrane can withstand them. In this study, extractives, hemicelluloses and lignin type compounds were separated from wood hydrolysate in order to be able to utilise the hemicelluloses in the production of biofuel. The performance of different polymeric membranes at different temperatures was studied. Samples were analysed for total organic compounds (TOC), lignin compounds (UV absorption at 280 nm) and sugar. Turbidity, conductivity and pH were also measured. The degree of fouling of the membranes was monitored by measuring the pure water flux before and comparing it with the pure water flux after the filtration of hydrolysate. According to the results, the retention of turbidity was observed to be higher at lower temperature compared to when the filtrations were operated at high temperature (70 °C). Permeate flux increased with elevated process temperature. There was no detrimental effect of temperature on most of the membranes used. Microdyn-Nadir regenerated cellulose membranes (RC) and GE-Osmonics thin film membranes seemed to be applicable in the chosen process conditions. The Polyethersulphone (NF-PES-10 and UH004P) and polysulphone (MPS-36) membranes used were highly fouled, but they showed high retentions for different compounds.
Resumo:
Hemicelluloses are among the most important natural resources that contain polysaccharides. In this study the separation and purification of hemicelluloses from water extraction liquors containing wood hemicelluloses, lignin compounds and monosaccharide by using membrane filtration was investigated. The isolation of the hemicelluloses from the wood hydrolysates was performed in two steps: concentration of high molar mass hemicelluloses by ultrafiltration and separation of low molar mass hemicelluloses from monomeric sugars using tight ultrafiltration membranes. The purification of the retained hemicelluloses was performed by diafiltration. During the filtration experiments, the permeate flux through ultrafiltration and tight ultrafiltration membranes was relatively high. The fouling ability of the used membranes was relatively low. In our experiments, the retention of hemicelluloses using two filtration steps was almost complete. The separation of monosaccharides from hemicelluloses was relatively high and the purification of hemicelluloses by diafiltration was highly efficient. The separation of lignin from hemicelluloses was partially achieved. Diafiltration showed potential to purify retained hemicelluloses from lignin and other organics. The best separation of lignin from hemicelluloses in the first filtration step was obtained using the UC005 membrane. The GE-5 and ETNA01PP membranes showed potential to purify and separate lignin from hemicelluloses. However, the feed solution of the second filtration stages (from different ultrafiltration membranes) affected the permeate flux and the separation of various extracted compounds from hemicelluloses. The GE-5 and ETNA01PP membranes gave the efficient purification of the hemicelluloses when using diafiltration. Separation of degraded xylan from glucomannan (primary spruce hemicelluloses) was also possible using membrane filtration. The best separation was achieved using the GE-5 membrane. The retention of glucomannan was three times higher than xylan retention.
Resumo:
Public opinion surveys have become progressively incorporated into systems of official statistics. Surveys of the economic climate are usually qualitative because they collect opinions of businesspeople and/or experts about the long-term indicators described by a number of variables. In such cases the responses are expressed in ordinal numbers, that is, the respondents verbally report, for example, whether during a given trimester the sales or the new orders have increased, decreased or remained the same as in the previous trimester. These data allow to calculate the percent of respondents in the total population (results are extrapolated), who select every one of the three options. Data are often presented in the form of an index calculated as the difference between the percent of those who claim that a given variable has improved in value and of those who claim that it has deteriorated.
Resumo:
Controlling the quality variables (such as basis weight, moisture etc.) is a vital part of making top quality paper or board. In this thesis, an advanced data assimilation tool is applied to the quality control system (QCS) of a paper or board machine. The functionality of the QCS is based on quality observations that are measured with a traversing scanner making a zigzag path. The basic idea is the following: The measured quality variable has to be separated into its machine direction (MD) and cross direction (CD) variations due to the fact that the QCS works separately in MD and CD. Traditionally this is done simply by assuming one scan of the zigzag path to be the CD profile and its mean value to be one point of the MD trend. In this thesis, a more advanced method is introduced. The fundamental idea is to use the signals’ frequency components to represent the variation in both CD and MD. To be able to get to the frequency domain, the Fourier transform is utilized. The frequency domain, that is, the Fourier components are then used as a state vector in a Kalman filter. The Kalman filter is a widely used data assimilation tool to combine noisy observations with a model. The observations here refer to the quality measurements and the model to the Fourier frequency components. By implementing the two dimensional Fourier transform into the Kalman filter, we get an advanced tool for the separation of CD and MD components in total variation or, to be more general, for data assimilation. A piece of a paper roll is analyzed and this tool is applied to model the dataset. As a result, it is clear that the Kalman filter algorithm is able to reconstruct the main features of the dataset from a zigzag path. Although the results are made with a very short sample of paper roll, it seems that this method has great potential to be used later on as a part of the quality control system.
Resumo:
Cue exposure treatment (CET) consists of controlled and repeated exposure to drugrelated stimuli in order to reduce cue-reactivity. Virtual reality (VR) has proved to be a promising tool for exposition. However, identifying the variables that can modulate the efficacy of this technique is essential for selecting the most appropriate exposure modality. The aim of this study was to determine the relation between several individual variables and self-reported craving in smokers exposed to VR environments. Fortysix smokers were exposed to seven complex virtual environments that reproduce typical situations in which people smoke. Self-reported craving was selected as the criterion variable and three types of variables were selected as the predictor variables: related to nicotine dependence, related to anxiety and impulsivity, and related to the sense of presence in the virtual environments. Sense of presence was the only predictor of self-reported craving in all the experimental virtual environments. Nicotine dependence variables added predictive power to the model only in the virtual breakfast at home. No relation was found between anxiety or impulsivity and self-reported craving. Virtual reality technology can be very helpful for improving CET for substance use disorders. However, the use of virtual environments would make sense only insofar as the sense of presence was high. Otherwise, the effectiveness of exposure might be affected. © 2012 by the Massachusetts Institute of Technology.
Resumo:
Most ecosystems undergo substantial variation over the seasons, ranging from changes in abiotic features, such as temperature, light and precipitation, to changes in species abundance and composition. How seasonality varies along latitudinal gradients is not well known in freshwater ecosystems, despite being very important in predicting the effects of climate change and in helping to advance ecological understanding. Stream temperature is often well correlated with air temperature and influences many ecosystem features such as growth and metabolism of most aquatic organisms. We evaluated the degree of seasonality in ten river mouths along a latitudinal gradient for a set of variables, ranging from air and water temperatures, to physical and chemical properties of water and growth of an invasive fish species (eastern mosquitofish, Gambusia holbrooki ). Our results show that although most of the variation in air temperature was explained by latitude and season, this was not the case for water features, including temperature, in lowland Mediterranean streams, which depended less on season and much more on local factors. Similarly, although there was evidence of latitude-dependent seasonality in fish growth, the relationship was nonlinear and weak and the significant latitudinal differences in growth rates observed during winter were compensated later in the year and did not result in overall differences in size and growth. Our results suggest that although latitudinal differences in air temperature cascade through properties of freshwater ecosystems, local factors and complex interactions often override the water temperature variation with latitude and might therefore hinder projections of species distribution models and effects of climate change
Resumo:
This paper describes the separation of CO2 from a gas mixture containing 25% CO2, 4% O2 and 71% N2 using the pressure swing adsorption (PSA) technique. The adsorbent selected was the zeolite 13X due to its great adsorption capacity for CO2 and selectivity towards the other components of the gas mixture. The experimental technique was designed to identify the most important variables for the process and to optimize it. It is shown that the PSA technique can be used to separate CO2 from O2 and N2 to obtain an effluent containing 2% CO2 with 99% separation efficiency.
Resumo:
Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective: We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application.Data sources and extraction: A search was made of the MEDLINE and EMBASE databases.Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis: The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level.Conclusions: The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use
Resumo:
We present the results of analyzing H$\alpha$ spectra of the radio emitting X-ray binary LS I+61303. For the first time, the same 26.5 d radio period is clearly detected in the H$\alpha$ emission line. Moreover, the equivalent width and the peak separation of the H$\alpha$ emission line seem also to vary over a time scale of 1600 days. This points towards the $\sim4$ yr modulation, detected in the radio outburst amplitude, being probably a result of variations in the mass loss rate of the Be star and/or density variability in the circumstellar disk. In addition, the dependence of the peak separation from the equivalent width informs us that the LS I+61303 circumstellar disk is among the densest of Be-stars.