960 resultados para Separating of variables
Resumo:
The present study was done with two different servo-systems. In the first system, a servo-hydraulic system was identified and then controlled by a fuzzy gainscheduling controller. The second servo-system, an electro-magnetic linear motor in suppressing the mechanical vibration and position tracking of a reference model are studied by using a neural network and an adaptive backstepping controller respectively. Followings are some descriptions of research methods. Electro Hydraulic Servo Systems (EHSS) are commonly used in industry. These kinds of systems are nonlinearin nature and their dynamic equations have several unknown parameters.System identification is a prerequisite to analysis of a dynamic system. One of the most promising novel evolutionary algorithms is the Differential Evolution (DE) for solving global optimization problems. In the study, the DE algorithm is proposed for handling nonlinear constraint functionswith boundary limits of variables to find the best parameters of a servo-hydraulic system with flexible load. The DE guarantees fast speed convergence and accurate solutions regardless the initial conditions of parameters. The control of hydraulic servo-systems has been the focus ofintense research over the past decades. These kinds of systems are nonlinear in nature and generally difficult to control. Since changing system parameters using the same gains will cause overshoot or even loss of system stability. The highly non-linear behaviour of these devices makes them ideal subjects for applying different types of sophisticated controllers. The study is concerned with a second order model reference to positioning control of a flexible load servo-hydraulic system using fuzzy gainscheduling. In the present research, to compensate the lack of dampingin a hydraulic system, an acceleration feedback was used. To compare the results, a pcontroller with feed-forward acceleration and different gains in extension and retraction is used. The design procedure for the controller and experimental results are discussed. The results suggest that using the fuzzy gain-scheduling controller decrease the error of position reference tracking. The second part of research was done on a PermanentMagnet Linear Synchronous Motor (PMLSM). In this study, a recurrent neural network compensator for suppressing mechanical vibration in PMLSM with a flexible load is studied. The linear motor is controlled by a conventional PI velocity controller, and the vibration of the flexible mechanism is suppressed by using a hybrid recurrent neural network. The differential evolution strategy and Kalman filter method are used to avoid the local minimum problem, and estimate the states of system respectively. The proposed control method is firstly designed by using non-linear simulation model built in Matlab Simulink and then implemented in practical test rig. The proposed method works satisfactorily and suppresses the vibration successfully. In the last part of research, a nonlinear load control method is developed and implemented for a PMLSM with a flexible load. The purpose of the controller is to track a flexible load to the desired position reference as fast as possible and without awkward oscillation. The control method is based on an adaptive backstepping algorithm whose stability is ensured by the Lyapunov stability theorem. The states of the system needed in the controller are estimated by using the Kalman filter. The proposed controller is implemented and tested in a linear motor test drive and responses are presented.
Resumo:
Background: Development of three classification trees (CT) based on the CART (Classification and Regression Trees), CHAID (Chi-Square Automatic Interaction Detection) and C4.5 methodologies for the calculation of probability of hospital mortality; the comparison of the results with the APACHE II, SAPS II and MPM II-24 scores, and with a model based on multiple logistic regression (LR). Methods: Retrospective study of 2864 patients. Random partition (70:30) into a Development Set (DS) n = 1808 and Validation Set (VS) n = 808. Their properties of discrimination are compared with the ROC curve (AUC CI 95%), Percent of correct classification (PCC CI 95%); and the calibration with the Calibration Curve and the Standardized Mortality Ratio (SMR CI 95%). Results: CTs are produced with a different selection of variables and decision rules: CART (5 variables and 8 decision rules), CHAID (7 variables and 15 rules) and C4.5 (6 variables and 10 rules). The common variables were: inotropic therapy, Glasgow, age, (A-a)O2 gradient and antecedent of chronic illness. In VS: all the models achieved acceptable discrimination with AUC above 0.7. CT: CART (0.75(0.71-0.81)), CHAID (0.76(0.72-0.79)) and C4.5 (0.76(0.73-0.80)). PCC: CART (72(69- 75)), CHAID (72(69-75)) and C4.5 (76(73-79)). Calibration (SMR) better in the CT: CART (1.04(0.95-1.31)), CHAID (1.06(0.97-1.15) and C4.5 (1.08(0.98-1.16)). Conclusion: With different methodologies of CTs, trees are generated with different selection of variables and decision rules. The CTs are easy to interpret, and they stratify the risk of hospital mortality. The CTs should be taken into account for the classification of the prognosis of critically ill patients.
Resumo:
El objetivo de la presente investigación fue analizar la correspondencia entre los resultados de una evaluación de tierras con la distribución real de los cultivos. Para ello la aptitud biofísica de las tierras se comparó con diferentes tipologías de frecuencia de ocurrencia de los cultivos y rotaciones derivadas de mapas de cultivos multitemporales. La investigación fue llevada a cabo en el distrito de riego de Flumen (33.000 ha), localizado en el valle del Ebro (NE España). La evaluación de tierras se basó en una cartografía de suelos 1:100.000, según el esquema FAO, para los principales cultivos presentes en el área de estudio (alfalfa, cereales de invierno, maíz, arroz y girasol). Se utilizaron tres mapas de frecuencia de cultivos y un mapa de rotaciones, derivado de una serie temporal de imágenes Landsat TM y ETM+ del periodo 1993-2000, y se compararon con los mapas de aptitud de tierras para los diferentes cultivos. Se analizó estadísticamente (Pearson χ2, Cramer V, Gamma y Somers D) la relación entre los dos tipos de variables. Los resultados muestran la existencia de una relación significativa (P=0,001) entre la localización de los cultivos y la idoneidad de las tierras, excepto de cultivos oportunistas como el girasol, muy influenciado por las subvenciones en el periodo estudiado. Las rotaciones basadas en la alfalfa muestran los mayores porcentajes (52%) de ocupación en las tierras más aptas para la agricultura en el área de estudio. El presente enfoque multitemporal de análisis de la información ofrece una visión más real que la comparación entre un mapa de evaluación de tierras y un mapa de cultivos de una fecha determinada, cuando se valora el grado de acuerdo entre las recomendaciones sobre la aptitud de las tierras y los cultivos realmente cultivados por los agricultores.
Resumo:
The goal of this work is to try to create a statistical model, based only on easily computable parameters from the CSP problem to predict runtime behaviour of the solving algorithms, and let us choose the best algorithm to solve the problem. Although it seems that the obvious choice should be MAC, experimental results obtained so far show, that with big numbers of variables, other algorithms perfom much better, specially for hard problems in the transition phase.
Resumo:
BACKGROUND: Health-related quality of life (HRQOL) levels and their determinants in those living in nursing homes are unclear. The aim of this study was to investigate different HRQOL domains as a function of the degree of cognitive impairment and to explore associations between them and possible determinants of HRQOL. METHOD: Five HRQOL domains using the Minimum Data Set - Health Status Index (MDS-HSI) were investigated in a large sample of nursing home residents depending on cognitive performance levels derived from the Cognitive Performance Scale. Large effect size associations between clinical variables and the different HRQOL domains were looked for. RESULTS: HRQOL domains are impaired to variable degrees but with similar profiles depending on the cognitive performance level. Basic activities of daily living are a major factor associated with some but not all HRQOL domains and vary little with the degree of cognitive impairment. LIMITATIONS: This study is limited by the general difficulties related to measuring HRQOL in patients with cognitive impairment and the reduced number of variables considered among those potentially influencing HRQOL. CONCLUSION: HRQOL dimensions are not all linearly associated with increasing cognitive impairment in NH patients. Longitudinal studies are required to determine how the different HRQOL domains evolve over time in NH residents.
Resumo:
Nowadays, Species Distribution Models (SDMs) are a widely used tool. Using different statistical approaches these models reconstruct the realized niche of a species using presence data and a set of variables, often topoclimatic. There utilization range is quite large from understanding single species requirements, to the creation of nature reserve based on species hotspots, or modeling of climate change impact, etc... Most of the time these models are using variables at a resolution of 50km x 50km or 1 km x 1 km. However in some cases these models are used with resolutions below the kilometer scale and thus called high resolution models (100 m x 100 m or 25 m x 25 m). Quite recently a new kind of data has emerged enabling precision up to lm x lm and thus allowing very high resolution modeling. However these new variables are very costly and need an important amount of time to be processed. This is especially the case when these variables are used in complex calculation like models projections over large areas. Moreover the importance of very high resolution data in SDMs has not been assessed yet and is not well understood. Some basic knowledge on what drive species presence-absences is still missing. Indeed, it is not clear whether in mountain areas like the Alps coarse topoclimatic gradients are driving species distributions or if fine scale temperature or topography are more important or if their importance can be neglected when balance to competition or stochasticity. In this thesis I investigated the importance of very high resolution data (2-5m) in species distribution models using either very high resolution topographic, climatic or edaphic variables over a 2000m elevation gradient in the Western Swiss Alps. I also investigated more local responses of these variables for a subset of species living in this area at two precise elvation belts. During this thesis I showed that high resolution data necessitates very good datasets (species and variables for the models) to produce satisfactory results. Indeed, in mountain areas, temperature is the most important factor driving species distribution and needs to be modeled at very fine resolution instead of being interpolated over large surface to produce satisfactory results. Despite the instinctive idea that topographic should be very important at high resolution, results are mitigated. However looking at the importance of variables over a large gradient buffers the importance of the variables. Indeed topographic factors have been shown to be highly important at the subalpine level but their importance decrease at lower elevations. Wether at the mountane level edaphic and land use factors are more important high resolution topographic data is more imporatant at the subalpine level. Finally the biggest improvement in the models happens when edaphic variables are added. Indeed, adding soil variables is of high importance and variables like pH are overpassing the usual topographic variables in SDMs in term of importance in the models. To conclude high resolution is very important in modeling but necessitate very good datasets. Only increasing the resolution of the usual topoclimatic predictors is not sufficient and the use of edaphic predictors has been highlighted as fundamental to produce significantly better models. This is of primary importance, especially if these models are used to reconstruct communities or as basis for biodiversity assessments. -- Ces dernières années, l'utilisation des modèles de distribution d'espèces (SDMs) a continuellement augmenté. Ces modèles utilisent différents outils statistiques afin de reconstruire la niche réalisée d'une espèce à l'aide de variables, notamment climatiques ou topographiques, et de données de présence récoltées sur le terrain. Leur utilisation couvre de nombreux domaines allant de l'étude de l'écologie d'une espèce à la reconstruction de communautés ou à l'impact du réchauffement climatique. La plupart du temps, ces modèles utilisent des occur-rences issues des bases de données mondiales à une résolution plutôt large (1 km ou même 50 km). Certaines bases de données permettent cependant de travailler à haute résolution, par conséquent de descendre en dessous de l'échelle du kilomètre et de travailler avec des résolutions de 100 m x 100 m ou de 25 m x 25 m. Récemment, une nouvelle génération de données à très haute résolution est apparue et permet de travailler à l'échelle du mètre. Les variables qui peuvent être générées sur la base de ces nouvelles données sont cependant très coûteuses et nécessitent un temps conséquent quant à leur traitement. En effet, tout calcul statistique complexe, comme des projections de distribution d'espèces sur de larges surfaces, demande des calculateurs puissants et beaucoup de temps. De plus, les facteurs régissant la distribution des espèces à fine échelle sont encore mal connus et l'importance de variables à haute résolution comme la microtopographie ou la température dans les modèles n'est pas certaine. D'autres facteurs comme la compétition ou la stochasticité naturelle pourraient avoir une influence toute aussi forte. C'est dans ce contexte que se situe mon travail de thèse. J'ai cherché à comprendre l'importance de la haute résolution dans les modèles de distribution d'espèces, que ce soit pour la température, la microtopographie ou les variables édaphiques le long d'un important gradient d'altitude dans les Préalpes vaudoises. J'ai également cherché à comprendre l'impact local de certaines variables potentiellement négligées en raison d'effets confondants le long du gradient altitudinal. Durant cette thèse, j'ai pu monter que les variables à haute résolution, qu'elles soient liées à la température ou à la microtopographie, ne permettent qu'une amélioration substantielle des modèles. Afin de distinguer une amélioration conséquente, il est nécessaire de travailler avec des jeux de données plus importants, tant au niveau des espèces que des variables utilisées. Par exemple, les couches climatiques habituellement interpolées doivent être remplacées par des couches de température modélisées à haute résolution sur la base de données de terrain. Le fait de travailler le long d'un gradient de température de 2000m rend naturellement la température très importante au niveau des modèles. L'importance de la microtopographie est négligeable par rapport à la topographie à une résolution de 25m. Cependant, lorsque l'on regarde à une échelle plus locale, la haute résolution est une variable extrêmement importante dans le milieu subalpin. À l'étage montagnard par contre, les variables liées aux sols et à l'utilisation du sol sont très importantes. Finalement, les modèles de distribution d'espèces ont été particulièrement améliorés par l'addition de variables édaphiques, principalement le pH, dont l'importance supplante ou égale les variables topographique lors de leur ajout aux modèles de distribution d'espèces habituels.
Resumo:
Interest in public accountability and government transparency is increasing worldwide. The literature on the determinants of transparency is evolving but is still in its early stages. So far, it has typically focused on national or regional governments while neglecting the local government level. This paper builds on the scarce knowledge available in order to examine the economic, social, and institutional determinants of local government transparency in Spain. We draw on a 2010 survey and the transparency indexes constructed by the NGO Transparency International (Spain) in order to move beyond the fiscal transparency addressed in previous work. In so doing, we broaden the analysis of transparency to the corporate, social, fiscal, contracting, and planning activities of governments. Our results on overall transparency indicate that large municipalities and left-wing local government leaders are associated with better transparency indexes; while the worst results are presented by provincial capitals, cities where tourist activity is particularly important and local governments that enjoy an absolute majority. The analysis of other transparency categories generally shows the consistent impact of these determinants and the need to consider a wider set of variables to capture their effect.
Resumo:
Public opinion surveys have become progressively incorporated into systems of official statistics. Surveys of the economic climate are usually qualitative because they collect opinions of businesspeople and/or experts about the long-term indicators described by a number of variables. In such cases the responses are expressed in ordinal numbers, that is, the respondents verbally report, for example, whether during a given trimester the sales or the new orders have increased, decreased or remained the same as in the previous trimester. These data allow to calculate the percent of respondents in the total population (results are extrapolated), who select every one of the three options. Data are often presented in the form of an index calculated as the difference between the percent of those who claim that a given variable has improved in value and of those who claim that it has deteriorated.
Resumo:
Cue exposure treatment (CET) consists of controlled and repeated exposure to drugrelated stimuli in order to reduce cue-reactivity. Virtual reality (VR) has proved to be a promising tool for exposition. However, identifying the variables that can modulate the efficacy of this technique is essential for selecting the most appropriate exposure modality. The aim of this study was to determine the relation between several individual variables and self-reported craving in smokers exposed to VR environments. Fortysix smokers were exposed to seven complex virtual environments that reproduce typical situations in which people smoke. Self-reported craving was selected as the criterion variable and three types of variables were selected as the predictor variables: related to nicotine dependence, related to anxiety and impulsivity, and related to the sense of presence in the virtual environments. Sense of presence was the only predictor of self-reported craving in all the experimental virtual environments. Nicotine dependence variables added predictive power to the model only in the virtual breakfast at home. No relation was found between anxiety or impulsivity and self-reported craving. Virtual reality technology can be very helpful for improving CET for substance use disorders. However, the use of virtual environments would make sense only insofar as the sense of presence was high. Otherwise, the effectiveness of exposure might be affected. © 2012 by the Massachusetts Institute of Technology.
Resumo:
Most ecosystems undergo substantial variation over the seasons, ranging from changes in abiotic features, such as temperature, light and precipitation, to changes in species abundance and composition. How seasonality varies along latitudinal gradients is not well known in freshwater ecosystems, despite being very important in predicting the effects of climate change and in helping to advance ecological understanding. Stream temperature is often well correlated with air temperature and influences many ecosystem features such as growth and metabolism of most aquatic organisms. We evaluated the degree of seasonality in ten river mouths along a latitudinal gradient for a set of variables, ranging from air and water temperatures, to physical and chemical properties of water and growth of an invasive fish species (eastern mosquitofish, Gambusia holbrooki ). Our results show that although most of the variation in air temperature was explained by latitude and season, this was not the case for water features, including temperature, in lowland Mediterranean streams, which depended less on season and much more on local factors. Similarly, although there was evidence of latitude-dependent seasonality in fish growth, the relationship was nonlinear and weak and the significant latitudinal differences in growth rates observed during winter were compensated later in the year and did not result in overall differences in size and growth. Our results suggest that although latitudinal differences in air temperature cascade through properties of freshwater ecosystems, local factors and complex interactions often override the water temperature variation with latitude and might therefore hinder projections of species distribution models and effects of climate change
Resumo:
Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective: We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application.Data sources and extraction: A search was made of the MEDLINE and EMBASE databases.Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis: The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level.Conclusions: The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use
Resumo:
This paper presents and classifies the cognitive and metacognitive variables involved in the processes that students execute in problem solving. Moreover, it shows how these variables affect the students success in problem solving. These variables are classified in: piagetian and neo-piagetian, representational, metacognitive and transfer of learning. In the first group of variables it is discussed formal reasoning ability and other neo-piagetian factors. In the second group of variables it is analysed mental models and external representations. Implications for chemistry education are collected as a proposal of didactic strategies in the classroom.
Resumo:
Raw measurement data does not always immediately convey useful information, but applying mathematical statistical analysis tools into measurement data can improve the situation. Data analysis can offer benefits like acquiring meaningful insight from the dataset, basing critical decisions on the findings, and ruling out human bias through proper statistical treatment. In this thesis we analyze data from an industrial mineral processing plant with the aim of studying the possibility of forecasting the quality of the final product, given by one variable, with a model based on the other variables. For the study mathematical tools like Qlucore Omics Explorer (QOE) and Sparse Bayesian regression (SB) are used. Later on, linear regression is used to build a model based on a subset of variables that seem to have most significant weights in the SB model. The results obtained from QOE show that the variable representing the desired final product does not correlate with other variables. For SB and linear regression, the results show that both SB and linear regression models built on 1-day averaged data seriously underestimate the variance of true data, whereas the two models built on 1-month averaged data are reliable and able to explain a larger proportion of variability in the available data, making them suitable for prediction purposes. However, it is concluded that no single model can fit well the whole available dataset and therefore, it is proposed for future work to make piecewise non linear regression models if the same available dataset is used, or the plant to provide another dataset that should be collected in a more systematic fashion than the present data for further analysis.
Resumo:
The first objective of this study was to find out reliable laboratory methods to predict the effect of enzymes on specific energy consumption and fiber properties of TMP pulp. The second one was to find with interactive software called “Knowledge discovery in databases” enzymes or other additives that can be used in finding a solution to reduce energy consumption of TMP pulp. The chemical composition of wood and enzymes, which have activity on main wood components were presented in the literature part of the work. The results of previous research in energy reduction of TMP process with enzymes were also highlighted. The main principles of knowledge discovery have been included in literature part too. The experimental part of the work contains the methods description in which the standard size chip, crushed chip and fiberized spruce chip (fiberized pulp) were used. Different types of enzymatic treatment with different dosages and time were tested during the experiments and showed. Pectinase, endoglucanase and mixture of enzymes were used for evaluation of method reliability. The fines content and fiber length of pulp was measured and used as evidence of enzymes' effect. The refining method with “Bauer” laboratory disc refiner was evaluated as not highly reliable. It was not able to provide high repeatability of results, because of uncontrolled feeding capacity and refining consistency. The refining method with Valley refiner did not have a lot of variables and showed stable and repeatable results in energy saving. The results of experiments showed that efficient enzymes impregnation is probably the main target with enzymes application for energy saving. During the work the fiberized pulp showed high accessibility to enzymatic treatment and liquid penetration without special impregnating equipment. The reason was that fiberized pulp has larger wood surface area and thereby the contact area between the enzymatic solution and wood is also larger. Standard size chip and crushed chip treatment without special impregnator of enzymatic solution was evaluated as not efficient and did not show visible, repeatable results in energy consumption decrease. Thereby it was concluded that using of fiberized pulp and Valley refiner for measurements of enzymes' effectiveness in SEC decrease is more suitable than normal size chip and crushed chip with “Bauer” refiner. Endoglucanase with 5 kg/t dosage showed about 20% energy consumption decrease. Mixture of enzymes with 1.5 kg/t dosage showed about 15% decrease of energy consumption during the refining. Pectinase at different dosages and treatment times did not show significant effect on energy consumption. Results of knowledge discovery in databases showed the xylanase, cellulase and pectinase blend as most promising for energy reduction in TMP process. Surfactants were determined as effective additives for energy saving with enzymes.
Resumo:
For design of vertical silos walls involving the storage of bulk solids to be safe and reliable, it is important knowing the largest possible number of variables such as: flow properties, silo geometry and pattern of flow desired. In order to validate the theories of flow prediction and design of conical hoppers, the flow properties of two bulk solids were determined, the theories of Jenike's flowability and Enstad and Walker for hopper design were analyzed and the results were compared with those experimentally obtained in a reduced model of a semicircular-section silo. Results show that Enstad theory for the hopper design is adequate to occur mass flow inside the silo, and for the sizing of the discharge outlet, the Walker's theory was closer to the appropriate than Jenike's theory, which was higher around 100% than the experimental hopper outlet.